source stringlengths 31 227 | text stringlengths 9 2k |
|---|---|
https://en.wikipedia.org/wiki/Automorphism%20group | In mathematics, the automorphism group of an object X is the group consisting of automorphisms of X under composition of morphisms. For example, if X is a finite-dimensional vector space, then the automorphism group of X is the group of invertible linear transformations from X to itself (the general linear group of X). If instead X is a group, then its automorphism group is the group consisting of all group automorphisms of X.
Especially in geometric contexts, an automorphism group is also called a symmetry group. A subgroup of an automorphism group is sometimes called a transformation group.
Automorphism groups are studied in a general way in the field of category theory.
Examples
If X is a set with no additional structure, then any bijection from X to itself is an automorphism, and hence the automorphism group of X in this case is precisely the symmetric group of X. If the set X has additional structure, then it may be the case that not all bijections on the set preserve this structure, in which case the automorphism group will be a subgroup of the symmetric group on X. Some examples of this include the following:
The automorphism group of a field extension is the group consisting of field automorphisms of L that fix K. If the field extension is Galois, the automorphism group is called the Galois group of the field extension.
The automorphism group of the projective n-space over a field k is the projective linear group
The automorphism group of a finite cyclic group of order n is isomorphic to , the multiplicative group of integers modulo n, with the isomorphism given by . In particular, is an abelian group.
The automorphism group of a finite-dimensional real Lie algebra has the structure of a (real) Lie group (in fact, it is even a linear algebraic group: see below). If G is a Lie group with Lie algebra , then the automorphism group of G has a structure of a Lie group induced from that on the automorphism group of .
If G is a group acting on a set |
https://en.wikipedia.org/wiki/Ethics%20of%20quantification | Ethics of quantification is the study of the ethical issues associated to different forms of visible or invisible forms of quantification. These could include algorithms, metrics/ indicators, statistical and mathematical modelling, as noted in a review of various aspects of sociology of quantification.
According to Espeland and Stevens an ethics of quantification would naturally descend from a sociology of quantification, especially at an age where democracy, merit, participation, accountability and even ‘‘fairness’’ are assumed to be best discovered and appreciated via numbers. In his classic work Trust in Numbers Theodore M. Porter notes how numbers meet a demand for quantified objectivity, and may for this be by used by bureaucracies or institutions to gain legitimacy and epistemic authority.
For Andy Stirling of the STEPS Centre at Sussex University there is a rhetoric element around concepts such as ‘expected utility’, ‘decision theory’, ‘life cycle assessment’, ‘ecosystem services’ ‘sound scientific decisions’ and ‘evidence-based policy’. The instrumental application of these techniques and their use of quantification to deliver an impression of accuracy may raise ethical concerns.
For Sheila Jasanoff these technologies of quantification can be labeled as 'Technologies of hubris', whose function is to reassure the public while keeping the wheels of science and industry turning. The downside of the technologies of hubris is that they may generate overconfidence thanks to the appearance of exhaustivity; they can preempt a political discussion by transforming a political problem into a technical one; and remain fundamentally limited in processing what takes place outside their restricted range of assumptions.
Jasanoff contrasts technologies of hubris with 'technologies of humility' which admit the existence of ambiguity, indeterminacy and complexity, and strive to bring to the surface the ethical nature of problems. Technologies of humility are also sensit |
https://en.wikipedia.org/wiki/Integrated%20services | In computer networking, integrated services or IntServ is an architecture that specifies the elements to guarantee quality of service (QoS) on networks. IntServ can for example be used to allow video and sound to reach the receiver without interruption.
IntServ specifies a fine-grained QoS system, which is often contrasted with DiffServ's coarse-grained control system.
Under IntServ, every router in the system implements IntServ, and every application that requires some kind of QoS guarantee has to make an individual reservation. Flow specs describe what the reservation is for, while RSVP is the underlying mechanism to signal it across the network.
Flow specs
There are two parts to a flow spec:
What does the traffic look like? Done in the Traffic SPECification part, also known as TSPEC.
What guarantees does it need? Done in the service Request SPECification part, also known as RSPEC.
TSPECs include token bucket algorithm parameters. The idea is that there is a token bucket which slowly fills up with tokens, arriving at a constant rate. Every packet which is sent requires a token, and if there are no tokens, then it cannot be sent. Thus, the rate at which tokens arrive dictates the average rate of traffic flow, while the depth of the bucket dictates how 'bursty' the traffic is allowed to be.
TSPECs typically just specify the token rate and the bucket depth. For example, a video with a refresh rate of 75 frames per second, with each frame taking 10 packets, might specify a token rate of 750 Hz, and a bucket depth of only 10. The bucket depth would be sufficient to accommodate the 'burst' associated with sending an entire frame all at once. On the other hand, a conversation would need a lower token rate, but a much higher bucket depth. This is because there are often pauses in conversations, so they can make do with less tokens by not sending the gaps between words and sentences. However, this means the bucket depth needs to be increased to compensate for the |
https://en.wikipedia.org/wiki/Bachelor%20of%20Mathematics | A Bachelor of Mathematics (abbreviated B.Math or BMath) is an undergraduate academic degree awarded for successfully completing a program of study in mathematics or related disciplines, such as applied mathematics, actuarial science, computational science, data analytics, financial mathematics, mathematical physics, pure mathematics, operations research or statistics. The Bachelor of Mathematics caters to high-achieving students seeking to develop a comprehensive specialised knowledge in a field of mathematics or a high level of sophistication in the applications of mathematics.
In practice, this is essentially equivalent to a Bachelor of Science or Bachelor of Arts degree with a speciality in mathematics. Relatively few institutions award Bachelor of Mathematics degrees, and the distinction between those that do and those that award B.Sc or B.A. degrees for mathematics is usually bureaucratic, rather than curriculum related.
List of institutions awarding Bachelor of Mathematics degrees
Australia
Flinders University, Adelaide, South Australia
Queensland University of Technology, Brisbane, Queensland
The Australian National University, Canberra, Australian Capital Territory (a Bachelor of Mathematical Sciences BMASC)
University of Adelaide, Adelaide, South Australia (a Bachelor of Mathematical Sciences BMathSc or Bachelor of Mathematical and Computer Sciences BMath&CompSc)
University of Newcastle, Newcastle, New South Wales
University of Western Sydney - Penrith, Parramatta, Cambelltown campuses in NSW.
Macquarie University, North Ryde, NSW.
University of Queensland, Brisbane, Queensland
University of South Australia, Adelaide, South Australia (a Bachelor of Mathematical Sciences BMathSc)
University of Wollongong, Wollongong, New South Wales
Bangladesh
University of Dhaka, Dhaka, Bangladesh
Jagannath University, Dhaka, Bangladesh
University of Chittagong, Chittagong, Bangladesh
Noakhali University of Science and Technology, Noakhali, Bangladesh
Can |
https://en.wikipedia.org/wiki/Trusted%20Information%20Systems | Trusted Information Systems (TIS) was a computer security research and development company during the 1980s and 1990s, performing computer and communications (information) security research for organizations such as NSA, DARPA, ARL, AFRL, SPAWAR, and others.
History
TIS was founded in 1983 by NSA veteran Steve Walker, and at various times employed notable information security experts including David Elliott Bell, Martha Branstad, John Pescatore, Marv Schaefer, Steve Crocker, Marcus Ranum, Wei Xu, John Williams, Steve Lipner and Carl Ellison. TIS was headquartered in Glenwood, Maryland, in a rural location. The company was started in Walker's basement on Shady Lane in Glenwood, MD. As the company grew, rather than move to Baltimore or the Washington D.C. suburbs, a small office building was constructed on land next to Walker's new home on Route 97.
Products
TIS projects included as the following:
Trusted Xenix, the first commercially available B2 operating system;
Trusted Mach, a research project that influenced DTOS and eventually SELinux;
Domain and Type Enforcement (DTE) which likewise influenced SELinux;
FWTK Firewall Toolkit (the first open source firewall software) in 1993;
First whitehouse.gov e-mail server was hosted at TIS headquarters from June 1 of 1993 to January 20 of 1995;
Gauntlet Firewall in 1994, one of the first commercial firewall products, with broad range of Internet Standards, including S/MIME, SNMP, DNS, DNSSEC, and many others. This Firewall became the inception of the third generation firewall;
IP Security (IPSec) product in late 1994, known as the first IPSec VPN commercial product in IT history;
Encryption Recovery technology integrated with IPSEC, ISAKMP, IKE, and RSA.
TIS's operating system work directly affected BSD/OS, which the Gauntlet Firewall and IPSec was based on, as well as Linux, FreeBSD, HP UX, Sun OS, Darwin, and others.
Post company
The company went public in 1996
and soon afterwards attempted to acquire |
https://en.wikipedia.org/wiki/PANDAS | Pediatric autoimmune neuropsychiatric disorders associated with streptococcal infections (PANDAS) is a controversial hypothetical diagnosis for a subset of children with rapid onset of obsessive-compulsive disorder (OCD) or tic disorders. Symptoms are proposed to be caused by group A streptococcal (GAS), and more specifically, group A beta-hemolytic streptococcal (GABHS) infections. OCD and tic disorders are hypothesized to arise in a subset of children as a result of a post-streptococcal autoimmune process. The proposed link between infection and these disorders is that an autoimmune reaction to infection produces antibodies that interfere with basal ganglia function, causing symptom exacerbations, and this autoimmune response results in a broad range of neuropsychiatric symptoms.
The PANDAS hypothesis, first described in 1998, was based on observations in clinical case studies by Susan Swedo et al at the US National Institute of Mental Health and in subsequent clinical trials where children appeared to have dramatic and sudden OCD exacerbations and tic disorders following infections. Whether PANDAS was a distinct entity differing from other cases of tic disorders or OCD is debated. As the PANDAS hypothesis was unconfirmed and unsupported by data, a new definition was proposed by Swedo and colleagues in 2012. In addition to the 2012 broader pediatric acute-onset neuropsychiatric syndrome (PANS), two other categories have been proposed: childhood acute neuropsychiatric symptoms (CANS) and pediatric infection-triggered autoimmune neuropsychiatric disorders (PITAND). The CANS/PANS hypotheses include different possible mechanisms underlying acute-onset neuropsychiatric conditions, but do not exclude GAS infections as a cause in a subset of individuals. PANDAS, PANS and CANS are the focus of clinical and laboratory research but remain unproven.
There is no diagnostic test to accurately confirm PANDAS; the diagnostic criteria are unevenly applied and the conditions ma |
https://en.wikipedia.org/wiki/Motorola%206847 | The MC6847 is a Video Display Generator (VDG) first introduced by Motorola in 1978 and used in the TRS-80 Color Computer, Dragon 32/64, Laser 200, TRS-80 MC-10/Matra Alice, NEC PC-6000 series, Acorn Atom, and the APF Imagination Machine, among others. It is a relatively simple display generator compared to other display chips of the time. It is capable of displaying alphanumeric text, semigraphics and raster graphics contained within a roughly square display matrix 256 pixels wide by 192 lines high.
The ROM includes a 5 x 7 pixel font, compatible with 6-bit ASCII. Effects such as inverse video or colored text (green on dark green; orange on dark orange) are possible.
The hardware palette is composed of twelve colors: black, green, yellow, blue, red, buff (almost-but-not-quite white), cyan, magenta, and orange (two extra colors, dark green and dark orange, are the ink colours for all alphanumeric text mode characters, and a light orange color is available as an alternative to green as the background color). According to the MC6847 datasheet, the colors are formed by the combination of three signals: with 6 possible levels, (or with 3 possible levels) and (or with 3 possible levels), based on the YPbPr colorspace, and then converted for output into a NTSC analog signal.
The low display resolution is a necessity of using television sets as display monitors. Making the display wider risked cutting off characters due to overscan. Compressing more dots into the display window would easily exceed the resolution of the television and be useless.
Variants
According to the datasheets, there are non-interlaced (6847) and interlaced (6847Y) variants, plus the 6847T1 (non-interlaced only). The chips can be found with ceramic (L suffix), plastic (P suffix) or CERDIP (S suffix) packages.
Die pictures
Signal levels and color palette
The chip outputs a NTSC-compatible progressive scan signal composed of one field of 262 lines 60 times per second.
According to the MC6847 |
https://en.wikipedia.org/wiki/Tatung%20Company | Tatung Company () (Tatung; ) is a multinational corporation established in 1918 and headquartered in Zhongshan, Taipei, Taiwan.
Description
Established in 1918 and headquartered in Taipei, Tatung Company holds 3 business groups, which includes 8 business units: Industrial Appliance BU, Motor BU, Wire & Cable BU, Solar BU, Smart Meter BU, System Integration BU, Appliance BU, and Advanced Electronics BU. As a conglomerate, Tatung's investees involve in some major industries such as optoelectronics, energy, system integration, industrial system, branding retail channel, and asset development.
History
Xie Zhi Business Enterprise, the forerunner of Tatung Company, was established in 1918 by Shang-Zhi Lin. It was involved in high-profile construction projects, including the Tamsui River embankment project and the Executive Yuan building.
In 1939, Tatung Iron Works was established as the company ventured into iron and steel manufacturing. Following the arrival of the ROC administration in 1945, Tatung Iron Works was renamed to Tatung Steel and Machinery Manufacturing Company. The company began mass production of electrical motors and appliances 10 years later in 1949.
In 1962 the company became publicly listed on the Taiwan Stock Exchange, and was renamed to Tatung Company in 1968. A year later, Tatung began production of color TVs, and adopted the "Tatung Boy" mascot, which became a Taiwanese cultural symbol.
Timeline
1970
Revenues exceeded NT$2.2 billion, making Tatung Taiwan's foremost private company.
1972
W. S. Lin, the grandson of Shang-Zhi Lin, was appointed as president of Tatung. Shortly thereafter he was implicated in a case of embezzlement at Tatung which would take more than ten years to litigate.
1977
Participated in the Ten Major Construction Projects with the construction of a slag treatment facility for China Steel and provision for Chiang Kai-shek International Airport's power control station
2000
Chunghwa Picture Tubes was listed on the OTC marke |
https://en.wikipedia.org/wiki/Radio%20modem | Radio modems are modems that transfer data wirelessly across a range of up to tens of kilometres.
Using radio modems is a modern way to create Private Radio Networks (PRN). Private radio networks are used in critical industrial applications, when real-time data communication is needed. Radio modems enable users to be independent of telecommunication or satellite network operators. In most cases users use licensed frequencies either in the UHF or VHF bands. In certain areas licensed frequencies may be reserved for a given user, thus ensuring that there is less likelihood of radio interference from other RF transmitters. Also licence free frequencies are available in most countries, enabling easy implementation, but at the same time other users may use the same frequency, thus making it possible that a given frequency is blocked.
Typical users for radio modems are: Land survey differential GPS, fleet management applications, SCADA applications (utility distribution networks), automated meter reading (AMR), telemetry applications and many more. Since applications usually require high reliability of data transfer and very high uptime, radio performance plays a key role. Factors influencing radio performance are: antenna height and type, the sensitivity of the radio, the output power of the radio and the complete system design.
See also
Flow control (data)
SATEL
Racom |
https://en.wikipedia.org/wiki/List%20of%20software%20based%20on%20Kodi%20and%20XBMC | This is list of software projects or products that are third-party source ports, modified forks, or derivative work directly based on Kodi Entertainment Center (formerly XBMC Media Center), an open source media player application and entertainment platform developed by the non-profit technology consortium XBMC Foundation.
Kodi-XBMC is royalty-free and cross-platform. The core code is written in C++ and is open-source licensed under GNU GPL v2. It offers the possibility for easy rebranding by an original design manufacturer (ODM) or original equipment manufacturer (OEM), with customizing of interface look and feel using skins, and simple plug-ins from third-party developers, available via Python scripts for content extensions. Thus, many systems integrators have created modified versions of Kodi, along with a just enough operating system (JeOS) that are mostly used as a software appliance suite in a variety of devices including smart TVs, set-top boxes, digital signage, hotel television systems, in-flight entertainment platforms, frontend for pay-TV operators using IPTV or Pay-per-view, and network connected digital media players.
Popular derivative applications and devices such as MediaPortal, Plex, LibreELEC, OpenELEC, ToFu, Boxee, Horizon TV, and PrismCube have all initially been spun off from the Kodi Entertainment Center code base as their main software framework to create new digital ecosystems.
9x9 Player for 9x9CloudTV
9x9 Player, by Santa Clara, CA based 9x9Network, is an open source software media player client for 9x9Network's 9x9CloudTV peer-to-peer TV delivery network over internet. The frontend of this media player client uses XBMC's source code as its application framework platform, and 9x9Network as a company also used to be an official sponsor of the XBMC development project.
Boxee
Boxee, made by startup company Boxee Inc., is a freeware and partly open source software cross-platform media center and entertainment hub with social networking fea |
https://en.wikipedia.org/wiki/Pinnacle | A pinnacle is an architectural element originally forming the cap or crown of a buttress or small turret, but afterwards used on parapets at the corners of towers and in many other situations. The pinnacle looks like a small spire. It was mainly used in Gothic architecture.
The pinnacle had two purposes:
Ornamental – adding to the loftiness and verticity of the structure. They sometimes ended with statues, such as in Milan Cathedral.
Structural – the pinnacles were very heavy and often rectified with lead, in order to enable the flying buttresses to contain the stress of the structure vaults and roof. This was done by adding compressive stress (a result of the pinnacle weight) to the thrust vector and thus shifting it downwards rather than sideways.
History
The accounts of Jesus' temptations in Matthew's and Luke's gospels both suggest that the Second Temple in Jerusalem had one or more pinnacles ():
Then he (Satan) brought Him to Jerusalem, set Him on the pinnacle of the temple, and said to Him, “If You are the Son of God, throw Yourself down from here.
Some have stated that there were no pinnacles in the Romanesque style, but conical caps to circular buttresses, with finial terminations, are not uncommon in France at very early periods. Eugène Viollet-le-Duc gives examples from Saint-Germer-de-Fly Abbey and the Basilica of Saint-Remi, and there is one of similar form at the west front of Rochester Cathedral.
In the 12th-century Romanesque two examples have been cited, one from Bredon in Worcestershire, and the other from Cleeve in Gloucestershire. In these the buttresses run up, forming a sort of square turret, and crowned with a pyramidal cap, very much like those of the next period, the Early English.
In this and the following styles, mainly in Gothic architecture, the pinnacle seems generally to have had its appropriate uses. It was a weight to counteract the thrust of the vaults, particularly where there were flying buttresses; it stopped the tendenc |
https://en.wikipedia.org/wiki/HCONDELs | hCONDELs refer to regions of deletions within the human genome containing sequences that are highly conserved among closely related relatives. Almost all of these deletions fall within regions that perform non-coding functions. These represent a new class of regulatory sequences and may have played an important role in the development of specific traits and behavior that distinguish closely related organisms from each other.
Nomenclature
The group of CONDELs of a specific organism is specified by prefixing the CONDELs with the first letter of the organism. For instance, hCONDELs refer to the group of CONDELs found in humans whereas mCONDELs and cCONDELs refer to mouse and chimpanzee CONDELs respectively.
Identification of CONDELs
The term hCONDEL was first used in the 2011 Nature article by McLean et al. in whole-genome comparison analysis. This involved firstly identifying a subset of 37,251 human deletions (hDELs) through pairwise comparisons of chimpanzee and macaque genomes. Chimpanzee sequences highly conserved in other species were then identified by pairwise alignment of chimpanzee with macaque, mouse and chicken sequences with BLASTZ followed by multiple alignment of the pairwise alignments done with MULTIZ. The highly conserved chimpanzee sequences were searched against the human genome using BLAT to identify conserved regions not present in humans. This identified 583 regions of deletions that were then referred to as hCONDELs. 510 of these identified hCONDELs were then validated computationally with 39 of these being validated by polymerase chain reaction (PCR).
Characteristics
hCONDELs in humans cover approximately 0.14% of chimpanzee genome. The number of hCONDELs currently identified is 583 using the genome-wide comparison method; however, validation of these predicated regions of deletions through polymerase chain reaction methods produces 510 hCONDELs. The remainder of these hCONDELs are either false-positives or non-existent genes. hCONDELs ha |
https://en.wikipedia.org/wiki/Catterline%20Cartie%20Challenge | The Catterline Cartie Challenge is a competition for homemade soapbox carts (or "carties", as they are known locally) held annually in Catterline, near Stonehaven, Scotland. It is part of the Catterline Gala Weekend and is held annually on the second weekend in June , with the carties being displayed at the gala on the Saturday and then time-trialed down the brae from the Creel Inn to the harbour the following day.
It was first held on 11/12 June 2005, when 11 carties were entered. The number of entries has grown in subsequent years, and in 2008 there were 26 carties taking part.
Prizes are awarded for the single fastest run (The Connons Shield), fastest aggregate time (Constructors Championship), Best Engineered, Best Decorated, Champagne Moment, Furthest Travelled, Cartie Sprint and "The Great Catterline Cartie Race".
The course is almost exactly 1000 ft (304.8m) long with a drop of almost 100 ft (30.5m) from start to finish, and the carties can reach speeds of around 30 mph at the finish line. As a result, the construction rules require the carties to have adequate brakes and steering. Other than these safety considerations, however, there are very few restrictions on the size and shape of the carties, and as a consequence there tends to be a wide range of designs entered, with many teams eschewing pure speed in favour of colourful novelty carties. These carties are very popular with the spectators and are often more memorable than the eventual winners.
Winners
2009
Connons Shield
1st. The Cheats / Tequilla Slammer
2nd. A La Cartie / The Auld Alliance
3rd. Bervie Allstars / The Bervie Bomber
Constructors Trophy
1st. The Cheats / Tequilla Slammer
2nd. Bitter and Twisted / Once a Fortnight
3rd. Team Weasel / The Flying Ferret
2008
Connons Shield
1st. A La Cartie / The Auld Alliance
2nd. Firstdrive Cars / The Bandit
3rd. Team Riley / The Black Bomber
Constructors Trophy
1st. Bitter and Twisted / Once a Fortnight
2nd. Firstdrive Cars / The Bandit
3r |
https://en.wikipedia.org/wiki/Laser-assisted%20device%20alteration | Laser-assisted device alteration (LADA) is a laser-based timing analysis technique used in the failure analysis of semiconductor devices. The laser is used to temporarily alter the operating characteristics of transistors on the device.
Theory of operation
The LADA technique targets a variable power continuous wave (CW) laser at specific device transistors. The laser is typically of a short wavelength variety on the order of 1064 nm. This allows the laser to generate photo carriers in the silicon without resulting in localized heating of the device. The LADA technique is somewhat similar in execution to the Soft Defect Localization (SDL) technique, except that SDL uses a longer wavelength laser (1340 nm) in order to induce localized heating rather than generate photo carriers. Both techniques require the device to be scanned with a laser while it is under active stimulation by the tester.
The device being tested is electrically stimulated and the device output is monitored. This technique is applied to the back side of the semiconductor device, thereby allowing direct access of the laser to the device active diffusion regions. The effect of the laser on the active transistor region is to generate a localized photocurrent. This photocurrent is a temporary effect and only occurs during the time that the laser is stimulating the target region. The creation of this photocurrent alters the transistor operating parameters, which may be observed as a change in function of the device. The effect of this change in parameters may be to speed up or slow down the operation of the device. This makes LADA a suitable technique for determining critical timing paths within a semiconductor circuit.
The laser has differing effects on NMOS and PMOS transistors. In the case of NMOS, the transistor will turn on. For PMOS, however, the effect is to lower the transistor threshold voltage. The effect on the PMOS transistor becomes proportionately stronger as the laser power is increased |
https://en.wikipedia.org/wiki/Delay-line%20oscillator | A delay-line oscillator is a form of electronic oscillator that uses a delay line as its principal timing element.
The circuit is set to oscillate by inverting the output of the delay line and feeding that signal back to the input of the delay line with appropriate amplification. The simplest style of delay-line oscillator, when properly designed, will oscillate with period exactly two times the delay period of the delay line. Additional outputs that are correlated in frequency with the main output but vary in phase can be derived by using additional taps from within the delay line.
The delay line may be realized with a physical delay line (such as an LC network or a transmission line). In contrast to a Phase-shift oscillator in which LC components are lumped, the capacitances and inductances are distributed through the length of the delay line. A ring oscillator uses a delay line formed from the gate delay of a cascade of logic gates. The timing of a circuit using a physical delay line is usually much more accurate. It is also easier to get such a circuit to oscillate in the desired mode.
The delay-line oscillator may be allowed to free run or it may be gated for use in asynchronous logic.
Since the optical cavity is a delay line, a laser can be regarded as a special case of the delay-line oscillator.
See also
Opto-electronic oscillator |
https://en.wikipedia.org/wiki/Receptor%20potential | A receptor potential, also known as a generator potential, a type of graded potential, is the transmembrane potential difference produced by activation of a sensory receptor.
A receptor potential is often produced by sensory transduction. It is generally a depolarizing event resulting from inward current flow. The influx of current will often bring the membrane potential of the sensory receptor towards the threshold for triggering an action potential. Receptor potential can work to trigger an action potential either within the same neuron or on an adjacent cell. Within the same neuron, a receptor potential can cause local current to flow to a region capable of generating an action potential by opening voltage-gated ion channels. A receptor potential can also cause the release of neurotransmitters from one cell that will act on another cell, generating an action potential in the second cell. The magnitude of the receptor potential determines the frequency with which action potentials are generated and is controlled by adaptation, stimulus strength, and temporal summation of successive receptor potentials. Receptor potential relies on receptor sensitivity which can adapt slowly, resulting in a slowly decaying receptor potential or rapidly, resulting in a quickly generated but shorter-lasting receptor potential.
An example of a receptor potential is in a taste bud, where taste is converted into an electrical signal sent to the brain. When stimulated, the taste bud triggers the release of neurotransmitters through exocytosis of synaptic vesicles from the presynaptic membrane. The neurotransmitter molecules diffuse across the synaptic cleft to the postsynaptic membrane of the primary sensory neuron, where they elicit an action potential.
See also
Resting potential
Action potential |
https://en.wikipedia.org/wiki/Baghdad%20Battery | The Baghdad Battery is the name given to a set of three artifacts which were found together: a ceramic pot, a tube of copper, and a rod of iron. It was discovered in present-day Khujut Rabu, Iraq in 1936, close to the metropolis of Ctesiphon, the capital of the Parthian (150 BC – 223 AD) and Sasanian (224–650 AD) empires, and it is believed to date from either of these periods. Similar artifacts have been found at nearby sites.
Its origin and purpose remain unclear. It was hypothesized by Wilhelm König, at the time director of the National Museum of Iraq, that the object functioned as a galvanic cell, possibly used for electroplating, or some kind of electrotherapy, but there is no electroplated object known from this period, and the claims are near-universally rejected by archaeologists. An alternative explanation is that it functioned as a storage vessel for sacred scrolls.
The artifact disappeared in 2003 during the US-led invasion of Iraq.
Physical description and dating
The artifacts consist of a terracotta pot approximately tall, with a mouth, containing a cylinder made of a rolled copper sheet, which houses a single iron rod. At the top, the iron rod is isolated from the copper by bitumen, with plugs or stoppers, and both rod and cylinder fit snugly inside the opening of the jar. The copper cylinder is not watertight, so if the jar were filled with a liquid, this would surround the iron rod as well. The artifact had been exposed to the weather and had suffered corrosion.
Austrian archeologist Wilhelm König thought the objects might date to the Parthian period, between 250 BC and AD 224. However, according to St John Simpson of the Near Eastern department of the British Museum, their original excavation and context were not well-recorded, and evidence for this date range is very weak. Furthermore, the style of the pottery is Sassanid (224–640).
Albert Al-Haik noted original reports from the 1936 dig at Khuyut Rabbou'a giving the location as an area nor |
https://en.wikipedia.org/wiki/Gular%20skin | Gular skin (throat skin), in ornithology, is an area of featherless skin on birds that joins the lower mandible of the beak (or bill) to the bird's neck. Other vertebrate taxa may have a comparable anatomical structure that is referred to as either a gular sac, throat sac, vocal sac or gular fold.
In birds
Gular skin can be very prominent, for example in members of the order Phalacrocoraciformes as well as in pelicans (which likely share a common ancestor). In many species, the gular skin forms a flap, or gular pouch, which is generally used to store fish and other prey while hunting.
In cormorants, the gular skin is often colored, contrasting with the otherwise plain black or black-and-white appearance of the bird. This presumably serves some function in social signalling, since the colors become more pronounced in breeding adults.
In frigatebirds, the gular skin (or gular sac or throat sac) is used dramatically. During courtship display, the male forces air into the sac, causing it to inflate over a period of 20 minutes into a startling huge red balloon.
Because cormorants are closer relatives of gannets and anhingas (which have no prominent gular pouch) than of frigatebirds or pelicans, it can be seen that the gular pouch is either plesiomorphic or was acquired by parallel evolution.
In other vertebrates
The orangutan is the only known great ape to have this characteristic, where it is only present in males. In addition, the walrus and some species of gibbon, such as the siamang, have a throat sac. Many amphibians will inflate their vocal sac to create certain vocalizations in order to communicate, scare off rivals (to proclaim territory or dominance), and to locate and attract a mate. The gular sac in this instance amplifies their voice to be heard louder and seemingly closer. Some species of lizard also have a gular fold and consequently, gular scales.
The theropod dinosaur Pelecanimimus, which lived in the early Cretaceous Period 130 million years ago |
https://en.wikipedia.org/wiki/17776 | 17776 (also known as What Football Will Look Like in the Future) is a serialized speculative fiction multimedia narrative by Jon Bois, published online through SB Nation. Set in the distant future in which all humans have become immortal and infertile, the series follows three sapient space probes that watch humanity play an evolved form of American football in which games can be played for millennia over distances of thousands of miles. The series debuted on July 5, 2017, and new chapters were published daily until the series concluded with its twenty-fifth chapter on July 15, 2017.
Bois began developing 17776 in 2016. Because the story incorporates text, animated GIFs, still images, and videos hosted on YouTube, new tools were developed to allow it to be hosted efficiently on the SB Nation website. The work explores themes of consciousness, hope, despair, and why humans play sports. 17776 was well received by critics, who praised it for its innovative use of its medium and for the depth of emotion it evoked. In 2018, the story won a National Magazine Award for Digital Innovation and was longlisted for both the Hugo Awards for Best Novella and Best Graphic Story.
It is followed by a sequel series: 20020, released from September to October 2020, which Bois intends to follow up with a further series entitled 20021. The sequel series follows a 111-team game of college football on fields spanning 236 million yards across the United States.
Premise
The story takes place on a future Earth where humans stopped dying, aging, and being born in 2026. All social ills were subsequently eliminated, and technology preventing humans from any injury was developed. In the United States, American football evolved to include new rules, including those that allow fields thousands of miles long, hundreds of in-game players, and games millennia long. Over time, computers gained sentience due to constant exposure to broadcast human data.
By the year 17776, the space probe Pioneer 9 |
https://en.wikipedia.org/wiki/Deniable%20authentication | In cryptography, deniable authentication refers to message authentication between a set of participants where the participants themselves can be confident in the authenticity of the messages, but it cannot be proved to a third party after the event.
In practice, deniable authentication between two parties can be achieved through the use of message authentication codes (MACs) by making sure that if an attacker is able to decrypt the messages, they would also know the MAC key as part of the protocol, and would thus be able to forge authentic-looking messages. For example, in the Off-the-Record Messaging (OTR) protocol, MAC keys are derived from the asymmetric decryption key through a cryptographic hash function. In addition to that, the OTR protocol also reveals used MAC keys as part of the next message, after they have already been used to authenticate previously received messages, and will not be re-used.
See also
Deniable encryption
Plausible deniability
Malleability
Undeniable signature |
https://en.wikipedia.org/wiki/Methuselah%20Foundation | The Methuselah Foundation is an American-based global non-profit organization based in Springfield, Virginia, with a declared mission to "make 90 the new 50 by 2030" by supporting tissue engineering and regenerative medicine therapies. The organization was originally incorporated by David Gobel in 2001 as the Performance Prize Society, a name inspired by the British governments Longitude Act, which offered monetary rewards for anyone who could devise a portable, practical solution for determining a ship's longitude.
Founding
In 2003, David Gobel, Aubrey de Grey, and Dane Gobel rebranded the organization Methuselah Foundation, named after Methuselah, the grandfather of Noah in the Hebrew Bible, whose lifespan was recorded as 969 years.
The new name was introduced at the 32nd Annual Meeting of the American Aging Association, where they awarded the first Methuselah Mouse Prize to Andrej Bartke for his work on mice that lived the equivalent of 180 human years.
The Foundation's work includes:
incubating and investing in early-stage life science companies,
funding scientific research,
providing fiscal sponsorship to aligned projects,
and sponsoring inducement prizes.
Throughout its history, the Foundation has helped to reshape the perception of longevity research with the public and the scientific community. When it was launched, the field of longevity science was largely thought to be a playground for eccentrics. Today, anti-aging or longevity research exists in the scientific mainstream and represents a $7 trillion dollar marketplace.
All of the Foundation's grants, investments, prizes and policy decisions follow seven strategies:
New Parts for People recognizes that age takes a toll on human bodies. The strategy is designed to promote technologies that create replacement parts for human bodies, such as organs, cartilage, bones, and vasculature.
Get the crud out acknowledges that cellular processes can result in harmful byproducts over time. The strategy promo |
https://en.wikipedia.org/wiki/Cryptography%20law | Cryptography is the practice and study of encrypting information, or in other words, securing information from unauthorized access. There are many different cryptography laws in different nations. Some countries prohibit export of cryptography software and/or encryption algorithms or cryptoanalysis methods. Some countries require decryption keys to be recoverable in case of a police investigation.
Overview
Issues regarding cryptography law fall into four categories:
Export control, which is the restriction on export of cryptography methods within a country to other countries or commercial entities. There are international export control agreements, the main one being the Wassenaar Arrangement. The Wassenaar Arrangement was created after the dissolution of COCOM (Coordinating committee for Multilateral Export Controls), which in 1989 "decontrolled password and authentication-only cryptography."
Import controls, which is the restriction on using certain types of cryptography within a country.
Patent issues, which deal with the use of cryptography tools that are patented.
Search and seizure issues, on whether and under what circumstances, a person can be compelled to decrypt data files or reveal an encryption key.
Legal issues
Prohibitions
Cryptography has long been of interest to intelligence gathering and law enforcement agencies. Secret communications may be criminal or even treasonous . Because of its facilitation of privacy, and the diminution of privacy attendant on its prohibition, cryptography is also of considerable interest to civil rights supporters. Accordingly, there has been a history of controversial legal issues surrounding cryptography, especially since the advent of inexpensive computers has made widespread access to high-quality cryptography possible.
In some countries, even the domestic use of cryptography is, or has been, restricted. Until 1999, France significantly restricted the use of cryptography domestically, though it has since rel |
https://en.wikipedia.org/wiki/PRR35 | Proline rich 35 is a protein that in humans is encoded by the PRR35 gene. |
https://en.wikipedia.org/wiki/Process%20control%20monitoring | In the application of integrated circuits, process control monitoring (PCM) is the procedure followed to obtain detailed information about the process used.
PCM is associated with designing and fabricating special structures that can monitor technology specific parameters such as Vth in CMOS and Vbe in bipolars. These structures are placed across the wafer at specific locations along with the chip produced so that a closer look into the process variation is possible.
Integrated circuits |
https://en.wikipedia.org/wiki/Exchange%20operator | In quantum mechanics, the exchange operator , also known as permutation operator, is a quantum mechanical operator that acts on states in Fock space. The exchange operator acts by switching the labels on any two identical particles described by the joint position quantum state . Since the particles are identical, the notion of exchange symmetry requires that the exchange operator be unitary.
Construction
In three or higher dimensions, the exchange operator can represent a literal exchange of the positions of the pair of particles by motion of the particles in an adiabatic process, with all other particles held fixed. Such motion is often not carried out in practice. Rather, the operation is treated as a "what if" similar to a parity inversion or time reversal operation. Consider two repeated operations of such a particle exchange:
Therefore, is not only unitary but also an operator square root of 1, which leaves the possibilities
Both signs are realized in nature. Particles satisfying the case of +1 are called bosons, and particles satisfying the case of −1 are called fermions. The spin–statistics theorem dictates that all particles with integer spin are bosons whereas all particles with half-integer spin are fermions.
The exchange operator commutes with the Hamiltonian and is therefore a conserved quantity. Therefore, it is always possible and usually most convenient to choose a basis in which the states are eigenstates of the exchange operator. Such a state is either completely symmetric under exchange of all identical bosons or completely antisymmetric under exchange of all identical fermions of the system. To do so for fermions, for example, the antisymmetrizer builds such a completely antisymmetric state.
In 2 dimensions, the adiabatic exchange of particles is not necessarily possible. Instead, the eigenvalues of the exchange operator may be complex phase factors (in which case is not Hermitian), see anyon for this case. The exchange operato |
https://en.wikipedia.org/wiki/Komar%20superpotential | In general relativity, the Komar superpotential, corresponding to the invariance of the Hilbert–Einstein Lagrangian , is the tensor density:
associated with a vector field , and where denotes covariant derivative with respect to the Levi-Civita connection.
The Komar two-form:
where denotes interior product, generalizes to an arbitrary vector field the so-called above Komar superpotential, which was originally derived for timelike Killing vector fields.
Komar superpotential is affected by the anomalous factor problem: In fact, when computed, for example, on the Kerr–Newman solution, produces the correct angular momentum, but just one-half of the expected mass.
See also
Superpotential
Einstein–Hilbert action
Komar mass
Tensor calculus
Christoffel symbols
Riemann curvature tensor
Notes |
https://en.wikipedia.org/wiki/RTAI | Real-time application interface (RTAI) is a real-time extension for the Linux kernel, which lets users write applications with strict timing constraints for Linux. Like Linux itself the RTAI software is a community effort. RTAI provides deterministic response to interrupts, POSIX-compliant and native RTAI real-time tasks. RTAI supports several architectures, including IA-32 (with and without FPU and TSC), x86-64, PowerPC, ARM (StrongARM and ARM7: clps711x-family, Cirrus Logic EP7xxx, CS89712, PXA25x), and MIPS.
RTAI consists mainly of two parts: an Adeos-based patch to the Linux kernel which introduces a hardware abstraction layer, and a broad variety of services which make lives of real-time programmers easier. RTAI versions over 3.0 use an Adeos kernel patch, slightly modified in the x86 architecture case, providing additional abstraction and much lessened dependencies on the "patched" operating system. Adeos is a kernel patch comprising an Interrupt Pipeline where different operating system domains register interrupt handlers. This way, RTAI can transparently take over interrupts while leaving the processing of all others to Linux. Use of Adeos also frees RTAI from patent restrictions caused by RTLinux project.
RTAI-XML
RTAI-XML is a server component of RTAI, implementing a service-oriented way to design and develop real-time (RT) control applications.
This project was born to fulfill the needs of a university group, mainly focused to have a flexible platform for learning control systems design, allowing the students to test their programs remotely, over the Internet. Leaving the first wishful thinking and going to the real implementation gave rise to the alpha version of RTAI-XML, that showed the potential impact of the basic idea of a net separation of hard and soft real-time tasks in the programmation logic. What was necessary to assure that students could not crash the RT process, is now becoming a new RTAI paradigm.
RTAI-XML consists of a server compo |
https://en.wikipedia.org/wiki/Digital%20badge | Digital badges (also known as ebadges, or singularly as ebadge) are a validated indicator of accomplishment, skill, quality or interest that can be earned in various learning environments.
Origin and development
Traditional physical badges have been used for many years by various organizations such as the Russian Army and the Boy Scouts of America to give members a physical emblem to display the accomplishment of various achievements.
While physical badges have been in use for hundreds of years, the idea of digital badges is a relatively recent development drawn from research into gamification. As game elements, badges have been used by organizations such as Foursquare and Huffington Post to reward users for accomplishing certain tasks. In 2005, Microsoft introduced the Xbox 360 Gamerscore system, which is considered to be the original implementation of an achievement system.
According to Shields & Chugh (2017, pg 1817), "digital badges are quickly becoming an appropriate, easy and efficient way for educators, community groups and other professional organisations to exhibit and reward participants for skills obtained in professional development or formal and informal learning".
In 2007, Eva Baker, the President of the American Educational Research Association (AERA), gave the Presidential Address at their annual conference on the need to develop merit-badge-like "Qualifications" that certify accomplishments, not through standardized tests, but as "an integrated experience with performance requirements." Such a system would apply to learning both in and out of school and support youth to develop and pursue passionate interests. Baker envisioned youth assembling "their unique Qualifications to show to their families, to adults in university and workforce, and to themselves." Ultimately, Baker believed "the path of Qualifications shifts attention from schoolwork to usable and compelling skills, from school life to real life."
In early 2010, the digital badge serv |
https://en.wikipedia.org/wiki/Calcium%20signaling | Calcium signaling is the use of calcium ions (Ca2+) to communicate and drive intracellular processes often as a step in signal transduction. Ca2+ is important for cellular signalling, for once it enters the cytosol of the cytoplasm it exerts allosteric regulatory effects on many enzymes and proteins. Ca2+ can act in signal transduction resulting from activation of ion channels or as a second messenger caused by indirect signal transduction pathways such as G protein-coupled receptors.
Concentration regulation
The resting concentration of Ca2+ in the cytoplasm is normally maintained around 100 nM. This is 20,000- to 100,000-fold lower than typical extracellular concentration. To maintain this low concentration, Ca2+ is actively pumped from the cytosol to the extracellular space, the endoplasmic reticulum (ER), and sometimes into the mitochondria. Certain proteins of the cytoplasm and organelles act as buffers by binding Ca2+. Signaling occurs when the cell is stimulated to release Ca2+ ions from intracellular stores, and/or when Ca2+ enters the cell through plasma membrane ion channels. Under certain conditions, the intracellular Ca2+ concentration may begin to oscillate at a specific frequency.
Phospholipase C pathway
Specific signals can trigger a sudden increase in the cytoplasmic Ca2+ levels to 500–1,000 nM by opening channels in the ER or the plasma membrane. The most common signaling pathway that increases cytoplasmic calcium concentration is the phospholipase C (PLC) pathway.
Many cell surface receptors, including G protein-coupled receptors and receptor tyrosine kinases, activate the PLC enzyme.
PLC uses hydrolysis of the membrane phospholipid PIP2 to form IP3 and diacylglycerol (DAG), two classic secondary messengers.
DAG attaches to the plasma membrane and recruits protein kinase C (PKC).
IP3 diffuses to the ER and is bound to the IP3 receptor.
The IP3 receptor serves as a Ca2+ channel, and releases Ca2+ from the ER.
The Ca2+ bind to PKC and o |
https://en.wikipedia.org/wiki/List%20of%20ideological%20symbols | This is a partial list of symbols and labels used by political parties and groups around the world. Some symbols are associated with a worldwide ideology or movement, and used by many parties that support that ideology. Others are country-specific.
Colors
Worldwide
Black – anarchism, arab nationalism, black nationalism, Islamism, pirate parties
Blue – American liberalism, conservatism, Japanese liberalism, men's rights movement, pro-Europeanism, Zionism
Brown – fascism, nazism
Gold – capitalism, classical liberalism, right-libertarianism
Green – agrarianism, anarcho-egoism, anarcho-primitivism, Arab nationalism, black nationalism, capitalism, environmentalism, green anarchism, green politics, Irish republicanism, Islamism
Gray – independent politicians
Lavender – LGBT movements, transgender rights movement
Magenta – centrism
Orange – Christian democracy, populism, mutualist anarchism, Classical Liberalism, Ulster unionism
Pink – feminism, LGBT movements, transgender rights movement
Purple – monarchism, royalism
Red – American conservatism, Arab nationalism, communism, democratic socialism, Japanese conservatism, social democracy, socialism, Zionism
Saffron – Hindu nationalism
White – anti-communism, Arab nationalism, independent politicians, monarchism, pacifism, white nationalism, Zionism
Yellow – liberalism, right-libertarianism
Australia
Blue – The Liberal Party
Red – The Labor Party
Green – The Greens
Green and yellow – The National Party
Bangladesh
Blue, red and green – Bangladesh Nationalist Party
Fern Green – Bangladesh Jamaat Islami
Green – Bangladesh Awami League
Yellow – Jatiyo Party
Canada
Blue – Conservative Party of Canada
Green – Green Party of Canada
Light blue – Bloc Québécois
Orange – New Democratic Party
Purple – People's Party of Canada
Red – Liberal Party of Canada
France
Red – La France Insoumise
Red – Parti Communiste Français
Pink – Parti Socialiste
Green – Europe Ecologie Les Verts
Ora |
https://en.wikipedia.org/wiki/Illumio | Illumio is an American business data center and cloud computing security company.
History
Illumio was founded in 2013 by Andrew Rubin and P. J. Kirner and is headquartered in Sunnyvale, California, United States.
The initial $8 million round of venture capital was led by Andreessen Horowitz. Steve Herrod, former CTO of VMware and managing director of General Catalyst led the company’s $34.5 million round with participation by Formation 8, Data Collective, Salesforce.com CEO Marc Benioff and Yahoo! founder Jerry Yang. BlackRock and Accel a $100 million round with participation from a number of investors including private investors John W. Thompson, Jerry Yang and Marc Benioff as well as previous investors.
In September 2019, Illumio was ranked #25 in the Forbes Cloud 100 list.
In February 2020, Stowe Australia, Australia's oldest and largest private electrical contractor, selected Illumio to provide security for its data centers across Australia.
Technology
Illumio’s technology decouples security from the underlying network and hypervisor. This allows for a security approach that works across a variety of computing environments, including private data centers, private clouds, and public clouds.
Illumio Adaptive Security Platform (ASP) uses the context (state, relationships, etc.) of workloads (bare-metal and virtual servers, etc.) in the computing environment and keeps security policies intact.
Unlike traditional security systems such as firewalls that rely on imperative programming techniques due to static networking constructs, Illumio Adaptive Security Platform is based on declarative programming and computes security in real time. |
https://en.wikipedia.org/wiki/Langmuir%20circulation | In physical oceanography, Langmuir circulation consists of a series of shallow, slow, counter-rotating vortices at the ocean's surface aligned with the wind.
These circulations are developed when wind blows steadily over the sea surface.
Irving Langmuir discovered this phenomenon after observing windrows of seaweed in the Sargasso Sea in 1927.
Langmuir circulations circulate within the mixed layer; however, it is not yet so clear how strongly they can cause mixing at the base of the mixed layer.
Theory
The driving force of these circulations is an interaction of the mean flow with wave averaged flows of the surface waves.
Stokes drift velocity of the waves stretches and tilts the vorticity of the flow near the surface.
The production of vorticity in the upper ocean is balanced by downward (often turbulent) diffusion .
For a flow driven by a wind characterized by friction velocity the ratio of vorticity diffusion and production defines the Langmuir number
where the first definition is for a monochromatic wave field of amplitude , frequency , and wavenumber and the second uses a generic inverse length scale , and Stokes velocity scale .
This is exemplified by the Craik–Leibovich equations
which are an approximation of the Lagrangian mean.
In the Boussinesq approximation the governing equations can be written
where
is the fluid velocity,
is planetary rotation,
is the Stokes drift velocity of the surface wave field,
is the pressure,
is the acceleration due to gravity,
is the density,
is the reference density,
is the viscosity, and
is the diffusivity.
In the open ocean conditions where there may not be a dominant length scale controlling the scale of the Langmuir cells the concept of Langmuir Turbulence is advanced.
Observations
The circulation has been observed to be between 0°–20° to the right of the wind in the northern hemisphere
and the helix forming bands of divergence and convergence at the surface.
At the convergence zones, there ar |
https://en.wikipedia.org/wiki/Allan%20Hills%2077005 | Allan Hills 77005 (also known as Allan Hills A77005, ALHA77005, ALH77005 and ALH-77005) is a Martian meteorite that was found in the Allan Hills of Antarctica in 1977 by a Japanese National Institute of Polar Research mission team and ANSMET. Like other members of the group of SNCs (shergottite, nakhlite, chassignite), ALH-77005 is thought to be from Mars.
Description
On discovery, the mass of ALH-77005 was . Initial geological examination determined that the meteorite was composed of ~55% olivine, ~35% pyroxene, ~8% maskelynite and ~2% opaques.
In March 2019, researchers reported the possibility of biosignatures in this Martian meteorite based on its microtexture and morphology as detected with optical microscopy and FTIR-ATR microscopy, and on the detection of mineralized organic compounds, suggesting that microbial life could have existed on the planet Mars. More broadly, and as a result of their studies, the researchers suggest Solar System materials should be carefully studied to determine whether there may be signs of microbial forms within other space rocks as well.
See also
Allan Hills 84001
Glossary of meteoritics
History of Mars observation
Life on Mars
List of Martian meteorites on Earth
List of meteorites on Mars
Nakhla meteorite
Mars sample return mission
Panspermia
Shergotty meteorite
Water on Mars |
https://en.wikipedia.org/wiki/Brine%20rejection | Brine rejection is a process that occurs when salty water freezes. The salts do not fit in the crystal structure of water ice, so the salt is expelled.
Since the oceans are salty, this process is important in nature. Salt rejected by the forming sea ice drains into the surrounding seawater, creating saltier, denser brine. The denser brine sinks, influencing ocean circulation.
Formation
As water reaches the temperature where it begins to crystallize and form ice, salt ions are rejected from the lattices within the ice and either forced out into the surrounding water, or trapped among the ice crystals in pockets called brine cells. Generally, sea ice has a salinity ranging from 0 psu at the surface to 4 psu at the base. The faster that this freezing process occurs, the more brine cells are left in the ice. Once the ice reaches a critical thickness, roughly 15 cm, the concentration of salt ions in the liquid around the ice begins to increase, as leftover brine is rejected from the cells. This increase is associated with the appearance of strong convective plumes, which flow from channels and within the ice and carry a significant salt flux. The brine that drains from the newly formed ice is replaced by a weak flow of relatively fresh water, from the liquid region below it. The new water partially freezes within the pores of the ice, increasing the solidity of the ice.
As sea ice ages and thickens, the initial salinity of the ice decreases due to the rejection of brine over time [Fig. 2]. While the sea ice ages, desalinization occurs to such a degree that some multiyear ice has a salinity of less than 1 PSU. This occurs in three different ways:
solute diffusion - this depends on the fact that brine inclusions trapped in ice will begin to migrate toward the warmer end of the ice block. The ice block is warmest at the water-ice interface, thus pushing the brine out into the water surrounding the ice.
gravity drainage - Gravity drainage involves the movement of brine d |
https://en.wikipedia.org/wiki/Kryptographik | Kryptographik Lehrbuch der Geheimschreibekunst (Cryptology: Instruction Book on the Art of Secret Writing) is an 1809 book on cryptography written by Johann Ludwig Klüber.
In 2011 the National Security Agency included a copy of Kryptographik, used by a German cryptographer during World War II, as part of a 50,000 page release of classified documents. |
https://en.wikipedia.org/wiki/Control%20variable | A control variable (or scientific constant) in scientific experimentation is an experimental element which is constant (controlled) and unchanged throughout the course of the investigation. Control variables could strongly influence experimental results were they not held constant during the experiment in order to test the relative relationship of the dependent variable (DV) and independent variable (IV). The control variables themselves are not of primary interest to the experimenter.
Usage
A variable in an experiment which is held constant in order to assess the relationship between multiple variables, is a control variable. A control variable is an element that is not changed throughout an experiment because its unchanging state allows better understanding of the relationship between the other variables being tested.
In any system existing in a natural state, many variables may be interdependent, with each affecting the other. Scientific experiments test the relationship of an IV (or independent variable: that element that is manipulated by the experimenter) to the DV (or dependent variable: that element affected by the manipulation of the IV). Any additional independent variable can be a control variable.
A control variable is an experimental condition or element that is kept the same throughout the experiment, and it is not of primary concern in the experiment, nor will it influence the outcome of the experiment. Any unexpected (e.g.: uncontrolled) change in a control variable during an experiment would invalidate the correlation of dependent variables (DV) to the independent variable (IV), thus skewing the results, and invalidating the working hypothesis. This indicates the presence of a spurious relationship existing within experimental parameters. Unexpected results may result from the presence of a confounding variable, thus requiring a re-working of the initial experimental hypothesis. Confounding variables are a threat to the internal validity of |
https://en.wikipedia.org/wiki/Affix%20grammar | An affix grammar is a kind of formal grammar; it is used to describe the syntax of languages, mainly computer languages, using an approach based on how natural language is typically described.
The grammatical rules of an affix grammar are those of a context-free grammar, except that certain parts in the nonterminals (the affixes) are used as arguments. If the same affix occurs multiple times in a rule, its value must agree, i.e. it must be the same everywhere. In some types of affix grammar, more complex relationships between affix values are possible.
Example
We can describe an extremely simple fragment of English in the following manner:
Sentence → Subject Predicate
Subject → Noun
Predicate → Verb Object
Object → Noun
Noun → John
Noun → Mary
Noun → children
Noun → parents
Verb → like
Verb → likes
Verb → help
Verb → helps
This context-free grammar describes simple sentences such as
John likes children
Mary helps John
children help parents
parents like John
With more nouns and verbs, and more rules to introduce other parts of speech, a large range of English sentences can be described; so this is a promising approach for describing the syntax of English.
However, the given grammar also describes sentences such as
John like children
children helps parents
These sentences are wrong: in English, subject and verb have a grammatical number, which must agree.
An affix grammar can express this directly:
Sentence → Subject + number Predicate + number
Subject + number → Noun + number
Predicate + number → Verb + number Object
Object → Noun + number
Noun + singular → John
Noun + singular → Mary
Noun + plural → children
Noun + plural → parents
Verb + singular → likes
Verb + plural → like
Verb + singular → helps
Verb + plural → help
This grammar only describes correct English sentences, although it could be argued that
John likes John
is still incorrect and should instead read
John likes himself
This, too, can be incorpora |
https://en.wikipedia.org/wiki/Index%20of%20physics%20articles%20%28B%29 | The index of physics articles is split into multiple pages due to its size.
To navigate by individual letter use the table of contents below.
B
B-factory
B-tagging
B-theory of time
B. V. Bowden, Baron Bowden
B2FH paper
BBGKY hierarchy
BCS: 50 Years (book)
BCS theory
BESS (experiment)
BESSY
BF model
BKL singularity
BL Lac object
BOOMERanG experiment
BPST instanton
BRST formalism
BRST quantization
BTZ black hole
BTeV experiment
BX442
B Reactor
B meson
B − L
BaBar experiment
Baby brane
Bach tensor
Back-reaction
Back pressure
Back scattering alignment
Background-oriented schlieren technique
Background count
Background field method
Background independence
Background noise
Background radiation
Backscatter
Backscatter X-ray
Backward-wave media
Backward wave oscillator
Bad Science: The Short Life and Weird Times of Cold Fusion
Bagger–Lambert–Gustavsson action
Bagnold formula
Bahram Mashhoon
Baien Tomlin
Baikal Deep Underwater Neutrino Telescope
Bainbridge mass spectrometer
Baksan Neutrino Observatory
Bak–Tang–Wiesenfeld sandpile
Baldwin–Lomax model
Balfour Stewart
Ball bearing motor
Ball lightning
Ballistic coefficient
Ballistic conduction
Ballistic galvanometer
Ballistic pendulum
Ballistic reentry
Ballistic transport
Ballotechnics
Balmer series
Balseiro Institute
Balthasar van der Pol
Balázs Győrffy
Banana equivalent dose
Band bending
Band diagram
Band gap
Band mapping
Band of stability
Band offset
Bandwidth-limited pulse
Banesh Hoffmann
Bangladesh Council of Scientific and Industrial Research
Banked turn
Banks–Zaks fixed point
Barber–Layden–Power effect
Bare mass
Bargeboard (aerodynamics)
Bargmann's limit
Bargmann–Wigner equations
Barkhausen effect
Barlow's law
Barlow lens
Barn (unit)
Barnett effect
Baroclinity
Barotropic
Barotropic vorticity equation
Barrel (disambiguation)
Barrett–Crane model
Barry M. McCoy
Barry Simon
Barton's Pendulums
Barton Zwiebach
Baryogenesis
Baryon
Baryon Oscillation Spectroscopic Survey
Baryon acoustic oscillations
Baryon asymmetry
Baryon nu |
https://en.wikipedia.org/wiki/Call%20signs%20in%20Russia | Call signs in Russia are unique identifiers for telecommunications and broadcasting. Call signs are regulated internationally by the ITU as well as nationally by Ministry of Communications and Mass Media of the Russian Federation. The latter is responsible for providing policy on the allocation of Russia's radio spectrum to support efficient, reliable and responsive wireless telecommunications and broadcasting infrastructure.
In 1991 Russia inherited the largest portion of the former Soviet Union's allocated call signs. The other post-USSR countries which inherited parts of the ITU UAA–UZZ call sign block are Uzbekistan, Kazakhstan, and Ukraine.
Call sign blocks for telecommunication
The International Telecommunication Union has assigned Russia the following call sign blocks for all radio communication, broadcasting or transmission:
While not directly related to call signs, the International Telecommunication Union (ITU) further has divided all countries assigned amateur radio prefixes into three regions; Russia is located in ITU Region 1.
Call sign assignments for amateur radio
Amateur radio or ham radio call signs are unique identifiers for the 24,000 licensed operators.
Russia uses the following 1-letter and 2-letter prefixes in amateur radio call signs for normal operation: R, RA, RK, RN, RU, RV, RW, RX, RZ, and UA. Any of these prefixes can be used in any of Russia's federal subjects. The other prefixes are reserved for special operation.
It uses the numerals 1, 2, 3, 4, 5, 6, 7, 8, 9, and 0 to separate prefixes from suffixes, and to indicate in which of the six regions the amateur was assigned the call sign.
Russia uses the first letter of the suffix to designate a specific federal subject in each respective region. This means that for most call signs the numeral and first letter of the suffix identifies which federal subject the operator was licensed in.
Northwest Russia
Central Russia
Volga River
North Caucusus
Urals and West Siberia
East |
https://en.wikipedia.org/wiki/Nucleotide%20exchange%20factor | Nucleotide exchange factors (NEFs) are proteins that stimulate the exchange (replacement) of nucleoside diphosphates for nucleoside triphosphates bound to other proteins.
Function
Many cellular proteins cleave (hydrolyze) nucleoside triphosphates–adenosine triphosphate (ATP) or guanosine triphosphate (GTP)–to their diphosphate forms (ADP and GDP) as a source of energy and to drive conformational changes. These changes in turn affect the structural, enzymatic, or signalling properties of the protein.
Nucleotide exchange factors actively assist in the exchange of depleted nucleoside diphosphates for fresh nucleoside triphosphates. NEFs are specific for the nucleotides they exchange (ADP or GDP, but not both) and are often specific to a single protein or class of proteins with which they interact.
See also
Nucleoside-diphosphate kinase
Guanine nucleotide exchange factor |
https://en.wikipedia.org/wiki/Language%20disorder | Language disorders or language impairments are disorders that involve the processing of linguistic information. Problems that may be experienced can involve grammar (syntax and/or morphology), semantics (meaning), or other aspects of language. These problems may be receptive (involving impaired language comprehension), expressive (involving language production), or a combination of both. Examples include specific language impairment, better defined as developmental language disorder, or DLD, and aphasia, among others. Language disorders can affect both spoken and written language, and can also affect sign language; typically, all forms of language will be impaired.
Current data indicates that 7% of young children display language disorder, with boys being diagnosed twice as often as girls.
Preliminary research on potential risk factors have suggested biological components, such as low birth weight, prematurity, general birth complications, and male gender, as well as family history and low parental education can increase the chance of developing language disorders.
For children with phonological and expressive language difficulties, there is evidence supporting speech and language therapy. However, the same therapy is shown to be much less effective for receptive language difficulties. These results are consistent with the poorer prognosis for receptive language impairments that are generally accompanied with problems in reading comprehension.
Note that these are distinct from speech disorders, which involve difficulty with the act of speech production, but not with language.
Language disorders tend to manifest in two different ways: receptive language disorders (where one cannot properly comprehend language) and expressive language disorders (where one cannot properly communicate their intended message).
Receptive language disorders
Receptive language disorders can be acquired—as in the case of receptive aphasia, or developmental (most often the latter) |
https://en.wikipedia.org/wiki/Eflornithine | Eflornithine, sold under the brand name Vaniqa among others, is a medication used to treat African trypanosomiasis (sleeping sickness) and excessive hair growth on the face in women. Specifically it is used for the 2nd stage of sleeping sickness caused by T. b. gambiense and may be used with nifurtimox. It is taken intravenously (injection into a vein) or topically. It has also been given orally on at least some rare occasions for the treatment of African trypanosomiasis.
Common side effects when applied as a cream include rash, redness, and burning. Side effects of the injectable form include bone marrow suppression, vomiting, and seizures. It is unclear if it is safe to use during pregnancy or breastfeeding. It is recommended typically for children over the age of 12.
Eflornithine was developed in the 1970s and came into medical use in 1990. It is on the World Health Organization's List of Essential Medicines. In the United States the injectable form can be obtained from the Centers for Disease Control and Prevention. In regions of the world where the disease is common eflornithine is provided for free by the World Health Organization.
Medical uses
Sleeping sickness
Sleeping sickness, or trypanosomiasis, is treated with pentamidine or suramin (depending on subspecies of parasite) delivered by intramuscular injection in the first phase of the disease, and with melarsoprol and eflornithine intravenous injection in the second phase of the disease. Efornithine is commonly given in combination with nifurtimox, which reduces the treatment time to 7 days of eflornithine infusions plus 10 days of oral nifurtimox tablets.
Eflornithine is also effective in combination with other drugs, such as melarsoprol and nifurtimox. A study in 2005 compared the safety of eflornithine alone to melarsoprol and found eflornithine to be more effective and safe in treating second-stage sleeping sickness Trypanosoma brucei gambiense. Eflornithine is not effective in the treatment of Try |
https://en.wikipedia.org/wiki/Rhinogradentia | Rhinogradentia is a fictitious order of extinct shrew-like mammals invented by German zoologist Gerolf Steiner. Members of the order, known as rhinogrades or snouters, are characterized by a nose-like feature called a "nasorium", which evolved to fulfill a wide variety of functions in different species. Steiner also created a fictional persona, naturalist Harald Stümpke, who is credited as author of the 1961 book Bau und Leben der Rhinogradentia (translated into English in 1967 as The Snouters: Form and Life of the Rhinogrades). According to Steiner, it is the only remaining record of the animals, which were wiped out, along with all the world's Rhinogradentia researchers, when the small Pacific archipelago they inhabited sank into the ocean due to nearby atomic bomb testing.
Successfully mimicking a genuine scientific work, Rhinogradentia has appeared in several publications without any note of its fictitious nature, sometimes in connection with April Fools' Day.
Background
Rhinogradentia, their island home of Hy-yi-yi, zoologist Harald Stümpke, and a host of other people, places, and documents are fictional creations of Gerolf Steiner (1908–2009), a German zoologist. Steiner is best known for his fictional work as Stümpke, but he was an accomplished zoologist in his own right. He held a professorship at the University of Heidelberg and later the Technical University of Karlsruhe, where he occupied the department chair from 1962 to 1973.
Steiner was also interested in illustration, and in 1945 drew a picture for one of his students as thanks for some food. He took inspiration from a short nonsense poem by Christian Morgenstern, The Nasobame (Das Nasobēm) about an animal that walked using its nose. He took to the drawing, made a copy for himself, and later incorporated the creatures into his teaching. According to Bud Webster, Steiner's motivation for writing a book about them was instructional, to illustrate "how animals evolve in isolation", but Joe Cain specul |
https://en.wikipedia.org/wiki/Max%20Planck%20Institute%20for%20Physics | The Max Planck Institute for Physics (MPP) is a physics institute in Munich, Germany that specializes in high energy physics and astroparticle physics. It is part of the Max-Planck-Gesellschaft and is also known as the Werner Heisenberg Institute, after its first director in its current location.
The founding of the institute traces back to 1914, as an idea from Fritz Haber, Walther Nernst, Max Planck, Emil Warburg, Heinrich Rubens. On October 1, 1917, the institute was officially founded in Berlin as Kaiser-Wilhelm-Institut für Physik (KWIP, Kaiser Wilhelm Institute for Physics) with Albert Einstein as the first head director. In October 1922, Max von Laue succeeded Einstein as managing director. Einstein gave up his position as a director of the institute in April 1933. The Institute took part in the German nuclear weapon project from 1939 to 1942.
In June 1942, Werner Heisenberg took over as managing director. A year after the end of fighting in Europe in World War II, the institute was moved to Göttingen and renamed the Max Planck Institute for Physics, with Heisenberg continuing as managing director. In 1946, Carl Friedrich von Weizsäcker and Karl Wirtz joined the faculty as the directors for theoretical and experimental physics, respectively.
In 1955 the institute made the decision to move to Munich, and soon after began construction of its current building, designed by Sep Ruf. The institute moved into its current location on September 1, 1958, and took on the new name the Max Planck Institute for Physics and Astrophysics, still with Heisenberg as the managing director. In 1991, the institute was split into the Max Planck Institute for Physics, the Max Planck Institute for Astrophysics and the Max Planck Institute for Extraterrestrial Physics.
Structure
There are three departments with multiple research groups:
Structure of matter
Innovative calculation methods in particle physics (Giulia Zanderighi)
Quantum field theory and scattering amplitudes (Joha |
https://en.wikipedia.org/wiki/List%20of%20online%20database%20creator%20apps | This list of online database creator apps lists notable web apps where end users with minimal database administration expertise can create online databases to share with team members.
Users need not have the coding skills to manage the solution stack themselves, because the web app already provides this predefined functionality. Such online database creator apps serve the gap between IT professionals (who can manage such a stack themselves) and people who would not create databases at all anyway. In other words, they provide a low-code way of doing database administration. As the concept of low-code development in general continues to evolve, some of the brands that began as online database creator apps are evolving into low-code development platforms for both the databases and the custom apps that use them. |
https://en.wikipedia.org/wiki/Tragicus | The tragicus (muscle of tragus or Valsalva muscle) is an intrinsic muscle of the outer ear.
It is a short, flattened vertical band on the lateral surface of the tragus.
While the muscle modifies the auricular shape only minimally in the majority of individuals, this action could increase the opening of the external acoustic meatus in some.
Additional images
See also
Intrinsic muscles of external ear |
https://en.wikipedia.org/wiki/Nanotextured%20surface | A nanotextured surface (NTS) is a surface which is covered with nano-sized structures. Such surfaces have one dimension on the nanoscale, i.e., only the thickness of the surface of an object is between 0.1 and 100 nm. They are currently gaining popularity because of their special applications due to their unique physical properties. Nanotextured surfaces are in various forms like cones, columns, or fibers. These are water, ice, oil, and microorganism repellent that is superamphiphobic, anti-icing, and antifouling respectively and thus self-cleaning. They are simultaneously anti-reflective and transparent, hence they are termed smart surfaces.
In research published online October 21, 2013, in Advanced Materials, of a group of scientists at the U.S. Department of Energy's Brookhaven National Laboratory (BNL), led by BNL physicist and lead author Antonio Checco, proposed that nanotexturing surfaces in the form of cones produces highly water-repellent surfaces. These nano-cone textures are superhydrophobic or super-water-hating.
See also |
https://en.wikipedia.org/wiki/Front%20Line%20%28video%20game%29 | is a military-themed run and gun video game released by Taito for arcades in November 1982. It was one of the first overhead run and gun games, a precursor to many similarly-themed games of the mid-to-late 1980s. Front Line is controlled with a joystick, a single button, and a rotary dial that can be pushed in like a button. The single button is used to throw grenades and to enter and exit tanks, while the rotary dial aims and fires the player's gun.
The game was created by Tetsuya Sasaki. It was a commercial success in Japan, where it was the seventh highest-grossing arcade game of 1982. However, it received a mixed critical and commercial reception in Western markets, with praise for its originality but criticism for its difficulty. The game's overhead run and gun formula preceded Capcom's Commando (1985) by several years. The SNK shooters TNK III (1985) and Ikari Warriors (1986) follow conventions established by Front Line, including the vertically scrolling levels, entering/exiting tanks, and not dying when an occupied tank is destroyed.
Gameplay
Playing as a lone soldier, the player's ultimate objective is to lob a hand grenade into the enemy's fort, first by fighting off infantry units and then battling tanks before finally reaching the opponent's compound.
The player begins with two weapons: a pistol and grenades, with no ammo limit. Once the player has advanced far enough into enemy territory, there is a "tank warfare" stage in which the player can hijack a tank to fight off other enemy tanks.
There are two types of tanks available: a light tank armed with a machine gun and a heavy tank armed with a cannon. The light tank is more nimble, but can be easily destroyed by the enemy. The heavy tank is slower, but can sustain one hit from a light tank; a second hit from a light tank will destroy it. A single shot from a heavy tank will destroy either type of tank. If a partially damaged tank is evacuated, the player can jump back in and resume its normal oper |
https://en.wikipedia.org/wiki/Highway%20Encounter | Highway Encounter is a video game published for the ZX Spectrum, Amstrad CPC, MSX, Commodore 64, Sharp MZ, and Tatung Einstein by Vortex Software in 1985. It was written by Costa Panayi who also coded Android, Android Two, TLL, Cyclone, and Revolution.
Summary
Highway Encounter is a strategy/action game played from a 3D isometric perspective in which you must successfully chaperone a bomb along a long, straight stretch of highway and into the alien base at the end of it. There are thirty screens to pass through and most are filled with hazards that threaten to block your progress (such as barrels) or destroy you (aliens and explosive mines).
Players control a robotic "Vorton" (resembling a dalek from Doctor Who) and one of the features that provides Highway Encounter with its unique appeal is that the bomb is constantly being pushed onwards by your extra lives - four more Vortons, who accompany you along the highway. A key strategic element to the game is for the player character to travel several screens ahead of the bomb to clear a safe path for it; normally this would be done by temporarily blocking the bomb's forward motion. However, if the bomb is left in an unsafe location, it is possible for all your extra lives to be lost without the player character being destroyed once. Once all spare lives are lost, the player character must manually push the bomb.
Reception
The Spectrum version of the game was voted number 40 in the Your Sinclair Official Top 100 Games of All Time.
Legacy
There is an unfinished and officially unreleased, but available to download version for Atari ST made by Mark Haigh-Hutchinson and graphics by Costa Panayi, from 1990. Versions for Amiga and Sega Mega Drive were also planned but Hutchinson stated that the Mega Drive version was left unpublished.
Highway Encounter was followed by a sequel, Alien Highway, in 1986. |
https://en.wikipedia.org/wiki/Nikos%20Kyrpides | Nikos Kyrpides (Greek: Νίκος Κυρπίδης) is a Greek-American bioscientist who has worked on the origins of life, information processing, bioinformatics, microbiology, metagenomics and microbiome data science. He is a senior staff scientist at the Berkeley National Laboratory, head of the Prokaryote Super Program and leads the Microbiome Data Science program at the US Department of Energy Joint Genome Institute.
Education
Kyrpides was born in Serres, Greece, where he studied biology at the Aristotle University of Thessaloniki and received his PhD in molecular biology and biotechnology from the University of Crete. He pursued postdoctoral studies in microbiology with Carl Woese at the University of Illinois at Urbana-Champaign and in bioinformatics with Ross Overbeek at the Argonne National Laboratory. From 1999 to 2004 Kyrpides worked in the biotech industry in Chicago, where he led the development of genome analysis and bioinformatics. He joined the United States Department of Energy Joint Genome Institute (JGI) in 2004 to lead the Genome Biology Program and develop the data management and comparative analysis platforms for microbial genomes and metagenomes. Kyrpides became the Metagenomics Program head in 2010 and founded the Prokaryotic Super Program in 2011, which he still leads with the Microbiome Data Science Group.
Research
Kyrpides's early work focused on the origins and evolution of the genetic code. In collaboration with Christos Ouzounis, he developed a series of hypotheses for the transfer of information from proteins to nucleic acids known as reverse interpretation. With the advent of genomics, Kyrpides turned his interest to the study and understanding of the last universal common ancestor. With Ouzounis he coined the acronym "LUCA" at a conference organized by Patrick Forterre at Les Treilles, France, and performed some of the first comparative genome analysis to predict the gene content of the LUCA. Kyrpides's work on the information processing syst |
https://en.wikipedia.org/wiki/Drama%20theory | Drama theory is one of the problem structuring methods in operations research. It is based on game theory and adapts the use of games to complex organisational situations, accounting for emotional responses that can provoke irrational reactions and lead the players to redefine the game. In a drama, emotions trigger rationalizations that create changes in the game, and so change follows change until either all conflicts are resolved or action becomes necessary. The game as redefined is then played.
Drama theory was devised by professor Nigel Howard in the early 1990s and, since then, has been turned to defense, political, health, industrial relations and commercial applications. Drama theory is an extension of Howard's metagame analysis work developed at the University of Pennsylvania in the late 1960s, and presented formally in his book Paradoxes of Rationality, published by MIT Press. Metagame analysis was originally used to advise on the Strategic Arms Limitation Talks (SALT).
Basics of drama theory
A drama unfolds through episodes in which characters interact. The episode is a period of preplay communication between characters who, after communicating, act as players in a game that is constructed through the dialogue between them. The action that follows the episode is the playing out of this game; it sets up the next episode. Most drama-theoretic terminology is derived from a theatrical model applied to real life interactions; thus, an episode goes through phases of scene-setting, build-up, climax and decision. This is followed by denouement, which is the action that sets up the next episode. The term drama theory and the use of theatrical terminology is justified by the fact that the theory applies to stage plays and fictional plots as well as to politics, war, business, personal and community relations, psychology, history and other kinds of human interaction. It was applied to help with the structuring of The Prisoner's Dilemma, a West End play by David E |
https://en.wikipedia.org/wiki/Scapegoating | Scapegoating is the practice of singling out a person or group for unmerited blame and consequent negative treatment. Scapegoating may be conducted by individuals against individuals (e.g. "he did it, not me!"), individuals against groups (e.g., "I couldn't see anything because of all the tall people"), groups against individuals (e.g., "He was the reason our team didn't win"), and groups against groups.
A scapegoat may be an adult, child, sibling, employee, peer, ethnic, political or religious group, or country. A whipping boy, identified patient, or "fall guy" are forms of scapegoat.
Scapegoating has its origins in a ritual of atonement described in chapter 16 of the Biblical Book of Leviticus, in which a goat (or ass) is released into the wilderness bearing all the sins of the community, which have been placed on the goat's head by a priest.
At the individual level
A medical definition of scapegoating is:
Scapegoated groups throughout history have included almost every imaginable group of people: genders, religions, people of different races, nations, or sexual orientations, people with different political beliefs, or people differing in behaviour from the majority. However, scapegoating may also be applied to organizations, such as governments, corporations, or various political groups.
Its archetype
Jungian analyst Sylvia Brinton Perera situates its mythology of shadow and guilt. Individuals experience it at the archetypal level. As an ancient social process to rid a community of its past evil deeds and reconnect it to the sacred realm, the scapegoat appeared in a biblical rite, which involved two goats and the pre-Judaic, chthonic god Azazel. In the modern scapegoat complex, however, "the energy field has been radically broken apart" and the libido "split off from consciousness". Azazel's role is deformed into an accuser of the scapegoated victim.
Blame for breaking a perfectionist moral code, for instance, might be measured out by aggressive scapegoate |
https://en.wikipedia.org/wiki/English%20Short%20Title%20Catalogue | The English Short Title Catalogue (ESTC) is a union short-title catalogue of works published between 1473 and 1800, in Britain and its former colonies, notably those in North America, and primarily in English, drawing on the collections of the British Library and other libraries in Britain and around the world. It is co-managed by the British Library and the Center for Bibliographical Studies and Research (CBSR) at the University of California, Riverside. The database is freely searchable.
History
The ESTC began life as the Eighteenth-Century Short Title Catalogue, with the same abbreviation, covering only 1701 to 1800. Earlier printed works had been catalogued in A. W. Pollard and G. R. Redgrave's Short Title Catalogue (1st edn 1926; 2nd edn, 1976–91) for the period 1473 to 1640; and Donald Goddard Wing's similarly titled bibliography (1945–51, with later supplements and addenda) for the period 1641 to 1700. These works were eventually incorporated into the database.
See also
Books in the United Kingdom
Books in the United States
Incunabula Short Title Catalogue
Universal Short Title Catalogue |
https://en.wikipedia.org/wiki/Myth%20of%20the%20spat-on%20Vietnam%20veteran | There is a persistent myth or misconception that many Vietnam veterans were spat on and vilified by antiwar protesters during the late 1960s and early 70s. These stories, which overwhelmingly surfaced many years after the war, usually involve an antiwar female spitting on a veteran, often yelling "baby killer". Most occur in U.S. civilian airports, usually San Francisco International, as GIs returned from the war zone in their uniforms. Perhaps the most well known version of this caricature appeared in 1982 in the first Rambo movie, First Blood, where ex-Green Beret John Rambo, played by Sylvester Stallone, furiously complained about returning "to the world" and seeing "all those maggots at the airport. Protesting me. Spitting. Calling me baby killer". It also appeared in an issue of Marvel Comics GI Joe series in January 1988. But most often now it is used as a talking point by politicians and opinion makers when arguing for the importance of supporting U.S. troops, especially during war time. For example, during the Iraq War, Daniel Henninger, deputy editor of The Wall Street Journal editorial page wrote about the "horrifying" images of GIs being spat on as they returned from Vietnam.
No unambiguous documented incident of this behavior has ever surfaced, despite repeated and concerted efforts to uncover them. The few dubious examples brought forward have been the object of much debate and controversy. Only 1 percent of Vietnam Veterans themselves, according the a Veterans Administration commissioned Harris Poll conducted in 1971, described their reception from friends and family as "not at all friendly", and only 3 percent described their reception from people their own age as "unfriendly". More, there is ample and well documented evidence of a mutually supportive, empathetic relationship between GIs, veterans and antiwar forces during the Vietnam War. Martin Luther King Jr. spoke to this in April 1967 when he chastised "those who are seeking to make it appear |
https://en.wikipedia.org/wiki/Intermetallic%20particle | Intermetallic particles form during solidification of metallic alloys.
Aluminium alloys
Al-Si-Cu-Mg alloys
For example, Al-Si-Cu-Mg alloys form Al5FeSi- plate like intermetallic phase, Chinese script like -Al8Fe2Si, Al2Cu, etc. The size and morphology of these intermetallic phases in these alloys control the mechanical properties of these alloys especially strength and ductility. The size of these phases depends on the secondary dendrite arm spacing, as well as the Si content of the alloy, of the primary phase in the micro structure.
Phases and crystal structures
Magnesium alloys
WE 43
In-situ synchrotron diffraction experiment on Electron alloy-WE 43 (Mg4Y3Nd) shows that this alloy form the following intermetallic phases ;Mg12Nd, Mg14Y4Nd, and Mg24Y5.
Phases and crystal structures
AZ 91 |
https://en.wikipedia.org/wiki/Stirling%20transform | In combinatorial mathematics, the Stirling transform of a sequence { an : n = 1, 2, 3, ... } of numbers is the sequence { bn : n = 1, 2, 3, ... } given by
where is the Stirling number of the second kind, also denoted S(n,k) (with a capital S), which is the number of partitions of a set of size n into k parts.
The inverse transform is
where s(n,k) (with a lower-case s) is a Stirling number of the first kind.
Berstein and Sloane (cited below) state "If an is the number of objects in some class with points labeled 1, 2, ..., n (with all labels distinct, i.e. ordinary labeled structures), then bn is the number of objects with points labeled 1, 2, ..., n (with repetitions allowed)."
If
is a formal power series, and
with an and bn as above, then
Likewise, the inverse transform leads to the generating function identity
See also
Binomial transform
Generating function transformation
List of factorial and binomial topics |
https://en.wikipedia.org/wiki/Hydrophobicity%20scales | Hydrophobicity scales are values that define the relative hydrophobicity or hydrophilicity of amino acid residues. The more positive the value, the more hydrophobic are the amino acids located in that region of the protein. These scales are commonly used to predict the transmembrane alpha-helices of membrane proteins. When consecutively measuring amino acids of a protein, changes in value indicate attraction of specific protein regions towards the hydrophobic region inside lipid bilayer.
The hydrophobic or hydrophilic character of a compound or amino acid is its hydropathic character, hydropathicity, or hydropathy.
Hydrophobicity and the hydrophobic effect
The hydrophobic effect represents the tendency of water to exclude non-polar molecules. The effect originates from the disruption of highly dynamic hydrogen bonds between molecules of liquid water. Polar chemical groups, such as OH group in methanol do not cause the hydrophobic effect. However, a pure hydrocarbon molecule, for example hexane, cannot accept or donate hydrogen bonds to water. Introduction of hexane into water causes disruption of the hydrogen bonding network between water molecules. The hydrogen bonds are partially reconstructed by building a water "cage" around the hexane molecule, similar to that in clathrate hydrates formed at lower temperatures. The mobility of water molecules in the "cage" (or solvation shell) is strongly restricted. This leads to significant losses in translational and rotational entropy of water molecules and makes the process unfavorable in terms of free energy of the system. In terms of thermodynamics, the hydrophobic effect is the free energy change of water surrounding a solute. A positive free energy change of the surrounding solvent indicates hydrophobicity, whereas a negative free energy change implies hydrophilicity. In this way, the hydrophobic effect not only can be localized but also decomposed into enthalpic and entropic contributions.
Types of amino acid hyd |
https://en.wikipedia.org/wiki/Public%20Services%20Network | The Public Services Network (PSN) is a UK government's high-performance network, which helps public sector organisations work together, reduce duplication and share resources. It unified the provision of network infrastructure across the United Kingdom public sector into an interconnected "network of networks" to increase efficiency and reduce overall public expenditure. It is now a legacy network and public sector organisations are being migrated to using services on the public internet.
Origins
The Public Services Network (PSN) was launched officially as part of the Transformational Government Strategy commencing in 2005, under the original name of the Public Sector Network.
Prior to this, some parts of local government had already successfully implemented the concept. The Hampshire Public Services Network (HPSN) was the first PSN, launched in 1999, followed closely by Kent County Councils partnerships with the KPSN. The HPSN, encompassing all of the borough, district and unitary councils, with the County Council, as well as the Fire Services, the Isle of Wight Council and 540 schools. National PSN technical and architecture compliance criteria were established from 2007, by GDS working with local government leaders from Socitm (the Society of Information Technology Management) on the National CIO Council and the Local CIO Council.
The PSN's aim was to bring public services organisations with a common interest onto a single, coherent and standards-based ‘network of networks’. This would create influence, economies of scale and a commonality of standards for secure and easy inter-connection between public service organisations.
The original concept of a network of networks strategy was based upon the work already undertaken in local government and recognition of Communities of Interest (COI) within the Criminal Justice Sector during work by the Office for Criminal Justice Reform (OCJR) between 2005 and 2007 to enable data sharing across business units.
In thi |
https://en.wikipedia.org/wiki/Superwind | A superwind is an extremely dense wind emanating from asymptotic giant branch stars towards the end of their lives.
See also
Cosmic wind
Solar wind
Stellar wind
Planetary wind
Stellar-wind bubble
Colliding-wind binary
Pulsar wind nebula
Galactic superwind |
https://en.wikipedia.org/wiki/ASD%20%28database%29 | Allostery is the most direct and efficient way for regulation of biological macromolecule function induced by the binding of a ligand at an allosteric site topographically distinct from the orthosteric site. Due to the inherent high receptor selectivity and lower target-based toxicity, it is also expected to play a more positive role in drug discovery and bioengineering, leading to rapid growth on allosteric findings.
Allosteric Database (ASD) provides a central resource for the display, search and analysis of the structure, function and related annotation for allosteric molecules. Currently, ASD contains allosteric proteins from more than 100 species and modulators in three categories (activators, inhibitors, and regulators). Each protein is annotated with a detailed description of allostery, biological process and related diseases, and each modulator with binding affinity, physicochemical properties and therapeutic area. Integrating the information of allosteric proteins in ASD should allow for the prediction of allostery for unknown proteins and eventually make them ideal targets for experimental validation. In addition, modulators curated in ASD can be used to investigate potent allosteric targets for the query compound, and also help chemists implement structure modifications for novel allosteric drug designs. Therefore, ASD could be a platform and a starting point for biologists and medicinal chemists for furthering allosteric research. |
https://en.wikipedia.org/wiki/Acarospora%20americana | Acarospora americana is a dark brown to black verruculose to areolate or squamulose crustose lichen with deeply immersed reddish to blackish-brown apothecia found in the Sierra Nevada and other southern California mountain ranges. Lichen spot tests are all negative. |
https://en.wikipedia.org/wiki/Altered%20level%20of%20consciousness | An altered level of consciousness is any measure of arousal other than normal. Level of consciousness (LOC) is a measurement of a person's arousability and responsiveness to stimuli from the environment. A mildly depressed level of consciousness or alertness may be classed as lethargy; someone in this state can be aroused with little difficulty. People who are obtunded have a more depressed level of consciousness and cannot be fully aroused. Those who are not able to be aroused from a sleep-like state are said to be stuporous. Coma is the inability to make any purposeful response. Scales such as the Glasgow coma scale have been designed to measure the level of consciousness.
An altered level of consciousness can result from a variety of factors, including alterations in the chemical environment of the brain (e.g. exposure to poisons or intoxicants), insufficient oxygen or blood flow in the brain, and excessive pressure within the skull. Prolonged unconsciousness is understood to be a sign of a medical emergency. A deficit in the level of consciousness suggests that both of the cerebral hemispheres or the reticular activating system have been injured. A decreased level of consciousness correlates to increased morbidity (sickness) and mortality (death). Thus it is a valuable measure of a patient's medical and neurological status. In fact, some sources consider level of consciousness to be one of the vital signs.
Definition
Scales and terms to classify the levels of consciousness differ, but in general, reduction in response to stimuli indicates an altered level of consciousness:
Altered level of consciousness is sometimes described as altered sensorium.
Glasgow Coma Scale
The most commonly used tool for measuring LOC objectively is the Glasgow Coma Scale (GCS). It has come into almost universal use for assessing people with brain injury, or an altered level of consciousness. Verbal, motor, and eye-opening responses to stimuli are measured, scored, and a |
https://en.wikipedia.org/wiki/Developmental%20systems%20theory | Developmental systems theory (DST) is an overarching theoretical perspective on biological development, heredity, and evolution. It emphasizes the shared contributions of genes, environment, and epigenetic factors on developmental processes. DST, unlike conventional scientific theories, is not directly used to help make predictions for testing experimental results; instead, it is seen as a collection of philosophical, psychological, and scientific models of development and evolution. As a whole, these models argue the inadequacy of the modern evolutionary synthesis on the roles of genes and natural selection as the principal explanation of living structures. Developmental systems theory embraces a large range of positions that expand biological explanations of organismal development and hold modern evolutionary theory as a misconception of the nature of living processes.
Overview
All versions of developmental systems theory espouse the view that:
All biological processes (including both evolution and development) operate by continually assembling new structures.
Each such structure transcends the structures from which it arose and has its own systematic characteristics, information, functions and laws.
Conversely, each such structure is ultimately irreducible to any lower (or higher) level of structure, and can be described and explained only on its own terms.
Furthermore, the major processes through which life as a whole operates, including evolution, heredity and the development of particular organisms, can only be accounted for by incorporating many more layers of structure and process than the conventional concepts of ‘gene’ and ‘environment’ normally allow for.
In other words, although it does not claim that all structures are equal, development systems theory is fundamentally opposed to reductionism of all kinds. In short, developmental systems theory intends to formulate a perspective which does not presume the causal (or ontological) priority of any p |
https://en.wikipedia.org/wiki/Structure%20and%20Interpretation%20of%20Computer%20Programs | Structure and Interpretation of Computer Programs (SICP) is a computer science textbook by Massachusetts Institute of Technology professors Harold Abelson and Gerald Jay Sussman with Julie Sussman. It is known as the "Wizard Book" in hacker culture. It teaches fundamental principles of computer programming, including recursion, abstraction, modularity, and programming language design and implementation.
MIT Press published the first edition in 1984, and the second edition in 1996. It was formerly used as the textbook for MIT's introductory course in computer science. SICP focuses on discovering general patterns for solving specific problems, and building software systems that make use of those patterns.
MIT Press published the JavaScript edition in 2022.
Content
The book describes computer science concepts using Scheme, a dialect of Lisp. It also uses a virtual register machine and assembler to implement Lisp interpreters and compilers.
Topics in the books are:
Chapter 1: Building Abstractions with Procedures
The Elements of Programming
Procedures and the Processes They Generate
Formulating Abstractions with Higher-Order Procedures
Chapter 2: Building Abstractions with Data
Introduction to Data Abstraction
Hierarchical Data and the Closure Property
Symbolic Data
Multiple Representations for Abstract Data
Systems with Generic Operations
Chapter 3: Modularity, Objects, and State
Assignment and Local State
The Environment Model of Evaluation
Modeling with Mutable Data
Concurrency: Time Is of the Essence
Streams
Chapter 4: Metalinguistic Abstraction
The Metacircular Evaluator
Variations on a Scheme – Lazy Evaluation
Variations on a Scheme – Nondeterministic Computing
Logic Programming
Chapter 5: Computing with Register Machines
Designing Register Machines
A Register-Machine Simulator
Storage Allocation and Garbage Collection
The Explicit-Control Evaluator
Compilation
Characters
Several fictional characters appear in the |
https://en.wikipedia.org/wiki/Garlic%20ice%20cream | Garlic ice cream is a flavour of ice cream consisting mainly of vanilla, or honey, and cream as a base, to which garlic is added. It has been featured at many garlic festivals, including the Gilroy Garlic Festival in Gilroy, California.
Preparation and description
According to a recipe by the San Francisco-based restaurant, The Stinking Rose, which is well known for including garlic in all of its dishes, garlic ice cream is basically vanilla ice cream with some garlic. The Scandinavian Garlic & Shots, which is based in Södermalm, serves garlic ice cream which is essentially a combination of honey flavoured ice cream and garlic. Garlic ice cream is savoury in taste.
Reception
Tasters of garlic ice cream at the 2012 North Quabbin Garlic and Arts Festival had positive feedback for the ice cream flavour, with one of them commenting that it "is really creamy, subtle flavors, but you can taste the garlic".
Notable use
Garlic ice cream has been a featured dish at several garlic conventions. Examples include the 1986 Ithaca Garlic Festival in Ithaca, New York and the Gilroy Garlic Festival, which has included garlic ice cream as one of its featured garlic dishes a handful of times, including in 2000 and 2005. It has also been showcased at the 2011 Toronto Garlic Festival in Toronto, Canada and the 2012 North Quabbin Garlic and Arts Festival at Forster's Farm in Orange, Massachusetts.
Garlic ice cream is a food item on The Stinking Rose's food menu. The ice cream flavour is treated as a "sauce" to accompany food items like steak, although it can also be consumed as a dessert item.
See also
List of garlic dishes |
https://en.wikipedia.org/wiki/Mathematica%3A%20A%20World%20of%20Numbers...%20and%20Beyond | Mathematica: A World of Numbers... and Beyond is a kinetic and static exhibition of mathematical concepts designed by Charles and Ray Eames, originally debuted at the California Museum of Science and Industry in 1961. Duplicates have since been made, and they (as well as the original) have been moved to other institutions.
History
In March, 1961 a new science wing at the California Museum of Science and Industry in Los Angeles opened. The IBM Corporation had been asked by the Museum to make a contribution; IBM in turn asked the famous California designer team of Charles Eames and his wife Ray Eames to come up with a good proposal. The result was that the Eames Office was commissioned by IBM to design an interactive exhibition called Mathematica: A World of Numbers... and Beyond. This was the first of many exhibitions designed by the Eames Office.
The exhibition stayed at the Museum until January 1998, making it the longest running of any corporate sponsored museum exhibition. Furthermore, it is the only one of the dozens of exhibitions designed by the Office of Charles and Ray Eames that is still extant. This original Mathematica exhibition was reassembled for display at the Alyce de Roulet Williamson Gallery at Art Center College of Design in Pasadena, California, July 30 through October 1, 2000. It is now owned by and on display at the New York Hall of Science, though it currently lacks the overhead plaques with quotations from mathematicians that were part of the original installation.
Duplicates
In November, 1961 an exact duplicate was made for Chicago's Museum of Science and Industry, where it was shown until late 1980. From there it was sold and relocated to the Museum of Science in Boston, Massachusetts, where it is permanently on display. The Boston installation bears the closest resemblance to the original Eames design, including numerous overhead plaques featuring historic quotations from famous mathematicians. As part of a refurbishment, a graphic p |
https://en.wikipedia.org/wiki/Immunoisolate | In general, immunoisolation is the process of protecting implanted material such as biopolymers, cells, or drug release carriers from an immune reaction. The most prominent means of accomplishing this is through the use of cell encapsulation. |
https://en.wikipedia.org/wiki/Solenoid%20%28mathematics%29 | This page discusses a class of topological groups. For the wrapped loop of wire, see Solenoid.
In mathematics, a solenoid is a compact connected topological space (i.e. a continuum) that may be obtained as the inverse limit of an inverse system of topological groups and continuous homomorphisms
where each is a circle and fi is the map that uniformly wraps the circle for times () around the circle . This construction can be carried out geometrically in the three-dimensional Euclidean space R3. A solenoid is a one-dimensional homogeneous indecomposable continuum that has the structure of a compact topological group.
Solenoids were first introduced by Vietoris for the case, and by van Dantzig the case, where is fixed. Such a solenoid arises as a one-dimensional expanding attractor, or Smale–Williams attractor, and forms an important example in the theory of hyperbolic dynamical systems.
Construction
Geometric construction and the Smale–Williams attractor
Each solenoid may be constructed as the intersection of a nested system of embedded solid tori in R3.
Fix a sequence of natural numbers {ni}, ni ≥ 2. Let T0 = S1 × D be a solid torus. For each i ≥ 0, choose a solid torus Ti+1 that is wrapped longitudinally ni times inside the solid torus Ti. Then their intersection
is homeomorphic to the solenoid constructed as the inverse limit of the system of circles with the maps determined by the sequence {ni}.
Here is a variant of this construction isolated by Stephen Smale as an example of an expanding attractor in the theory of smooth dynamical systems. Denote the angular coordinate on the circle S1 by t (it is defined mod 2π) and consider the complex coordinate z on the two-dimensional unit disk D. Let f be the map of the solid torus T = S1 × D into itself given by the explicit formula
This map is a smooth embedding of T into itself that preserves the foliation by meridional disks (the constants 1/2 and 1/4 are somewhat arbitrary, but it is essential tha |
https://en.wikipedia.org/wiki/ISO%2031-7 | ISO 31-7 is the part of international standard ISO 31 that defines names and symbols for quantities and units related to acoustics. It is superseded by ISO 80000-8.
Its definitions include:
Acoustics
00031-07 |
https://en.wikipedia.org/wiki/Root%20mean%20square | In mathematics and its applications, the root mean square of a set of numbers (abbreviated as RMS, or rms and denoted in formulas as either or ) is defined as the square root of the mean square (the arithmetic mean of the squares) of the set.
The RMS is also known as the quadratic mean (denoted ) and is a particular case of the generalized mean. The RMS of a continuously varying function (denoted ) can be defined in terms of an integral of the squares of the instantaneous values during a cycle.
For alternating electric current, RMS is equal to the value of the constant direct current that would produce the same power dissipation in a resistive load.
In estimation theory, the root-mean-square deviation of an estimator is a measure of the imperfection of the fit of the estimator to the data.
Definition
The RMS value of a set of values (or a continuous-time waveform) is the square root of the arithmetic mean of the squares of the values, or the square of the function that defines the continuous waveform. In physics, the RMS current value can also be defined as the "value of the direct current that dissipates the same power in a resistor."
In the case of a set of n values , the RMS is
The corresponding formula for a continuous function (or waveform) f(t) defined over the interval is
and the RMS for a function over all time is
The RMS over all time of a periodic function is equal to the RMS of one period of the function. The RMS value of a continuous function or signal can be approximated by taking the RMS of a sample consisting of equally spaced observations. Additionally, the RMS value of various waveforms can also be determined without calculus, as shown by Cartwright.
In the case of the RMS statistic of a random process, the expected value is used instead of the mean.
In common waveforms
If the waveform is a pure sine wave, the relationships between amplitudes (peak-to-peak, peak) and RMS are fixed and known, as they are for any continuous periodic wave. |
https://en.wikipedia.org/wiki/Ars%20Magna%20Lucis%20et%20Umbrae | Ars Magna Lucis et Umbrae ("The Great Art of Light and Shadow") is a 1646 work by the Jesuit scholar Athanasius Kircher. It was dedicated to Ferdinand IV, King of the Romans and published in Rome by Lodovico Grignani. A second edition was published in Amsterdam in 1671 by Johann Jansson. Ars Magna was the first description published in Europe of the illumination and projection of images. The book contains the first printed illustration of Saturn and the 1671 edition also contained a description of the magic lantern.
Ars magna lucis et umbrae followed soon after Kircher's work on magnetism, Magnes sive de Arte Magnetica (1641) and the title was a play on words. In his introduction Kircher notes that the word 'magna' alluded to the powers of the magnet, so that the title could also be read “The Magnetic Art of Light and Shadow”. The work was well known for several decades.
Content
Ars Magna is the first of Kircher's works to follow a symbolic structure. It consists of ten books, represented as the ten strings of the instrument with which the psalmist praises the Lord in Psalm 143. The ten books also have a kabbalistic significance, betokening the ten sefirot.
Kircher dealt comprehensively with many different aspects of light, including physical, astronomical, astrological and metaphysical. He discussed phenomena such as fluorescence, phosphorescence and luminescence, optics and perspective. He also described pareidolia. The work deals first with the Sun, Moon, stars, comets, eclipses and planets. It also discusses phenomena related to light, such as optical illusions, colour, refraction, projection and distortion. The work includes one of the first scientific on phosphorescence and the luminosity of fireflies. He devoted much care to descriptions of instruments such as sundials, moondials and mirrors that make use of light. He had written extensively on these subjects in an earlier work, the Primitiae gnomoniciae catroptricae. Kircher also discussed the "magic |
https://en.wikipedia.org/wiki/Coherent%20turbulent%20structure | Turbulent flows are complex multi-scale and chaotic motions that need to be classified into more elementary components, referred to coherent turbulent structures. Such a structure must have temporal coherence, i.e. it must persist in its form for long enough periods that the methods of time-averaged statistics can be applied. Coherent structures are typically studied on very large scales, but can be broken down into more elementary structures with coherent properties of their own, such examples include hairpin vortices. Hairpins and coherent structures have been studied and noticed in data since the 1930s, and have been since cited in thousands of scientific papers and reviews.
Flow visualization experiments, using smoke and dye as tracers, have been historically used to simulate coherent structures and verify theories, but computer models are now the dominant tools widely used in the field to verify and understand the formation, evolution, and other properties of such structures. The kinematic properties of these motions include size, scale, shape, vorticity, energy, and the dynamic properties govern the way coherent structures grow, evolve, and decay. Most coherent structures are studied only within the confined forms of simple wall turbulence, which approximates the coherence to be steady, fully developed, incompressible, and with a zero pressure gradient in the boundary layer. Although such approximations depart from reality, they contain sufficient parameters needed to understand turbulent coherent structures in a highly conceptual degree.
History and Discovery
The presence of organized motions and structures in turbulent shear flows was apparent for a long time, and has been additionally implied by mixing length hypothesis even before the concept was explicitly stated in literature. There were also early correlation data found by measuring jets and turbulent wakes, particularly by Corrsin and Roshko. Hama's hydrogen bubble technique, which used flow visu |
https://en.wikipedia.org/wiki/Runaway%20electrons | The term runaway electrons (RE) is used to denote electrons that undergo free fall acceleration into the realm of relativistic particles. REs may be classified as thermal (lower energy) or relativistic. The study of runaway electrons is thought to be fundamental to our understanding of High-Energy Atmospheric Physics. They are also seen in tokamak fusion devices, where they can damage the reactors.
Lightning
Runaway electrons are the core element of the runaway breakdown based theory of lightning propagation. Since C.T.R. Wilson's work in 1925, research has been conducted to study the possibility of runaway electrons, cosmic ray based or otherwise, initiating the processes required to generate lightning.
Extraterrestrial Occurrence
Electron runaway based lightning may be occurring on the four jovian planets in addition to earth. Simulated studies predict runaway breakdown processes are likely to occur on these gaseous planets far more easily on earth, as the threshold for runaway breakdown to begin is far smaller.
High Energy Plasma
The runaway electron phenomenon has been observed in high energy plasmas. They can pose a threat to machines and experiments in which these plasmas exist, including ITER. Several studies exist examining the properties of runaway electrons in these environments (tokamak), searching to better suppress the detrimental effects of these unwanted runaway electrons. Recent measurements reveal higher-than-expected impurity ion diffusion in runaway electron plateaus, possibly due to turbulence. The choice between low and high atomic number (Z) gas injections for disruption mitigation techniques requires a better understanding of the impurity ion transport, as these ions may not completely mix at impact, affecting the prevention of runaway electron wall damage in large tokamak concepts, like ITER.
Computer and Numerical Simulations
This highly complex phenomenon has proved difficult to model with traditional systems, but has been modelled in p |
https://en.wikipedia.org/wiki/Subcutaneous%20tissue%20of%20perineum | The subcutaneous tissue of perineum (or superficial perineal fascia) is a layer of subcutaneous tissue surrounding the region of the perineal body.
The superficial fascia of this region consists of two layers, superficial and deep.
The superficial layer is thick, loose, areolar in texture, and contains in its meshes much adipose tissue, the amount of which varies in different subjects. In front, it is continuous with the dartos tunic of the scrotum; behind, with the subcutaneous areolar tissue surrounding the anus; and, on either side, with the same fascia on the inner sides of the thighs. In the middle line, it is adherent to the skin on the raphe and to the deep layer of the superficial fascia.
The deep layer of superficial fascia (fascia of Colles) is thin, aponeurotic in structure, and of considerable strength, serving to bind down the muscles of the root of the penis. |
https://en.wikipedia.org/wiki/Fan%20triangulation | In computational geometry, a fan triangulation is a simple way to triangulate a polygon by choosing a vertex and drawing edges to all of the other vertices of the polygon. Not every polygon can be triangulated this way, so this method is usually only used for convex polygons.
Properties
Aside from the properties of all triangulations, fan triangulations have the following properties:
All convex polygons, but not all polygons, can be fan triangulated.
Polygons with only one concave vertex can always be fan triangulated, as long as the diagonals are drawn from the concave vertex.
It can be known if a polygon can be fan triangulated by solving the Art gallery problem, in order to determine whether there is at least one vertex that is visible from every point in the polygon.
The triangulation of a polygon with vertices uses diagonals, and generates triangles.
Generating the list of triangles is trivial if an ordered list of vertices is available, and can be computed in linear time. As such, it is unnecessary to explicitly store the list of triangles, and therefore, many graphical libraries implement primitives to represent polygons based on this triangulation.
Although this triangulation is fit for solving certain problems, such as Rasterisation, or collision detection, it may be unfit for other tasks because the origin vertex accumulates a high number of neighbors, and the internal angles of the triangulation are unevenly distributed.
See also
Triangle fan |
https://en.wikipedia.org/wiki/William%20A.%20Mitchell | Dr. William A. Mitchell (October 21, 1911 – July 26, 2004) was an American food chemist who, while working for General Foods Corporation between 1941 and 1976, was the key inventor behind Pop Rocks, Tang, Cool Whip, and powdered egg whites. During his career he received over 70 patents.
Early life
He was born in Raymond, Minnesota. When he was a teenager, he ran the sugar crystallization tanks at the American Sugar Beet Company and slept two hours a night before getting to school. He earned an undergraduate degree at Cotner College in Lincoln, Nebraska and then graduated with a master's degree in chemistry from the University of Nebraska.
Career
Mitchell got a research job at an Agricultural Experiment Station in Lincoln, Nebraska. A lab accident there left him with second- and third-degree burns over most of his body. He joined General Foods in 1941. His first major success came with a tapioca substitute he helped develop during World War II, in response to the disruption of cassava supplies. Because of this, tapioca quickly became known as "Mitchell mud" within the US WW II infantry.
In 1957, he invented a powdered fruit-flavored vitamin-enhanced drink mix that became known as Tang Flavor Crystals. NASA started using Tang in 1962 in their space program.
In 1956, he tried to create instantly self-carbonating soda, which resulted in the creation of Pop Rocks. Although Pop Rocks weren't sold until 1975, he received patent 3,012,893 for its manufacturing process in 1961.
In 1967, he introduced Cool Whip, which became the largest and most profitable line in its division very quickly.
He received 70 patents in total during his career. Mitchell was a resident of Lincoln Park, New Jersey for many years before moving out west after his retirement in 1976.
Personal life
He was married to Ruth Cobbey Mitchell and they had seven children. His daughter, Cheryl Mitchell, also became a food scientist. He moved to Stockton after Ruth's death in 1999. Mitchell died of he |
https://en.wikipedia.org/wiki/Cloud%20Foundry | Cloud Foundry is an open source, multi-cloud application platform as a service (PaaS) governed by the Cloud Foundry Foundation, a 501(c)(6) organization.
The software was originally developed by VMware, transferred to Pivotal Software (a joint venture by EMC, VMware and General Electric), who then transferred the software to the Cloud Foundry Foundation upon its inception in 2015.
History
Originally conceived in 2009, Cloud Foundry was designed and developed by a small team at VMware led by Derek Collison and was originally called Project B29. At the time, a different PaaS project written in Java for Amazon EC2 used the name Cloud Foundry. It was founded by Chris Richardson in 2008 and acquired by SpringSource in 2009, the same year VMware acquired SpringSource. The current project is unrelated to the project under SpringSource, but the name was adopted when the original SpringSource project ended.
The announcement of Cloud Foundry took place in April 2011. A year later, in April 2012, BOSH, an open source tool chain for release engineering, deployment, and life-cycle management of large scale distributed services, was publicly launched. In April 2013, Pivotal was created from EMC and VMware, to market assets including Cloud Foundry, RabbitMQ and Spring.
By February 2014, it was announced that there would be an open governance foundation established with seven Platinum members and two Gold members.
In May 2014, there was an announcement of expanded membership with the addition of eight new companies. By December 2014, the membership had increased to 40.
Cloud Foundry Foundation
In January 2015, the Cloud Foundry Foundation was created as an independent not-for-profit 501(c)(6) Linux Foundation Collaborative Project.
Following the creation of the Cloud Foundry Foundation, the Cloud Foundry software (source code and all associated trademarks) was transferred to be held by the open source software foundation. It is primarily written in Ruby, Go and Java.
As of |
https://en.wikipedia.org/wiki/Parabiaugmented%20hexagonal%20prism | In geometry, the parabiaugmented hexagonal prism is one of the Johnson solids (). As the name suggests, it can be constructed by doubly augmenting a hexagonal prism by attaching square pyramids () to two of its nonadjacent, parallel (opposite) equatorial faces. Attaching the pyramids to nonadjacent, nonparallel equatorial faces yields a metabiaugmented hexagonal prism (). (The solid obtained by attaching pyramids to adjacent equatorial faces is not convex, and thus not a Johnson solid.)
External links
Johnson solids |
https://en.wikipedia.org/wiki/Hockey%20Night%20in%20Canada | CBC Television has aired National Hockey League (NHL) broadcasts under the Hockey Night in Canada (often abbreviated Hockey Night or HNiC) brand that is primarily associated with its Saturday night NHL broadcasts throughout its history in various platforms.
Saturday NHL broadcasts began in 1931 on the CNR Radio network, and debuted on television in 1952. Initially games were aired once a week, but doubleheader games had debuted in 1995 at 7:30 pm and 10:30 pm (ET) start times. Since 1998, the games begin at 7:00 pm and 10:00 pm (ET). The broadcast features various segments during the intermissions and between games, as well as pre- and post-game coverage of the night's games, and player interviews. It also shows the hosts' opinions on news and issues occurring in the league.
The Hockey Night in Canada brand is owned by the CBC and was exclusively used by CBC Sports through the end of the 2013–14 NHL season. Beginning in the 2014–15 season, the brand is being licensed to Rogers Communications for Sportsnet-produced Saturday NHL broadcasts airing on CBC Television as well as the Rogers-owned Citytv and Sportsnet networks. Rogers had secured exclusive national multimedia rights to NHL games beginning in 2014–15, and sublicensed Saturday night and playoff games to the CBC. This sub-license agreement runs through the end of the Rogers deal with the NHL.
History
Radio
Hockey broadcasting originated with play-by-play radio broadcasts from Toronto's Arena Gardens, which began on February 8, 1923, on Toronto station CFCA when Norman Albert announced the third period of play of an intermediate men's Ontario Hockey Association game. Foster Hewitt took over announcing duties within a month and, after several years of sporadic coverage that began to include National Hockey League games, the broadcasts went national in 1931 as the General Motors Hockey Broadcast. The program began broadcasting Saturday-night Toronto Maple Leafs games on November 12, 1931 over the Canadian N |
https://en.wikipedia.org/wiki/Hyper-IgM%20syndrome%20type%202 | Hyper IgM Syndrome Type 2 is a rare disease. Unlike other hyper-IgM syndromes, Type 2 patients identified thus far did not present with a history of opportunistic infections. One would expect opportunistic infections in any immunodeficiency syndrome. The responsible genetic lesion is in the AICDA gene found at 12p13.
Hyper IgM syndromes
Hyper IgM syndromes is a group of primary immune deficiency disorders characterized by defective CD40 signaling; via B cells affecting class switch recombination (CSR) and somatic hypermutation. Immunoglobulin (Ig) class switch recombination deficiencies are characterized by elevated serum IgM levels and a considerable deficiency in Immunoglobulins G (IgG), A (IgA) and E (IgE). As a consequence, people with HIGM have an increased susceptibility to infections.
Signs and symptoms
Hyper IgM syndrome can have the following syndromes:
Infection/Pneumocystis pneumonia (PCP), which is common in infants with hyper IgM syndrome, is a serious illness. PCP is one of the most frequent and severe opportunistic infections in people with weakened immune systems.
Hepatitis (Hepatitis C)
Chronic diarrhea
Hypothyroidism
Neutropenia
Arthritis
Encephalopathy (degenerative)
Cause
Different genetic defects cause HIgM syndrome, the vast majority are inherited as an X-linked recessive genetic trait and most with the condition are male.
IgM is the form of antibody that all B cells produce initially before they undergo class switching. Healthy B cells efficiently switch to other types of antibodies as needed to attack invading bacteria, viruses, and other pathogens. In people with hyper IgM syndromes, the B cells keep making IgM antibodies because can not switch to a different antibody. This results in an overproduction of IgM antibodies and an underproduction of IgA, IgG, and IgE.
Pathophysiology
CD40 is a costimulatory receptor on B cells that, when bound to CD40 ligand (CD40L), sends a signal to the B-cell receptor. When there is a defect in |
https://en.wikipedia.org/wiki/Exposure%20fusion | In image processing, computer graphics, and photography, exposure fusion is a technique for blending multiple exposures of the same scene (bracketing) into a single image. As in high dynamic range imaging (HDRI or just HDR), the goal is to capture a scene with a higher dynamic range than the camera is capable of capturing with a single exposure.
Technique
By using different exposure parameters on the same scene, a wider dynamic range can be represented and later merged into an image with better dynamic range. After correcting for small shifts that may inadvertently happen with hand-held devices, the full-image can be fused in two ways:
A higher dynamic range raw image can be reassembled and tone-mapped like usual HDR images, or more commonly:
A blended image can be directly produced without reconstructing a higher bit-depth.
The former method assumes a linear response from the camera, which may be provided by DNG or other raw formats. Some variants can take developed images, but the process of reconstructing the intensities is complicated and noisy, compromising the effective dynamic range.
The latter method [Mertens-Kautz-Van Reeth (MKVr)] only cares about aligning features and taking the best parts, automatically (by contrast, saturation, and proper exposure) or manually, so it is immune to this drawback. However, it cannot be considered a true HDR technique because no HDR image is ever created. The image does look better on displays, but the resulting bit depth of the image is equal to the input depth, unlike on a true HDR image where a greater bit depth allows storing more detailed intensity changes. Flexibility being its strength, this method can be extended to perform focus stacking by using contrast as the sole criteria.
In photomicrography
In photomicrography, the exposure fusion is often the only way to acquire properly exposed images from stereomicroscopes. One of the software solutions designed for photomicrography is the HDR module for QuickPH |
https://en.wikipedia.org/wiki/Glabrousness | Glabrousness (from the Latin glaber meaning "bald", "hairless", "shaved", "smooth") is the technical term for a lack of hair, down, setae, trichomes or other such covering. A glabrous surface may be a natural characteristic of all or part of a plant or animal, or be due to loss because of a physical condition, such as alopecia universalis in humans, which causes hair to fall out or not regrow.
In botany
Glabrousness or otherwise, of leaves, stems, and fruit is a feature commonly mentioned in plant keys; in botany and mycology, a glabrous morphological feature is one that is smooth and may be glossy. It has no bristles or hair-like structures such as trichomes. In anything like the zoological sense, no plants or fungi have hair or wool, although some structures may resemble such materials.
The term "glabrous" strictly applies only to features that lack trichomes at all times. When an organ bears trichomes at first, but loses them with age, the term used is glabrescent.
In the model plant Arabidopsis thaliana, trichome formation is initiated by the GLABROUS1 protein. Knockouts of the corresponding gene lead to glabrous plants. This phenotype has already been used in gene editing experiments and might be of interest as visual marker for plant research to improve gene editing methods such as CRISPR/Cas9.
In zoology
In varying degrees most mammals have some skin areas without natural hair. On the human body, glabrous skin is found on the ventral portion of the fingers, palms, soles of feet and lips, which are all parts of the body most closely associated with interacting with the world around us, as are the labia minora and glans penis. There are four main types of mechanoreceptors in the glabrous skin of humans: Pacinian corpuscles, Meissner's corpuscles, Merkel's discs, and Ruffini corpuscles.
The Naked mole-rat (Heterocephalus glaber) has evolved skin lacking in general, pelagic hair covering, yet has retained long, very sparsely scattered tactile hairs over i |
https://en.wikipedia.org/wiki/Covert%20channel | In computer security, a covert channel is a type of attack that creates a capability to transfer information objects between processes that are not supposed to be allowed to communicate by the computer security policy. The term, originated in 1973 by Butler Lampson, is defined as channels "not intended for information transfer at all, such as the service program's effect on system load," to distinguish it from legitimate channels that are subjected to access controls by COMPUSEC.
Characteristics
A covert channel is so called because it is hidden from the access control mechanisms of secure operating systems since it does not use the legitimate data transfer mechanisms of the computer system (typically, read and write), and therefore cannot be detected or controlled by the security mechanisms that underlie secure operating systems. Covert channels are exceedingly hard to install in real systems, and can often be detected by monitoring system performance. In addition, they suffer from a low signal-to-noise ratio and low data rates (typically, on the order of a few bits per second). They can also be removed manually with a high degree of assurance from secure systems by well established covert channel analysis strategies.
Covert channels are distinct from, and often confused with, legitimate channel exploitations that attack low-assurance pseudo-secure systems using schemes such as steganography or even less sophisticated schemes to disguise prohibited objects inside of legitimate information objects. The legitimate channel misuse by steganography is specifically not a form of covert channel.
Covert channels can tunnel through secure operating systems and require special measures to control. Covert channel analysis is the only proven way to control covert channels. By contrast, secure operating systems can easily prevent misuse of legitimate channels, so distinguishing both is important. Analysis of legitimate channels for hidden objects is often misrepresente |
https://en.wikipedia.org/wiki/Hemerythrin | Hemerythrin (also spelled haemerythrin; , ) is an oligomeric protein responsible for oxygen (O2) transport in the marine invertebrate phyla of sipunculids, priapulids, brachiopods, and in a single annelid worm genus, Magelona. Myohemerythrin is a monomeric O2-binding protein found in the muscles of marine invertebrates. Hemerythrin and myohemerythrin are essentially colorless when deoxygenated, but turn a violet-pink in the oxygenated state.
Hemerythrin does not, as the name might suggest, contain a heme. The names of the blood oxygen transporters hemoglobin, hemocyanin, hemerythrin, do not refer to the heme group (only found in globins), instead these names are derived from the Greek word for blood. Hemerythrin may also contribute to innate immunity and anterior tissue regeneration in certain worms.
O2 binding mechanism
The mechanism of dioxygen binding is unusual. Most O2 carriers operate via formation of dioxygen complexes, but hemerythrin holds the O2 as a hydroperoxide (HO2, or -OOH−). The site that binds O2 consists of a pair of iron centres. The iron atoms are bound to the protein through the carboxylate side chains of a glutamate and aspartates as well as through five histidine residues. Hemerythrin and myohemerythrin are often described according to oxidation and ligation states of the iron center:
The uptake of O2 by hemerythrin is accompanied by two-electron oxidation of the diferrous centre to produce a hydroperoxide (OOH−) complex. The binding of O2 is roughly described in this diagram:
Deoxyhemerythrin contains two high-spin ferrous ions bridged by hydroxyl group (A). One iron is hexacoordinate and another is pentacoordinate. A hydroxyl group serves as a bridging ligand but also functions as a proton donor to the O2 substrate. This proton-transfer result in the formation of a single oxygen atom (μ-oxo) bridge in oxy- and methemerythrin. O2 binds to the pentacoordinate Fe2+ centre at the vacant coordination site (B). Then electrons are transferred |
https://en.wikipedia.org/wiki/Channel%20memory | An automatic channel memory system (ACMS) is a system in which a digitally controlled radio tuner such as a TV set or VCR could search and memorize TV channels automatically. While more common in television, it can also be used to store presets for radio stations. This is often called a channel scan, though that may also refer to a "preview" mode which plays each station it finds for a few seconds and then moves on to the next, without affecting memory.
Channel scanning
A typical TV device allows an automatic channel scan to be performed from a menu accessed by a button on the TV set, or sometimes only on the remote control. This applied first to analog TV sets — sometimes those with digital LED displays, or later always those with on-screen displays. These simply searched for the video carrier signal on every channel. (Before the advent of ACMS, many sets would search for the next channel every time it was changed.)
It now also applies to digital TV, which must not only find the signal itself, but also decode its metadata enough to remap channel numbers to their proper locations. In the case of the American ATSC system, the ATSC tuner uses PSIP metadata to do this. The internal channel map for digital TV stations is different from the presets or "favorites" that the user has programmed. Just as with analog TV (which worked only by turning a preset on or off for each station/channel), users of digital television adapters and other similar tuners can choose to ignore channels that are still in the channel map.
Analog station presets and digital channel maps are normally deleted when a new scan is started. On some tuners, digital channel maps can be added-to with an "easy-add" channel scan, which is useful for finding new stations without losing old ones that may be weak or currently off-air, or not aimed-at with an antenna rotator or other set-top TV antenna adjustment. If a station adds a digital subchannel, most digital TV tuners will find it automati |
https://en.wikipedia.org/wiki/Microwindows | In computing, Nano-X is a windowing system which is full featured enough to be used on a PC, an embedded system or a PDA. It is an Open Source project aimed at bringing the features of modern graphical windowing environments to smaller devices and platforms. The project was renamed from Microwindows due to legal threats from Microsoft regarding the Windows trademark.
Overview
The Nano-X Window System is extremely portable, and completely written in C. It has been ported to the Intel 16, 32 and 64 bit CPUs, the Broadcom BCM2837 ARM Cortex-A53, as well as MIPS R4000 (NEC Vr41xx) StrongARM and PowerPC chips found on handheld and pocket PCs.
The Nano-X Window System currently runs on Linux systems with kernel framebuffer support, or using an X11 driver that allows Microwindows applications to be run on top of the X Window desktop. This driver emulates all of Microwindows' truecolor and palette modes so that an application can be previewed using the target system's display characteristics directly on the desktop display, regardless of the desktop display characteristics. In addition, it has been ported to Windows, Emscripten, Android (based on the Allegro library), and MS-DOS. Microwindows screen drivers have been written based on the SDL1 and SDL2 libraries plus the Allegro and SVGAlib libraries. There are also a VESA and a VGA 16 color 4 planes driver.
Architecture
Layered Design
Microwindows is essentially a layered design that allows different layers to be used or rewritten to suit the needs of the implementation. At the lowest level, screen, mouse/touchpad and keyboard drivers provide access to the actual display and other user-input hardware. At the mid level, a portable graphics engine is implemented, providing support for line draws, area fills, polygons, clipping and color models. At the upper level, three API's are implemented providing access to the graphics applications programmer. Currently, Microwindows supports the Xlib, Nano-X and Windows Win32/ |
https://en.wikipedia.org/wiki/Ketosis | Ketosis is a metabolic state characterized by elevated levels of ketone bodies in the blood or urine. Physiological ketosis is a normal response to low glucose availability, such as low-carbohydrate diets or fasting, that provides an additional energy source for the brain in the form of ketones. In physiological ketosis, ketones in the blood are elevated above baseline levels, but the body's acid–base homeostasis is maintained. This contrasts with ketoacidosis, an uncontrolled production of ketones that occurs in pathologic states and causes a metabolic acidosis, which is a medical emergency. Ketoacidosis is most commonly the result of complete insulin deficiency in type 1 diabetes or late-stage type 2 diabetes. Ketone levels can be measured in blood, urine or breath and are generally between 0.5 and 3.0 millimolar (mM) in physiological ketosis, while ketoacidosis may cause blood concentrations greater than 10 mM.
Trace levels of ketones are always present in the blood and increase when blood glucose reserves are low and the liver shifts from primarily metabolizing carbohydrates to metabolizing fatty acids. This occurs during states of increased fatty acid oxidation such as fasting, starvation, carbohydrate restriction, or prolonged exercise. When the liver rapidly metabolizes fatty acids into acetyl-CoA, some acetyl-CoA molecules can then be converted into ketone bodies: acetoacetate, beta-hydroxybutyrate, and acetone. These ketone bodies can function as an energy source as well as signalling molecules. The liver itself cannot utilize these molecules for energy, so the ketone bodies are released into the blood for use by peripheral tissues including the brain.
When ketosis is induced by carbohydrate restriction, it is sometimes referred to as nutritional ketosis. A low-carbohydrate, moderate protein diet that can lead to ketosis is called a ketogenic diet. Ketosis is well-established as a treatment for epilepsy and is also effective in treating type 2 diabetes.
|
https://en.wikipedia.org/wiki/Pharmacodiagnostic%20testing | Pharmacodiagnostic testing is pre-treatment testing performed in order to determine whether or not a patient is likely to respond to a given therapy. This type of test is classified as a predictive test and is a prerequisite for the implementation of stratified and personalized medicine. |
https://en.wikipedia.org/wiki/Hardy%20space | In complex analysis, the Hardy spaces (or Hardy classes) Hp are certain spaces of holomorphic functions on the unit disk or upper half plane. They were introduced by Frigyes Riesz , who named them after G. H. Hardy, because of the paper . In real analysis Hardy spaces are certain spaces of distributions on the real line, which are (in the sense of distributions) boundary values of the holomorphic functions of the complex Hardy spaces, and are related to the Lp spaces of functional analysis. For 1 ≤ p < ∞ these real Hardy spaces Hp are certain subsets of Lp, while for p < 1 the Lp spaces have some undesirable properties, and the Hardy spaces are much better behaved.
There are also higher-dimensional generalizations, consisting of certain holomorphic functions on tube domains in the complex case, or certain spaces of distributions on Rn in the real case.
Hardy spaces have a number of applications in mathematical analysis itself, as well as in control theory (such as H∞ methods) and in scattering theory.
Hardy spaces for the unit disk
For spaces of holomorphic functions on the open unit disk, the Hardy space H2 consists of the functions f whose mean square value on the circle of radius r remains bounded as r → 1 from below.
More generally, the Hardy space Hp for 0 < p < ∞ is the class of holomorphic functions f on the open unit disk satisfying
This class Hp is a vector space. The number on the left side of the above inequality is the Hardy space p-norm for f, denoted by It is a norm when p ≥ 1, but not when 0 < p < 1.
The space H∞ is defined as the vector space of bounded holomorphic functions on the disk, with the norm
For 0 < p ≤ q ≤ ∞, the class Hq is a subset of Hp, and the Hp-norm is increasing with p (it is a consequence of Hölder's inequality that the Lp-norm is increasing for probability measures, i.e. measures with total mass 1).
Hardy spaces on the unit circle
The Hardy spaces defined in the preceding section can also be viewed as certain closed |
https://en.wikipedia.org/wiki/Tonic%20sol-fa | Tonic sol-fa (or tonic sol-fah) is a pedagogical technique for teaching sight-singing, invented by Sarah Ann Glover (1785–1867) of Norwich, England and popularised by John Curwen, who adapted it from a number of earlier musical systems. It uses a system of musical notation based on movable do solfège, whereby every note is given a name according to its relationship with other notes in the key: the usual staff notation is replaced with anglicized solfège syllables (e.g. do, re, mi, fa, sol, la, ti, do) or their abbreviations (d, r, m, f, s, l, t, d). "Do" is chosen to be the tonic of whatever key is being used (thus the terminology moveable Do in contrast to the fixed Do system used by John Pyke Hullah). The original solfège sequence started with "Ut", the first syllable of the hymn Ut queant laxis, which later became "Do".
Overview
Glover developed her method in Norwich from 1812, resulting in the "Norwich Sol-fa Ladder" which she used to teach children to sing. She published her work in the Manual of the Norwich Sol-fa System (1845) and Tetrachordal System (1850).
Curwen was commissioned by a conference of Sunday school teachers in 1841 to find and promote a
way of teaching music for Sunday school singing. He took elements of the Norwich Sol-fa and other techniques later adding hand signals. It was intended that his method could teach singing initially from the Sol-fa and then a transition to staff notation.
Curwen brought out his Grammar of Vocal Music in 1843, and in 1853 started the Tonic Sol-Fa Association. The Standard Course of Lessons on the Tonic Sol-fa Method of Teaching to Sing was published in 1858.
In 1872, Curwen changed his former course of using the Sol-fa system as an aid to sight reading, when that edition of his Standard Course of Lessons excluded the staff and relied solely on Tonic Sol-fa.
In 1879 the Tonic Sol-Fa College was opened. Curwen also began publishing, and brought out a periodical called the Tonic Sol-fa Reporter and Magazine of |
https://en.wikipedia.org/wiki/McNaughton%27s%20theorem | In automata theory, McNaughton's theorem refers to a theorem that asserts that the set of ω-regular languages is identical to the set of languages recognizable by deterministic Muller automata.
This theorem is proven by supplying an algorithm to construct a deterministic Muller automaton for any ω-regular language and vice versa.
This theorem has many important consequences.
Since (non-deterministic) Büchi automata and ω-regular languages are equally expressive, the theorem implies that Büchi automata and deterministic Muller automata are equally expressive.
Since complementation of deterministic Muller automata is trivial, the theorem implies that Büchi automata/ω-regular languages are closed under complementation.
Original statement
In McNaughton's original paper, the theorem was stated as:
"An ω-event is regular if and only if it is finite-state."
In modern terminology, ω-events are commonly referred to as ω-languages. Following McNaughton's definition, an ω-event is a finite-state event if there exists a deterministic Muller automaton that recognizes it.
Constructing an ω-regular language from a deterministic Muller automaton
One direction of the theorem can be proven by showing that any given Muller automaton recognizes an ω-regular language.
Suppose A = (Q,Σ,δ,q0,F) is a deterministic Muller automaton. The union of finitely many ω-regular languages produces an ω-regular language; therefore it can be assumed without loss of generality that the Muller acceptance condition F contains exactly one set of states {q1, ... ,qn}.
Let α be the regular language whose elements will take A from q0 to q1. For 1≤i≤n, let βi be a regular language whose elements take A from qi to q(i mod n)+1 without passing through any state outside of {q1, ... ,qn}. It is claimed that α(β1 ... βn)ω is the ω-regular language recognized by the Muller automaton A. It is proved as follows.
Suppose w is a word accepted by A. Let ρ be the run that led to the acceptance of w. For a time i |
https://en.wikipedia.org/wiki/Melioidosis | Melioidosis is an infectious disease caused by a gram-negative bacterium called Burkholderia pseudomallei. Most people exposed to B. pseudomallei experience no symptoms; however, those who do experience symptoms have signs and symptoms that range from mild, such as fever and skin changes, to severe with pneumonia, abscesses, and septic shock that could cause death. Approximately 10% of people with melioidosis develop symptoms that last longer than two months, termed "chronic melioidosis".
Humans are infected with B. pseudomallei by contact with contaminated soil or water. The bacteria enter the body through wounds, inhalation, or ingestion. Person-to-person or animal-to-human transmission is extremely rare. The infection is constantly present in Southeast Asia particularly in northeast Thailand and northern Australia. In temperate countries such as Europe and the United States, melioidosis cases are usually imported from countries where melioidosis is endemic. The signs and symptoms of melioidosis resemble tuberculosis and misdiagnosis is common. Diagnosis is usually confirmed by the growth of B. pseudomallei from an infected person's blood or other bodily fluid such as pus, sputum, and urine. Those with melioidosis are treated first with an "intensive phase" course of intravenous antibiotics (most commonly ceftazidime) followed by a several-month treatment course of co-trimoxazole. In countries with the advanced healthcare system, approximately 10% of people with melioidosis die from the disease. In less developed countries, the death rate could reach 40%.
Efforts to prevent melioidosis include: wearing protective gear while handling contaminated water or soil, practising hand hygiene, drinking boiled water, and avoiding direct contact with soil, water, or heavy rain. There is little evidence in supporting the use of melioidosis prophylaxis in humans. The antibiotic co-trimoxazole is used as a preventative only for individuals at high risk for getting the diseas |
https://en.wikipedia.org/wiki/Diffusion%20barrier | A diffusion barrier is a thin layer (usually micrometres thick) of metal usually placed between two other metals. It is done to act as a barrier to protect either one of the metals from corrupting the other.
Adhesion of a plated metal layer to its substrate requires a physical interlocking, inter-diffusion of the deposit or a chemical bonding between plate and substrate in order to work. The role of a diffusion barrier is to prevent or to retard the inter-diffusion of the two superposed metals. Therefore, to be effective, a good diffusion barrier requires inertness with respect to adjacent materials. To obtain good adhesion and a diffusion barrier simultaneously, the bonding between layers needs to come from a chemical reaction of limited range at both boundaries. Materials providing good adhesion are not necessarily good diffusion barriers and vice versa. Consequently, there are cases where two or more separate layers must be used to provide a proper interface between substrates.
Selection
While the choice of diffusion barrier depends on the final function, anticipated operating temperature, and service life, are critical parameters to select diffusion barrier materials. Many thin film metal combinations have been evaluated for their adhesion and diffusion barrier properties.
Aluminum provides good electrical and thermal conductivity, adhesion and reliability because of its oxygen reactivity and the self-passivation properties of its oxide.
Copper also easily reacts with oxygen but its oxides have poor adhesion properties. As for gold its virtue relies in its inertness, and ease of application; its problem is its cost.
Chromium has excellent adhesion to many materials because of its reactivity. Its affinity for oxygen forms a thin stable oxide coat on the outer surface, creating a passivation layer which prevents further oxidation of the chromium, and of the underlying metal (if any), even in corrosive environments. Chromium plating on steel for automotive |
https://en.wikipedia.org/wiki/Vent%20pecking | Vent pecking is an abnormal behaviour of birds performed primarily by commercial egg-laying hens. It is characterised by pecking damage to the cloaca, the surrounding skin and underlying tissue. Vent pecking frequently occurs immediately after an egg has been laid when the cloaca often remains partly everted exposing the mucosa, red from the physical trauma of oviposition or bleeding if the tissue is torn by her laying an egg. Vent pecking clearly causes pain and distress to the bird being pecked. Tearing of the skin increases susceptibility to disease and may lead to cannibalism, with possible evisceration of the pecked bird and ultimately, death.
Prevalence and severity
Surveys [verification of survey specifications needed e.g. sampling size] have shown that 27% of farmers reported seeing damage to the vents of their hens and 36.9% of farmers reported vent pecking had occurred in their previous flock. Whilst farmers attributed 1.3% of mortalities as due to vent pecking the most common findings at autopsy were different types of cannibalism (65.51%), with vent cannibalism (38.57%) the most common. The type of housing system markedly affects the prevalence of vent pecking with 22.5% of hens affected in free-range systems, 10.0% in barn systems, 6.2% in conventional cages and 1.6% in furnished cages, with a similar rank for the severity of vent pecking injuries.
Causation
The causes and development of vent pecking are multifarious.
Risk factors that have been identified as increasing vent pecking include dim lights placed in nest boxes to encourage hens to use the boxes, the diet being changed more than three times during the egg laying period, the use of bell drinkers, and the hens beginning to lay earlier than 20 weeks of age. Vent pecking is associated with indicators of stress, e.g. fluctuating asymmetry, heterophil to lymphocyte ratio, and tonic immobility duration. Vent pecking can be related to disease or immune challenge as it sometimes becomes p |
https://en.wikipedia.org/wiki/68K/OS | 68K/OS was a computer operating system developed by GST Computer Systems for the Sinclair QL microcomputer.
It was commissioned by Sinclair Research in February 1983. However, after the official launch of the QL in January 1984, 68K/OS was rejected, and production QLs shipped with Sinclair's own Qdos operating system.
GST later released 68K/OS as an alternative to Qdos, in the form of an EPROM expansion card, and also planned to use it on single-board computers based on the QL's hardware.
The operating system was developed by Chris Scheybeler, Tim Ward, Howard Chalkley and others.
The few ROM cards that were made mean that surviving examples now fetch a high price: On Feb 04, 2010 one sold for £310 on eBay. |
https://en.wikipedia.org/wiki/Honeypot%20%28computing%29 | In computer terminology, a honeypot is a computer security mechanism set to detect, deflect, or, in some manner, counteract attempts at unauthorized use of information systems. Generally, a honeypot consists of data (for example, in a network site) that appears to be a legitimate part of the site which contains information or resources of value to attackers. It is actually isolated, monitored, and capable of blocking or analyzing the attackers. This is similar to police sting operations, colloquially known as "baiting" a suspect.
The main use for this network decoy is to distract potential attackers from more important information and machines on the real network, learn about the forms of attacks they can suffer, and examine such attacks during and after the exploitation of a honeypot.
It provides a way to prevent and see vulnerabilities in a specific network system. A honeypot is a decoy used to protect a network from present or future attacks.
Types
Honeypots can be differentiated based on if they are physical or virtual:
Physical honeypots: real machine with its own IP address, this machine simulates behaviors modeled by the system. Many times this modality is not used as much as the high price of acquiring new machines, their maintenance and the complication affected by configuring specialized hardware
Virtual honeypots: the use of these types of honeypot allow one to install and simulate hosts on the network from different operating systems, but in order to do so, it is necessary to simulate the TCP/IP of the target operating system. This modality is more frequent.
Honeypots can be classified based on their deployment (use/action) and based on their level of involvement. Based on deployment, honeypots may be classified as:
production honeypots
research honeypots
Production honeypots are easy to use, capture only limited information, and are used primarily by corporations. Production honeypots are placed inside the production network with other production |
https://en.wikipedia.org/wiki/Bilinear%20time%E2%80%93frequency%20distribution | Bilinear time–frequency distributions, or quadratic time–frequency distributions, arise in a sub-field of signal analysis and signal processing called time–frequency signal processing, and, in the statistical analysis of time series data. Such methods are used where one needs to deal with a situation where the frequency composition of a signal may be changing over time; this sub-field used to be called time–frequency signal analysis, and is now more often called time–frequency signal processing due to the progress in using these methods to a wide range of signal-processing problems.
Background
Methods for analysing time series, in both signal analysis and time series analysis, have been developed as essentially separate methodologies applicable to, and based in, either the time or the frequency domain. A mixed approach is required in time–frequency analysis techniques which are especially effective in analyzing non-stationary signals, whose frequency distribution and magnitude vary with time. Examples of these are acoustic signals. Classes of "quadratic time-frequency distributions" (or bilinear time–frequency distributions") are used for time–frequency signal analysis. This class is similar in formulation to Cohen's class distribution function that was used in 1966 in the context of quantum mechanics. This distribution function is mathematically similar to a generalized time–frequency representation which utilizes bilinear transformations. Compared with other time–frequency analysis techniques, such as short-time Fourier transform (STFT), the bilinear-transformation (or quadratic time–frequency distributions) may not have higher clarity for most practical signals, but it provides an alternative framework to investigate new definitions and new methods. While it does suffer from an inherent cross-term contamination when analyzing multi-component signals, by using a carefully chosen window function(s), the interference can be significantly mitigated, at the expens |
https://en.wikipedia.org/wiki/Prunus%20americana | Prunus americana, commonly called the American plum, wild plum, or Marshall's large yellow sweet plum, is a species of Prunus native to North America from Saskatchewan and Idaho south to New Mexico and east to Québec, Maine and Florida.
Prunus americana has often been planted outside its native range and sometimes escapes cultivation. It is commonly confused with the Canada plum (Prunus nigra), although the fruit is smaller and rounder and bright red as opposed to yellow. Many cultivated varieties have been derived from this species. It forms an excellent stock upon which to graft the domestic plum.
Description
The American plum grows as a large shrub or small tree, reaching up to . It is adapted to coarse- and medium-textured soils, but not to fine soils (silt or clay). Beneficially, the shrub survives harsh winters, down to temperatures of -40 degrees (Fahrenheit); but has little tolerance for shade, drought, or fire. Its growth is most active in spring and summer; it blooms in spring and starts fruiting in summer. It propagates naturally by seed, expanding as a stand relatively slowly, due to its long time to maturity when grown from seed.
The roots are shallow, widely spread, and send up suckers. The numerous stems per plant become scaly with age. The tree has a crown width and height of 10 feet at maturity. The branches are thorny. The leaves are alternately arranged, with an oval shape. The leaf length is usually long. The upper surface of the leaf is dark green; the underside is smooth and pale. The small white flowers with five petals occur singly or in clusters in the leaf axils. The globular fruits are about in diameter.
Taxonomy
Prunus americana var. lanata Sudw is considered a synonym of Prunus mexicana, and Prunus americana var. nigra is considered a synonym of Prunus nigra.
Chickasaw plum (Prunus angustifolia Marsh.) hybridizes naturally with P. americana to produce P. × orthosepala Koehne.
In cultivation, many crosses have been made between A |
https://en.wikipedia.org/wiki/Real%20closed%20ring | In mathematics, a real closed ring (RCR) is a commutative ring A that is a subring of a product of real closed fields, which is closed under continuous semi-algebraic functions defined over the integers.
Examples of real closed rings
Since the rigorous definition of a real closed ring is of technical nature it is convenient to see a list of prominent examples first. The following rings are all real closed rings:
real closed fields. These are exactly the real closed rings that are fields.
the ring of all real-valued continuous functions on a completely regular space X. Also, the ring of all bounded real-valued continuous functions on X is real closed.
convex subrings of real closed fields. These are precisely those real closed rings which are also valuation rings and were initially studied by Cherlin and Dickmann (they used the term "real closed ring" for what is now called a "real closed valuation ring").
the ring A of all continuous semi-algebraic functions on a semi-algebraic set of a real closed field (with values in that field). Also, the subring of all bounded (in any sense) functions in A is real closed.
(generalizing the previous example) the ring of all (bounded) continuous definable functions on a definable set S of an arbitrary first-order expansion M of a real closed field (with values in M). Also, the ring of all (bounded) definable functions is real closed.
Real closed rings are precisely the rings of global sections of affine real closed spaces (a generalization of semialgebraic spaces) and in this context they were invented by Niels Schwartz in the early 1980s.
Definition
A real closed ring is a reduced, commutative unital ring A which has the following properties:
The set of squares of A is the set of nonnegative elements of a partial order ≤ on A and (A,≤) is an f-ring.
Convexity condition: For all a, b in A, if 0 ≤ a ≤ b then b | a2.
For every prime ideal p of A, the residue class ring A/p is integrally closed and its field of fractions |
https://en.wikipedia.org/wiki/Deferrisoma%20camini | Deferrisoma camini is a moderately thermophilic and anaerobic bacterium from the genus of Deferrisoma which has been isolated from a deep-sea hydrothermal vent from the Eastern Lau Spreading Centre in the Pacific Ocean. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.