source stringlengths 31 227 | text stringlengths 9 2k |
|---|---|
https://en.wikipedia.org/wiki/Chernobyl%20Diaries | Chernobyl Diaries is a 2012 American disaster horror film co-written and produced by Oren Peli and directed by Brad Parker, in his directorial debut. The film stars Jonathan Sadowski, Jesse McCartney, Devin Kelley, Olivia Taylor Dudley, Ingrid Bolsø Berdal, Nathan Phillips, and Dimitri Diatchenko, and was shot on locations in Pripyat, Ukraine, as well as Hungary, and Serbia.
Plot
Chris, his girlfriend Natalie, and their mutual friend Amanda are traveling across Europe. They stop in Kyiv, Ukraine, to visit Chris' brother, Paul, before heading on to Moscow, Russia, where Chris intends to propose to Natalie.
Paul suggests they go for an extreme tour of Pripyat, an abandoned town which sits in the shadow of the Chernobyl Nuclear Power Plant, the site of the 1986 Chernobyl nuclear disaster. Chris is against going on the tour and would rather stay on the original plan of going to Moscow, but Paul insists. They meet tour guide Yuri and are joined by a backpacking couple, Norwegian Zoe and Australian Michael. Yuri drives them through Ukraine, before they arrive at a Chernobyl Exclusion Zone checkpoint, where they are refused entry by the Ukrainian military. He then takes them to an alternate entry he discovered years ago.
The group stops at a river where Yuri points out a large, mutated fish apparently able to live on land; while returning to their van several other mutant fish are seen. The group is worried about radiation poisoning, but Yuri assures their safety with a Geiger counter. After spending a few hours exploring, Yuri takes them to the upper floor of an apartment building and shows them the Chernobyl nuclear plant on the near horizon. After hearing noises at the other end of the apartment, it is found to be a bear which runs through the hallway past them, but not harming them.
The group prepares to leave Pripyat, but Yuri finds the wires in his van have been chewed through. He tries to radio for help, to no avail. As night falls, the group decides on whether |
https://en.wikipedia.org/wiki/Real%20number | In mathematics, a real number is a number that can be used to measure a continuous one-dimensional quantity such as a distance, duration or temperature. Here, continuous means that pairs of values can have arbitrarily small differences. Every real number can be almost uniquely represented by an infinite decimal expansion.
The real numbers are fundamental in calculus (and more generally in all mathematics), in particular by their role in the classical definitions of limits, continuity and derivatives.
The set of real numbers is denoted or and is sometimes called "the reals".
The adjective real, used in the 17th century by René Descartes, distinguishes real numbers from imaginary numbers such as the square roots of .
The real numbers include the rational numbers, such as the integer and the fraction . The rest of the real numbers are called irrational numbers. Some irrational numbers (as well as all the rationals) are the root of a polynomial with integer coefficients, such as the square root ; these are called algebraic numbers. There are also real numbers which are not, such as ; these are called transcendental numbers.
Real numbers can be thought of as all points on a line called the number line or real line, where the points corresponding to integers () are equally spaced.
Conversely, analytic geometry is the association of points on lines (especially axis lines) to real numbers such that geometric displacements are proportional to differences between corresponding numbers.
The informal descriptions above of the real numbers are not sufficient for ensuring the correctness of proofs of theorems involving real numbers. The realization that a better definition was needed, and the elaboration of such a definition was a major development of 19th-century mathematics and is the foundation of real analysis, the study of real functions and real-valued sequences. A current axiomatic definition is that real numbers form the unique (up to an isomorphism) Dedekind-c |
https://en.wikipedia.org/wiki/Collectin-10 | Collectin-10, also known as collectin liver 1, is a collectin protein that in humans is encoded by the COLEC10 gene. Its structure is similar to mannan-binding lectin (MBL).
Collectin liver 1 (CL-L1) show very similar carbohydrate selectivity as MBL. Two other discovered collectins include collectin placenta 1 (CL-P1) and collectin kidney 1 (CL-K1). CL-L1's location found to be on chromosome 8 q23-24.1. Research concluded CL-L1 to be a serum protein. |
https://en.wikipedia.org/wiki/Immunity%20passport | An immunity passport, immunity certificate, health pass or release certificate (among other names used by various local authorities) is a document, whether in paper or digital format, attesting that its bearer has a degree of immunity to a contagious disease. Public certification is an action that governments can take to mitigate an epidemic.
When it takes into account natural immunity or very recent negative test results, an immunity passport cannot be reduced to a vaccination record or vaccination certificate that proves someone has received certain vaccines verified by the medical records of the clinic where the vaccines were given., such as the Carte Jaune ("yellow card") issued by the World Health Organization (WHO), which works as an official vaccination record.
The concept of immunity passports received much attention during the COVID-19 pandemic as a potential way to contain the pandemic and permit faster economic recovery. Reliable serological testing for antibodies against SARS-CoV-2 virus is done to certify people as relatively immune to COVID-19 and issue immunity documentation.
History
Quarantine has been used since ancient times as a method of limiting the spread of infectious disease. Consequently, there has also been a need for documents attesting that a person has completed quarantine or is otherwise known not to be infectious. One of the oldest known immunity passports, issued in 1578 in Venice, was found by Jacek Partyka, and since the 1600s, various Italian states issued fedi di sanità to exempt their bearers from quarantine.
The International Certificate of Vaccination (Carte Jaune) is a certificate of vaccination and prophylaxis, not immunity. The document has remained largely unchanged since it was adopted by the International Sanitary Convention of 1944. The certificate is most commonly associated with Yellow Fever, but it is also used to track vaccination against other illnesses.
Modern definition
An immunity certificate is a legal do |
https://en.wikipedia.org/wiki/Bias%E2%80%93variance%20tradeoff | In statistics and machine learning, the bias–variance tradeoff describes the relationship between a model's complexity, the accuracy of its predictions, and how well it can make predictions on previously unseen data that were not used to train the model. In general, as we increase the number of tunable parameters in a model, it becomes more flexible, and can better fit a training data set. It is said to have lower error, or bias. However, for more flexible models, there will tend to be greater variance to the model fit each time we take a set of samples to create a new training data set. It is said that there is greater variance in the model's estimated parameters.
The bias–variance dilemma or bias–variance problem is the conflict in trying to simultaneously minimize these two sources of error that prevent supervised learning algorithms from generalizing beyond their training set:
The bias error is an error from erroneous assumptions in the learning algorithm. High bias can cause an algorithm to miss the relevant relations between features and target outputs (underfitting).
The variance is an error from sensitivity to small fluctuations in the training set. High variance may result from an algorithm modeling the random noise in the training data (overfitting).
The bias–variance decomposition is a way of analyzing a learning algorithm's expected generalization error with respect to a particular problem as a sum of three terms, the bias, variance, and a quantity called the irreducible error, resulting from noise in the problem itself.
Motivation
The bias–variance tradeoff is a central problem in supervised learning. Ideally, one wants to choose a model that both accurately captures the regularities in its training data, but also generalizes well to unseen data. Unfortunately, it is typically impossible to do both simultaneously. High-variance learning methods may be able to represent their training set well but are at risk of overfitting to noisy or unrepresenta |
https://en.wikipedia.org/wiki/Papernaia | Papernaia is a subgenus of the genus Plasmodium, all of which are parasitic protozoa. The subgenus was created in 2010 by Landau et al. It may be synonymous with Novyella.
Species in this subgenus infect birds with malaria.
|
https://en.wikipedia.org/wiki/DHR1%20domain | DHR1 (DOCK homology region 1), also known as CZH1 or Docker1, is a protein domain of approximately 200–250 amino acids that is present in the DOCK family of signalling proteins. This domain binds phospholipids and so may assist in recruitment to cellular membranes. There is evidence that this domain may also mediate protein–protein interactions. |
https://en.wikipedia.org/wiki/Askaryan%20radiation | The Askaryan radiation also known as Askaryan effect is the phenomenon whereby a particle traveling faster than the phase velocity of light in a dense dielectric (such as salt, ice or the lunar regolith) produces a shower of secondary charged particles which contains a charge anisotropy and emits a cone of coherent radiation in the radio or microwave part of the electromagnetic spectrum. The signal is a result of the Cherenkov radiation from individual particles in the shower. Wavelengths greater than the extent of the shower interfere constructively and thus create a radio or microwave signal which is strongest at the Cherenkov angle. The effect is named after Gurgen Askaryan, a Soviet-Armenian physicist who postulated it in 1962.
The radiation was first observed experimentally in 2000, 38 years after its theoretical prediction. So far the effect has been observed in silica sand, rock salt, ice, and Earth's atmosphere.
The effect is of primary interest in using bulk matter to detect ultra-high energy neutrinos. The Antarctic Impulse Transient Antenna (ANITA) experiment uses antennas attached to a balloon flying over Antarctica to detect the Askaryan radiation produced as cosmic neutrinos travel through the ice. Several experiments have also used the Moon as a neutrino detector based on detection of the Askaryan radiation.
See also
Cherenkov radiation |
https://en.wikipedia.org/wiki/Plant%20disease%20resistance | Plant disease resistance protects plants from pathogens in two ways: by pre-formed structures and chemicals, and by infection-induced responses of the immune system. Relative to a susceptible plant, disease resistance is the reduction of pathogen growth on or in the plant (and hence a reduction of disease), while the term disease tolerance describes plants that exhibit little disease damage despite substantial pathogen levels. Disease outcome is determined by the three-way interaction of the pathogen, the plant and the environmental conditions (an interaction known as the disease triangle).
Defense-activating compounds can move cell-to-cell and systematically through the plant's vascular system. However, plants do not have circulating immune cells, so most cell types exhibit a broad suite of antimicrobial defenses. Although obvious qualitative differences in disease resistance can be observed when multiple specimens are compared (allowing classification as “resistant” or “susceptible” after infection by the same pathogen strain at similar inoculum levels in similar environments), a gradation of quantitative differences in disease resistance is more typically observed between plant strains or genotypes. Plants consistently resist certain pathogens but succumb to others; resistance is usually specific to certain pathogen species or pathogen strains.
Background
Plant disease resistance is crucial to the reliable production of food, and it provides significant reductions in agricultural use of land, water, fuel and other inputs. Plants in both natural and cultivated populations carry inherent disease resistance, but this has not always protected them.
The late blight Great Famine of Ireland of the 1840s was caused by the oomycete Phytophthora infestans. The world’s first mass-cultivated banana cultivar Gros Michel was lost in the 1920s to Panama disease caused by the fungus Fusarium oxysporum. The current wheat stem rust, leaf rust and yellow stripe rust epidemics s |
https://en.wikipedia.org/wiki/Russula%20herrerae | Russula herrerae is an edible mushroom in the genus Russula. Described as new to science in 2002, it is found only in its type locality in Mexico, where it grows in temperate oak forests near the village of San Francisco Temezontla in the state of Tlaxcala. The specific epithet herrerae honors Mexican mycologist Teófilo Herrera. R. herrerae is classified in the section Plorantes, subsection Lactarioideae.
Description
The fruit bodies have a white cap that is in diameter. The cap margin is appendiculate, meaning that there are patches of the partial veil attached to it. The brittle white gills have an adnate to decurrent attachment to the stem and are distantly spaced, with many lamellulae (short gills) interspersed between them. The white to yellowish stem measures long by thick and is equal in width throughout, or tapers towards the base. The color of the spore print ranges from white to pale cream.
The mushrooms are considered edible by most inhabitants of San Francisco Temezontla, who call it hongo blanco (white mushroom) or hongo blanco de ocote (pine white mushroom).
See also
List of Russula species |
https://en.wikipedia.org/wiki/New%20South%20Wales%20Food%20Authority | The New South Wales Food Authority (NSW Food Authority) is a statutory authority of Government of New South Wales, responsible for food safety and food labelling regulations in the state as well as consumer food safety promotion. It is part of the DPI Biosecurity and Food Safety Branch within the Department for Primary Industries, which is a part of the Department of Regional NSW.
The authority was established in April 2004 with the amalgamation of SafeFood Production NSW and the food inspection activities of the Department of Health (being the former Food Branch and the food inspection staff of the Area Health Services). SafeFood Production NSW was itself the product of a series of amalgamations of industry-specific organisations. NSW is the first Australian state to consolidate such a wide range of food safety and food labelling functions into a single, central government agency.
Remit
The authority enforces the Food Act 2003 (NSW) and associated regulations within New South Wales in respect of all food for sale. The act brings the bi-national Food Standards Code maintained by Food Standards Australia New Zealand into force within the state. The authority designs and monitors food safety schemes under the Food Regulation 2015 for higher risk industry sectors in the state, licenses food businesses under those schemes, receives notifications for non-retail food businesses, investigates food hygiene and labelling complaints for businesses it monitors, coordinates recalls of food products by companies in the state, coordinates the state's input to national food standards policy and implementation, and advises the NSW Minister for Primary Industries on food standards.
Completing the end-to-end remit of the agency, the Food Act also gives the authority a legislative responsibility 'to provide advice, information, community education and assistance in relation to matters connected with food safety or other interests of consumers in food'.
Scope of activities
The aut |
https://en.wikipedia.org/wiki/Washington%20%28state%29%20statistical%20areas | The U.S. currently has 28 statistical areas that have been delineated by the Office of Management and Budget (OMB). On March 6, 2020, the OMB delineated six combined statistical areas, 13 metropolitan statistical areas, and nine micropolitan statistical areas in Washington.
Statistical areas
The Office of Management and Budget (OMB) has designated more than 1,000 statistical areas for the United States and Puerto Rico. These statistical areas are important geographic delineations of population clusters used by the OMB, the United States Census Bureau, planning organizations, and federal, state, and local government entities.
The OMB defines a core-based statistical area (commonly referred to as a CBSA) as "a statistical geographic entity consisting of the county or counties (or county-equivalents) associated with at least one core of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration with the core as measured through commuting ties with the counties containing the core." The OMB further divides core-based statistical areas into metropolitan statistical areas (MSAs) that have "a population of at least 50,000" and micropolitan statistical areas (μSAs) that have "a population of at least 10,000, but less than 50,000."
The OMB defines a combined statistical area (CSA) as "a geographic entity consisting of two or more adjacent core-based statistical areas with employment interchange measures of at least 15%." The primary statistical areas (PSAs) include all combined statistical areas and any core-based statistical area that is not a constituent of a combined statistical area.
Table
The table below describes the 28 United States statistical areas and 39 counties of the State of Washington with the following information:
The combined statistical area (CSA) as designated by the OMB.
The CSA population according to 2019 US Census Bureau population estimates.
The core based statistical area (CBSA) as designated by |
https://en.wikipedia.org/wiki/Discriminant%20Book | The Discriminant Book (German: Kenngruppenbuch; literally: Groups to identify the key to the receiver) shortened to K-Book (K. Buch), and also known as the indicator group book or identification group book was a secret distribution list in booklet form, which listed trigraphs in random order. The Kenngruppenbuch was introduced in May 1937, and used by the Kriegsmarine (German War Navy) during World War II as part of the Naval Enigma message encipherment procedure, to ensure secret and confidential communication between Karl Dönitz, Commander of Submarines (BdU) in the Atlantic and in the Mediterranean operating German submarines. The Kenngruppenbuch was used in the generation of the Enigma message Key that was transmitted within the message Indicator. The Kenngruppenbuch was used from 5 October 1941, for the Enigma Model M3, and from 1 February 1942 exclusively for the Enigma M4. It must not be confused with the Kenngruppenheft which was used with the Short Signal Book (German: Kurzsignalbuch).
History
The Kenngruppenbuch was a large document with the first edition coming into force in 1938, that mostly remained unchanged when a second edition was released in 1941. The Zuteilungsliste, however, was continually updated. After 1 May 1937, the Kriegsmarine had stopped using an Indicating system with a repetition of message key within the indicator, a serious security flaw, which was still being used by the Luftwaffe (German Airforce) and Heer (German Army) at the beginning of 1940, making the Naval Enigma more secure. The introduction of the K Book was designed to avert this serious security flaw.
On 9 May 1941, when a version of the K Book was recovered from U-boat U-110, Joan Clarke, and her compatriots at Hut 8, the section at Bletchley Park tasked with solving German naval (Kriegsmarine) Enigma messages, noticed that German telegraphists were not acting in a random way, which they were supposed to when making up the message Indicator. Rather than selecting a r |
https://en.wikipedia.org/wiki/Composability | Composability is a system design principle that deals with the inter-relationships of components. A highly composable system provides components that can be selected and assembled in various combinations to satisfy specific user requirements. In information systems, the essential features that make a component composable are that it be:
self-contained (modular): it can be deployed independently – note that it may cooperate with other components, but dependent components are replaceable
stateless: it treats each request as an independent transaction, unrelated to any previous request. Stateless is just one technique; managed state and transactional systems can also be composable, but with greater difficulty.
It is widely believed that composable systems are more trustworthy than non-composable systems because it is easier to evaluate their individual parts.
Simulation theory
In simulation theory, current literature distinguishes between Composability of Models and Interoperability of Simulation. Modeling is understood as the purposeful abstraction of reality, resulting in the formal specification of a conceptualization and underlying assumptions and constraints. Modeling and simulation (M&S) is, in particular, interested in models that are used to support the implementation of an executable version on a computer. The execution of a model over time is understood as the simulation. While modeling targets the conceptualization, simulation challenges mainly focus on implementation, in other words, modeling resides on the abstraction level, whereas simulation resides on the implementation level. Following the ideas derived from the Levels of Conceptual Interoperability model (LCIM), Composability addresses the model challenges on higher levels, interoperability deals with simulation implementation issues, and integratability with network questions. Tolk proposes the following definitions: Interoperability allows exchanging information between the systems and using |
https://en.wikipedia.org/wiki/Sound%20server | A sound server is software that manages the use of and access to audio devices (usually a sound card). It commonly runs as a background process.
Sound server in an operating system
In a Unix-like operating system, a sound server mixes different data streams (usually raw PCM audio) and sends out a single unified audio to an output device. The mixing is usually done by software, or by hardware if there is a supported sound card.
Layers
The "sound stack" can be visualized as follows, with programs in the upper layers calling elements in the lower layers:
Applications (e.g. mp3 player, web video)
Sound server (e.g. aRts, ESD, JACK, PulseAudio)
Sound subsystem (described as kernel modules or drivers; e.g. OSS, ALSA)
Operating system kernel (e.g. Linux, Unix)
Motivation
Sound servers appeared in Unix-like operating systems after limitations in Open Sound System were recognized. OSS is a basic sound interface that was incapable of playing multiple streams simultaneously, dealing with multiple sound cards, or streaming sound over the network.
A sound server can provide these features by running as a daemon. It receives calls from different programs and sound flows, mixes the streams, and sends raw audio out to the audio device.
With a sound server, users can also configure global and per-application sound preferences.
Diversification and problems
there are multiple sound servers; some focus on providing very low latency, while others concentrate on features suitable for general desktop systems. While diversification allows a user to choose just the features that are important to a particular application, it also forces developers to accommodate these options by necessitating code that is compatible with the various sound servers available. Consequently, this variety has resulted in a desire for a standard API to unify efforts.
List of sound servers
aRts
Enlightened Sound Daemon
JACK
Network Audio System
PipeWire
PulseAudio
sndio - OpenBSD audio an |
https://en.wikipedia.org/wiki/Outcome%20primacy | Outcome primacy is a psychological phenomenon that describes lasting effects on a subject's behavior based on the outcome of first experiences with a given task or decision. It was found that this outcome primacy can account for much of the underweighting of rare events in experience based decisions, where participants apparently underestimate small probabilities (in contrast to prospect theory where people tend to overestimate low probabilities, when lotteries are described).
Modelling
Behaviour in this task can be modelled using a standard, model-free reinforcement learning algorithm. In this model, the values of the different actions are learned over time and are used to determine the next action according to a predefined action-selection rule. It was shown that a substantial effect of first experience on behaviour is consistent with the reinforcement learning model if one assumes that the outcome of first experience resets the values of the experienced actions, but not if symmetric initial conditions are assumed.
Moreover, the predictive power of the resetting model outperforms previously published models regarding the aggregate choice behaviour.
Relation to other form of primacy
These findings suggest that first experience has a disproportionately large effect on subsequent actions, similar to primacy effects in other fields of cognitive psychology, such as in the application of the serial position effect. The mechanism of resetting of the initial conditions that underlies outcome primacy may thus also account for other forms of primacy. |
https://en.wikipedia.org/wiki/140%20%28number%29 | 140 (one hundred [and] forty) is the natural number following 139 and preceding 141.
In mathematics
140 is an abundant number and a harmonic divisor number. It is the sum of the squares of the first seven integers, which makes it a square pyramidal number.
140 is an odious number because it has an odd number of ones in its binary representation. The sum of Euler's totient function φ(x) over the first twenty-one integers is 140.
140 is a repdigit in bases 13, 19, 27, 34, 69, and 139.
In other fields
140 is also:
The number of varieties of ashes from different varieties of pipe, cigar, and cigarette tobacco included in the Sherlock Holmes monograph.
The former Twitter entry-character limit, a well-known characteristic of the service (based on the text messaging limit)
A film, based on the Twitter entry-character limit, created and edited by Frank Kelly of Ireland
The age at which Job died
The atomic number of unquadnilium, a temporary chemical element
PRO 140 antibody found on T lymphocytes of the human immune system
Telephone directory assistance in Egypt
A video game developed by Jeppe Carlsen
The BPM (tempo) of the music genre Dubstep
See also
List of highways numbered 140
United Nations Security Council Resolution 140
United States Supreme Court cases, Volume 140 |
https://en.wikipedia.org/wiki/Cryptocat | Cryptocat is a discontinued open-source desktop application intended to allow encrypted online chatting available for Windows, OS X, and Linux. It uses end-to-end encryption to secure all communications to other Cryptocat users. Users are given the option of independently verifying their buddies' device lists and are notified when a buddy's device list is modified and all updates are verified through the built-in update downloader.
Cryptocat was created by Nadim Kobeissi and further developed along with a community of open source contributors and is published under the terms of the GPLv3 license, although it has since been discontinued.
History
Cryptocat was first launched on 19 May 2011 as a web application.
In June 2012, Kobeissi said he was detained at the U.S. border by the DHS and questioned about Cryptocat's censorship resistance. He tweeted about the incident afterwards, resulting in media coverage and a spike in the popularity of the software.
In June 2013, security researcher Steve Thomas pointed out a security bug that could be used to decrypt any group chat message that had taken place using Cryptocat between September 2012 and 19 April 2013. Private messages were not affected, and the bug had been resolved a month before. In response, Cryptocat issued a security advisory, requested that all users ensure that they had upgraded, and informed users that past group conversations may have been compromised.
In February 2014, an audit by iSec Partners criticized Cryptocat's authentication model as insufficient. In response, Cryptocat made improvements to user authentication, making it easier for users to authenticate and detect man-in-the-middle attacks.
In February 2016, citing dissatisfaction with the project's current state after 19 months of non-maintenance, Kobeissi announced that he would be taking Cryptocat temporarily offline and discontinuing the development of its mobile application, pending a complete rewrite and relaunch of the software. In M |
https://en.wikipedia.org/wiki/Heat%20of%20formation%20group%20additivity | Heat of formation group additivity methods in thermochemistry enable the calculation and prediction of heat of formation of organic compounds based on additivity. This method was pioneered by S. W. Benson.
Benson model
Starting with simple linear and branched alkanes and alkenes the method works by collecting a large number of experimental heat of formation data (see: Heat of Formation table) and then divide each molecule up into distinct groups each consisting of a central atom with multiple ligands:
X-(A)i(B)j(C)k(D)l
To each group is then assigned an empirical incremental value which is independent on its position inside the molecule and independent of the nature of its neighbors:
P primary C-(C)(H)3 -10.00
S secondary C-(C)2(H)2 -5.00
T tertiary C-(C)3(H) -2.40
Q quaternary C-(C)4 -0.10
gauche correction +0.80
1,5 pentane interference correction +1.60
in kcal/mol and 298 K
The following example illustrates how these values can be derived.
The experimental heat of formation of ethane is -20.03 kcal/mol and ethane consists of 2 P groups. Likewise propane (-25.02 kcal/mol) can be written as 2P+S, isobutane (-32.07) as 3P+T and neopentane (-40.18 kcal/mol) as 4P+Q. These four equations and 4 unknowns work out to estimations for P (-10.01 kcal/mol), S (-4.99 kcal/mol), T (-2.03 kcal/mol) and Q (-0.12 kcal/mol). Of course the accuracy will increase when the dataset increases.
the data allow the calculation of heat of formation for isomers. For example, the pentanes:
n-pentane = 2P + 3S = -35 (exp. -35 kcal/mol)
isopentane = 3P + S + T + 1 gauche correction = -36.6 (exp. -36.7 kcal/mol)
neopentane = 4P + Q = 40.1 (exp. 40.1 kcal/mol)
The group additivities for alkenes are:
Cd-(H2) +6.27
Cd-(C)(D) +8.55
Cd-(C)2 +10.19
Cd-(Cd)(H) +6.78
Cd-(Cd)(C) +8.76
C-(Cd)(H)3 -10.00
C-(Cd)(C)(H)2 -4.80
C-(Cd)(C)2(H) -1.67
C-(Cd)(C)3 +1.77
C-(Cd)2(H)2 -4.30
cis correction +1.10
alkene gauche correction +0.80
In alkenes the cis isomer is always less s |
https://en.wikipedia.org/wiki/Repetition%20blindness | Repetition blindness (RB) is a phenomenon observed in rapid serial visual presentation. People are sometimes poor at recognizing when things happen twice. Repetition blindness is the failure to recognize a second happening of a visual display. The two displays are shown sequentially, possibly with other stimuli displays in between. Each display is only shortly shown, usually for about 150 milliseconds (Kanwisher, 1987). If stimuli are shown in between, RB can occur in a time interval up to 600 milliseconds. Without other stimuli displayed in between the two repeated stimuli, RB only lasts about 250 milliseconds (Luo & Caramazza, 1995). Repetition blindness tasks usually are words in lists and in sentences. They are called phonologically similar items (Bavelier & Potter, 1992). There are also pictures, and words that include pictures. An example of this is a picture of the sun and the word sun (Bavelier, 1994). The most popular task used to examine repetition blindness is to show words one after another on a screen fast in which participants must recall the words that they saw. This task is known as the rapid serial visual presentation (RSVP). Repetition blindness is present if missing the second word creates an inaccurate sentence. An example of this is "When she spilled the ink there was ink all over.” An RSVP sequence participants will recall seeing "When she spilled the ink there was all over." However, they are missing the second occurrence of "ink" (Kanwisher, 1987). This finding supports that people are "blind" for the second occurrence of a repetitive item in an RSVP series.
For example, a subject's chances of correctly reporting both appearances of the word "cat" in the RSVP stream "dog mouse cat elephant cat snake" are lower than their chances of reporting the third and fifth words in the stream "dog mouse cat elephant pig snake".
The precise mechanism underlying RB has been extensively debated. Nancy Kanwisher has argued that it involves failure to t |
https://en.wikipedia.org/wiki/Eduroam | eduroam ( ; education roaming) is an international Wi-Fi internet access roaming service for users in research, higher education and further education. It provides researchers, teachers, and students network access when visiting an institution other than their own. Users are authenticated with credentials from their home institution, regardless of the location of the eduroam access point. Authorization to access the Internet and other resources are handled by the visited institution. Users do not have to pay to use eduroam.
In some countries, Internet access via eduroam is also available at other locations than the participating institutions, e.g. in libraries, public buildings, railway stations, city centres and airports.
History
The eduroam initiative started in 2002 when during the preparations for the creation of TERENA's task force TF-Mobility, Klaas Wierenga of SURFnet shared the idea of combining a RADIUS-based infrastructure with IEEE 802.1X technology to provide roaming network access across research and education networks. Initially, the service was joined by institutions in the Netherlands, Germany, Finland, Portugal, Croatia and the United Kingdom. Later, other NRENs in Europe embraced the idea and started joining the infrastructure, which was then called eduroam. Since 2004, the European Union co-funded further research and development work related to the eduroam service through the GN2 and GN3 projects. From September 2007, the European Union also funded through these projects the continued operation and maintenance of the eduroam service at the European level.
The first non-European country to join eduroam was Australia, in December 2004. In Canada, eduroam started as an initiative of the University of British Columbia, which was later taken over by CANARIE as a service of its Canadian Access Federation. In the United States, eduroam was initially a pilot project between the National Science Foundation and the University of Tennessee (UTK). In 201 |
https://en.wikipedia.org/wiki/Profidia | Profidia is an extinct genus of leaf beetles in the subfamily Eumolpinae. It contains only one species, Profidia nitida. It is known from Oligo-Miocene amber found near Simojovel in Chiapas, Mexico.
The species was described by American entomologist Judson Linsley Gressitt in 1963, using a single specimen (UCMP 12630) from the collections of the University of California Museum of Paleontology in Berkeley, California. |
https://en.wikipedia.org/wiki/Knowledge-based%20authentication | Knowledge-based authentication, commonly referred to as KBA, is a method of authentication which seeks to prove the identity of someone accessing a service such as a financial institution or website. As the name suggests, KBA requires the knowledge of private information from the individual to prove that the person providing the identity information is the owner of the identity. There are two types of KBA: static KBA, which is based on a pre-agreed set of shared secrets, and dynamic KBA, which is based on questions generated from a wider base of personal information.
Static KBA (shared secrets)
Static KBA, also referred to as "shared secrets" or "shared secret questions," is commonly used by banks, financial services companies and e-mail providers to prove the identity of the customer before allowing account access or, as a fall-back, if the user forgets their password. At the point of initial contact with a customer, a business using static KBA must collect the information to be shared between the provider and customer—most commonly the questions and corresponding answers. This data must then be stored only to be retrieved when the customer comes back to access the account.
The weakness of static KBA was demonstrated in an incident in 2008 where unauthorized access was gained to the e-mail account of former Alaska Governor Sarah Palin. The Yahoo! account's password could be reset using shared secret questions including "where did you meet your spouse?" along with the date of birth and ZIP code of the former governor to which answers were easily available online.
Some identity verification providers have recently introduced secret sounds or pictures in an effort to help secure sites and information. These tactics require the same methods of data storage and retrieval as secret questions.
Dynamic KBA
Dynamic KBA is a high level of authentication that uses knowledge questions to verify each individual identity but does not require the person to have provided the |
https://en.wikipedia.org/wiki/IBM%20System/36 | The IBM System/36 (often abbreviated as S/36) was a midrange computer marketed by IBM from 1983 to 2000 - a multi-user, multi-tasking successor to the System/34.
Like the System/34 and the older System/32, the System/36 was primarily programmed in the RPG II language. One of the machine's optional features was an off-line storage mechanism (on the 5360 model) that utilized "magazines" – boxes of 8-inch floppies that the machine could load and eject in a nonsequential fashion. The System/36 also had many mainframe features such as programmable job queues and scheduling priority levels.
While these systems were similar to other manufacturer's minicomputers, IBM themselves described the System/32, System/34 and System/36 as "small systems" and later as midrange computers along with the System/38 and succeeding IBM AS/400 range.
The AS/400 series and IBM Power Systems running IBM i can run System/36 code in the System/36 Environment, although the code needs to be recompiled on IBM i first.
Overview of the IBM System/36
The IBM System/36 was a popular small business computer system, first announced on 16 May 1983 and shipped later that year. It had a 17-year product lifespan. The first model of the System/36 was the 5360.
In the 1970s, the US Department of Justice brought an antitrust lawsuit against IBM, claiming it was using unlawful practices to knock out competitors. At this time, IBM had been about to consolidate its entire line (System/370, 4300, System/32, System/34, System/38) into one "family" of computers with the same ISAM database technology, programming languages, and hardware architecture. After the lawsuit was filed, IBM decided it would have two families: the System/38 line, intended for large companies and representing IBM's future direction, and the System/36 line, intended for small companies who had used the company's legacy System/32/34 computers. In the late 1980s the lawsuit was dropped, and IBM decided to recombine the two product lines, cre |
https://en.wikipedia.org/wiki/Ranitidine%20bismuth%20citrate | Ranitidine bismuth citrate - drug, which has antisecretory and bactericidal action.
Group affiliation: H2-histamine receptor blocker.
Shape
Physical
Pills. The film-coated tablet contains ranitidine bismuth citrate 400 mg; in a blister 14 pcs., in a box 1, 2 or 4 blisters.
Chemical
N- / 2 - /// 5 - / (Dimethylamino) methyl / -2-furanyl / methyl / thio / ethyl / -N'-methyl-2-nitro-1,1-ethenediamine bismuth citrate.
Pharmacological properties
Antiulcer, which is mediated by two active ingredients. Ranitidine - a blocker of H2-histamine receptors, suppresses basal and stimulated, day and night gastric acid secretion, reducing both the volume of its secretion and the concentration hydrochloric acid and pepsin are secreted. Bismuth citrate in vitro has a bactericidal effect on Helicobacter pylori, a protective effect on the gastric mucosa.
Indications
Stomach ulcer and duodenal ulcer, including those associated with Helicobacter pylori (in combination with amoxicillin or clarithromycin ohm).
Prospects for repurposing
Ranitidine bismuth citrate may also be effective against coronavirus and SARS-CoV-2, since it exhibits low cytotoxicity and is able to protect cells from infection with the SARS-CoV-2 virus, with a selectivity index of several times higher than that of remdesivir a. This is due to the fact that it inhibits the activity required for viral replication helicase Nsp13 SARS-CoV-2 due to irreversible displacement of zinc (II) ions from enzyme and bismuth ions ( III)
Contraindications
Hypersensitivity, acute porphyria, chronic renal failure (CC less than 25 ml / min), childhood (up to 14 years), pregnancy, lactation.
Side effects
From the digestive system: gastralgia, diarrhea or constipation, increased activity of "hepatic" transaminases, hepatitis (hepatocellular, cholestatic or mixed, with or without jaundice, usually reversible), acute pancreatitis. From the side of the cardiovascular system: decrease BP, bradycardia, AV block, chest pa |
https://en.wikipedia.org/wiki/History%20of%20geodesy | The history of geodesy (/dʒiːˈɒdɪsi/), concerning developments in measuring and representing the planet Earth, began during antiquity and ultimately blossomed during the Age of Enlightenment.
Many early conceptions of the Earth held it to be flat, with the heavens being a physical dome spanning over it. Early arguments for a spherical Earth pointed to various more subtle empirical observations, including how lunar eclipses were seen as circular shadows, as well as the that Polaris is seen lower in the sky as one travels southward.
Hellenic world
Initial developments
Though the earliest written mention of a spherical Earth comes from ancient Greek sources, there is no account of how the sphericity of Earth was discovered, or if it was initially simply a guess. A plausible explanation given by the historian Otto E. Neugebauer is that it was "the experience of travellers that suggested such an explanation for the variation in the observable altitude of the pole and the change in the area of circumpolar stars, a change that was quite drastic between Greek settlements" around the eastern Mediterranean Sea, particularly those between the Nile Delta and Crimea.
Another possible explanation can be traced back to earlier Phoenician sailors. The first circumnavigation of Africa is described as being undertaken by Phoenician explorers employed by Egyptian pharaoh Necho II c. 610–595 BC. In The Histories, written 431–425 BC, Herodotus cast doubt on a report of the Sun observed shining from the north. He stated that the phenomenon was observed by Phoenician explorers during their circumnavigation of Africa (The Histories, 4.42) who claimed to have had the Sun on their right when circumnavigating in a clockwise direction. To modern historians, these details confirm the truth of the Phoenicians' report. The historian Dmitri Panchenko hypothesizes that it was the Phoenician circumnavigation of Africa that inspired the theory of a spherical Earth, the earliest mention of which wa |
https://en.wikipedia.org/wiki/K%C2%B7p%20perturbation%20theory | In solid-state physics, the k·p perturbation theory is an approximated semi-empirical approach for calculating the band structure (particularly effective mass) and optical properties of crystalline solids. It is pronounced "k dot p", and is also called the "k·p method". This theory has been applied specifically in the framework of the Luttinger–Kohn model (after Joaquin Mazdak Luttinger and Walter Kohn), and of the Kane model (after Evan O. Kane).
Background and derivation
Bloch's theorem and wavevectors
According to quantum mechanics (in the single-electron approximation), the quasi-free electrons in any solid are characterized by wavefunctions which are eigenstates of the following stationary Schrödinger equation:
where p is the quantum-mechanical momentum operator, V is the potential, and m is the vacuum mass of the electron. (This equation neglects the spin–orbit effect; see below.)
In a crystalline solid, V is a periodic function, with the same periodicity as the crystal lattice. Bloch's theorem proves that the solutions to this differential equation can be written as follows:
where k is a vector (called the wavevector), n is a discrete index (called the band index), and un,k is a function with the same periodicity as the crystal lattice.
For any given n, the associated states are called a band. In each band, there will be a relation between the wavevector k and the energy of the state En,k, called the band dispersion. Calculating this dispersion is one of the primary applications of k·p perturbation theory.
Perturbation theory
The periodic function un,k satisfies the following Schrödinger-type equation (simply, a direct expansion of the Schrödinger equation with a Bloch-type wave function):<ref name=Yu2.6>
{{cite book
|author=P. Yu, M. Cardona
|year=2005
|title=Fundamentals of Semiconductors: Physics and Materials Properties
|url=https://books.google.com/books?id=W9pdJZoAeyEC&pg=PA244
|edition=3rd
|page=Section 2.6, pp. 68 ff' |no-pp=yes
|publi |
https://en.wikipedia.org/wiki/Sequenom | Sequenom () is an American company based in San Diego, California. It develops enabling molecular technologies, and highly sensitive laboratory genetic tests for NIPT. Sequenom's wholly owned subsidiarity, Sequenom Center for Molecular Medicine (SCMM), offers multiple clinical molecular genetics tests to patients, including MaterniT21, plus a noninvasive prenatal test for trisomy 21, trisomy 18, and trisomy 13, and the SensiGene RHD Fetal RHD genotyping test.
In June 2014 the company sold its biosciences unit to Agena Bioscience for up to $35.8 million. In July 2016, it was announced that diagnostic and testing giant LabCorp will acquire Sequenom, paying $2.40 for every outstanding share of Sequenom stock. The acquisition was completed in September 2016.
Competition
Companies also offering non-invasive prenatal genetic testing include Ariosa, Ravgen, Illumina (Verinata Health), PerkinElmer and Natera (The Panorama Prenatal Test). Other companies and universities that are working towards developing non-invasive prenatal testing include Stanford University.
Patent litigation
In January 2012, Sequenom entered a patent battle with competing companies, Ariosa and Natera, accusing them of infringing the "540 patent" (). The cases are Sequenom Inc. v. Natera Inc. 12-cv-0184, Sequenom v. Ariosa Diagnostics Inc., 12-cv-0189, U.S. District Court, Southern District of California (San Diego), and Ariosa v. Sequenom.
Verinatal Health and Stanford University later filed suit against Sequenom in a dispute over the 'Quake patent'. Verinata claims that Sequenom's lawyers sent it a letter in 2010 alleging that "'the practice of non-invasive prenatal diagnostics, including diagnosis of the Down Syndrome and other genetic disorders, using cell-free nucleic acids in a sample of maternal blood infringes' the '540 patent, as well as the claims of a pending United States Patent Application." The '540 patent was invented by Isis Ltd. and expires in 2017.
Stanford University owns the Qu |
https://en.wikipedia.org/wiki/Confuciusornis | Confuciusornis is a genus of basal crow-sized avialan from the Early Cretaceous Period of the Yixian and Jiufotang Formations of China, dating from 125 to 120 million years ago. Like modern birds, Confuciusornis had a toothless beak, but closer and later relatives of modern birds such as Hesperornis and Ichthyornis were toothed, indicating that the loss of teeth occurred convergently in Confuciusornis and living birds. It was thought to be the oldest known bird to have a beak, though this title now belongs to an earlier relative Eoconfuciusornis. It was named after the Chinese moral philosopher Confucius (551–479 BC). Confuciusornis is one of the most abundant vertebrates found in the Yixian Formation, and several hundred complete specimens have been found.
History of discovery
In November 1993, the Chinese paleontologists Hou Lianhai and Hu Yoaming of the Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) at Beijing, visited fossil collector Zhang He at his home in Jinzhou, where he showed them a fossil bird specimen that he had bought at a local flea market. In December, Hou learned about a second specimen, which had been discovered by a farmer named Yang Yushan. Both specimens were found in the same locality in Shangyuan, Beipiao. In 1995, these two specimens, as well as a third one, were formally described as a new genus and species of bird, Confuciusornis sanctus, by Hou and colleagues. The generic name combines the philosopher Confucius with Greek ὄρνις (ornis), "bird". The specific name means "holy one" in Latin and is a translation of Chinese 圣贤 (shèngxián), "sage," again in reference to Confucius. The first discovered specimen was designated the holotype and catalogued under the specimen number IVPP V10918; it comprises a partial skeleton with skull and parts of the forelimb. Of the other two skeletons, one (paratype, IVPP V10895) comprises a complete pelvis and hind limb, and the other (paratype, IVPP V10919–10925) a fragmentary hind limb |
https://en.wikipedia.org/wiki/Acceleration%20%28special%20relativity%29 | Accelerations in special relativity (SR) follow, as in Newtonian Mechanics, by differentiation of velocity with respect to time. Because of the Lorentz transformation and time dilation, the concepts of time and distance become more complex, which also leads to more complex definitions of "acceleration". SR as the theory of flat Minkowski spacetime remains valid in the presence of accelerations, because general relativity (GR) is only required when there is curvature of spacetime caused by the energy–momentum tensor (which is mainly determined by mass). However, since the amount of spacetime curvature is not particularly high on Earth or its vicinity, SR remains valid for most practical purposes, such as experiments in particle accelerators.
One can derive transformation formulas for ordinary accelerations in three spatial dimensions (three-acceleration or coordinate acceleration) as measured in an external inertial frame of reference, as well as for the special case of proper acceleration measured by a comoving accelerometer. Another useful formalism is four-acceleration, as its components can be connected in different inertial frames by a Lorentz transformation. Also equations of motion can be formulated which connect acceleration and force. Equations for several forms of acceleration of bodies and their curved world lines follow from these formulas by integration. Well known special cases are hyperbolic motion for constant longitudinal proper acceleration or uniform circular motion. Eventually, it is also possible to describe these phenomena in accelerated frames in the context of special relativity, see Proper reference frame (flat spacetime). In such frames, effects arise which are analogous to homogeneous gravitational fields, which have some formal similarities to the real, inhomogeneous gravitational fields of curved spacetime in general relativity. In the case of hyperbolic motion one can use Rindler coordinates, in the case of uniform circular motion one c |
https://en.wikipedia.org/wiki/Nuage%20%28cell%20biology%29 | Nuage are Drosophila melanogaster germline granules. Nuage are the hallmark of Drosophila melanogaster germline cells, which have an electron-dense perinuclear structure and can silence the selfish genetic elements in Drosophila melanogaster. The term 'Nuage' comes from the French word for 'cloud', as they appear as nebulous electron-dense bodies by electron microscopy. They are found in nurse cells of the developing Drosophila melanogaster egg chamber and are composed of various types of proteins, including RNA-helicases, Tudor domain proteins, Piwi-clade Argonaute proteins, in addition to a PRMT5 methylosome composed of Capsuléen and its co-factor, Valois (MEP50).
See piRNA for more information. |
https://en.wikipedia.org/wiki/NetIQ%20eDirectory | eDirectory is an X.500-compatible directory service software product from NetIQ. Previously owned by Novell, the product has also been known as Novell Directory Services (NDS) and sometimes referred to as NetWare Directory Services. NDS was initially released by Novell in 1993 for Netware 4, replacing the Netware bindery mechanism used in previous versions, for centrally managing access to resources on multiple servers and computers within a given network. eDirectory is a hierarchical, object oriented database used to represent certain assets in an organization in a logical tree, including organizations, organizational units, people, positions, servers, volumes, workstations, applications, printers, services, and groups to name just a few.
Features
eDirectory uses dynamic rights inheritance, which allows both global and specific access controls. Access rights to objects in the tree are determined at the time of the request and are determined by the rights assigned to the objects by virtue of their location in the tree, any security equivalences, and individual assignments. The software supports partitioning at any point in the tree, as well as replication of any partition to any number of servers. Replication between servers occurs periodically using deltas of the objects. Each server can act as a master of the information it holds (provided the replica is not read only). Additionally, replicas may be filtered to only include defined attributes to increase speed (for example, a replica may be configured to only include a name and phone number for use in a corporate address book, as opposed to the entire directory user profile).
The software supports referential integrity, multi-master replication, and has a modular authentication architecture. It can be accessed via LDAP, DSML, SOAP, ODBC, JDBC, JNDI, and ADSI.
Supported platforms
Windows 2000
Windows Server 2003
Windows Server 2008
Windows Server 2012
SUSE Linux Enterprise Server
Red Hat Enterprise Li |
https://en.wikipedia.org/wiki/NCKAP1 | Nck-associated protein 1 is a protein that in humans is encoded by the NCKAP1 gene.
Interactions
NCKAP1 has been shown to interact with RAC1 and ABI1. |
https://en.wikipedia.org/wiki/Spirulina%20%28dietary%20supplement%29 | Spirulina is a biomass of cyanobacteria (blue-green algae) that can be consumed by humans and animals. The three species are Arthrospira platensis, A. fusiformis, and A. maxima.
Cultivated worldwide, Arthrospira is used as a dietary supplement or whole food. It is also used as a feed supplement in the aquaculture, aquarium, and poultry industries.
Etymology and ecology
The species A. maxima and A. platensis were once classified in the genus Spirulina. The common name, spirulina, refers to the dried biomass of A. platensis, which belongs to photosynthetic bacteria that cover the groups Cyanobacteria and Prochlorophyta. Scientifically, a distinction exists between spirulina and the genus Arthrospira. Species of Arthrospira have been isolated from alkaline brackish and saline waters in tropical and subtropical regions. Among the various species included in the genus Arthrospira, A. platensis is the most widely distributed and is mainly found in Africa, but also in Asia. A. maxima is believed to be found in California and Mexico. The term spirulina remains in use for historical reasons.
Arthrospira species are free-floating, filamentous cyanobacteria characterized by cylindrical, multicellular trichomes in an open left-handed helix. They occur naturally in tropical and subtropical lakes with high pH and high concentrations of carbonate and bicarbonate. A. platensis occurs in Africa, Asia, and South America, whereas A. maxima is confined to Central America. Most cultivated spirulina is produced in open-channel raceway ponds, with paddle wheels used to agitate the water.
Spirulina thrives at a pH around 8.5 and above and a temperature around . They are autotrophic, meaning that they are able to make their own food, and do not need a living energy or organic carbon source. A nutrient feed for growing it is:
Baking soda
Potassium nitrate
Sea salt-
Potassium phosphate
Iron sulphate
Historical use
Spirulina was a food source for the Aztecs and other Mesoamer |
https://en.wikipedia.org/wiki/Induction-recursion | In intuitionistic type theory (ITT), a discipline within mathematical logic, induction-recursion is a feature for simultaneously declaring a type and function on that type. It allows the creation of larger types, such as universes, than inductive types. The types created still remain predicative inside ITT.
An inductive definition is given by rules for generating elements of a type. One can then define functions from that type by induction on the way the elements of the type are generated. Induction-recursion generalizes this situation since one can simultaneously define the type and the function, because the rules for generating elements of the type are allowed to refer to the function.
Induction-recursion can be used to define large types including various universe constructions. It increases the proof-theoretic strength of type theory substantially. Nevertheless, inductive-recursive recursive definitions are still considered predicative.
Background
Induction-Recursion came out of investigations to the rules of Martin-Löf's intuitionistic type theory. The type theory has a number of "type formers" and four kinds of rules for each one. Martin-Löf had hinted that the rules for each type former followed a pattern, which preserved the properties of the type theory (e.g., strong normalization, predicativity). Researchers started looking for the most general description of the pattern, since that would tell what kinds of type formers could be added (or not added!) to extend the type theory.
The "universe" type former was the most interesting, because when the rules were written "à la Tarski", they simultaneously defined the "universe type" and a function that operated on it. This eventually lead Dybjer to Induction-Recursion.
Dybjer's initial papers called Induction-Recursion a "schema" for rules. It stated what type formers could be added to the type theory. Later, he and Setzer would write a new type former with rules that allowed new Induction-Recursio |
https://en.wikipedia.org/wiki/Goat | The goat or domestic goat (Capra hircus) is a domesticated species of goat-antelope typically kept as livestock. It was domesticated from the wild goat (C. aegagrus) of Southwest Asia and Eastern Europe. The goat is a member of the animal family Bovidae and the tribe Caprini, meaning it is closely related to the sheep. There are over 300 distinct breeds of goat. It is one of the oldest domesticated species of animal, according to archaeological evidence that its earliest domestication occurred in Iran at 10,000 calibrated calendar years ago. Goats have been used for milk, meat, fur, and skins across much of the world. Milk from goats is often turned into goat cheese.
In 2011, there were more than 924 million goats living in the world, according to the UN Food and Agriculture Organization.
Etymology
The Modern English word goat comes from Old English gāt "she-goat, goat in general", which in turn derives from Proto-Germanic *gaitaz (cf. Dutch/Frisian/Icelandic/Norwegian geit, German Geiß, and Gothic gaits), ultimately from Proto-Indo-European *ǵʰaidos meaning "young goat" (cf. Latin haedus "kid"). To refer to the male goat, Old English used bucca (cf. Dutch/Frisian bok and giving modern buck) until ousted by hegote, hegoote in the late 12th century. Nanny goat (females) originated in the 18th century, and billy goat (for males) originated in the 19th century.
Female goats are referred to as does or nannies, intact males are called bucks or billies, and juvenile goats of both sexes are called kids. Castrated males are called wethers. While the words hircine and caprine both refer to anything having a goat-like quality, hircine is used most often to emphasize the distinct smell of domestic goats.
History
Goats are among the earliest animals domesticated by humans. The most recent genetic analysis confirms the archaeological evidence that the wild bezoar ibex of the Zagros Mountains is the likely original ancestor of probably all domestic goats today.
Neolithi |
https://en.wikipedia.org/wiki/CCNA | CCNA (Cisco Certified Network Associate) is an information technology (IT) certification from Cisco Systems. CCNA certification is an associate-level Cisco Career certification.
The Cisco exams have changed several times in response to changing IT trends. In 2020, Cisco announced an update to its certification program that "Consolidated and updated associate-level training and certification." Cisco has consolidated the previous different types of Cisco-certified Network Associate with a general CCNA certification.
The content of the exams is proprietary. Cisco and its learning partners offer a variety of different training methods, including books published by Cisco Press, and online and classroom courses available under the title "Interconnecting Cisco Network Devices".
Exam
To achieve a CCNA certification, candidates must earn a passing score on Cisco exam No. 200-301. After the exam, candidates receive a score report along with a score breakdown by exam section and the passing score for the given exam.
The exam tests a candidate's knowledge and skills required to install, operate, and troubleshoot a small to medium size enterprise branch network. The exam covers a broad range of fundamentals, including network fundamentals, network access, IP connectivity, IP services, security fundamentals, automation, and programmability.
Prerequisites
There are no prerequisites to take the CCNA certification exam. There is also a starting point of networking which is the CCT (Cisco Certified Technician).
Validity
The validity of CCNA Certification is three years. Renewal requires certification holders to register for and pass the same or higher level Cisco re-certification exam(s) every three years.
See also
Cisco Networking Academy
Cisco certifications
DevNet
Cyber Ops
CCNP
CCIE Certification |
https://en.wikipedia.org/wiki/Differential%20game | In game theory, differential games are a group of problems related to the modeling and analysis of conflict in the context of a dynamical system. More specifically, a state variable or variables evolve over time according to a differential equation. Early analyses reflected military interests, considering two actors—the pursuer and the evader—with diametrically opposed goals. More recent analyses have reflected engineering or economic considerations.
Connection to optimal control
Differential games are related closely with optimal control problems. In an optimal control problem there is single control and a single criterion to be optimized; differential game theory generalizes this to two controls and two criteria, one for each player. Each player attempts to control the state of the system so as to achieve its goal; the system responds to the inputs of all players.
History
In the study of competition, differential games have been employed since a 1925 article by Charles F. Roos. The first to study the formal theory of differential games was Rufus Isaacs, publishing a text-book treatment in 1965. One of the first games analyzed was the 'homicidal chauffeur game'.
Random time horizon
Games with a random time horizon are a particular case of differential games. In such games, the terminal time is a random variable with a given probability distribution function. Therefore, the players maximize the mathematical expectancy of the cost function. It was shown that the modified optimization problem can be reformulated as a discounted differential game over an infinite time interval
Applications
Differential games have been applied to economics. Recent developments include adding stochasticity to differential games and the derivation of the stochastic feedback Nash equilibrium (SFNE). A recent example is the stochastic differential game of capitalism by Leong and Huang (2010). In 2016 Yuliy Sannikov received the John Bates Clark Medal from the American Economic As |
https://en.wikipedia.org/wiki/Cooperative%20eye%20hypothesis | The cooperative eye hypothesis is a proposed explanation for the appearance of the human eye. It suggests that the eye's distinctive visible characteristics evolved to make it easier for humans to follow another's gaze while communicating or while working together on tasks.
Differences in primate eyes
Unlike other primates, all human beings have eyes with a distinct colour contrast between the white sclera, the coloured iris, and the black pupil. This is due to a lack of pigment in the sclera. Other primates mostly have pigmented sclerae that are brown or dark in colour. There is also a higher contrast between human skin, sclera, and irises. Human eyes are also larger in proportion to body size, and are longer horizontally. Among primates, humans are the only species where the outline of the eye and the position of the iris can be clearly seen in each individual.
Studies
The cooperative eye hypothesis was first proposed by H. Kobayashi and S. Kohshima in 2002 and was subsequently tested by Michael Tomasello and others at the Max Planck Institute for Evolutionary Anthropology in Germany. Researchers examined the effect of head and eye movement on changing gaze direction in humans and other great apes. A human experimenter, observed by either a human infant, a gorilla, a bonobo, or a chimpanzee, did one of four actions:
Tilted his head up while closing his eyes
Looked at the ceiling with his eyes while keeping his head stationary
Looked at the ceiling with his head and his eyes
Looked straight ahead without moving his head or his eyes
The apes were most likely to follow the gaze of the experimenter when only his head moved. The infants followed the gaze more often when only the eyes moved.
The results suggest that humans depend more on eye movements than head movements when trying to follow the gaze of another. Anthropologists not involved in the study have called the hypothesis plausible, noting that "human infants and children both infer cooperative intentions |
https://en.wikipedia.org/wiki/Tata%20Neu | Tata Neu is a multi-purpose super-app, developed in India by the Tata Group. It is the country's first super-app. The app was launched to coincide with the start of a 2022 Indian Premier League match.
Tata Neu was launched on 7 April 2022, during which the servers were overloaded. Sales numbers missed the million targets set by Tata Digital. Reports stated that Tata Digital received as much as billion in funding from Tata Group for the Tata Neu app and additional investments. By May ending the app had nearly 11 million downloads, however the app was bugged with constant glitches and slow response time leading to a reduction in customer usage.
As the reports of high number of bugs and sluggish user experienced continued, reports emerged that Tata Neu CTO had resigned within 4 months and the app faced backlash over data sharing of customer info between companies. In January 2023, Mukesh Bansal the President of Tata Digital and head of operations for Tata Neu stepped down from Tata Neu with Pratik Pal, the CEO of Tata Digital, looking after all the business decisions at the firm. Soon, additional reports emerged that Tata Neu is set to miss the first year GMV by as much as 50%. Internal projections showed the company projecting GMV by March 2023 at billion vs expected billion (which was scaled down from billion).
The Tata Neu app, which was revamped in time for the 2023 Indian Premier League (IPL) season, saw a significant increase in downloads and membership following the tournament.The Tata Group's sponsorship of the IPL and the heavy promotion of the Neu app during the matches likely played a role in this growth. Post the revamp, Neu’s rating on the Google Play store has improved from 3.8 to 4.2. |
https://en.wikipedia.org/wiki/Logorama | Logorama is a 2009 French adult animated short film produced by the French graphic design and animation studio H5 as their first and only animated project. Co-written and directed by François Alaux, Hervé de Crécy and Ludovic Houplain, the film is set in a stylized version of Los Angeles and portrays various events as being told entirely through the extensive use of more than 2,000 contemporary and historical company logos and mascots. The short's voice cast consists of Bob Stephenson, David Fincher, Aja Evans, Sherman Augustus, Joel Michaely, Matt Winston, Gregory J. Pruss, Josh Eichenbaum, Jaime Ray Newman and Andrew Kevin Walker.
Upon its release on May 20, 2009 in Cannes, Logorama has received numerous awards and nominations, including winning both the Kodak Discovery Award for Best Short Film (the Kodak Prix) at the 2009 Cannes Film Festival and the Academy Award for Best Animated Short Film at the 82nd Academy Awards the following year.
Plot
In Los Angeles the buildings and inhabitants are commercial branding: Birds are in the form of Bentley and Aston Martin logos and MSN's butterfly; pedestrians are in the shape of the AIM icon; and overhead highway signs are mounted on Atlantic Records logos, among others. The Original Pringles mascot pulls into a Pizza Hut restaurant's parking lot and propositions an Esso waitress on a smoking break. Meanwhile, BiC Boy students alongside Bob's Big Boy and Haribo Boy are on a tour at a zoo led by Mr. Clean; hating the tour, Big Boy and Haribo Boy hop off the tour train and soon begin to harass the MGM lion by mooning and throwing a bottle of Coca-Cola at it, prompting the zoo's owner, the Jolly Green Giant, to scold them.
As Michelin Man police officers Mitch and Mike order lunch at KFC, a call comes in over the radio stating that a criminal named Ronald McDonald is on the loose on a red delivery truck; the police chase Ronald in his truck. Many innocent bystanders are imperiled/injured. Meanwhile, the students on tour |
https://en.wikipedia.org/wiki/Varicosavirus | Varicosavirus is a genus of plant viruses. The virus is associated with swelling in plant vein tissues. They are negative single stranded RNA viruses. The genus contains three species.
Taxonomy
The genus contains the following species:
Alopecurus varicosavirus
Lettuce big-vein associated varicosavirus
Trifolium varicosavirus
Structure
Virions consist of a non-enveloped rod-shaped capsid, having a helical symmetry of 120–360 nm in length, and a width of 18–30 nm.
Genome
The genome consists of a bi-segmented linear, single-stranded negative sense RNA. The first segment is about 6350–7000 nucleotides in length; the second, about 5630–6500 nucleotides in length. |
https://en.wikipedia.org/wiki/Notre%20Dame%20Journal%20of%20Formal%20Logic | The Notre Dame Journal of Formal Logic is a quarterly peer-reviewed scientific journal covering the foundations of mathematics and related fields of mathematical logic, as well as philosophy of mathematics. It was established in 1960 and is published by Duke University Press on behalf of the University of Notre Dame. The editors-in-chief are Curtis Franks and Anand Pillay (University of Notre Dame).
Abstracting and indexing
The journal is abstracted and indexed in:
According to the Journal Citation Reports, the journal has a 2012 impact factor of 0.431. |
https://en.wikipedia.org/wiki/Nurses%20Improving%20Care%20for%20Healthsystem%20Elders | Nurses Improving Care for Healthsystem Elders (NICHE) is a program of the Hartford Institute for Geriatric Nursing at New York University College of Nursing, that works to achieve systematic nursing change to benefit hospitalized older patients. Founded in 1992, NICHE has evolved into a national geriatric nursing program comprising over 620 hospitals in more than 40 states as well as parts of Canada.
Unlike similar programs, NICHE does not prescribe how institutions should modify geriatric care; rather, it provides the materials and services necessary to stimulate a change in the culture of healthcare facilities to achieve patient-centered care for older adults.
History
NICHE is based on 1981 Geriatric Resource Nurse (GRN) model, in which unit-based GRN nurses provided consultation to other staff nurses leading improved care of older adults by creating standard protocols for common geriatric problems and enhancing the expertise of staff nurses. Initially a part of the Hartford Foundation's Hospital Outcomes Program for the Elderly (HOPE) multi-site initiative, after a series of funds for the project, the NICHE program was officially created in 1992. Through a rigorous screening-process using the Geriatric Institutional Assessment Profile (GIAP), an instrument designed to help hospitals analyze the needs of their elderly patients and determine gaps in geriatric care provision, the NICHE Program has become an integral part of the Hartford Institute since 1996.
In 2006, Hartford Institute received funding from Atlantic Philanthropies to develop a business plan to expand its organizational capacity, improve dramatically the program's "toolkit"—particularly its measurement and reporting capacity—and initiate outreach to accelerate adoption of this program by additional hospitals.
The NICHE Program
The NICHE Program provides the principles, resources and tools to stimulate a change in the culture of health care facilities and achieve patient-centered care for olde |
https://en.wikipedia.org/wiki/SaTScan | SaTScan is a software tool that employs scan statistics for the spatial and temporal analysis of clusters of events. The software is trademarked by Martin Kulldorff, and was designed originally for public health and epidemiology to identify clusters of cases in both space (geographical location) and time and to perform statistical analysis to determine if these clusters are significantly different from what would be expected by chance The software provides a user-friendly interface and a range of statistical methods, making it accessible to researchers and practitioners. While not a full Geographic Information System, the outputs from SaTScan can be integrated with software such as ArcGIS or QGIS to visualize and analyze spatial data, and to map the distribution of various phenomena.
Analysis
SaTScan employs scan statistics to identify clusters of space and time phenomena. Scan statistics use regular shapes (usually circles) of varying sizes to evaluate a study area. Within each circle, the software computes if the phenomena within the circle is significantly different than expected compared to the area outside the circle.
SaTScan can analyze data retrospectively or prospectively. It can look at the data spatially, temporally, or simultaneously incorporate both space and time. SaTScan can incorporate numerous probability models, including Poisson distribution, Bernoulli distribution, Monte Carlo method, and multinomial distribution. Using these, it can look for areas of higher and lower occurrences of phenomena than expected.
Results are output into a variety of formats, including ESRI Shapefile, HTML, and KML.
History
SaTScan was developed by a group of epidemiologists and statisticians led by Martin Kulldorff, a Swedish biostatistician professor of medicine at Harvard Medical School. Version 1.0 of the software was first released in 1997 and has since become a widely used tool in the field of public health research and practice.
SaTScan was developed in res |
https://en.wikipedia.org/wiki/Divisorial%20scheme | In algebraic geometry, a divisorial scheme is a scheme admitting an ample family of line bundles, as opposed to an ample line bundle. In particular, a quasi-projective variety is a divisorial scheme and the notion is a generalization of "quasi-projective". It was introduced in (in the case of a variety) as well as in (in the case of a scheme). The term "divisorial" refers to the fact that "the topology of these varieties is determined by their positive divisors." The class of divisorial schemes is quite large: it includes affine schemes, separated regular (noetherian) schemes and subschemes of a divisorial scheme (such as projective varieties).
Definition
Here is the definition in SGA 6, which is a more general version of the definition of Borelli. Given a quasi-compact quasi-separated scheme X, a family of invertible sheaves on it is said to be an ample family if the open subsets form a base of the (Zariski) topology on X; in other words, there is an open affine cover of X consisting of open sets of such form. A scheme is then said to be divisorial if there exists such an ample family of invertible sheaves.
Properties and counterexample
Since a subscheme of a divisorial scheme is divisorial, "divisorial" is a necessary condition for a scheme to be embedded into a smooth variety (or more generally a separated Noetherian regular scheme). To an extent, it is also a sufficient condition.
A divisorial scheme has the resolution property; i.e., a coherent sheaf is a quotient of a vector bundle. In particular, a scheme that does not have the resolution property is an example of a non-divisorial scheme.
See also
Jouanolou's trick |
https://en.wikipedia.org/wiki/Video%20game%20remake | A video game remake is a video game closely adapted from an earlier title, usually for the purpose of modernizing a game with updated graphics for newer hardware and gameplay for contemporary audiences. Typically, a remake of such game software shares essentially the same title, fundamental gameplay concepts, and core story elements of the original game, although some aspects of the original game may have been changed for the remake.
Remakes are often made by the original developer or copyright holder, and sometimes by the fan community. If created by the community, video game remakes are sometimes also called fangames and can be seen as part of the retro gaming phenomenon.
Definition
A remake offers a newer interpretation of an older work, characterized by updated or changed assets. For example, The Legend of Zelda: Ocarina of Time 3D and The Legend of Zelda: Majora's Mask 3D for the Nintendo 3DS are considered remakes of their original versions for the Nintendo 64, and not a remaster or a port, since there are new character models and texture packs. The Legend of Zelda: Wind Waker HD for Wii U would be considered a remaster, since it retains the same, albeit updated upscaled aesthetics of the original.
A remake typically maintains the same story, genre, and fundamental gameplay ideas of the original work. The intent of a remake is usually to take an older game that has become outdated and update it for a new platform and audience. A remake will not necessarily preserve the original gameplay especially if it is dated, instead remaking the gameplay to conform to the conventions of contemporary games or later titles in the same series in order to make a game marketable to a new audience.
For example, for Sierra's 1991 remake of Space Quest, the developers used the engine, point-and-click interface, and graphical style of Space Quest IV: Roger Wilco and The Time Rippers, replacing the original graphics and text parser interface of the original. However, other elem |
https://en.wikipedia.org/wiki/Parity%20of%20a%20permutation | In mathematics, when X is a finite set with at least two elements, the permutations of X (i.e. the bijective functions from X to X) fall into two classes of equal size: the even permutations and the odd permutations. If any total ordering of X is fixed, the parity (oddness or evenness) of a permutation of X can be defined as the parity of the number of inversions for σ, i.e., of pairs of elements x, y of X such that and .
The sign, signature, or signum of a permutation σ is denoted sgn(σ) and defined as +1 if σ is even and −1 if σ is odd. The signature defines the alternating character of the symmetric group Sn. Another notation for the sign of a permutation is given by the more general Levi-Civita symbol (εσ), which is defined for all maps from X to X, and has value zero for non-bijective maps.
The sign of a permutation can be explicitly expressed as
where N(σ) is the number of inversions in σ.
Alternatively, the sign of a permutation σ can be defined from its decomposition into the product of transpositions as
where m is the number of transpositions in the decomposition. Although such a decomposition is not unique, the parity of the number of transpositions in all decompositions is the same, implying that the sign of a permutation is well-defined.
Example
Consider the permutation σ of the set defined by and In one-line notation, this permutation is denoted 34521. It can be obtained from the identity permutation 12345 by three transpositions: first exchange the numbers 2 and 4, then exchange 3 and 5, and finally exchange 1 and 3. This shows that the given permutation σ is odd. Following the method of the cycle notation article, this could be written, composing from right to left, as
There are many other ways of writing σ as a composition of transpositions, for instance
,
but it is impossible to write it as a product of an even number of transpositions.
Properties
The identity permutation is an even permutation. An even permutation can be obtai |
https://en.wikipedia.org/wiki/ADF-H%20domain | In molecular biology, ADF-H domain (actin-depolymerising factor homology domain) is an approximately 150 amino acid motif that is present in three phylogenetically distinct classes of eukaryotic actin-binding proteins.
ADF/cofilins, which include ADF, cofilin, destrin, actophorin, coactosin, depactin and glia maturation factors (GMFs) beta and gamma. ADF/cofilins are small actin-binding proteins composed of a single ADF-H domain. They bind both actin-monomers and filaments and promote rapid filament turnover in cells by depolymerising/fragmenting actin filaments. ADF/cofilins bind ADP-actin with higher affinity than ATP-actin and inhibit the spontaneous nucleotide exchange on actin monomers
Twinfilins, which are actin monomer-binding proteins that are composed of two ADF-H domains
Abp1/Drebrins, which are relatively large proteins composed of an N-terminal ADF-H domain followed by a variable region and a C-terminal SH3 domain. Abp1/Drebrins interact only with actin filaments and do not promote filament depolymerisation or fragmentation. Although these proteins are biochemically distinct and play different roles in actin dynamics, they all appear to use the ADF-H domain for their interactions with actin.
The ADF-H domain consists of a six-stranded mixed beta-sheet in which the four central strands (beta2-beta5) are anti-parallel and the two edge strands (beta1 and beta6) run parallel with the neighbouring strands. The sheet is surrounded by two alpha-helices on each side . |
https://en.wikipedia.org/wiki/Genetics%20Society%20of%20America%20Medal | The Genetics Society of American Medal is a medal awarded by the Genetics Society of America (GSA) for outstanding contributions to the field of genetics in the last 15 years.
The Medal was established by society in 1981 and recognizes members who have made recent contributions to the field.
Award recipients
Source: Genetics Society of America
2022 Margaret Fuller, Stanford University School of Medicine
2021 Douglas Koshland
2020 Bonnie Bassler, Princeton University
2019 Anne Villeneuve
2018 Mariana Wolfner
2017 David Kingsley
2016 Detlef Weigel
2015 Steven Henikoff
2014 Angelika B. Amon
2013 Elaine Ostrander
2012 Joanne Chory
2011 John Carlson
2010 Barbara J. Meyer
2009 Marian Carlson
2008 Susan Lindquist
2007 Shirley M. Tilghman
2006 Victor Ambros
2005 Stephen J. Elledge
2004 Trudy F. Mackay
2003 Jeffrey C. Hall
2002 Andrew Z. Fire (Nobel Prize in Physiology or Medicine)
2001 H. Robert Horvitz (Nobel Prize in Physiology or Medicine)
2000 Jack W. Szostak (Nobel Prize in Physiology or Medicine)
1999
1998 Ronald W. Davis
1997 Christine Guthrie
1996 Elliot Meyerowitz
1995 Eric F. Wieschaus (Nobel Prize in Physiology or Medicine)
1994 Leland H. Hartwell (Nobel Prize in Physiology or Medicine)
1993 Jonathan R. Beckwith
1992 Maynard V. Olson
1991 Bruce S. Baker
1990 Nancy Kleckner
1989 Allan C. Spradling
1988 David Botstein and Ira Herskowitz
1987 Sydney Brenner (Nobel Prize in Physiology or Medicine)
1986 Gerald Rubin
1985 Philip Leder
1984 David S. Hogness
1983 Charles Yanofsky
1982 Gerald R. Fink
1981 Beatrice Mintz
See also
List of genetics awards |
https://en.wikipedia.org/wiki/Nicholson%E2%80%93Bailey%20model | The Nicholson–Bailey model was developed in the 1930s to describe the population dynamics of a coupled host-parasitoid system. It is named after Alexander John Nicholson and Victor Albert Bailey. Host-parasite and prey-predator systems can also be represented with the Nicholson-Bailey model. The model is closely related to the Lotka–Volterra model, which describes the dynamics of antagonistic populations (preys and predators) using differential equations.
The model uses (discrete time) difference equations to describe the population growth of host-parasite populations. The model assumes that parasitoids search for hosts at random, and that both parasitoids and hosts are assumed to be distributed in a non-contiguous ("clumped") fashion in the environment. In its original form, the model does not allow for stable coexistence. Subsequent refinements of the model, notably adding density dependence on several terms, allowed this coexistence to happen.
Equations
Derivation
The model is defined in discrete time. It is usually expressed as
with H the population size of the host, P the population size of the parasitoid, k the reproductive rate of the host, a the searching efficiency of the parasitoid, and c the average number of viable eggs that a parasitoid lays on a single host.
This model can be explained based on probability. is the probability that the host will survive predators; whereas is that they will not, bearing in mind the parasitoid eventually will hatch into larva and escape.
Analysis of the Nicholson–Bailey model
When , is the unique non-negative fixed point and all non-negative solutions converge to . When , all non-negative solutions lie on level curves of the function and converge to a fixed point on the -axis. When , this system admits one unstable positive fixed point, at
It has been proven that all positive solutions whose initial conditions are not equal to are unbounded and exhibit oscillations with infinitely increasing amplitude.
V |
https://en.wikipedia.org/wiki/Vegan%20Awareness%20Foundation | The Vegan Awareness Foundation, also known as Vegan Action, is a 501(c)(3) non-profit organization in Virginia, United States, and founded in 1995. Its declared goal is to help animals, the environment, and human health by educating the public about the benefits of a vegan lifestyle and encourage the spread of vegan food options through public outreach campaigns. One of the goals of Vegan Action is to create growth in the vegan marketplace and increase the availability of vegan products. They have introduced a logo to certify vegan products, vegan food options into schools nationwide, and ideas behind veganism.
Activities and campaigns
Vegan certification
Since 2000, Vegan Action administers a "Certified Vegan" logo, which is a registered trademark applied to foods, clothing, cosmetics and other items that contain no animal products and are not tested on animals. The logo is aimed at consumers interested in vegan products and helps vegans to shop without consulting ingredient lists. Since its introduction, the logo has been used by over 800 companies on thousands of products.
McVegan
Each year, Vegan Action helps people worldwide organize McVegan events, which involve passing out thousands of free vegan food samples and dietary information to the public. In 1995, McDonald's threatened to sue Vegan Action for trademark infringement over McVegan shirts which featured McDonald's well-known golden arches with the logo "McVegan. Billions and billions saved." Instead of backing down, Vegan Action enlisted pro-bono services from an intellectual property legal firm and developed a defense based on the First Amendment's protection of parody. The story was picked up by The Los Angeles Times, San Francisco Examiner, Houston Chronicle, Chicago Tribune, the AP and UPI wires, National Public Radio, and four local television networks. After two weeks of widespread press, McDonald's backed down and formally withdrew their threat of legal action.
Dorm food
Vegan Action has worke |
https://en.wikipedia.org/wiki/List%20of%20marine%20ecoregions | The following is a list of marine ecoregions, as defined by the WWF and The Nature Conservancy
The WWF/Nature Conservancy scheme groups the individual ecoregions into 12 marine realms, which represent the broad latitudinal divisions of polar, temperate, and tropical seas, with subdivisions based on ocean basins. The marine realms are subdivided into 62 marine provinces, which include one or more of the 232 marine ecoregions.
The WWF/Nature Conservancy scheme currently encompasses only coastal and continental shelf areas.
Arctic realm
(no provinces identified)
North Greenland
North and East Iceland
East Greenland Shelf
West Greenland Shelf
Northern Grand Banks-Southern Labrador
Northern Labrador
Baffin Bay-Davis Strait
Hudson Complex
Lancaster Sound
High Arctic Archipelago
Beaufort-Admunsen-Viscount Melville-Queen Maud
Beaufort Sea-continental coast and shelf
Chukchi Sea
Eastern Bering Sea
East Siberian Sea
Laptev Sea
Kara Sea
North and East Barents Sea
White Sea
Temperate Northern Atlantic
Northern European Seas
South and West Iceland
Faroe Plateau
Southern Norway
Northern Norway and Finnmark
Baltic Sea
North Sea
Celtic Seas
Lusitanian
South European Atlantic Shelf
Saharan Upwelling
Azores Canaries Madeira
Mediterranean Sea
Adriatic Sea
Aegean Sea
Levantine Sea
Tunisian Plateau/Gulf of Sidra
Ionian Sea
Western Mediterranean
Alboran Sea
Black Sea
Black Sea
Cold Temperate Northwest Atlantic
Gulf of St. Lawrence-Eastern Scotian Shelf
Southern Grand Banks-South Newfoundland
Scotian Shelf
Gulf of Maine-Bay of Fundy
Virginian
Warm Temperate Northwest Atlantic
Carolinian
Northern Gulf of Mexico
Temperate Northern Pacific
Cold Temperate Northwest Pacific
Sea of Okhotsk
Kamchatka Shelf and Coast
Oyashio Current
Northern Honshu
Sea of Japan
Yellow Sea
Warm Temperate Northwest Pacific
Central Kuroshio Current
East China Sea
Cold Temperate Northeast Pacific
Aleutian Islands
Gulf of Alaska
North American Pacific Fjordland
Puget Trough/Georgia Basin
Oregon, Washin |
https://en.wikipedia.org/wiki/Exome%20sequencing | Exome sequencing, also known as whole exome sequencing (WES), is a genomic technique for sequencing all of the protein-coding regions of genes in a genome (known as the exome). It consists of two steps: the first step is to select only the subset of DNA that encodes proteins. These regions are known as exons—humans have about 180,000 exons, constituting about 1% of the human genome, or approximately 30 million base pairs. The second step is to sequence the exonic DNA using any high-throughput DNA sequencing technology.
The goal of this approach is to identify genetic variants that alter protein sequences, and to do this at a much lower cost than whole-genome sequencing. Since these variants can be responsible for both Mendelian and common polygenic diseases, such as Alzheimer's disease, whole exome sequencing has been applied both in academic research and as a clinical diagnostic.
Motivation and comparison to other approaches
Exome sequencing is especially effective in the study of rare Mendelian diseases, because it is an efficient way to identify the genetic variants in all of an individual's genes. These diseases are most often caused by very rare genetic variants that are only present in a tiny number of individuals; by contrast, techniques such as SNP arrays can only detect shared genetic variants that are common to many individuals in the wider population. Furthermore, because severe disease-causing variants are much more likely (but by no means exclusively) to be in the protein coding sequence, focusing on this 1% costs far less than whole genome sequencing but still detects a high yield of relevant variants.
In the past, clinical genetic tests were chosen based on the clinical presentation of the patient (i.e. focused on one gene or a small number known to be associated with a particular syndrome), or surveyed only certain types of variation (e.g. comparative genomic hybridization) but provided definitive genetic diagnoses in fewer than half of all patien |
https://en.wikipedia.org/wiki/Psi-theory | Psi-theory, developed by Dietrich Dörner at the University of Bamberg, is a systemic psychological theory covering human action regulation, intention selection and emotion. It models the human mind as an information processing agent, controlled by a set of basic physiological, social and cognitive drives. Perceptual and cognitive processing are directed and modulated by these drives, which allow the autonomous establishment and pursuit of goals in an open environment.
Next to the motivational and emotional system, Psi-theory suggests a neuro-symbolic model of representation, which encodes semantic relationships in a hierarchical spreading activation network. The representations are grounded in sensors and actuators, and are acquired by autonomous exploration.
Main assumptions
The concepts of Psi-theory may be reduced to a set of basic assumptions. Psi-theory describes a cognitive system as a structure consisting of relationships and dependencies that is designed to maintain a homeostatic balance in the face of a dynamic environment.
Representation
Psi-theory suggests hierarchical networks of nodes as a universal mode of representation for declarative, procedural and tacit knowledge. These nodes may encode localist and distributed representations. The activity of the system is modeled using modulated and directional spreading of activation within these networks.
Plans, episodes, situations and objects are described with a semantic network formalism that relies on a fixed number of pre-defined link types, which especially encode causal/sequential ordering, and partonomic hierarchies (the theory specifies four basic link-types). Special nodes (representing neural circuits) control the spread of activation and the forming of temporary or permanent associations and their dissociations.
Memory
At any time, the Psi agent possesses a world model (situation image). This is extrapolated into a branching expectation horizon (consisting of anticipated developments and acti |
https://en.wikipedia.org/wiki/Nasonov%27s%20gland | Nasonov's gland produces a pheromone used in recruitment in worker honeybees. The pheromone can serve the purposes of attracting workers to a settled swarm and draw bees who have lost their way back to the hive. It is used to recruit workers to food that lacks a characteristic scent and lead bees to water sources. The gland is located on the dorsal side of the abdomen. Its opening is located at the base of the last tergite at the tip of the abdomen.
The gland was first described in 1882 by the Russian zoologist Nikolai Viktorovich Nasonov (February 14, 1855 – February 11, 1939). Nasonov thought that the gland performed perspiration; it was Frederick William Lambert Sladen (May 30, 1876 - 1921) of England who in 1901 first proposed that the gland produced a pheromone.
See also
Nasonov pheromone |
https://en.wikipedia.org/wiki/CECPQ2 | In cryptography, Combined Elliptic-Curve and Post-Quantum 2 (CECPQ2) is a quantum secure modification to Transport Layer Security (TLS) 1.3 developed by Google. It is intended to be used experimentally, to help evaluate the performance of post quantum key-exchange algorithms on actual users' devices.
Details
Similarly to its predecessor CECPQ1, CECPQ2 aims to provide confidentiality against an attacker with a large scale quantum computer. It is essentially a plugin for the TLS key-agreement part. CECPQ2 combines two key exchange mechanisms: the classical X25519 and HRSS (Hülsing, Rijneveld, Schanck, and Schwabe) scheme (an instantiation of the NTRU lattice based key exchange primitive). Additionally, Kris Kwiatkowski has implemented and deployed an alternative version of post-quantum key exchange algorithm, titled CECPQ2b. Similarly to CECPQ2, this is also a hybrid post-quantum key exchange scheme, that is based on supersingular isogeny key exchange (SIKE) instead of HRSS.
CECPQ2 uses 32 bytes of shared secret material derived from the classical X25519 mechanism, and 32 bytes of shared secret material derived from the quantum-secure HRSS mechanism. The resulting bytes are concatenated and used as secret key. Concatenation is meant to assure that the protocol provides at least the same security level as widely used X25519, should HRSS be found insecure.
The algorithm was to be deployed on both the server side using Cloudflare's infrastructure, and the client side using Google Chrome Canary. Since both parties need to support the algorithm for it to be chosen, this experiment is available only to Chrome Canary users accessing websites hosted by Cloudflare.
It was estimated that the experiment started mid-2019. It was considered a step in a general program at Cloudflare to transition to post-quantum safe cryptographic primitives.
Support for CECPQ2 was removed from BoringSSL in April 2023.
See also
Elliptic-curve Diffie–Hellman |
https://en.wikipedia.org/wiki/ChoKyun%20Rha | ChoKyun Rha (October 5, 1933 – March 2, 2021) was a Korean-born American food technologist, inventor, and professor of biomaterials science and engineering at the Massachusetts Institute of Technology (MIT). She was the first Asian woman awarded tenure at MIT.
Early life
ChoKyun Rha was born in Seoul, the daughter of SaeJin Rha and Young Soon Choi Rha. Her father was a physician and dean of the medical school at Seoul National University. She moved to the United States in 1956, and attended Miami University in Ohio, before enrolling at MIT as an undergraduate. She finished a bachelor's degree in 1962, with a senior thesis on the storage of dried scallions. She stayed at MIT to earn master's degrees in 1964 and 1966, and completed a doctoral degree in 1967, with a dissertation titled "Thermal Sterilization of Flexibly Packaged Foods".
Career
Rha was a professor of biomaterials science and engineering at MIT, until her retirement in 2006. In 1980, she became the first Asian woman to earn tenure at MIT. She helped establish Genzyme, a biotechnology firm, and founded and directed the Malaysia-MIT Biotechnology Partnership Program. She endowed a professorship in industrial biotechnology at MIT. She was a co-founder of Women’s World Banking, a microfinancing program.
Rha's research focused on biochemistry and biotechnology for food and other applications. Her work was published in academic journalist including Journal of Food Science, Nature Biotechnology, Applied Microbiology and Biotechnology, Bioresource Technology, Biotechnology Letters, and British Journal of Nutrition. She earned her first of several patents in 1988, with a process for encapsulation. As part of her work in Malaysia, she developed several patented products derived from palm oil.
Publications
"Evaluation of cheese texture" (1978, with Cho Lee and Em Imoto)
"Microstructure of soybean protein aggregates and its relation to the physical and textural properties of the curd" (1978, with Cho Lee)
|
https://en.wikipedia.org/wiki/Reynolds%20analogy | The Reynolds Analogy is popularly known to relate turbulent momentum and heat transfer. That is because in a turbulent flow (in a pipe or in a boundary layer) the transport of momentum and the transport of heat largely depends on the same turbulent eddies: the velocity and the temperature profiles have the same shape.
The main assumption is that heat flux q/A in a turbulent system is analogous to momentum flux τ, which suggests that the ratio τ/(q/A) must be constant for all radial positions.
The complete Reynolds analogy* is:
Experimental data for gas streams agree approximately with above equation if the Schmidt and Prandtl numbers are near 1.0 and only skin friction is present in flow past a flat plate or inside a pipe. When liquids are present and/or form drag is present, the analogy is conventionally known to be invalid.
In 2008, the qualitative form of validity of Reynolds' analogy was re-visited for laminar flow of incompressible fluid with variable dynamic viscosity (μ). It was shown that the inverse dependence of Reynolds number (Re) and skin friction coefficient(cf) is the basis for validity of the Reynolds’ analogy, in laminar convective flows with constant & variable μ. For μ = const. it reduces to the popular form of Stanton number (St) increasing with increasing Re, whereas for variable μ it reduces to St increasing with decreasing Re. Consequently, the Chilton-Colburn analogy of St•Pr2/3 increasing with increasing cf is qualitatively valid whenever the
Reynolds’ analogy is valid. Further, the validity of the Reynolds’ analogy is linked to the applicability of Prigogine's Theorem of Minimum Entropy Production. Thus, Reynolds' analogy is valid for flows that are close to developed, for whom, changes in the gradients of field variables (velocity & temperature) along the flow are small.
See also
Reynolds number
Chilton and Colburn J-factor analogy |
https://en.wikipedia.org/wiki/Information%20Object%20Class%20%28ASN.1%29 | ASN.1 Information Object Class is a concept widely used in ASN.1 specifications to address issues related to protocol specification similar to issues addressed by CORBA/IDL specifications.
Information Object Classes are used for example to specify ROSE (Remote Operations Service Element) protocol in ASN.1.
Abbreviations
Abbreviations used throughout this article:
ASN.1 Abstract Syntax Notation One
IOC Information Object Class
IOS Information Object Set
IO Information Object
SQL Structured Query Language
PER Packed Encoding Rules
BER Basic Encoding Rules
IDL Interface Definition Language
CORBA Common Object Request Broker Architecture
IIOP Internet Inter-ORB Protocol
Introduction
The simplest way of thinking of ASN.1 Information Object Classes is to regard them as a way to represent IDL specification in ASN.1 using concepts derived from the relational databases theory and SQL syntax in particular.
The concepts used in ASN.1 are more flexible than the ones used in IDL, because, continuing the analogy, they allow to "customize grammar" of the "IDL specification". ASN.1 encoding rules are used as a transfer syntax for remote invocations that resemble CORBA/IIOP.
In the light of this comparison, we can draw an approximate analogy between concepts used in Information Object Classes and SQL and IDL concepts as shown in Table 1.
Analogy by example
Table 2 illustrates by example correspondence of ASN.1 concepts to similar constructs found in SQL and IDL.
Parameterization
If you carefully examine the ASN.1 example presented in Table 2 and compare it to IDL concepts, you will see one important limitation on the ASN.1 side.
Our example ASN.1 data types which we agreed to compare to a high-level CORBA/IDL transfer syntax specification are limited to definition of such transfer syntax only for a single instance of what we compared to an IDL interface (Information Object Set in ASN.1 terms).
In other words, such transfer syntax is not generic and |
https://en.wikipedia.org/wiki/Rng%20%28algebra%29 | In mathematics, and more specifically in abstract algebra, a rng (or non-unital ring or pseudo-ring) is an algebraic structure satisfying the same properties as a ring, but without assuming the existence of a multiplicative identity. The term rng (IPA: ) is meant to suggest that it is a ring without i, that is, without the requirement for an identity element.
There is no consensus in the community as to whether the existence of a multiplicative identity must be one of the ring axioms (see ). The term rng was coined to alleviate this ambiguity when people want to refer explicitly to a ring without the axiom of multiplicative identity.
A number of algebras of functions considered in analysis are not unital, for instance the algebra of functions decreasing to zero at infinity, especially those with compact support on some (non-compact) space.
Definition
Formally, a rng is a set R with two binary operations called addition and multiplication such that
(R, +) is an abelian group,
(R, ·) is a semigroup,
Multiplication distributes over addition.
A rng homomorphism is a function from one rng to another such that
f(x + y) = f(x) + f(y)
f(x · y) = f(x) · f(y)
for all x and y in R.
If R and S are rings, then a ring homomorphism is the same as a rng homomorphism that maps 1 to 1.
Examples
All rings are rngs. A simple example of a rng that is not a ring is given by the even integers with the ordinary addition and multiplication of integers. Another example is given by the set of all 3-by-3 real matrices whose bottom row is zero. Both of these examples are instances of the general fact that every (one- or two-sided) ideal is a rng.
Rngs often appear naturally in functional analysis when linear operators on infinite-dimensional vector spaces are considered. Take for instance any infinite-dimensional vector space V and consider the set of all linear operators with finite rank (i.e. ). Together with addition and composition of operators, this is a rng, but not |
https://en.wikipedia.org/wiki/David%20Gale | David Gale (December 13, 1921 – March 7, 2008) was an American mathematician and economist. He was a professor emeritus at the University of California, Berkeley, affiliated with the departments of mathematics, economics, and industrial engineering and operations research. He has contributed to the fields of mathematical economics, game theory, and convex analysis.
Gale earned his B.A. from Swarthmore College, obtained an M.A. from the University of Michigan in 1947, and earned his Ph.D. in Mathematics at Princeton University in 1949. He taught at Brown University from 1950 to 1965 and then joined the faculty at the University of California, Berkeley.
Gale lived in Berkeley, California, and Paris, France with his partner Sandra Gilbert, feminist literary scholar and poet. He has three daughters and two grandsons.
Contribution
Gale's contributions to mathematical economics include an early proof of the existence of competitive equilibrium, his solution of the n-dimensional Ramsey problem, in the theory of optimal economic growth.
Gale and F. M. Stewart initiated the study of infinite games with perfect information. This work led to fundamental contributions to mathematical logic.
Gale is the inventor of the game of Bridg-It (also known as "Game of Gale") and Chomp.
Gale played a fundamental role in the development of the theory of linear programming and linear inequalities. His classic 1960 book The Theory of Linear Economic Models continues to be a standard reference for this area.
The Gale transform is an involution on sets of points in projective space. The concept is important in optimization, coding theory, and algebraic geometry.
Gale's 1962 paper with Lloyd Shapley on the stable marriage problem provides the first formal statement and proof of a problem that has far-reaching implications in many matching markets. The resulting Gale–Shapley algorithm is currently being applied in New York and Boston public school systems in assigning students to sch |
https://en.wikipedia.org/wiki/Castleman%20Disease%20Collaborative%20Network | The Castleman Disease Collaborative Network (CDCN) is an organization focused on research and awareness of Castleman disease. It was founded in 2012 and has used a collaborative network approach to advance several research studies on Castleman disease.
History
The Castleman Disease Collaborative Network was founded in 2012 by Dr. Frits van Rhee and Dr. David Fajgenbaum, after Fajgenbaum was diagnosed with idiopathic multicentric Castleman disease as a medical student in 2010. Soon after its creation, the CDCN merged with the Castleman's Awareness and Research Effort (CARE).
Fajgenbaum has served as the executive director of the CDCN since its founding.
Activities
Research
The CDCN provides grant funding to support research on the etiology, pathogenesis, diagnosis, and management of Castleman disease. The CDCN is involved in longitudinal research initiatives designed to facilitate multiple projects.
ACCELERATE Natural History Study
The ACCELERATE (Accelerating Castlmean Care with an Electronic Longitudinal registry, E-Repository, And Treatment Effectiveness research) natural history study is a collaborative project between the CDCN, Janssen Pharmaceuticals, and the University of Pennsylvania. It is a database of clinical information drawn from patients diagnosed with Castleman disease and includes symptoms, laboratory tests, imaging, pathology, and treatment approaches. The ACCELERATE study was designed to document the natural history of Castleman disease, range of clinical features associated with the disease, and response to treatment. Patients can enroll themselves in the ACCELERATE study online.
Castlebank
The Castlebank is a collaboration between the CDCN and the University of Pennsylvania to house a centralized biobank of tissue samples donated by patients with Castleman disease and collected from researcher around the world. The Castlebank is used to support collaborative research projects requiring tissue samples.
Drug Repurposing
The CDCN is comm |
https://en.wikipedia.org/wiki/Confused%20flour%20beetle | The confused flour beetle (Tribolium confusum), a type of darkling beetle known as a flour beetle, is a common pest insect known for attacking and infesting stored flour and grain. They are one of the most common and most destructive insect pests for grain and other food products stored in silos, warehouses, grocery stores, and homes.
The "confused" in the beetle's name is due to it being confused with the red flour beetle, not because of its walking pattern.
Description
The confused flour beetle is very similar in appearance and habit to the red flour beetle, Tribolium castaneum and the destructive flour beetle, Tribolium destructor. Both the confused flour beetle and red flour beetle are small, about 3–6 mm (1/8-1/4 inch) in length, and reddish-brown in color. The primary distinguishing physical difference is the shape of their antennae: the confused flour beetle's antennae increase gradually in size and have four clubs, while the red flour beetle's antennae have only three. Additionally, red flour beetles have been known to fly short distances, while confused flour beetles do not. Tribolium destructor is much darker than either and less common.
Ecology
While confused (and red) flour beetles cannot feed on whole, undamaged grain, they are often found in large numbers in infested grains, feeding on broken grain, grain dust, and other household food items such as flour, rice, dried fruit, nuts, and beans. Both types of beetles are often found not only in infested grains, but in crevices in pantries and cabinet, as well. Damage to food is caused somewhat by the beetles' feeding, but also by their dead bodies, fecal pellets, and foul-smelling secretions. In addition to creating a foul odor, the beetles' presence encourages the growth of mold.
Confused flour beetles are a common model organism in science. Several confused flour beetles were experimental subjects on the Bion 1 spacecraft, launched in 1973.
See also
Home stored product entomology |
https://en.wikipedia.org/wiki/Macdonald%20identities | In mathematics, the Macdonald identities are some infinite product identities associated to affine root systems, introduced by . They include as special cases the Jacobi triple product identity, Watson's quintuple product identity, several identities found by , and a 10-fold product identity found by .
and pointed out that the Macdonald identities are the analogs of the Weyl denominator formula for affine Kac–Moody algebras and superalgebras. |
https://en.wikipedia.org/wiki/Sputnik-1%20EMC/EMI%20lab%20model | Sputnik 1 EMC/EMI is a class of full-scale laboratory models of the Soviet Sputnik 1 satellite, made to test ground Electromagnetic Compatibility (EMC) and Electromagnetic Interference (EMI). The models, manufactured by OKB-1 and NII-885 (headed by Mikhail Ryazansky), were introduced on February 15, 1957.
Sputnik 1 EMC/EMI Lab Model Number 001
The first testing model Sputnik 1 EMC/EMI – Lab model 001, made on February 15, 1957, is located in Deutsches Technikmuseum Berlin, Germany. It comes from the former collections of Russian institute NII-885. The designer of Sputnik Dr. Mikhail Ryazansky was the director back then. In 2007, on the occasion of 50 years since the launch of the first artificial satellite project Sputnik 1, was this first test model exposed to the public in Deutsches Technikmuseum Berlin, Germany.
Sputnik 1 EMC/EMI Lab Models Number 002 and 003
Sequence numbers of other test models Sputnik 1 EMC/EMI Lab model: 002, 003.
Date of production: 1957.
Producer: OKB-1 a NII-885 (the leader of the team: Dr. Mikhail Ryazansky).
On July 20, 2016, a working Sputnik-1 EMC/EMI model, serial number 003, was sold at Bonhams in New York for US$269,000. It featured a still-operational transmitter and four antennas. Of four known Sputnik-1 test articles, this was the only one known to be functional.
Serial numbers 002 a 003 are found in private collections, both models were auctioned off by Bonhams, New York.
Two View Mockups Sputnik 1
Apart from functional laboratory EMC/EMI models 001, 002 a 003 there are two view mockups, which do not contain any active radio or electronics.
Of four known models, two reside in private hands, one is located at the Energia Corporate Museum outside Moscow, and one, lacking internal components, is displayed at the Museum of Flight in Seattle, Washington, US. |
https://en.wikipedia.org/wiki/Glutamine%20amidotransferase | In molecular biology, glutamine amidotransferases (GATase) are enzymes which catalyse the removal of the ammonia group from a glutamine molecule and its subsequent transfer to a specific substrate, thus creating a new carbon-nitrogen group on the substrate. This activity is found in a range of biosynthetic enzymes, including glutamine amidotransferase, anthranilate synthase component II, p-aminobenzoate, and glutamine-dependent carbamoyl-transferase (CPSase). Glutamine amidotransferase (GATase) domains can occur either as single polypeptides, as in glutamine amidotransferases, or as domains in a much larger multifunctional synthase protein, such as CPSase. On the basis of sequence similarities two classes of GATase domains have been identified: class-I (also known as trpG-type) and class-II (also known as purF-type). Class-I GATase domains are defined by a conserved catalytic triad consisting of cysteine, histidine and glutamate. Class-I GATase domains have been found in the following enzymes: the second component of anthranilate synthase and 4-amino-4-deoxychorismate (ADC) synthase; CTP synthase; GMP synthase; glutamine-dependent carbamoyl-phosphate synthase; phosphoribosylformylglycinamidine synthase II; and the histidine amidotransferase hisH. |
https://en.wikipedia.org/wiki/Eiectus | Eiectus is a potentially valid genus of extinct short-necked pliosaur that lived in the Early Cretaceous period. Fossil material has been recovered from the Wallumbilla Formation (Aptian) of Queensland was initially classified under the related genus Kronosaurus until 2021.
History
Initial discoveries
A partial skull previously assigned to Kronosaurus queenslandicus that was discovered in 1929 in the same place as the holotype of K. queenslandicus probably belonged to Eiectus, and another skull discovered in 1935 near Telemon Station in Hughenden, Queensland and prepared in May 1936 may have also belonged to Eiectus, along with all other Albian remains previously referred to K. queenslandicus.
MCZ 1285: the Harvard specimen
In 1931 the Harvard Museum of Comparative Zoology (MCZ) sent an expedition to Australia for the dual purpose of procuring specimens – the museum being "weak in Australian animals and...desires[ing] to complete its series" – and to engage in "the study of the animals of the region when alive." The Harvard Australian Expedition (1931–1932), as it became known, was a six-man venture led by Harvard Professor William Morton Wheeler, with the others being Dr. P. Jackson Darlington Jr. (a renowned coleopterist), Dr. Glover Morrill Allen and his student Ralph Nicholson Ellis, medical officer Dr. Ira M. Dixon, and William E. Schevill (a graduate-student in his twenties and Associate Curator of Invertebrate Palaeontology). MCZ director Thomas Barbour said at the time "We shall hope for specimens' of the kangaroo, the wombat, the Tasmanian devil and Tasmanian wolf," and the mission was a success with over 300 mammal and thousands of insect specimens returning to the United States. Yet Mr. Schevill, the team's fossil enthusiast, remained in Australia after the others had departed and, in the winter of 1932, was told by the rancher R.W.H. Thomas of rocks with something "odd" poking out of them on his property near Hughenden. The rocks were limestone nodul |
https://en.wikipedia.org/wiki/4-Aminopyridine | 4-Aminopyridine (4-AP, fampridine, dalfampridine) is an organic compound with the chemical formula C5H4N–NH2. The molecule is one of the three isomeric amines of pyridine. It is used as a research tool in characterizing subtypes of the potassium channel. It has also been used as a drug, to manage some of the symptoms of multiple sclerosis, and is indicated for symptomatic improvement of walking in adults with several variations of the disease. It was undergoing Phase III clinical trials , and the U.S. Food and Drug Administration (FDA) approved the compound on January 22, 2010. Fampridine is also marketed as Ampyra (pronounced "am-PEER-ah," according to the maker's website) in the United States by Acorda Therapeutics and as Fampyra in the European Union, Canada, and Australia. In Canada, the medication has been approved for use by Health Canada since February 10, 2012.
Applications
In the laboratory, 4-AP is a useful pharmacological tool in studying various potassium conductances in physiology and biophysics. It is a relatively selective blocker of members of Kv1 (Shaker, KCNA) family of voltage-activated K+ channels. However, 4-AP has been shown to potentiate voltage-gated Ca2+ channel currents independent of effects on voltage-activated K+ channels.
Convulsant activity
4-Aminopyridine is a potent convulsant and is used to generate seizures in animal models for the evaluation of antiseizure agents.
Vertebrate pesticide
4-Aminopyridine is also used under the trade name Avitrol as 0.5% or 1% in bird control bait. It causes convulsions and, infrequently, death, depending on dosage. The manufacturer says the proper dose should cause epileptic-like convulsions which cause the poisoned birds to emit distress calls resulting in the flock leaving the site; if the dose was sub-lethal, the birds will recover after 4 or more hours without long-term ill effect. The amount of bait should be limited so that relatively few birds are poisoned, causing the remainder of the fl |
https://en.wikipedia.org/wiki/Subphrenic%20abscess | Subphrenic abscess is a disease characterized by an accumulation of infected fluid between the diaphragm, liver, and spleen. This abscess develops after surgical operations like splenectomy.
Presents with cough, increased respiratory rate with shallow respiration, diminished or absent breath sounds, hiccups, dullness in percussion, tenderness over the 8th–11th ribs, fever, chills, anorexia and shoulder tip pain on the affected side. Lack of treatment or misdiagnosis could quickly lead to sepsis, septic shock, and death.
Patients who develop peritonitis may get localized abscesses in the right or left subphrenic space. The right side is more common due to the high frequency of ruptured appendices and perforated duodenal ulcers.
Two common approaches to draining a subphrenic abscess are 1) incision inferior to or through the bed of the 12th rib (no need to create an opening in the pleura or peritoneum) 2) an anterior subphrenic abscess is often drained through a subcostal incision located inferior and parallel to the right costal margin. It is also associated with peritonitis. |
https://en.wikipedia.org/wiki/Nikolai%20G%C3%BCnther | Nikolai Maximovich Günther (, also transliterated as Nicholas M. Gunther or N. M. Gjunter) ( – May 4, 1941) was a Russian mathematician known for his work in potential theory and in integral and partial differential equations: later studies have uncovered his contributions to the theory of Gröbner bases.
He was an Invited Speaker of the ICM in 1924 at Toronto, in 1928 at Bologna, and in 1932 at Zurich.
Selected publications
. A large paper aimed at showing the applications of Radon integrals to problems of mathematical physics: the Mathematical Reviews review refers to a 1949 reprint published by the Chelsea Publishing Company.
.
, reviewed also by and by .
. The second edition of the monograph , now a classical textbook in potential theory, translated from the Russian original (edition cured by V. I. Smirnov and H. L. Smolitskii), which was also translated in German as .
See also
Harmonic function
Integral equation
Radon measure
Notes |
https://en.wikipedia.org/wiki/Maxillary%20tuberosity | At the lower part of the infratemporal surface of the maxilla is a rounded eminence, the maxillary tuberosity, especially prominent after the growth of the wisdom tooth; it is rough on its lateral side for articulation with the pyramidal process of the palatine bone and in some cases articulates with the lateral pterygoid plate of the sphenoid.
It gives origin to a few fibers of the medial pterygoid muscle. |
https://en.wikipedia.org/wiki/Spatiotemporal%20Epidemiological%20Modeler | The Spatiotemporal Epidemiological Modeler (STEM) is free software available through the Eclipse Foundation. Originally developed by IBM Research, STEM is a framework and development tool designed to help scientists create and use spatial and temporal models of infectious disease. STEM uses a component software architecture based on the OSGi standard. The Eclipse Equinox platform is a reference implementation of that standard. By using a component software architecture, all of the components or elements required for a disease model, including the code and the data are available as software building blocks that can be independently exchanged, extended, reused, or replaced. These building blocks or plug-ins are called eclipse "plug-ins" or "extensions". STEM plug-ins contain denominator data for administrative regions of interest. The regions are indexed by standard (ISO3166) codes.
STEM currently includes a large number of plug-ins for the 244 countries and dependent areas defined by the Geographic Coding Standard maintained by the International Organization for Standardization. These plug-ins contain global data including geographic data, population data, demographics, and basic models of disease. The disease models distributed with STEM include epidemiological compartment models. Other plug-ins describe relationships between regions including nearest-neighbor or adjacency relationships as well as information about transportation, such as connections by roads and a model of air transportation.
Relationships between regions can then be included in models of how a disease spreads from place to place. To accomplish this, STEM represents the world as a "graph". The nodes in the graph correspond to places or regions, and the edges in the graph describe relationships or connections between regions. Both the nodes and the edges can be labeled or "decorated" with a variety of denominator data and models. This graphical representation is implemented using the Eclipse Mo |
https://en.wikipedia.org/wiki/Pectoral%20axillary%20lymph%20nodes | An anterior or pectoral group consists of four or five glands along the lower border of the Pectoralis minor, in relation with the lateral thoracic artery.
Their afferents drain the skin and muscles of the anterior and lateral thoracic walls, and the central and lateral parts of the mamma; their efferents pass partly to the central and partly to the subclavicular groups of axillary glands.
Additional images |
https://en.wikipedia.org/wiki/Absorption%20%28chemistry%29 | In chemistry, absorption is a physical or chemical phenomenon or a process in which atoms, molecules or ions enter some bulk phase – liquid or solid material. This is a different process from adsorption, since molecules undergoing absorption are taken up by the volume, not by the surface (as in the case for adsorption).
A more common definition is that "Absorption is a chemical or physical phenomenon in which the molecules, atoms and ions of the substance getting absorbed enter into the bulk phase (gas, liquid or solid) of the material in which it is taken up."
A more general term is sorption, which covers absorption, adsorption, and ion exchange. Absorption is a condition in which something takes in another substance.
In many processes important in technology, the chemical absorption is used in place of the physical process, e.g., absorption of carbon dioxide by sodium hydroxide – such acid-base processes do not follow the Nernst partition law (see: solubility).
For some examples of this effect, see liquid-liquid extraction. It is possible to extract a solute from one liquid phase to another without a chemical reaction. Examples of such solutes are noble gases and osmium tetroxide.
The process of absorption means that a substance captures and transforms energy. The absorbent distributes the material it captures throughout whole and adsorbent only distributes it through the surface.
The process of gas or liquid which penetrate into the body of adsorbent is commonly known as absorption.
Equation
If absorption is a physical process not accompanied by any other physical or chemical process, it usually follows the Nernst distribution law:
"the ratio of concentrations of some solute species in two bulk phases when it is equilibrium and in contact is constant for a given solute and bulk phases":
The value of constant KN depends on temperature and is called partition coefficient. This equation is valid if concentrations are not too large and if the species "x" |
https://en.wikipedia.org/wiki/Comcast | Comcast Corporation (formerly known as American Cable Systems and Comcast Holdings, stylized in all caps), incorporated and headquartered in Philadelphia, is the largest American multinational telecommunications and media conglomerate. The corporation is the second-largest broadcasting and cable television company in the world by revenue (behind AT&T), and is also the largest pay-TV company, the largest cable TV company, and largest home Internet service provider in the United States. Comcast is additionally the nation's third-largest home telephone service provider. It provides services to U.S. residential and commercial customers in 40 states and the District of Columbia. As the owner of the international media company NBCUniversal since 2011, Comcast is also a high-volume producer of feature films for theatrical exhibition and television programming, and a theme park operator.
Comcast owns and operates the Xfinity residential cable communications business segment and division; Comcast Business, a commercial services provider; and Xfinity Mobile, an MVNO of Verizon. Through NBCUniversal, it also owns and operates over-the-air national broadcast network channels such as NBC, Telemundo, TeleXitos, and Cozi TV; multiple cable-only channels such as MSNBC, CNBC, USA Network, Syfy, Oxygen, Bravo, and E!; the film studio Universal Pictures; the VOD streaming service Peacock; animation studios DreamWorks Animation, Illumination, and Universal Animation Studios; and Universal Destinations & Experiences. It also has significant holdings in digital distribution, such as thePlatform, which it acquired in 2006; and ad-tech company FreeWheel, which it acquired in 2014. Since October 2018, it has also been the parent company of Sky Group.
Comcast has been criticized and put under intense public scrutiny for a variety of reasons. Its customer satisfaction ratings were among the lowest in the cable industry during the years 2008–2010. It has violated net neutrality practices in |
https://en.wikipedia.org/wiki/Rawflow | RawFlow was a provider of live p2p streaming technology that enables internet broadcasting of audio and video. The company's technology is similar to Abacast and Octoshape.
Rawflow was incorporated in 2002 by Mikkel Dissing, Daniel Franklin and Stephen Dicks. Its main office was in London, UK. Velocix acquired RawFlow in July 2008. Alcatel-Lucent acquired Velocix in July 2009. The streaming media CDN solution by Alcatel-Lucent is called Velocix Digital Media Delivery Platform.
A peer-to-peer (or P2P) computer network relies on the computing power and bandwidth of the participants in the network rather than concentrating it in a relatively low number of servers. When using this technology, the bandwidth requirement of the broadcast is intelligently distributed over the entire network of participants, instead of being centralised at the broadcast's origin; as the audience grows so do the network resources available to distribute that broadcast without adding any additional bandwidth costs.
How does it work?
The RawFlow ICD (Intelligent Content Distribution) Server provides the initial contact point for clients in the network. When initially installed, the ICD Server connects to the broadcaster’s existing media server and begins receiving the stream, maintaining a buffer of stream data in memory at all times. It then begins accepting connections from clients and serving them with the stream. When launched by a user, the ICD Client first contacts the ICD Server and begins receiving the stream from it. The media player plays the stream as it is received by the ICD Client. If it has available resources, the ICD Client also accepts connections from other clients in the grid to which it may relay a portion or the whole stream it receives as requested.
The ICD Client monitors the quality of the stream it is receiving and upon any reduction of quality or loss of connection it again searches the grid for available resources while continuing to serve the media player fro |
https://en.wikipedia.org/wiki/Relativistic%20Lagrangian%20mechanics | In theoretical physics, relativistic Lagrangian mechanics is Lagrangian mechanics applied in the context of special relativity and general relativity.
Lagrangian formulation in special relativity
Lagrangian mechanics can be formulated in special relativity as follows. Consider one particle (N particles are considered later).
Coordinate formulation
If a system is described by a Lagrangian L, the Euler–Lagrange equations
retain their form in special relativity, provided the Lagrangian generates equations of motion consistent with special relativity. Here r = (x, y, z) is the position vector of the particle as measured in some lab frame where Cartesian coordinates are used for simplicity, and
is the coordinate velocity, the derivative of position r with respect to coordinate time t. (Throughout this article, overdots are with respect to coordinate time, not proper time). It is possible to transform the position coordinates to generalized coordinates exactly as in non-relativistic mechanics, r = r(q, t). Taking the total differential of r obtains the transformation of velocity v to the generalized coordinates, generalized velocities, and coordinate time
remains the same. However, the energy of a moving particle is different from non-relativistic mechanics. It is instructive to look at the total relativistic energy of a free test particle. An observer in the lab frame defines events by coordinates r and coordinate time t, and measures the particle to have coordinate velocity v = dr/dt. By contrast, an observer moving with the particle will record a different time, this is the proper time, τ. Expanding in a power series, the first term is the particle's rest energy, plus its non-relativistic kinetic energy, followed by higher order relativistic corrections;
where c is the speed of light in vacuum. The differentials in t and τ are related by the Lorentz factor γ,
where · is the dot product. The relativistic kinetic energy for an uncharged particle of rest mass m0 |
https://en.wikipedia.org/wiki/Choke%20exchange | A choke exchange is a telephone exchange designed to handle many simultaneous call attempts to telephone numbers of that exchange. Choke exchanges are typically used to service telephone numbers of talk radio caller and contest lines of radio stations and event ticket vendors.
Motivation
A central office might only have physical plant resources to handle ca. 8% of allocated telephone numbers, based on historical call traffic averages. A choke exchange has trunk facilities to other exchanges designed in a manner that high call volume is handled through the choke connection rather than overwhelming the rest of the local telephone network. Other local exchanges have a limited number of direct trunks (junctions) to the choke exchange, which may only serve one or more customers, such as a radio station contest line, which may experience many simultaneous calls. But instead of calls being overflowed to main or tandem routes shared with other calls, the unsuccessful callers receive a reorder tone from their local or tandem exchange. If the calls were overflowed to the tandem route, the caller would receive a busy tone from the exchange serving the radio station, and the sudden peak would disrupt calls between other customers.
With common-channel signaling (CCS), e.g., Signalling System No. 7, separate choke exchanges may not be required for these customers.
Examples
Examples of choke exchanges in North America have included:
One of the early choke lines (exchanges) was instituted due to a widely advertised contest by a local radio station in the Miami/Fort Lauderdale area. WHYI-FM advertised their "Last Contest". The top prize was an automobile. Since the advertising lasted over a month, there were a very large volume of calls when they announced for people to call in. There were so many calls that the local exchanges ran out of dial tones. This caused major issues since at the time if a caller had no dial tone, the caller could not dial at all. After it was over, the |
https://en.wikipedia.org/wiki/Synovitis | Synovitis is the medical term for inflammation of the synovial membrane. This membrane lines joints that possess cavities, known as synovial joints. The condition is usually painful, particularly when the joint is moved. The joint usually swells due to synovial fluid collection.
Synovitis may occur in association with arthritis as well as lupus, gout, and other conditions. Synovitis is more commonly found in rheumatoid arthritis than in other forms of arthritis, and can thus serve as a distinguishing factor, although it is also present in many joints affected with osteoarthritis. In rheumatoid arthritis, the fibroblast-like synoviocytes, highly specialized mesenchymal cells found in the synovial membrane, play an active and prominent role in the synovitis. Long term occurrence of synovitis can result in degeneration of the joint.
Signs and symptoms
Synovitis causes joint tenderness or pain, swelling and hard lumps, called nodules. When associated with rheumatoid arthritis, swelling is a better indicator than tenderness. The joints in your hands and fingers feel painful when pressed and when moving or gripping anything.
Diagnosis
A rheumatologist will aim to diagnose the cause of the patient’s pain by first determining whether it is inside the joint itself, meaning true synovitis, or if it is actually caused by an inflammation of the tendons, referred to as tendonitis. Imaging, such as an MRI or musculoskeletal ultrasound is often required to make a firm diagnosis.
Treatment
Synovitis symptoms can be treated with anti-inflammatory drugs such as NSAIDs. An injection of steroids may be done, directly into the affected joint. Injection of beta-emitting radioisotopes to locally treat synovitis has been performed in people for decades and is now being applied using tin-117m in veterinary medicine to treat canine elbow synovitis. Specific treatment depends on the underlying cause of the synovitis.
See also
Tenosynovitis
Transient synovitis
Knee effusion (water on th |
https://en.wikipedia.org/wiki/Jermain%20G.%20Porter | Jermain Gildersleeve Porter (January 8, 1852 - April 14, 1933) was an American astronomer and opponent of the theory of relativity.
Porter was born at Buffalo, New York. He studied at Hamilton College, was employed by the United States Coast and Geodetic Survey in 1878, and from 1884 to 1930 was director of the Cincinnati Observatory and professor at the University of Cincinnati. He observed comets and nebulae, but gained a name mainly through his three star catalogs (1895–1905) and through his studies of the stars motion, collected in the Catalog of Proper Motion Stars , I – IV (1915–18), Publications of the Cincinnati Observatory No. 18.
He also authored Variation of Latitude 1899–1906 (1908) and The Stars in Song and Legend (1901). Together with Elliott Smith, he published the Catalog of 4683 Stars of the Epoch 1900, Publications of the Cincinnati Observatory, No. 193, in which he also stated his own movements. He also made a name for himself as an opponent of the theory of relativity.
Selected publications
Historical Sketch of the Cincinnati Observatory 1843-1893 (1893)
The Stars in Song and Legend (1901)
The Overthrow of Newton's Theory of Gravitation (1920)
The Relativity Deflection of Light: Facts versus Theory (1929)
Recent Textbooks and Relativity (1927) |
https://en.wikipedia.org/wiki/Geriatric%20psychiatry | Geriatric psychiatry, also known as geropsychiatry, psychogeriatrics or psychiatry of old age, is a branch of medicine and a subspecialty of psychiatry dealing with the study, prevention, and treatment of neurodegenerative, cognitive impairment, and mental disorders in people of old age. Geriatric psychiatry as a subspecialty has significant overlap with the specialties of geriatric medicine, behavioural neurology, neuropsychiatry, neurology, and general psychiatry. Geriatric psychiatry has become an official subspecialty of psychiatry with a defined curriculum of study and core competencies.
History
Origins
The origins of geriatric psychiatry began with Alois Alzheimer, a German psychiatrist and neuropathologist who first identified amyloid plaques and neurofibrillary tangles in a fifty-year-old woman he called Auguste D. These plaques and tangles were later identified as being responsible for her behavioural symptoms, short-term memory loss, and psychiatric symptoms. These brain anomalies would become identifiers of what later became known as Alzheimer's disease.
Subspecialty
The subspecialty of geriatric psychiatry originated in the United Kingdom in the 1950s.
Naming
The geropsychiatric unit, the term for a hospital-based geriatric psychiatry program, was first introduced in 1984 by Norman White MD, when he opened New England's first specialized program at a community hospital in Rochester, New Hampshire. White is a pioneer in geriatric psychiatry, being among the first psychiatrists nationally to achieve board certification in the field. The prefix psycho- had been proposed for the geriatric program, but White, knowing New Englanders' aversion to anything psycho- lobbied successfully for the name geropsychiatric rather than psychogeriatrics.
Diseases
Diseases and disorders diagnosed or managed by geriatric psychiatrists include:
Dementia
Mild cognitive impairment
Alzheimer's disease
Vascular dementia
Dementia with Lewy bodies
Parkinson's disease |
https://en.wikipedia.org/wiki/Cryptocurrencies%20in%20Europe | The general notion of cryptocurrencies in Europe denotes the processes of legislative regulation, distribution, circulation, and storage of cryptocurrencies in Europe.
In April 2023, the EU Parliament passed the Markets in Crypto Act (MiCA) unified legal framework for crypto-assets within the European Union.
The legality of cryptocurrencies in Europe
There are some regulatory policy recommendations for EU states to follow in the course of cryptocurrency adoption and regulatory framework development that are given below in chronological order.
In 2013, the European Banking Authority (EBA) issued a public warning about the possible risks of virtual currencies.
In 2014, The EBA issued a decision on virtual currencies, which included a list of more than 70 risks associated with its dissemination.
In 2016, the European Central Bank issued an analysis of virtual currency schemes, acknowledging the potential advantages of virtual currencies.
The European Securities and Markets Authority (ESMA) published a study in 2017 on the use of distributed ledger technology (DLT) in securities markets.
Also in the same year, ESMA released two statements on initial coin offerings (ICOs), one on investor risks and the other on the laws that apply to companies that participate in these offers. After that, the European Commission directed the EBA and ESMA to evaluate the applicability and appropriateness of the existing EU financial services regulatory framework to crypto assets.
In 2018, the European Parliament released two reports about virtual currencies and central banks’ monetary policy.
In 2018, The Financial Stability Board (FSB) released a study on the crypto asset market and its potential pathways for future financial stability concerns.
During the G7 meeting of July 2019 risks posed by global stablecoin projects were discussed.
FINMA, the Swiss financial authority, published a supplement to its ICO guidelines outlining how it treats so-called ‘stable coins’ under Sw |
https://en.wikipedia.org/wiki/Web%20Mercator%20projection | Web Mercator, Google Web Mercator, Spherical Mercator, WGS 84 Web Mercator or WGS 84/Pseudo-Mercator is a variant of the Mercator map projection and is the de facto standard for Web mapping applications. It rose to prominence when Google Maps adopted it in 2005. It is used by virtually all major online map providers, including Google Maps, CARTO, Mapbox, Bing Maps, OpenStreetMap, Mapquest, Esri, and many others. Its official EPSG identifier is EPSG:3857, although others have been used historically.
Properties
Web Mercator is a slight variant of the Mercator projection, one used primarily in Web-based mapping programs. It uses the same formulas as the standard Mercator as used for small-scale maps. However, the Web Mercator uses the spherical formulas at all scales whereas large-scale Mercator maps normally use the ellipsoidal form of the projection. The discrepancy is imperceptible at the global scale but causes maps of local areas to deviate slightly from true ellipsoidal Mercator maps at the same scale. This deviation becomes more pronounced further from the equator, and can reach as much as 40 km on the ground.
While the Web Mercator's formulas are for the spherical form of the Mercator, geographical coordinates are required to be in the WGS 84 ellipsoidal datum. This discrepancy causes the projection to be slightly non-conformal. General lack of understanding that the Web Mercator differs from standard Mercator usage has caused considerable confusion and misuse. For all these reasons, the United States Department of Defense through the National Geospatial-Intelligence Agency has declared this map projection to be unacceptable for any official use.
Unlike most map projections for the sphere, the Web Mercator uses the equatorial radius of the WGS 84 spheroid, rather than some compromise between the equatorial and polar radii. This results in a slightly larger map compared to the map's stated (nominal) scale than for most maps.
Formulas
Formulas for the Web M |
https://en.wikipedia.org/wiki/Concept%20drift | In predictive analytics, data science, machine learning and related fields, concept drift or drift is an evolution of data that invalidates the data model. It happens when the statistical properties of the target variable, which the model is trying to predict, change over time in unforeseen ways. This causes problems because the predictions become less accurate as time passes. Drift detection and drift adaptation are of paramount importance in the fields that involve dynamically changing data and data models.
Predictive model decay
In machine learning and predictive analytics this drift phenomenon is called concept drift. In machine learning, a common element of a data model are the statistical properties, such as probability distribution of the actual data. If they deviate from the statistical properties of the training data set, then the learned predictions may become invalid, if the drift is not addressed.
Data configuration decay
Another important area is software engineering, where three types of data drift affecting data fidelity may be recognized. Changes in the software environment ("infrastructure drift") may invalidate software infrastructure configuration. "Structural drift" happens when the data schema changes, which may invalidate databases. "Semantic drift" is changes in the meaning of data while the structure does not change. In many cases this may happen in complicated applications when many independent developers introduce changes without proper awareness of the effects of their changes in other areas of the software system.
For many application systems, the nature of data on which they operate are subject to changes for various reasons, e.g., due to changes in business model, system updates, or switching the platform on which the system operates.
In the case of cloud computing, infrastructure drift that may affect the applications running on cloud may be caused by the updates of cloud software.
There are several types of detrimental effects o |
https://en.wikipedia.org/wiki/LU%20domain | The LU domain (Ly-6 antigen/uPAR) is an evolutionarily conserved protein domain of the three-finger protein superfamily. This domain is found in the extracellular domains of cell-surface receptors and in either GPI-anchored or secreted globular proteins, for example the Ly-6 family, CD59, and Sgp-2.
A variety of GPI-linked cell-surface glycoproteins are composed of one or more copies of a conserved LU domain of about 100 amino-acid residues. Among these proteins, most contain only a single LU domain, though small numbers of exceptions are known; well-studied family member uPAR has three tandem LU domains.
Structure
This domain folds into five antiparallel beta sheets, a structure common to the three-finger protein family. The domain typically contains ten well-conserved cysteine residues involved in five disulfide bonds, though some examples such as two of the three uPAR domains have fewer.
Examples
Besides uPAR, other receptors with LU domains include members of the transforming growth factor beta receptor (TGF-beta) superfamily, such as the activin type 2 receptor; and bone morphogenetic protein receptor, type IA. Other LU domain proteins are small globular proteins such as CD59 antigen, LYNX1, SLURP1, and SLURP2.
Subfamilies
Urokinase plasminogen activator surface receptor
Cell-surface glycoprotein Ly-6/CD59
Human proteins containing this domain
ARS; CD177; CD59; LY6D; LY6E; LY6H; LYNX1;
LYPD2; LYPD3; LYPD4; LYPD5; LYPD6; PLAUR; PSCA;
SLURP2; SLURP1; SPACA4; TEX101;
Functions
Many LU domain containing proteins are involved in cholinergic signaling and bind acetylcholine receptors, notably linking their function to a common mechanism of 3FTx toxicity. Members of the Ly6/uPAR family are believed to be the evolutionary ancestors of the three-finger toxin (3FTx). Other LU proteins, such as the CD59 antigen, have well-studied functions in regulation of the immune system. |
https://en.wikipedia.org/wiki/Viral%20envelope | A viral envelope is the outermost layer of many types of viruses. It protects the genetic material in their life cycle when traveling between host cells. Not all viruses have envelopes. A viral envelope protein or E protein is a protein in the envelope, which may be acquired by the capsid from an infected host cell.
Numerous human pathogenic viruses in circulation are encased in lipid bilayers, and they infect their target cells by causing the viral envelope and cell membrane to fuse. Although there are effective vaccines against some of these viruses, there is no preventative or curative medicine for the majority of them. In most cases, the known vaccines operate by inducing antibodies that prevent the pathogen from entering cells. This happens in the case of enveloped viruses when the antibodies bind to the viral envelope proteins.
The membrane fusion event that triggers viral entrance is caused by the viral fusion protein. Many enveloped viruses only have one protein visible on the surface of the particle, which is required for both mediating adhesion to the cell surface and for the subsequent membrane fusion process. To create potentially protective vaccines for human pathogenic enveloped viruses for which there is currently no vaccine, it is essential to comprehend how antibodies interact with viral envelope proteins, particularly with the fusion protein, and how antibodies neutralize viruses.
Enveloped viruses enter cells by joining a cellular membrane to their lipid bilayer membrane. Priming by proteolytic processing, either of the fusion protein or of a companion protein, is necessary for the majority of viral fusion proteins. The priming stage then gets the fusion protein ready for triggering by the processes that go along with attachment and uptake, which frequently happens during transport of the fusion protein to the cell surface but may also happen extracellularly. So far, structural studies have revealed two kinds of viral fusion proteins. These pro |
https://en.wikipedia.org/wiki/Attention%20%28machine%20learning%29 | Machine learning-based attention is a mechanism mimicking cognitive attention. It calculates "soft" weights for each word, more precisely for its embedding, in the context window. It can do it either in parallel (such as in transformers) or sequentially (such as recurrent neural networks). "Soft" weights can change during each runtime, in contrast to "hard" weights, which are (pre-)trained and fine-tuned and remain frozen afterwards.
Attention was developed to address the weaknesses of recurrent neural networks, where words in a sentence are slowly processed one at a time. Recurrent neural networks favor more recent words at the end of a sentence while earlier words fade away in volatile neural activations. Attention gives all words equal access to any part of a sentence in a faster parallel scheme and no longer suffers the wait time of serial processing. Earlier uses attached this mechanism to a serial recurrent neural network's language translation system (below), but later uses in Transformers large language models removed the recurrent neural network and relied heavily on the faster parallel attention scheme.
Predecessors
Predecessors of the mechanism were used in recurrent neural networks which, however, calculated "soft" weights sequentially and, at each step, considered the current word and other words within the context window. They were known as multiplicative modules, sigma pi units, and hyper-networks. They have been used in long short-term memory (LSTM) networks, multi-sensory data processing (sound, images, video, and text) in perceivers, fast weight controllers's memory, reasoning tasks in differentiable neural computers, and neural Turing machines
Core Calculations
The attention network was designed to identify the highest correlations amongst words within a sentence, assuming that it has learned those patterns from the training corpus. This correlation is captured in neuronal weights through back-propagation from unsupervised pretraining. |
https://en.wikipedia.org/wiki/Exterior%20gateway%20protocol | An exterior gateway protocol is an IP routing protocol used to exchange routing information between autonomous systems. This exchange is crucial for communications across the Internet. Notable exterior gateway protocols include Exterior Gateway Protocol (EGP), now obsolete, and Border Gateway Protocol (BGP).
By contrast, an interior gateway protocol is a type of protocol used for exchanging routing information between gateways (commonly routers) within an autonomous system (for example, a system of corporate local area networks). This routing information can then be used to route network-level protocols like IP. |
https://en.wikipedia.org/wiki/Convergence%20%28routing%29 | Convergence is the state of a set of routers that have the same topological information about the internetwork in which they operate. For a set of routers to have converged, they must have collected all available topology information from each other via the implemented routing protocol, the information they gathered must not contradict any other router's topology information in the set, and it must reflect the real state of the network. In other words: in a converged network all routers "agree" on what the network topology looks like.
Convergence is an important notion for a set of routers that engage in dynamic routing. All Interior Gateway Protocols rely on convergence to function properly. "To have, or be, converged" is the normal state of an operational autonomous system. The Exterior Gateway Routing Protocol BGP typically never converges because the Internet is too big for changes to be communicated fast enough.
Convergence process
When a routing protocol process is enabled, every participating router will attempt to exchange information about the topology of the network. The extent of this information exchange, the way it is sent and received, and the type of information required vary widely depending on the routing protocol in use, see e.g. RIP, OSPF, BGP4.
A state of convergence is achieved once all routing protocol-specific information has been distributed to all routers participating in the routing protocol process. Any change in the network that affects routing tables will break the convergence temporarily until this change has been successfully communicated to all other routers.
Convergence time
Convergence time is a measure of how fast a group of routers reach the state of convergence. It is one of the main design goals and an important performance indicator for routing protocols, which should implement a mechanism that allows all routers running the protocol to quickly and reliably converge. Of course, the size of the network also plays an imp |
https://en.wikipedia.org/wiki/Levenshtein%20distance | In information theory, linguistics, and computer science, the Levenshtein distance is a string metric for measuring the difference between two sequences. Informally, the Levenshtein distance between two words is the minimum number of single-character edits (insertions, deletions or substitutions) required to change one word into the other. It is named after the Soviet mathematician Vladimir Levenshtein, who considered this distance in 1965.
Levenshtein distance may also be referred to as edit distance, although that term may also denote a larger family of distance metrics known collectively as edit distance. It is closely related to pairwise string alignments.
Definition
The Levenshtein distance between two strings (of length and respectively) is given by where
where the of some string is a string of all but the first character of , and is the first character of . Either the notation or is used to refer the th character of the string , counting from 0, thus .
Note that the first element in the minimum corresponds to deletion (from to ), the second to insertion and the third to replacement.
This definition corresponds directly to the naive recursive implementation.
Example
For example, the Levenshtein distance between "kitten" and "sitting" is 3, since the following 3 edits change one into the other, and there is no way to do it with fewer than 3 edits:
kitten → sitten (substitution of "s" for "k"),
sitten → sittin (substitution of "i" for "e"),
sittin → sitting (insertion of "g" at the end).
Upper and lower bounds
The Levenshtein distance has several simple upper and lower bounds. These include:
It is at least the absolute value of the difference of the sizes of the two strings.
It is at most the length of the longer string.
It is zero if and only if the strings are equal.
If the strings have the same size, the Hamming distance is an upper bound on the Levenshtein distance. The Hamming distance is the number of positions at which the co |
https://en.wikipedia.org/wiki/PhyreEngine | PhyreEngine is a license-only free to use game engine from Sony Interactive Entertainment, compatible with
PlayStation 5, PlayStation 4, PlayStation 3, PlayStation VR, PlayStation Vita, PlayStation Portable, Nintendo Switch, Microsoft Windows (for OpenGL and DirectX 11), Google Android and Apple iOS. PhyreEngine has been adopted by several game studios and has been used in over 200 published titles.
Features
PhyreEngine is exclusively distributed to Sony licensees as an installable package that includes both full source code and Microsoft Windows tools, provided under its own flexible use license that allows any PlayStation 3 game developer, publisher or tools and middleware company to create software based partly or fully on PhyreEngine on any platform. The engine uses sophisticated parallel processing techniques that are optimized for the Synergistic Processor Unit (SPU) of the Cell Broadband Engine of PS3, but can be easily ported to other multi-core architectures.
PhyreEngine supports OpenGL and Direct3D, in addition to the low level PS3 LibGCM library. It provides fully functional “game templates” as source code, including support for Havok Complete XS, NVIDIA PhysX and Bullet for physics.
History
The development of PhyreEngine was started in 2003 to create a graphics engine for PlayStation 3. The first public demonstration occurred in 2006.
PhyreEngine was launched during the 2008 Game Developers Conference. New features (including deferred rendering) were showcased during GDC 2009. Version 2.40, released in March 2009, included a new “foliage rendering” system that provides tools and technology to render ultra-realistic trees and plants to be easily integrated into games.
Version 3.0, released in 2011, has a new and powerful asset pipeline, combining enhanced versions of the already robust exporters, with a powerful processing tool to generate optimized assets for each platform. Also new is the rewritten level editor, which permits a far more data-drive |
https://en.wikipedia.org/wiki/Immunoglobulin%20light%20chain | The immunoglobulin light chain is the small polypeptide subunit of an antibody (immunoglobulin).
A typical antibody is composed of two immunoglobulin (Ig) heavy chains and two Ig light chains.
In humans
There are two types of light chain in humans:
kappa (κ) chain, encoded by the immunoglobulin kappa locus (IGK@) on chromosome 2 (locus: 2p11.2)
lambda (λ) chain, encoded by the immunoglobulin lambda locus (IGL@) on chromosome 22 (locus: 22q11.2)
Antibodies are produced by B lymphocytes, each expressing only one class of light chain. Once set, light chain class remains fixed for the life of the B lymphocyte. In a healthy individual, the total kappa-to-lambda ratio is roughly 2:1 in serum (measuring intact whole antibodies) or 1:1.5 if measuring free light chains, with a highly divergent ratio indicative of neoplasm. The free light chain ratio ranges from 0.26 to 1.65. Both the kappa and the lambda chains can increase proportionately, maintaining a normal ratio. This is usually indicative of something other than a blood cell dyscrasia, such as kidney disease.
In other animals
The immunoglobulin light chain genes in tetrapods can be classified into three distinct groups: kappa (κ), lambda (λ), and sigma (σ). The divergence of the κ, λ, and σ isotypes preceded the radiation of tetrapods. The σ isotype was lost after the evolution of the amphibian lineage and before the emergence of the reptilian lineage.
Other types of light chains can be found in lower vertebrates, such as the Ig-Light-Iota chain of Chondrichthyes and Teleostei.
Camelids are unique among mammals as they also have fully functional antibodies which have two heavy chains, but lack the light chains usually paired with each heavy chain.
Sharks also possess, as part of their adaptive immune systems, a functional heavy-chain homodimeric antibody-like molecule referred to as IgNAR (immunoglobulin new antigen receptor). IgNAR is believed to have never had an associated light chain, in contrast with |
https://en.wikipedia.org/wiki/Commwarrior | Commwarrior is a Symbian Bluetooth worm that was the first to spread via Multimedia Messaging Service (MMS) and Bluetooth. The worm affects only the Nokia Series 60 software platform.
Infection
Commwarrior was particularly effective via the MMS vector it used to infect other phones. It appeared as though it had been sent from a source that was known to the victim, leading even security-conscious users to open the infected message. Actually, the message was sent at random to a contact in the sender's address book. Once the message is opened, the virus attempts to install itself on the phone via a SIS file. As it runs, the worm is executed every time the phone is switched on.
A secondary method of infection is to create a malicious .SIS file on a compromised phone. Once per minute thereafter, the worm attempts to send this file to any phone that has Bluetooth enabled.
Symptoms
According to Sophos, during installation the program has a one in six chance of displaying the following text: "CommWarrior v1.0 (c) 2005 by e10d0r" |
https://en.wikipedia.org/wiki/List%20of%20organs%20of%20the%20human%20body | This article contains a list of organs of the human body. A general consensus is widely believed to be 79 organs (this number goes up if you count each bone and muscle as an organ on their own, which is becoming more common practice to do); however, there is no universal standard definition of what constitutes an organ, and some tissue groups' status as one is debated. Since there is no single standard definition of what an organ is, the number of organs varies depending on how one defines an organ. For example, this list contains more than 79 organs (about ~103).
It is still not clear which definition of an organ is used for all the organs in this list, it seemed that it may have been compiled based on what wikipedia articles were available on organs.
Musculoskeletal system
Skeleton
Joints
Ligaments
Muscular system
Tendons
Digestive system
Mouth
Teeth
Tongue
Lips
Salivary glands
Parotid glands
Submandibular glands
Sublingual glands
Pharynx
Esophagus
Stomach
Small intestine
Duodenum
Jejunum
Ileum
Large intestine
Cecum
Ascending colon
Transverse colon
Descending colon
Sigmoid colon
Rectum
Liver
Gallbladder
Mesentery
Pancreas
Anal canal
Appendix
Respiratory system
Nasal cavity
Pharynx
Larynx
Trachea
Bronchi
Bronchioles and smaller air passages
Lungs
Muscles of breathing
Urinary system
Kidneys
Ureter
Bladder
Urethra
Reproductive systems
Female reproductive system
Internal reproductive organs
Ovaries
Fallopian tubes
Uterus
Cervix
Vagina
External reproductive organs
Vulva
Clitoris
Male reproductive system
Internal reproductive organs
Testicles
Epididymis
Vas deferens
Prostate
External reproductive organs
Penis
Scrotum
Endocrine system
Pituitary gland
Pineal gland
Thyroid gland
Parathyroid glands
Adrenal glands
Pancreas
Circulatory system
Circulatory system
Heart
Arteries
Veins
Capillaries
Lymphatic system
Lymphatic vessel
Lymph node
Bone marrow
Thymus
Spleen
Gut-associated lymphoid tissue
Tonsils
Interstitium
Nervous system
Central nervous system |
https://en.wikipedia.org/wiki/PelicanHPC | PelicanHPC is an operating system based on Debian Live, which provides a rapid means of setting up a high performance computer cluster.
PelicanHPC was formerly known as ParallelKNOPPIX.
Versions |
https://en.wikipedia.org/wiki/Mir-548%20microRNA%20precursor%20family | In molecular biology mir-548 microRNA is a short RNA molecule. MicroRNAs function to regulate the expression levels of other genes by several mechanisms.
See also
MicroRNA |
https://en.wikipedia.org/wiki/Jackson%27s%20inequality | In approximation theory, Jackson's inequality is an inequality bounding the value of function's best approximation by algebraic or trigonometric polynomials in terms of the modulus of continuity or modulus of smoothness of the function or of its derivatives. Informally speaking, the smoother the function is, the better it can be approximated by polynomials.
Statement: trigonometric polynomials
For trigonometric polynomials, the following was proved by Dunham Jackson:
Theorem 1: If is an times differentiable periodic function such that
then, for every positive integer , there exists a trigonometric polynomial of degree at most such that
where depends only on .
The Akhiezer–Krein–Favard theorem gives the sharp value of (called the Akhiezer–Krein–Favard constant):
Jackson also proved the following generalisation of Theorem 1:
Theorem 2: One can find a trigonometric polynomial of degree such that
where denotes the modulus of continuity of function with the step
An even more general result of four authors can be formulated as the following Jackson theorem.
Theorem 3: For every natural number , if is -periodic continuous function, there exists a trigonometric polynomial of degree such that
where constant depends on and is the -th order modulus of smoothness.
For this result was proved by Dunham Jackson. Antoni Zygmund proved the inequality in the case when in 1945. Naum Akhiezer proved the theorem in the case in 1956. For this result was established by Sergey Stechkin in 1967.
Further remarks
Generalisations and extensions are called Jackson-type theorems. A converse to Jackson's inequality is given by Bernstein's theorem. See also constructive function theory. |
https://en.wikipedia.org/wiki/Synthetic%20mycoides | Synthetic mycoides refers to an artificial life form created by Craig Venter at the J Craig Venter Institute in May 2010. A synthetic genome was transferred into an empty cell to form the bacterium, which was capable of self replication and functioned solely from the transferred chromosomes. |
https://en.wikipedia.org/wiki/Uniform%20convergence%20in%20probability | Uniform convergence in probability is a form of convergence in probability in statistical asymptotic theory and probability theory. It means that, under certain conditions, the empirical frequencies of all events in a certain event-family converge to their theoretical probabilities. Uniform convergence in probability has applications to statistics as well as machine learning as part of statistical learning theory.
The law of large numbers says that, for each single event , its empirical frequency in a sequence of independent trials converges (with high probability) to its theoretical probability. In many application however, the need arises to judge simultaneously the probabilities of events of an entire class from one and the same sample. Moreover it, is required that the relative frequency of the events converge to the probability uniformly over the entire class of events The Uniform Convergence Theorem gives a sufficient condition for this convergence to hold. Roughly, if the event-family is sufficiently simple (its VC dimension is sufficiently small) then uniform convergence holds.
Definitions
For a class of predicates defined on a set and a set of samples , where , the empirical frequency of on is
The theoretical probability of is defined as
The Uniform Convergence Theorem states, roughly, that if is "simple" and we draw samples independently (with replacement) from according to any distribution , then with high probability, the empirical frequency will be close to its expected value, which is the theoretical probability.
Here "simple" means that the Vapnik–Chervonenkis dimension of the class is small relative to the size of the sample. In other words, a sufficiently simple collection of functions behaves roughly the same on a small random sample as it does on the distribution as a whole.
The Uniform Convergence Theorem was first proved by Vapnik and Chervonenkis using the concept of growth function.
Uniform convergence theorem
The sta |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.