source stringlengths 31 227 | text stringlengths 9 2k |
|---|---|
https://en.wikipedia.org/wiki/Cluster-expansion%20approach | The cluster-expansion approach is a technique in quantum mechanics that systematically truncates the BBGKY hierarchy problem that arises when quantum dynamics of interacting systems is solved. This method is well suited for producing a closed set of numerically computable equations that can be applied to analyze a great variety of many-body and/or quantum-optical problems. For example, it is widely applied in semiconductor quantum optics and it can be applied to generalize the semiconductor Bloch equations and semiconductor luminescence equations.
Background
Quantum theory essentially replaces classically accurate values by a probabilistic distribution that can be formulated using, e.g., a wavefunction, a density matrix, or a phase-space distribution. Conceptually, there is always, at least formally, probability distribution behind each observable that is measured. Already in 1889, a long time before quantum physics was formulated, Thorvald N. Thiele proposed the cumulants that describe probabilistic distributions with as few quantities as possible; he called them half-invariants.
The cumulants form a sequence of quantities such as mean, variance, skewness, kurtosis, and so on, that identify the distribution with increasing accuracy as more cumulants are used.
The idea of cumulants was converted into quantum physics by Fritz Coester
and Hermann Kümmel
with the intention of studying nuclear many-body phenomena. Later, Jiři Čížek and Josef Paldus extended the approach for quantum chemistry in order to describe many-body phenomena in complex atoms and molecules. This work introduced the basis for the coupled-cluster approach that mainly operates with many-body wavefunctions. The coupled-clusters approach is one of the most successful methods to solve quantum states of complex molecules.
In solids, the many-body wavefunction has an overwhelmingly complicated structure such that the direct wave-function-solution techniques are intractable. The cluster expansion is a |
https://en.wikipedia.org/wiki/Particle%20damping | Particle damping is the use of particles moving freely in a cavity to produce a damping effect.
Introduction
Active and passive damping techniques are common methods of attenuating the resonant vibrations excited in a structure. Active damping techniques are not applicable under all circumstances due, for example, to power requirements, cost, environment, etc. Under such circumstances, passive damping techniques are a viable alternative. Various forms of passive damping exist, including viscous damping, viscoelastic damping, friction damping, and impact damping. Viscous and viscoelastic damping usually have a relatively strong dependence on temperature. Friction dampers, while applicable over wide temperature ranges, may degrade with wear. Due to these limitations, attention has been focused on impact dampers, particularly for application in cryogenic environments or at elevated temperatures.
Particle damping technology is a derivative of impact damping with several advantages. Impact damping refers to only a single (somewhat larger) auxiliary mass in a cavity, whereas particle damping is used to imply multiple auxiliary masses of small size in a cavity. The principle behind particle damping is the removal of vibratory energy through losses that occur during impact of granular particles which move freely within the boundaries of a cavity attached to a primary system. In practice, particle dampers are highly nonlinear dampers whose energy dissipation, or damping, is derived from a combination of loss mechanisms, including friction and momentum exchange. Because of the ability of particle dampers to perform through a wide range of temperatures and frequencies and survive for a longer life, they have been used in applications such as the weightless environments of outer space, in aircraft structures, to attenuate vibrations of civil structures, and even in tennis rackets.
Advantages of particle dampers
They can perform in a large range of temperature without loss |
https://en.wikipedia.org/wiki/Sharp%20%28music%29 | In music, sharp, (from French), or (from Greek) means, "higher in pitch". More specifically, in musical notation, sharp means "higher in pitch by one semitone (half step)". A sharp is the opposite of a flat, a lowering of pitch. The ♯ symbol itself is conjectured to be a condensed form of German ligature (for scharf) or the symbol (for "cancelled flat").
Examples
A sharp symbol, , is used in key signatures or as an accidental. For instance, the music below has a key signature with three sharps (indicating either A major or F minor, the relative minor) and the note, A♯, has a sharp accidental.
Under twelve-tone equal temperament, the pitch B♯, for instance, sounds the same as, or is enharmonically equivalent to, C natural (C), and E♯ is enharmonically equivalent to F. However in other tuning systems, the enharmonic relationship is generally different from 12-EDO.
Key signature
When used as a key signature, the key is indicated by writing one or more letters to the right of the clef.
If sharps is used as a key signature, the effect continues to be applied regardless of measure and octave, unless the key changes midway or is affected by another accidental effect such as natural.
The key signature is basically applied as follows: F♯ C♯ G♯ D♯ A♯ E♯ B♯ The major scale with one sharp is G major. In all scales of the key signature for sharps, the tonic note of the major scale is a minor second above the last symbol, and the tonic note of the minor scale is a major second below the last symbol.
If there are three or more sharps, the tonic note of the minor key is the note of the third to last symbol of the key signature. For example, in the case of a minor (F♯ C♯ G♯ D♯) composed of four sharps, the third sharp from the last is C♯, which represents C♯ minor. Each new scale begins a fifth above (or a fourth below) the previous scale.
Order of sharps
The order of sharps in key signature notation is F, C, G, D, A, E, B, each extra sharp being added successively in |
https://en.wikipedia.org/wiki/Radical%20axis | In Euclidean geometry, the radical axis of two non-concentric circles is the set of points whose power with respect to the circles are equal. For this reason the radical axis is also called the power line or power bisector of the two circles. In detail:
For two circles with centers and radii the powers of a point with respect to the circles are
Point belongs to the radical axis, if
If the circles have two points in common, the radical axis is the common secant line of the circles.
If point is outside the circles, has equal tangential distance to both the circles.
If the radii are equal, the radical axis is the line segment bisector of .
In any case the radical axis is a line perpendicular to
On notations
The notation radical axis was used by the French mathematician M. Chasles as axe radical.
J.V. Poncelet used .
J. Plücker introduced the term .
J. Steiner called the radical axis line of equal powers () which led to power line ().
Properties
Geometric shape and its position
Let be the position vectors of the points . Then the defining equation of the radical line can be written as:
From the right equation one gets
The pointset of the radical axis is indeed a line and is perpendicular to the line through the circle centers.
( is a normal vector to the radical axis !)
Dividing the equation by , one gets the Hessian normal form. Inserting the position vectors of the centers yields the distances of the centers to the radical axis:
,
with .
( may be negative if is not between .)
If the circles are intersecting at two points, the radical line runs through the common points. If they only touch each other, the radical line is the common tangent line.
Special positions
The radical axis of two intersecting circles is their common secant line.
The radical axis of two touching circles is their common tangent.
The radical axis of two non intersecting circles is the common secant of two convenient equipower circles (see below).
Orthogonal circles
|
https://en.wikipedia.org/wiki/Drylands | Drylands are defined by a scarcity of water. Drylands are zones where precipitation is balanced by evaporation from surfaces and by transpiration by plants (evapotranspiration). The United Nations Environment Program defines drylands as tropical and temperate areas with an aridity index of less than 0.65. One can classify drylands into four sub-types:
Dry sub-humid lands
Demi-arid lands
Arid lands
Hyper-arid lands
Some authorities regard hyper-arid lands as deserts (United Nations Convention to Combat Desertification - UNCCD) although a number of the world's deserts include both hyper-arid and arid climate zones. The UNCCD excludes hyper-arid zones from its definition of drylands.
Drylands cover 41.3% of the earth's land surface, including 15% of Latin America, 66% of Africa, 40% of Asia, and 24% of Europe. There is a significantly greater proportion of drylands in developing countries (72%), and the proportion increases with aridity: almost 100% of all hyper-arid lands are in the developing world. Nevertheless, the United States, Australia, and several countries in Southern Europe also contain significant dryland areas.
Drylands are complex, evolving structures whose characteristics and dynamic properties depend on many interrelated interactions between climate, soil, and vegetation.
Biodiversity
The livelihoods of millions of people in developing countries depend highly on dryland biodiversity to ensure their food security and their well-being. Drylands, unlike more humid biomes, rely mostly on above ground water runoff for redistribution of water, and almost all their water redistribution occurs on the surface. Dryland inhabitants' lifestyle provides global environmental benefits which contribute to halt climate change, such as carbon sequestration and species conservation. Dryland biodiversity is equally of central importance as to ensuring sustainable development, along with providing significant global economic values through the provision of ecosys |
https://en.wikipedia.org/wiki/Hybrid%20wood | Hybrid wood or wood hybrid systems (WHS) is a multilayer composite material, composed on the surface of a skin made of composite wood (WPC) adhering to an underneath structural core, in general aluminum. Invented in Japan in 2008, this technological evolution is based on wood composite technology which was conceived in 1972 by Sadao Nishibori and patented in 1983 to substitute threatened exotic wood species. WHS are not fibre-reinforced plastic (FRP).
In truth, speaking of hybrid wood profiles only makes sense when the outer layer – the composite wood – and the core – aluminum – are adhering sufficiently to each other to prevent any delamination under any climatic conditions to which they may be exposed. To this day, only coextrusion at high temperature of a powerful adhesive and of composite wood allows the "fusion" of 2 materials as different as aluminum and composite wood. This adhesiveness is so powerful that bending the profiles is possible even at very low radii, thus broadening significantly the fields of applications.
In the construction, decoration or design industry sector, hybrid woods have the look, feel and sometimes smell of natural wood. They are easier to install and better performing than natural wood and their exceptional properties allow utilizations that are many times broader than those of wood. They are used in exterior and interior applications such as façades trims, cladding, louvers, screens, pergola, canopies or any durable installations such as urban furniture for instance.
Manufacturing
Wood hybrid profiles are obtained through extrusion. The wood composite covers a core in anodized aluminum core. The optimal adhesiveness between these two materials is made possible by applying a coextruded intermediate adhesive layer. The wood composite layer can be applied on only one side of the profile if so required. The wood fibers to resins ratio as well as the type of core vary according to the desired characteristics. Manufacturing of these p |
https://en.wikipedia.org/wiki/Current%20ratio | The current ratio is a liquidity ratio that measures whether a firm has enough resources to meet its short-term obligations. It compares a firm's current assets to its current liabilities, and is expressed as follows:-
The current ratio is an indication of a firm's liquidity. Acceptable current ratios vary from industry to industry. In many cases, a creditor would consider a high current ratio to be better than a low current ratio, because a high current ratio indicates that the company is more likely to pay the creditor back. Large current ratios are not always a good sign for investors. If the company's current ratio is too high it may indicate that the company is not efficiently using its current assets or its short-term financing facilities.
If current liabilities exceed current assets the current ratio will be less than 1. A current ratio of less than 1 indicates that the company may have problems meeting its short-term obligations. Some types of businesses can operate with a current ratio of less than one, however. If inventory turns into cash much more rapidly than the accounts payable become due, then the firm's current ratio can comfortably remain less than one. Inventory is valued at the cost of acquiring it and the firm intends to sell the inventory for more than this cost. The sale will therefore generate substantially more cash than the value of inventory on the balance sheet. Low current ratios can also be justified for businesses that can collect cash from customers long before they need to pay their suppliers.
Limitations
The ratio is only useful when two companies are compared within industry because inter industry business operations differ substantially.
To determine liquidity, the current ratio is not as helpful as the quick ratio, because it includes all those assets that may not be easily liquidated, like prepaid expenses and inventory.
See also
Debt ratio
Quick ratio
Ratio |
https://en.wikipedia.org/wiki/Zeitschrift%20f%C3%BCr%20Physik | Zeitschrift für Physik (English: Journal for Physics) is a defunct series of German peer-reviewed physics journals established in 1920 by Springer Berlin Heidelberg. The series ended publication in 1997, when it merged with other journals to form the new European Physical Journal series. It had expanded to four parts over the years.
History
Zeitschrift für Physik (1920–1975 ), The first three issues were published as a supplement to Verhandlungen der Deutschen Physikalischen Gesellschaft. The journal split in parts A and B in 1975.
Zeitschrift für Physik A (1975–1997). The original subtitle was Atoms and Nuclei (). In 1986, it split into Zeitschrift für Physik A: Atomic Nuclei () and Zeitschrift für Physik D: Atoms, Molecules and Clusters. Zeitschrift für Physik A continues as the European Physical Journal A.
Zeitschrift für Physik B (1975–1997) resulted from the split of Zeitschrift für Physik and the merger of Physics of Condensed Matter (). Physics of Condensed Matter was itself the continuation of Physik der Kondensierten Materie ). The original subtitle of Zeitschrift für Physik B was Condensed Matter and Quanta () but changed to Condensed Matter () in 1980. Zeitschrift für Physik B merged with Journal de Physique I () to form European Physical Journal B.
Zeitschrift für Physik C: Particles and Fields (1979–1997, ), continuing as the European Physical Journal C.
Zeitschrift für Physik D: Atoms, Molecules and Clusters (1986–1997, ). Zeitschrift für Physik D merged with Journal de Physique II ( to form European Physical Journal D.
See also
Physikalische Zeitschrift
External links
Z. Phys. A website
Z. Phys. B website
Z. Phys. C website
Z. Phys. D website
Publications established in 1920
Physics journals
Springer Science+Business Media academic journals
Publications disestablished in 1997
Defunct journals
Academic journal series |
https://en.wikipedia.org/wiki/Future%20Evolution | Future Evolution is a book written by paleontologist Peter Ward and illustrated by Alexis Rockman. He addresses his own opinion of future evolution and compares it with Dougal Dixon's After Man: A Zoology of the Future and H. G. Wells's The Time Machine.
According to Ward, humanity may exist for a long time. Nevertheless, we are impacting our planet. He splits his book in different chronologies, starting with the near future (the next 1,000 years). Humanity would be struggling to support a massive population of 11 billion. Global warming raises sea levels. The ozone layer weakens. Most of the available land is devoted to agriculture due to the demand for food. Despite all this, the oceanic wildlife remains untethered by most of these impacts, specifically the commercial farmed fish. This is, according to Ward, an era of extinction that would last about 10 million years (note that many human-caused extinctions have already occurred). After that, Earth gets stranger.
Ward labels the species that have the potential to survive in a human-infested world. These include dandelions, raccoons, owls, pigs, cattle, rats, snakes, and crows to name but a few. In the human-infested ecosystem, those preadapted to live amongst man survived and prospered. Ward describes garbage dumps 10 million years in the future infested with multiple species of rats, a snake with a sticky frog-like tongue to snap up rodents, and pigs with snouts specialized for rooting through garbage. The story's time traveller who views this new refuse-covered habitat is gruesomely attacked by ravenous flesh-eating crows.
Ward then questions the potential for humanity to evolve into a new species. According to him, this is incredibly unlikely. For this to happen a human population must isolate itself and interbreed until it becomes a new species. Then he questions if humanity would survive or extinguish itself by climate change, nuclear war, disease, or the posing threat of nanotechnology as terrorist weapon |
https://en.wikipedia.org/wiki/Kurosh%20problem | In mathematics, the Kurosh problem is one general problem, and several more special questions, in ring theory. The general problem is known to have a negative solution, since one of the special cases has been shown to have counterexamples. These matters were brought up by Aleksandr Gennadievich Kurosh as analogues of the Burnside problem in group theory.
Kurosh asked whether there can be a finitely-generated infinite-dimensional algebraic algebra (the problem being to show this cannot happen). A special case is whether or not every nil algebra is locally nilpotent.
For PI-algebras the Kurosh problem has a positive solution.
Golod showed a counterexample to that case, as an application of the Golod–Shafarevich theorem.
The Kurosh problem on group algebras concerns the augmentation ideal I. If I is a nil ideal, is the group algebra locally nilpotent?
There is an important problem which is often referred as the Kurosh's problem on division rings. The problem asks whether there exists an algebraic (over the center) division ring which is not locally finite. This problem has not been solved until now. |
https://en.wikipedia.org/wiki/Non-volatile%20random-access%20memory | Non-volatile random-access memory (NVRAM) is random-access memory that retains data without applied power. This is in contrast to dynamic random-access memory (DRAM) and static random-access memory (SRAM), which both maintain data only for as long as power is applied, or forms of sequential-access memory such as magnetic tape, which cannot be randomly accessed but which retains data indefinitely without electric power.
Read-only memory devices can be used to store system firmware in embedded systems such as an automotive ignition system control or home appliance. They are also used to hold the initial processor instructions required to bootstrap a computer system. Read-write memory can be used to store calibration constants, passwords, or setup information, and may be integrated into a microcontroller.
If the main memory of a computer system were non-volatile, it would greatly reduce the time required to start a system after a power interruption. Current existing types of semiconductor non-volatile memory have limitations in memory size, power consumption, or operating life that make them impractical for main memory. Development is going on for the use of non-volatile memory chips as a system's main memory, as persistent memory. A standard for persistent memory known as NVDIMM-P has been published in 2021.
Early NVRAMs
Early computers used core and drum memory systems which were non-volatile as a byproduct of their construction. The most common form of memory through the 1960s was magnetic-core memory, which stored data in the polarity of small magnets. Since the magnets held their state even with the power removed, core memory was also non-volatile. Other memory types required constant power to retain data, such as vacuum tube or solid-state flip-flops, Williams tubes, and semiconductor memory (static or dynamic RAM).
Advances in semiconductor fabrication in the 1970s led to a new generation of solid state memories that magnetic-core memory could not match |
https://en.wikipedia.org/wiki/Vaccination%20Week%20In%20The%20Americas | Vaccination Week In The Americas (VWA) is an annual public health campaign by the member states of the Pan American Health Organization (PAHO) to promote equity and access to immunization. It is marked each year during the last week of April.
Since 2012, the campaign has been celebrated globally across all regions of the World Health Organization (WHO) under World Immunization Week. The WHO reports there are 25 diseases for which vaccines exist for their prevention and control.
History
VWA was initially proposed and implemented on April 23, 2002, under the presidency of Dr. Patricio Jamriska, Secretary of Health of the Republic of Ecuador and by the Ministers of Health in the Andean Region following a measles outbreak in Venezuela and Colombia. Originally endorsed through the adoption of Resolution CD44.R by the Directing Council of the Pan American Health Organization (PAHO), the overarching objectives of VWA are to
promote equity and access to immunization and strengthen immunization programs in the Region of the Americas by “reaching the unreached”, populations with otherwise limited access to regular health services and at heightened risk of contracting vaccine-preventable diseases. In the Americas, such populations can be found in areas such as urban fringes, low coverage municipalities, indigenous communities and rural and/or border zones.
VWA has been used to conduct large-scale vaccination campaigns, introduce new vaccines, carry out communication and social mobilization campaigns, and to integrate other health promotion measures. Since its launch in 2003, over 365 million individuals have been vaccinated through campaigns conducted under the framework of the initiative.
In April 2012, PAHO joined the other five WHO regions to celebrate World Immunization Week. This followed the launch of European Immunization Week (EIW) in the WHO European region in 2005, and Vaccination Week campaigns launched in the WHO Eastern Mediterranean region in 2010, and in th |
https://en.wikipedia.org/wiki/Amenamevir | Amenamevir (trade name Amenalief) is an antiviral drug used for the treatment of shingles (herpes zoster).
It acts as an inhibitor of the zoster virus's helicase–primase complex. Amenamevir was approved in Japan for the treatment of shingles in 2017. |
https://en.wikipedia.org/wiki/ParaHox | The ParaHox gene cluster is an array of homeobox genes (involved in morphogenesis, the regulation of patterns of anatomical development) from the Gsx, Xlox (Pdx) and Cdx gene families.
Regulatory gene cluster
These genes were first shown to be arranged into a physically-linked chromosomal cluster in amphioxus, an invertebrate with a single member of each of the three gene families. All the ParaHox genes in the amphioxus genome are therefore in the ParaHox gene cluster. In contrast, the human genome has six ParaHox genes (GSX1, GSX2, PDX1, CDX1, CDX2, CDX4), of which three genes (GSX1, PDX1 (=IPF1), CDX2) are physically linked to form a human ParaHox gene cluster on chromosome 13. Mouse has a homologous ParaHox gene cluster on chromosome 5. The other three human ParaHox genes are remnants from duplicated ParaHox gene clusters that were generated in the 2R genome duplications at the base of vertebrate evolution. Some vertebrates, notably chondrichthyan fish and coelacanths, have retained an additional ParaHox gene (PDX2).
The ParaHox gene cluster has been proposed to be a paralogue, or evolutionary sister, of the Hox gene cluster; the two gene clusters being descendent from a segmental duplication early in animal evolution, preceding the divergence of cnidarians and bilaterian animals. It has been suggested that an ancient role of the ParaHox gene cluster in bilaterians was the specify or pattern the through-gut, with Gsx patterning the mouth, Xlox (=Pdx) patterning the midgut and Cdx marking the anus. Gene expression and functional data lends tentative support to this hypothesis, although in many animals the roles of the genes have changed in evolution, notably the Gsx gene family which plays a role in brain (not foregut) development in vertebrates. |
https://en.wikipedia.org/wiki/Permissive%20hypotension | Permissive hypotension or hypotensive resuscitation is the use of restrictive fluid therapy, specifically in the trauma patient, that increases systemic blood pressure without reaching normotension (normal blood pressures). The goal blood pressure for these patients is a mean arterial pressure of 40-50 mmHg or systolic blood pressure of less than or equal to 80. This goes along with certain clinical criteria. Following traumatic injury, some patients experience hypotension (low blood pressure) that is usually due to blood loss (hemorrhage) but can be due to other causes as well (for example, blood leaking around an abdominal aortic aneurysms). In the past, physicians were very aggressive with fluid resuscitation (giving fluids such as normal saline or lactated Ringer's through the vein) to try to bring the blood pressure to normal values. Recent studies have found that there is some benefit to allowing specific patients to experience some degree of hypotension in certain settings. This concept does not exclude therapy by means of i.v. fluid, inotropes or vasopressors, the only restriction is to avoid completely normalizing blood pressure in a context where blood loss may be enhanced. When a person starts to bleed (big or small) the body starts a natural coagulation process that eventually stops the bleed. Issues with fluid resuscitation without control of bleeding are thought to be secondary to dislodgement of the thrombus (blood clot) that is helping to control further bleeding. Thrombus dislodgement was found to occur at a systolic pressure greater than 80mm Hg. In addition, fluid resuscitation will dilute coagulation factors that help form and stabilize a clot, hence making it harder for the body to use its natural mechanisms to stop the bleeding. These factors are aggravated by hypothermia (if fluids are administered without being warmed first it will cause body temperature to drop).
It is becoming common in hemorrhaging patients without traumatic brain injury. |
https://en.wikipedia.org/wiki/Pleurotus%20nebrodensis | Pleurotus nebrodensis, commonly known as "fungus of basilisk" or "macchia carduncieddu(?)", is a fungus that was declared by the IUCN as critically endangered in 2006. This fungus only grows on limestone in northern Sicily in association with Cachrys ferulacea (family Apiaceae). The characteristics of the mushroom are its creamy white to yellow colour, its diameter of between , its extremely angled gills, and the breaking apart of the cap surface at maturity.
Taxonomy
The first record of the mushroom was in 1866 by Italian botanist Giuseppe Inzenga, who named it Agaricus nembrodensis; it was described as "the most delicious mushroom of the Sicilian mycological flora". This was widely agreed upon, which has led to widespread cultivation, by professionals and amateurs. In 1886, French mycologist Lucien Quélet transferred the species to the genus Pleurotus. Recent research has shown that P. nebrodensis is closely related to, but unique from, Pleurotus eryngii, which also occurs in the Mediterranean Basin and is also associated with plants in the family Apiaceae.
Decline
Pleurotus nebrodensis was classified as critically endangered is because it is found in an area of less than , and the population has become fractured. In addition, there are not as many mature fungi, and it is losing its natural habitat. An additional reason for the decline is that collectors are now picking unripe specimens. It is currently estimated that fewer than 250 Pleurotus nebrodensis reach maturity every year.
Conservation
Currently there are no laws to protect Pleurotus nebrodensis. Even in protected areas there has not been a ban on picking of unripe specimens. However, a draft of rules has been created, and could be approved. This draft proposes protecting all ages of the Pleurotus nebrodensis in one part of Madonie Park, a sanctuary, while in another section of the park it will protect any non-mature mushroom. In addition to this, this fungus is being grown, like a crop, to reduce th |
https://en.wikipedia.org/wiki/Cardo | A cardo (plural cardines) was a north–south street in Ancient Roman cities and military camps as an integral component of city planning. The cardo maximus, or most often the cardo, was the main or central north–south-oriented street.
Etymology
The cardo maximus was the "hinge" or axis of the city, derived from Greek καρδίᾱ, kardia ("heart") and as such was generally lined with shops and vendors, and served as a hub of economic life.
Most Roman cities also had a Decumanus Maximus, an east–west street that served as a secondary main street. Due to varying geography, in some cities the Decumanus is the main street and the Cardo is secondary, but in general the Cardo maximus served as the primary street. The Forum was normally located at, or close to, the intersection of the Decumanus and the Cardo.
Examples
Apamea, Syria
The Cardo Maximus of Apamea, Syria ran through the centre of the city directly from North to South, linked the principal gates of the city, and was originally surrounded by 1200 columns with unique spiral fluting, each subsequent column spiralling in the opposite direction. The thoroughfare was about 1.85 kilometres long and 37 metres wide, as it was used for wheeled transport. The great colonnade was erected in the 2nd century and it was still standing until the 12th. The earthquakes of 1157 and 1170 demolished the colonnade. The cardo was lined on both sides with civic and religious buildings.
Cologne, Germany
Hohe Strasse and Schildergasse in Cologne, Germany, may be taken as examples of streets that have kept their course and function of Cardo and Decumanus Maximus until this day.
Jerash, Jordan
The excavations at Jerash in Jordan have unearthed the remains of an ancient Roman city on the site, with the mains feature of the city being a colonnaded cardo. The original road surface survived.
Jerusalem
During the visit of Hadrian to Judea in the 130s AD, Jerusalem's ruins were surveyed and Hadrian decided to build a Roman colony in its pla |
https://en.wikipedia.org/wiki/Inferior%20mesenteric%20lymph%20nodes | The inferior mesenteric lymph nodes consist of:
(a) small glands on the branches of the left colic and sigmoid arteries
(b) a group in the sigmoid mesocolon, around the superior hemorrhoidal artery
(c) a pararectal group in contact with the muscular coat of the rectum
Structure
The inferior mesenteric lymph nodes are lymph nodes present throughout the hindgut.
Afferents
The inferior mesenteric lymph nodes drain structures related to the hindgut; they receive lymph from the descending colon, sigmoid colon, and proximal part of the rectum.
Efferents
They drain into the superior mesenteric lymph nodes and ultimately to the preaortic lymph nodes. Lymph nodes surrounding the inferior mesenteric artery drain directly into the preaortic nodes.
Clinical significance
Colorectal cancer may metastasise to the inferior mesenteric lymph nodes. For this reason, the inferior mesenteric artery may be removed in people with lymph node-positive cancer. This has been proposed since at least 1908, by surgeon William Ernest Miles.
Additional images |
https://en.wikipedia.org/wiki/Family%20of%20curves | In geometry, a family of curves is a set of curves, each of which is given by a function or parametrization in which one or more of the parameters is variable. In general, the parameter(s) influence the shape of the curve in a way that is more complicated than a simple linear transformation. Sets of curves given by an implicit relation may also represent families of curves.
Families of curves appear frequently in solutions of differential equations; when an additive constant of integration is introduced, it will usually be manipulated algebraically until it no longer represents a simple linear transformation.
Families of curves may also arise in other areas. For example, all non-degenerate conic sections can be represented using a single polar equation with one parameter, the eccentricity of the curve:
as the value of changes, the appearance of the curve varies in a relatively complicated way.
Applications
Families of curves may arise in various topics in geometry, including the envelope of a set of curves and the caustic of a given curve.
Generalizations
In algebraic geometry, an algebraic generalization is given by the notion of a linear system of divisors.
External links
Algebraic geometry |
https://en.wikipedia.org/wiki/Aoyagi%20Metals%20Company | (commonly known as Ayk in America and Aoyagi in Japan) was a Japanese company that became notable in the 1980s for its radio-controlled cars.
The company began in the 1960s manufacturing metal chassis for slotcar racing; 1978 saw the introduction of the RX1200, a 1:12 scale on-road racer, which was the start of the RX series of 1/12-scale on-road chassis. The RX2000 followed and took the Japan Model Racing Car Association All-Japan Championship title in 1980, the first of three consecutive titles for the company. In 1984, Joel Johnson won the title on his first trip to Japan with Trinity.
Conventional manufacturers mounted their differential gears between the right rear wheel and the motor compartment, but Ayk placed theirs within the axle, which centered the weight more proportionately and kept the differential gears protected. By 1984, they resorted to an outboard gear diff much like the current Delta of the time.
In 1982, the same year that AYK took its third consecutive 1:12 title (with the RX3000 EXL480), like other manufacturers, Ayk broke into the 1:10 off-road buggy market with the 566 B Super Trail. The buggy included aluminum chassis, enclosed transmission with all-aluminum gears, and a waterproof radio case. This model was closely followed in the early eighties by a series of race-ready 1/10 off-road buggies which saw reasonable success on the semi-pro circuit.
By the mid-1980s, Ayk abandoned the 1:10 two-wheel-drive off-road to concentrate on off-road four-wheel-drive models and on-road 1/12 scale.
After winning the Japan Model Racing Car Association 1:12 on-road title with the Super Parsec, the company began its decline following the death of its president in 1989. The Japanese asset price bubble (バブル景気 baburu keiki) bursting in Japan was the other primary reason AYK closed down. These two events had more to do with AYK closing than the common overseas racing car market rumor.
One of its staff members, Tatsuro Watanabe, left the company by 1986 to |
https://en.wikipedia.org/wiki/Facilitation%20cascade | A facilitation cascade is a sequence of ecological interactions that occur when a species benefits a second species that in turn has a positive effect on a third species. These facilitative interactions can take the form of amelioration of environmental stress and/or provision of refuge from predation. Autogenic ecosystem engineering species, structural species, habitat-forming species, and foundation species are associated with the most commonly recognized examples of facilitation cascades, sometimes referred to as a habitat cascades. Facilitation generally is a much broader concept that includes all forms of positive interactions including pollination, seed dispersal, and co-evolved commensalism and mutualistic relationships, such as between cnidarian hosts and symbiodinium in corals, and between algae and fungi in lichens. As such, facilitation cascades are widespread through all of the earth's major biomes with consistently positive effects on the abundance and biodiversity of associated organisms.
Overview
Facilitation cascades occur when prevalent foundation species, or less abundant but ecologically important keystone species, are involved in a hierarchy of positive interactions and consist of a primary facilitator which positively affects one or more secondary facilitators which support a suite of beneficiary species. Facilitation cascades at a minimum have a primary and secondary facilitator, although tertiary, quaternary, etc. facilitators may be found in some systems.
A typical example of facilitation cascades in a tropical coastal ecosystem
Origin of concept and related terms
The term facilitation cascade was coined by Altieri, Silliman, and Bertness during a study on New England cobblestone beaches to explain the chain of positive interactions that allow a diverse community to exist in a habitat that is otherwise characterized by substrate instability, elevated temperatures, and desiccation stress. Cordgrass is able to establish independently, and t |
https://en.wikipedia.org/wiki/Hydrogen-like%20atom | A hydrogen-like atom (or hydrogenic atom) is any atom or ion with a single valence electron. These atoms are isoelectronic with hydrogen. Examples of hydrogen-like atoms include, but are not limited to, hydrogen itself, all alkali metals such as Rb and Cs, singly ionized alkaline earth metals such as Ca+ and Sr+ and other ions such as He+, Li2+, and Be3+ and isotopes of any of the above. A hydrogen-like atom includes a positively charged core consisting of the atomic nucleus and any core electrons as well as a single valence electron. Because helium is common in the universe, the spectroscopy of singly ionized helium is important in EUV astronomy, for example, of DO white dwarf stars.
The non-relativistic Schrödinger equation and relativistic Dirac equation for the hydrogen atom can be solved analytically, owing to the simplicity of the two-particle physical system. The one-electron wave function solutions are referred to as hydrogen-like atomic orbitals. Hydrogen-like atoms are of importance because their corresponding orbitals bear similarity to the hydrogen atomic orbitals.
Other systems may also be referred to as "hydrogen-like atoms", such as muonium (an electron orbiting an antimuon), positronium (an electron and a positron), certain exotic atoms (formed with other particles), or Rydberg atoms (in which one electron is in such a high energy state that it sees the rest of the atom effectively as a point charge).
Schrödinger solution
In the solution to the Schrödinger equation, which is non-relativistic, hydrogen-like atomic orbitals are eigenfunctions of the one-electron angular momentum operator L and its z component Lz. A hydrogen-like atomic orbital is uniquely identified by the values of the principal quantum number n, the angular momentum quantum number l, and the magnetic quantum number m. The energy eigenvalues do not depend on l or m, but solely on n. To these must be added the two-valued spin quantum number ms = ±, setting the stage for the Aufbau p |
https://en.wikipedia.org/wiki/Biochemical%20and%20Biophysical%20Research%20Communications | Biochemical and Biophysical Research Communications is a weekly peer-reviewed scientific journal covering all aspects of biochemistry and biophysics. It was established in 1959 by Academic Press and is currently published by Elsevier. The editor-in-chief is Wolfgang Baumeister (Max Planck Institute of Biochemistry).
According to the Journal Citation Reports, the journal has a 2020 impact factor of 3.575.
Abstracting and indexing
The journal is abstracted and indexed in: |
https://en.wikipedia.org/wiki/Technics%20Digital%20Link | The Technics Digital Link interface was introduced by Panasonic at the Internationale Funkausstellung 2014 in Berlin as an integral part of the new "R1 Reference Class" of audio components. At the same time, Panasonic relaunched the Technics brand itself.
At the moment, only two devices are announced with this interface, the SE-R1 stereo power amplifier and the SU-R1 network audio control player/preamp.
Design
As can be seen on the product images, the Technics Digital Link uses some kind of RJ45-styled connector and therefore most likely twisted pair cables with an unknown set of pairs.
The interface transmits digital audio data at a sample frequency of 192 kbit/s with 32-bit resolution together with some control data. The so called "VR control data" is used to adjust the output volume inside the power amp rather than in the pre-amplifier, which would be the case in analogue setups.
The left and right channels of digital audio are transmitted on separate digital links and cables which seems very unusual compared to S/PDIF, for example. If Cat.3 unshielded twisted pair cable will be used to connect the components, the cable itself would be suitable for bandwidths up to 16 MHz.
The connectors are clearly labeled as "IN" and "OUT", so the connections between the audio components will most likely be unidirectional point-to-point connections. |
https://en.wikipedia.org/wiki/Newman%27s%20conjecture | In mathematics, specifically in number theory, Newman's conjecture is a conjecture about the behavior of the partition function modulo any integer. Specifically, it states that for any integers and such that , the value of the partition function satisfies the congruence for infinitely many non-negative integers . It was formulated by mathematician Morris Newman in 1960. It is unsolved as of 2020.
History
Oddmund Kolberg was probably the first to prove a related result, namely that the partition function takes both even and odd values infinitely often. The proof employed was of elementary nature and easily accessible, and was proposed as an exercise by Newman in the American Mathematical Monthly.
1 year later, in 1960, Newman proposed the conjecture and proved the cases m=5 and 13 in his original paper, and m=65 two years later.
Ken Ono, an American mathematician, made further advances by exhibiting sufficient conditions for the conjecture to hold for prime . He first showed that Newman's conjecture holds for prime if for each between 0 and , there exists a nonnegative integer such that the following holds:
He used the result, together with a computer program, to prove the conjecture for all primes less than 1000 (except 3). Ahlgren expanded on his result to show that Ono's condition is, in fact, true for all composite numbers coprime to 6.
Three years later, Ono showed that for every prime greater than 3, one of the following must hold:
Newman's conjecture holds for , or
for all nonnegative integers , and .
Using computer technology, he proved the theorem for all primes less than 200,000 (except 3).
Afterwards, Ahlgren and Boylan used Ono's criterion to extend Newman's conjecture to all primes except possibly 3. 2 years afterwards, they extended their result to all prime powers except powers of 2 or 3.
Partial progress and solved cases
The weaker statement that has at least 1 solution has been proved for all . It was formerly known as the Erdő |
https://en.wikipedia.org/wiki/Apache%20Shiro | Apache Shiro (pronounced "sheeroh", a Japanese word for castle ) is an open source software security framework that performs authentication, authorization, cryptography and session management. Shiro has been designed to be an intuitive and easy-to-use framework while still providing robust security features.
History
Shiro's predecessor, JSecurity, was founded in 2004 by Les Hazlewood and Jeremy Haile because they could not find a suitable Java security framework that operated well at the application level and they were frustrated with JAAS. Between 2004 and 2008, JSecurity was hosted on SourceForge and its committer list grew to include Peter Ledbrook, Alan Ditzel, Tim Veil.
In 2008, JSecurity project was submitted to the Apache Software Foundation (ASF) and accepted into their Incubator Program to be stewarded by mentors in order to become a top level Apache Project. Under the ASF's Incubator, Jsecurity was renamed Ki (pronounced Key) and shortly later renamed Shiro by the community because of trademark concerns.
The project continued to grow while in the Apache Incubator, adding Kalle Korhonen as a project committer. And in July 2010, the Shiro community released its official version 1.0, marking a period of stability in the code base. Following the release of version 1.0, the Shiro community created a Project Management Committee and elected Les Hazlewood as its chair. On September 22, 2010, Shiro became a top level project (TLP) in the Apache Software Foundation.
Releases
1.12.0 on 2023-07-18 (current stable release)
1.11.0 on 2023-01-13
1.10.1 on 2022-11-19
1.10.0 on 2022-10-10
1.9.1 on 2022-06-28
1.9.0 on 2022-03-22
1.8.0 on 2021-08-26
1.7.1 on 2021-01-31
1.7.0 on 2020-10-29
1.6.0 on 2020-08-17
1.5.3 on 2020-05-03
1.5.2 on 2020-03-23
1.5.1 on 2020-02-23
1.5.0 on 2020-01-24
1.4.2 on 2019-11-18
1.4.1 on 2019-04-18
1.4.0 on 2017-05-05
1.3.2 on 2016-09-11
1.3.1 on 2016-08-29
1.3.0 on 2016-07-25
1.2.6 on 2016-06-28
1.2.5 on 2016-05- |
https://en.wikipedia.org/wiki/HEPPS%20%28buffer%29 | HEPPS (EPPS) is a buffering agent used in biology and biochemistry. The pKa of HEPPS is 8.00. It is ones of Good's buffers.
Research on mice with Alzheimer's disease-like amyloid beta plaques has shown that HEPPS can cause the plaques to break up, reversing some of the symptoms in the mice. HEPPS was reported to dissociate amyloid beta oligomers in patients' plasma samples enabling blood diagnosis of Alzheimer's disease.
See also
CAPSO
CHES
HEPES |
https://en.wikipedia.org/wiki/Evolution%20in%20Mendelian%20Populations | "Evolution in Mendelian Populations" is a lengthy 1931 scientific paper on evolution by the American population geneticist Sewall Wright.
The paper was first published in Genetics volume 16, pages 97–159. In it, Wright outlines various concepts, including genetic drift, effective population size, and inbreeding.
A contemporary review by R.A. Fisher can be found here
Overview
Studiers of evolution such as Lamarck and those who postulated the inheritance of acquired characteristics (e.g. Theodor Eimer and Edward Drinker Cope) were concerned with heredity and sought a link between one generation to the next. Lamarck thought that bodily responses from one generation should be passed along to future generations, which Wright refers to as "direct evolution". Sewall Wright expresses that the birth of genetics stems from Mendelian inheritance principles and so "any theory of evolution" must also be based on Mendelian inheritance.
See also
Evolutionary biology |
https://en.wikipedia.org/wiki/Camptodactyly | Camptodactyly is a medical condition that causes one or more fingers or toes to be permanently bent. It involves fixed flexion deformity of the proximal interphalangeal joints.
Camptodactyly can be caused by a genetic disorder. In that case, it is an autosomal dominant trait that is known for its incomplete genetic expressivity. This means that when a person has the genes for it, the condition may appear in both hands, one, or neither. A linkage scan proposed that the chromosomal locus of camptodactyly was 3q11.2-q13.12.
Causes
The specific cause of camptodactyly remains unknown, but there are a few deficiencies that lead to the condition. A deficient lumbrical muscle controlling the flexion of the fingers, and abnormalities of the flexor and extensor tendons.
A number of congenital syndromes may also cause camptodactyly:
Jacobsen syndrome
Beals syndrome
Blau syndrome
Freeman–Sheldon syndrome
Cerebrohepatorenal syndrome
Weaver syndrome
Christian syndrome 1
Gordon syndrome
Jacobs arthropathy-camptodactyly syndrome
Lenz microphthalmia syndrome
Marshall–Smith–Weaver syndrome
Oculo-dento-digital syndrome
Tel Hashomer camptodactyly syndrome
Toriello–Carey syndrome
Trisomy 13
Stuve–Wiedemann syndrome
Loeys–Dietz syndrome
Fetal alcohol syndrome
Fryns syndrome
Marfan syndrome
Carnio-carpo-tarsal dystrophy
Genetics
The pattern of inheritance is determined by the phenotypic expression of a gene—which is called expressivity. Camptodactyly can be passed on through generations in various levels of phenotypic expression, which include both or only one hand. This means that the genetic expressivity is incomplete. It can be inherited from either parent.
In most of its cases, camptodactyly occurs sporadically, but it has been found in several studies that it is inherited as an autosomal dominant condition.
Treatment
If a contracture is less than 30 degrees, it may not interfere with normal functioning. The common treatment is splinting and occupation |
https://en.wikipedia.org/wiki/Heather%20Marsh | Heather Marsh is a philosopher, programmer and human rights activist. She is the author of the Binding Chaos series, a study of methods of mass collaboration.
Internet and journalism
In 2015 Marsh began working on a data project with a goal of allowing global collaboration on research and information without control by a specific platform. This is a continuation of her earlier project called the Global Square.
Activism
Marsh has been a transparency activist associated with Guantanamo activism, primarily for Canadian POW Omar Khadr, and Anonymous activity, particularly human rights issues. She has reported on and campaigned against human trafficking and violations committed by global resource corporations.
She has written investigative reports and interviews on Canadian juvenile Omar Khadr, one of the youngest prisoners of Guantanamo Bay. She was the national spokesperson for the Free Omar Khadr group in Canada.
She has reported on ritual killings in Gabon and began a research project to map connections between the people responding to a fracking protest in New Brunswick. She started the OpDeathEaters campaign with a goal of independent inquiries to investigate and a change in public discourse around human trafficking. The opGabon and opDeatheaters campaigns were the subject of a book, Crime, Justice and Social Media by Australian criminologist Michael Salter which featured extensive interviews with her. |
https://en.wikipedia.org/wiki/Acta%20Physica%20Polonica | Acta Physica Polonica is an open access peer-reviewed scientific journal covering research in physics. It was established by the Polish Physical Society in 1920 as Comptes Rendus des Séances de la Société Polonaise de Physique. In 1970 is split into two journals Acta Physica Polonica A and Acta Physica Polonica B. The two journals became independent in 1995, with series A published by the Institute of Physics of the Polish Academy of Sciences and series B published by the Jagiellonian University in cooperation with the Polish Academy of Arts and Sciences.
History
Acta Physica Polonica was established by the Polish Physical Society in 1920. In 1970 it was split into Acta Physica Polonica A (published by the Institute of Physics of the Polish Academy of Sciences), whose scope includes general physics, atomic and molecular physics, condensed matter physics, optics and quantum optics, biophysics, quantum information, and applied physics, and Acta Physica Polonica B (published by the Jagiellonian University), which covers mathematical physics, particle and nuclear physics, relativity, astrophysics, and statistical physics. The editors-in-chief are Jan Mostowski and Michał Praszałowicz. In 2008, Acta Physica Polonica B Proceedings Supplement was established.
Abstracting and Indexing
Acta Physica Polonica A has a 2013 impact factor of 0.604, while the B series has an impact factor of 0.998. Acta Physica Polonica B is part of the SCOAP3 initiative. |
https://en.wikipedia.org/wiki/Ecological%20stoichiometry | Ecological stoichiometry (more broadly referred to as biological stoichiometry) considers how the balance of energy and elements influences living systems. Similar to chemical stoichiometry, ecological stoichiometry is founded on constraints of mass balance as they apply to organisms and their interactions in ecosystems. Specifically, how does the balance of energy and elements affect and how is this balance affected by organisms and their interactions. Concepts of ecological stoichiometry have a long history in ecology with early references to the constraints of mass balance made by Liebig, Lotka, and Redfield. These earlier concepts have been extended to explicitly link the elemental physiology of organisms to their food web interactions and ecosystem function.
Most work in ecological stoichiometry focuses on the interface between an organism and its resources. This interface, whether it is between plants and their nutrient resources or large herbivores and grasses, is often characterized by dramatic differences in the elemental composition of each part. The difference, or mismatch, between the elemental demands of organisms and the elemental composition of resources leads to an elemental imbalance. Consider termites, which have a tissue carbon:nitrogen ratio (C:N) of about 5 yet consume wood with a C:N ratio of 300–1000. Ecological stoichiometry primarily asks:
why do elemental imbalances arise in nature?
how is consumer physiology and life-history affected by elemental imbalances? and
what are the subsequent effects on ecosystem processes?
Elemental imbalances arise for a number of physiological and evolutionary reasons related to the differences in the biological make up of organisms, such as differences in types and amounts of macromolecules, organelles, and tissues. Organisms differ in the flexibility of their biological make up and therefore in the degree to which organisms can maintain a constant chemical composition in the face of variations in their |
https://en.wikipedia.org/wiki/Theil%E2%80%93Sen%20estimator | In non-parametric statistics, the Theil–Sen estimator is a method for robustly fitting a line to sample points in the plane (simple linear regression) by choosing the median of the slopes of all lines through pairs of points. It has also been called Sen's slope estimator, slope selection, the single median method, the Kendall robust line-fit method, and the Kendall–Theil robust line. It is named after Henri Theil and Pranab K. Sen, who published papers on this method in 1950 and 1968 respectively, and after Maurice Kendall because of its relation to the Kendall tau rank correlation coefficient.
Theil-Sen regression has several advantages over Ordinary least squares regression. It is insensitive to outliers. It can be used for significance tests even when residuals are not normally distributed. It can be significantly more accurate than non-robust simple linear regression (least squares) for skewed and heteroskedastic data, and competes well against least squares even for normally distributed data in terms of statistical power. It has been called "the most popular nonparametric technique for estimating a linear trend". There are fast algorithms for efficiently computing the parameters.
Definition
As defined by , the Theil–Sen estimator of a set of two-dimensional points is the median of the slopes determined by all pairs of sample points. extended this definition to handle the case in which two data points have the same coordinate. In Sen's definition, one takes the median of the slopes defined only from pairs of points having distinct coordinates.
Once the slope has been determined, one may determine a line from the sample points by setting the -intercept to be the median of the values . The fit line is then the line with coefficients and in slope–intercept form. As Sen observed, this choice of slope makes the Kendall tau rank correlation coefficient become approximately zero, when it is used to compare the values with their associated residuals . Int |
https://en.wikipedia.org/wiki/Clovis%20Vincent | Clovis Vincent was born September 26, 1879, in Ingré (Loiret) and died November 14, 1947, in Paris. He was a French neurologist and neurosurgeon. With Thierry de Martel (1875-1940), he was one of the founders of neurosurgery in France.
Career
Student of Professor Fulgence Raymond, Charcot's successor, had a great admiration for Joseph Babinski.
In 1914, when the First World War broke out, he served as a 2nd class Doctor adjutant in a stretcher bearers corps assigned to the 46th Infantry Regiment. In February 1915, he participated in the Battle of Vauquois (Meuse department, Lorraine, north-eastern France). He received the Legion of Honor as a soldier and the Military Medal in 1915.
He was appointed chief physician of the neurological center of the ninth French military region, located in the buildings of the Descartes high school in Tours. There, he fostered a new treatment to get soldiers with psychic disease symptoms back to the front. The soldiers suffering from shell-shock ("Obusite") underwent a "faradic treatment", more commonly known as "torpedoing": 60 mA to 100 mA electric shocks were inflicted on those with a plicature syndrome.
On May 27, 1916, at a session of "torpedoing", the Zouave Baptiste Deschamps hit Clovis Vincent. A sensational trial opened that the press described as follows: "Can a soldier refuse to be treated? ".
In 1927, he went to Boston to see Harvey Cushing, a pioneer in neurosurgery.
On 19 December 1937, in Paris, Clovis Vincent tried surgery on the brain of Maurice Ravel. The composer woke up a short time after surgery, then plunged into a definitive coma. |
https://en.wikipedia.org/wiki/Huntington%27s%20Disease%20Outreach%20Project%20for%20Education%20at%20Stanford | The Huntington's disease Outreach Project for Education at Stanford (HOPES) is a student-run project at Stanford University dedicated to making scientific information about Huntington's disease (HD) more readily accessible to patients and the public. Initiated by Professor William H. Durham in 2000, HOPES is a team of faculty members and undergraduate students at Stanford that surveys the rapidly growing scientific and clinical literature on Huntington's disease. They then present this information in a web resource that reflects the current scientific understanding of HD.
The HOPES website provides information about topics including the causes and symptoms of HD, existing drugs and supplements that may help HD patients, recent advances in HD research and lifestyle choices for managing HD. Articles summarize and synthesize recent research on HD for a non-technical audience. The website is designed for people of all ages and scientific backgrounds. Material ranges from interactive articles about basic genetics, written for children, to more comprehensive topics in molecular neuroscience, such as the potential for stem cells to treat or cure HD.
Awards
In June 2008 HOPES was honored with the first annual “Giving a Voice to HD” award from the Huntington's Disease Society of America (HDSA), which recognizes an individual or group who has helped to raise awareness about HD in the community.
In October 2018, HOPES was honored with the Community Advocate Award at the 12th Annual San Francisco Team Hope Walk |
https://en.wikipedia.org/wiki/Far%20pointer | In a segmented architecture computer, a far pointer is a pointer which includes a segment selector, making it possible to point to addresses outside of the default segment.
Comparison and arithmetic on far pointers is problematic: there can be several different segment-offset address pairs pointing to one physical address.
In 16-bit x86
For example, in an Intel 8086, as well as in later processors running 16-bit code, a far pointer has two parts: a 16-bit segment value, and a 16-bit offset value. A linear address is obtained by shifting the binary segment value four times to the left, and then adding the offset value. Hence the effective address is 20 bits (actually 21-bit, which led to the address wraparound and the Gate A20). There can be up to 4096 different segment-offset address pairs pointing to one physical address. To compare two far pointers, they must first be converted (normalized) to their 20-bit linear representation.
On C compilers targeting the 8086 processor family, far pointers were declared using a non-standard "far" qualifier. For example, char far *p; defined a far pointer to a char. The difficulty of normalizing far pointers could be avoided with the non-standard "huge" qualifier.
Example of far pointer:
#include <stdio.h>
int main() {
char far *p =(char far *)0x55550005;
char far *q =(char far *)0x53332225;
*p = 80;
(*p)++;
printf("%d",*q);
return 0;
}
Output of the following program: 81; Because both addresses point to same location.
Physical Address = (value of segment register) * 0x10 + (value of offset).
Location pointed to by pointer 'p' is : 0x5555 * 0x10 + 0x0005 = 0x55555
Location pointed to by pointer 'q' is : 0x5333 * 0x10 + 0x2225 = 0x55555
So, p and q both point to the same location 0x55555. |
https://en.wikipedia.org/wiki/Tree%20transducer | In theoretical computer science and formal language theory, a tree transducer (TT) is an abstract machine taking as input a tree, and generating output – generally other trees, but models producing words or other structures exist. Roughly speaking, tree transducers extend tree automata in the same way that word transducers extend word automata.
Manipulating tree structures instead of words enable TT to model syntax-directed transformations of formal or natural languages. However, TT are not as well-behaved as their word counterparts in terms of algorithmic complexity, closure properties, etcetera. In particular, most of the main classes are not closed under composition.
The main classes of tree transducers are:
Top-Down Tree Transducers (TOP)
A TOP T is a tuple (Q, Σ, Γ, I, δ) such that:
Q is a finite set, the set of states;
Σ is a finite ranked alphabet, called the input alphabet;
Γ is a finite ranked alphabet, called the output alphabet;
I is a subset of Q, the set of initial states; and
is a set of rules of the form , where f is a symbol of Σ, n is the arity of f, q is a state, and u is a tree on Γ and , such pairs being nullary.
Examples of rules and intuitions on semantics
For instance,
is a rule – one customarily writes instead of the pair – and its intuitive semantics is that, under the action of q, a tree with f at the root and three children is transformed into
where, recursively, and are replaced, respectively, with the application of on the first child and
with the application of on the third.
Semantics as term rewriting
The semantics of each state of the transducer T, and of T itself, is a binary relation between input trees (on Σ) and output trees (on Γ).
A way of defining the semantics formally is to see as a term rewriting system, provided that in the right-hand sides the calls are written in the form , where states q are unary symbols. Then the semantics of a state q is given by
The semantics of T is then defined as t |
https://en.wikipedia.org/wiki/Alternating%20caps | Alternating caps, also known as studly caps or sticky caps (where "caps" is short for capital letters), is a form of text notation in which the capitalization of letters varies by some pattern, or arbitrarily (often also omitting spaces between words and occasionally some letters), such as "aLtErNaTiNg cApS", "sTuDlY cApS" or "sTiCKycApS".
History
According to the Jargon File, the origin and significance of the practice is obscure. The term "alternating case" has been used as early as the 1970s, in several studies on word identification.
Arbitrary variation found popularity among adolescent users during the BBS and early WWW eras of online culture, as if in parody of the marginally less idiosyncratic capitalization found in common trade and service marks of the time. This method was extensively used since the 1980s in the BBS-world and warez scene (for example in FILE_ID.DIZ and .nfo files) to show "elite" (or elitist) attitude, the often used variant was using small-caps vowels and capitalised consonants ("THiS iS aN eXCePTioNaLLy eLiTe SeNTeNCe.") or reversed capitals ("eXTENDED kEY gENERERATOR pRO").
The iNiQUiTY BBS software based on Renegade had a feature to support two variants of this automatically: either all vowels would be uppercase or all vowels would be lowercase, with the consonants as the other case.
A meme known as "Mocking SpongeBob" popularized using alternating caps to convey a mocking tone starting in May 2017, leading to alternating caps becoming a mainstream method of conveying mockery in text.
Usage and effect
Alternating caps are typically used to display mockery in text messages.
The randomized capitalization leads to the flow of words being broken, making it harder for the text to be read as it disrupts word identification even when the size of the letters is the same as in uppercase or lowercase.
Unlike the use of all-lowercase letters, which suggests laziness as a motivation, alternating caps requires additional effort to type, ei |
https://en.wikipedia.org/wiki/Padmakar%E2%80%93Ivan%20index | In chemical graph theory, the Padmakar–Ivan (PI) index is a topological index of a molecule, used in biochemistry. The Padmakar–Ivan index is a generalization introduced by Padmakar V. Khadikar and Iván Gutman of the concept of the Wiener index, introduced by Harry Wiener. The Padmakar–Ivan index of a graph G is the sum over all edges uv of G of number of edges which are not equidistant from u and v.
Let G be a graph and e = uv an edge of G. Here denotes the number of edges lying closer to the vertex u than the vertex v, and is the number of edges lying closer to the vertex v than the vertex u. The Padmakar–Ivan index of a graph G is defined as
The PI index is very important in the study of quantitative structure–activity relationship for the classification models used in the chemical, biological sciences, engineering, and nanotechnology.
Examples
The PI index of Dendrimer Nanostar of the following figure can be calculated by |
https://en.wikipedia.org/wiki/Isichthys | Henry's mormyrid (Isichthys henryi) is a species of freshwater elephantfish in the family Mormyridae and the only member of its genus. It occurs in coastal river basins in West Africa, ranging as far southeast as the Kouilou-Niari River in Middle Africa. It reaches a length of about . |
https://en.wikipedia.org/wiki/Deoxyribozyme | Deoxyribozymes, also called DNA enzymes, DNAzymes, or catalytic DNA, are DNA oligonucleotides that are capable of performing a specific chemical reaction, often but not always catalytic. This is similar to the action of other biological enzymes, such as proteins or ribozymes (enzymes composed of RNA).
However, in contrast to the abundance of protein enzymes in biological systems and the discovery of biological ribozymes in the 1980s,
there is only little evidence for naturally occurring deoxyribozymes.
Deoxyribozymes should not be confused with DNA aptamers which are oligonucleotides that selectively bind a target ligand, but do not catalyze a subsequent chemical reaction.
With the exception of ribozymes, nucleic acid molecules within cells primarily serve as storage of genetic information due to its ability to form complementary base pairs, which allows for high-fidelity copying and transfer of genetic information. In contrast, nucleic acid molecules are more limited in their catalytic ability, in comparison to protein enzymes, to just three types of interactions: hydrogen bonding, pi stacking, and metal-ion coordination. This is due to the limited number of functional groups of the nucleic acid monomers: while proteins are built from up to twenty different amino acids with various functional groups, nucleic acids are built from just four chemically similar nucleobases. In addition, DNA lacks the 2'-hydroxyl group found in RNA which limits the catalytic competency of deoxyribozymes even in comparison to ribozymes.
In addition to the inherent inferiority of DNA catalytic activity, the apparent lack of naturally occurring deoxyribozymes may also be due to the primarily double-stranded conformation of DNA in biological systems which would limit its physical flexibility and ability to form tertiary structures, and so would drastically limit the ability of double-stranded DNA to act as a catalyst; though there are a few known instances of biological single-stranded |
https://en.wikipedia.org/wiki/STR%20multiplex%20system | An STR multiplex system is used to identify specific short tandem repeats (STRs). STR polymorphisms are genetic markers that may be used to identify a DNA sequence.
The FBI analyses 13 specific STR loci for their database. These may be used in many areas of genetics in addition to their forensic uses.
One can think of a STR multiplex system as a collection of specific STRs which are positionally conserved on a target genome. Hence these can be used as markers. A number of different STRs along with their loci in a particular genome can be used for genotyping.
For example, the STR multiplex system AmpFlSTR Profiler Plus which analyses nine different STRs (3S1358, vWA, FGA, D8S1179, D21S11, D18S51, D5S818, D13S317, D7S820) plus Amelogenin for sex determination is used for human identification purposes. |
https://en.wikipedia.org/wiki/Mutator%20method | In computer science, a mutator method is a method used to control changes to a variable. They are also widely known as setter methods. Often a setter is accompanied by a getter, which returns the value of the private member variable. They are also known collectively as accessors.
The mutator method is most often used in object-oriented programming, in keeping with the principle of encapsulation. According to this principle, member variables of a class are made private to hide and protect them from other code, and can only be modified by a public member function (the mutator method), which takes the desired new value as a parameter, optionally validates it, and modifies the private member variable. Mutator methods can be compared to assignment operator overloading but they typically appear at different levels of the object hierarchy.
Mutator methods may also be used in non-object-oriented environments. In this case, a reference to the variable to be modified is passed to the mutator, along with the new value. In this scenario, the compiler cannot restrict code from bypassing the mutator method and changing the variable directly. The responsibility falls to the developers to ensure the variable is only modified through the mutator method and not modified directly.
In programming languages that support them, properties offer a convenient alternative without giving up the utility of encapsulation.
In the examples below, a fully implemented mutator method can also validate the input data or take further action such as triggering an event.
Implications
The alternative to defining mutator and accessor methods, or property blocks, is to give the instance variable some visibility other than private and access it directly from outside the objects. Much finer control of access rights can be defined using mutators and accessors. For example, a parameter may be made read-only simply by defining an accessor but not a mutator. The visibility of the two methods may be differen |
https://en.wikipedia.org/wiki/Trifluralin | Trifluralin is a commonly used pre-emergence herbicide. With about used in the United States in 2001, it is one of the most widely used herbicides. Trifluralin is generally applied to the soil to provide control of a variety of annual grass and broadleaf weed species. It inhibits root development by interrupting mitosis, and thus can control weeds as they germinate.
Environmental Regulation
Trifluralin has been banned in the European Union since 20 March 2008, primarily due to high toxicity to aquatic life.
Trifluralin is on the United States Environmental Protection Agency list of Hazardous Air Pollutants as a regulated substance under the Clean Air Act.
Environmental behavior
Trifluralin undergoes an extremely complex fate in the environment and is transiently transformed into many different products as it degrades, ultimately being incorporated into soil-bound residues or converted to carbon dioxide (mineralized). Among the more unusual behaviors of trifluralin is inactivation in wet soils. This has been linked to transformation of the herbicide by reduced soil minerals, which in turn had been previously reduced by soil microorganisms using them as electron acceptors in the absence of oxygen. This environmental degradation process has been reported for many structurally related herbicides (dinitroanilines) as well as a variety of explosives like TNT and picric acid. |
https://en.wikipedia.org/wiki/List%20of%20music%20archivists | This is a list of music archivists.
Music archivists
Santiago Álvarez
Alan Leeds
Alan Lomax
David Marks
Todd Matshikiza
Eleanor Mlotek
Einojuhani Rautavaara
Arthur Warren Darley
Tiny Tim
See also
List of sound archives
List of archivists
Music archivists |
https://en.wikipedia.org/wiki/Flora%20and%20fauna%20of%20Greenland | Although the bulk of its area is covered by ice caps inhospitable to most forms of life, Greenland's terrain and waters support a wide variety of plant and animal species. The northeastern part of the island is the world's largest national park. The flora and fauna of Greenland are strongly susceptible to changes associated with climate change.
The image galleries below link to information related to the flora and fauna of Greenland, including Latin taxonomy, Danish translations, and links to articles in the Danish Wikipedia, which can be helpful when searching for more information.
Flora
310 species of vascular plants were said to be found in Greenland in 2019, including 15 endemic species. Although individual plants can be profuse in favourable situations, relatively few plant species tend to be represented in a given place.
In northern Greenland, the ground is covered with a carpet of mosses and low-lying shrubs such as dwarf willows and crowberries. Flowering plants in the north include yellow poppy, Pedicularis, and Pyrola. Plant life in southern Greenland is more abundant, and certain plants, such as the dwarf birch and willow, may grow several feet high.
The only natural forest in Greenland is found in the Qinngua Valley. The forest consists mainly of downy birch (Betula pubescens) and grey-leaf willow (Salix glauca), growing up to tall, although nine stands of conifers had been cultivated elsewhere by 2007.
Horticulture shows a certain degree of success. Plants such as broccoli, radishes, spinach, leeks, lettuce, turnips, chervil, potatoes and parsley are grown up to considerable latitudes, while the very south of the country also rears asters, Nemophila, mignonette, rhubarb, sorrel and carrots. Over the decade to 2007, the growing season lengthened by as much as three weeks.
In the 13th-century Konungs skuggsjá (King's mirror), it is stated that the old Norsemen tried in vain to raise barley. Recent research from archaeological digs on Greenland |
https://en.wikipedia.org/wiki/Sign%20convention | In physics, a sign convention is a choice of the physical significance of signs (plus or minus) for a set of quantities, in a case where the choice of sign is arbitrary. "Arbitrary" here means that the same physical system can be correctly described using different choices for the signs, as long as one set of definitions is used consistently. The choices made may differ between authors. Disagreement about sign conventions is a frequent source of confusion, frustration, misunderstandings, and even outright errors in scientific work. In general, a sign convention is a special case of a choice of coordinate system for the case of one dimension.
Sometimes, the term "sign convention" is used more broadly to include factors of i and 2π, rather than just choices of sign.
Relativity
Metric signature
In relativity, the metric signature can be either or . (Note that throughout this article we are displaying the signs of the eigenvalues of the metric in the order that presents the timelike component first, followed by the spacelike components). A similar convention is used in higher-dimensional relativistic theories; that is, or . A choice of signature is associated with a variety of names:
:
Timelike convention
Particle physics convention
West coast convention
Mostly minuses
Landau–Lifshitz sign convention.
:
Spacelike convention
Relativity convention
East coast convention
Mostly pluses
Pauli convention
Cataloged below are the choices of various authors of some graduate textbooks:
:
Landau & Lifshitz
Gravitation: an introduction to current research (L. Witten)
Ray D'Inverno, Introducing Einstein's relativity.
:
Misner, Thorne and Wheeler
Spacetime and Geometry: An Introduction to General Relativity (Sean M. Carroll)
General Relativity (Wald) (Note that Wald changes signature to the timelike convention for Chapter 13 only.)
The signature corresponds to the metric tensor:
and gives as the relationship between mass and four momentum
whereas the signature corres |
https://en.wikipedia.org/wiki/Optical%20recording | The history of optical recording can be divided into a few number of distinct major contributions. The pioneers of optical recording worked mostly independently, and their solutions to the
many technical challenges have very distinctive features, such as
reflective disc (Compaan and Kramer)
transparent disc (Gregg)
floppy disc (Russell)
rigid disc (Compaan and Kramer)
focused laser beam for read-out through transparent substrate (Compaan and Kramer).
Gregg 1958
Laserdisc technology, using a transparent disc, was invented by David Paul Gregg in 1958 (and patented in 1970 and 1990). By 1969 Philips had developed a videodisc in reflective mode, which has great advantages over the transparent mode. MCA and Philips decided to join their efforts. They first publicly demonstrated the videodisc in 1972. Laserdisc was first available on the market, in Atlanta, on December 15, 1978, two years after the VHS VCR and four years before the CD, which is based on Laserdisc technology. Philips produced the players and MCA produced the discs. The Philips/MCA cooperation was not successful, and discontinued after a few years. Several of the scientists responsible for the early research (John Winslow, Richard Wilkinson and Ray Dakin) founded Optical Disc Corporation (now ODC Nimbus).
Russell 1965
While working at Pacific Northwest National Laboratory, James Russell invented an optical storage
system for digital audio and video, patenting the concept in 1970.
The earliest patents by Russell, US 3,501,586, and 3,795,902 were filed in 1966, and 1969. respectively. He built prototypes, and the first was operating in 1973.
Russell had found a way to record digital information onto a photosensitive plate in tiny dark spots, each spot one micrometre from centre to centre, with a laser that wrote the binary patterns. Russell's first optical disc was distinctly different from the eventual compact disc product: the disc in the player was not read by laser light. A key characteristic of Russ |
https://en.wikipedia.org/wiki/Creeping%20wave | According to the principle of diffraction, when a wave front passes an obstruction, it spreads out into the shadowed space. A creeping wave in electromagnetism or acoustics is the wave that is diffracted around the shadowed surface of a smooth body such as a sphere.
Creeping waves greatly extend the ground wave propagation of long wavelength (low frequency) radio. They also cause both of a person's ears to hear a sound, rather than only the ear on the side of the head facing the origin of the sound. In radar ranging, the creeping wave return appears to come from behind the target.
Vladimir Fock made important contributions to the understanding and calculation of creeping waves. They are described by Airy functions.
Electromagnetic radiation
Acoustics
Radio |
https://en.wikipedia.org/wiki/Fast%20automatic%20restoration | Fast automatic restoration (FASTAR) is an automated fast response system developed and deployed by American Telephone & Telegraph (AT&T) in 1992 for the centralized restoration of its digital transport network. FASTAR automatically reroutes circuits over a spare protection capacity when a fiber-optic cable failure is detected, hence increasing service availability and reducing the impact of the outages in the network. Similar in operation is real-time restoration (RTR), developed and deployed by MCI and used in the MCI network to minimize the effects of a fiber cut.
Restoration techniques
It is a recovery technique used in computer networks and telecommunication networks such as mesh optical networks, where the backup path (the alternate path that affected traffic takes after a failure condition) and backup channel are computed in real time after the occurrence of a failure. This technique can be broadly classified into two: centralized restoration and distributed restoration.
Centralized restoration techniques
This technique uses a central controller which has access to complete up-to-date and accurate information about the network, the available resources, resources used, the physical topology of the network, the service demands etc. When failure is detected in any part of the network through some failure detection, identification and notification scheme, the central controller calculates a new re-route path around the failure based on the information in its database about the current state of the network. After this new route (backup path) is calculated, the central controller sends out commands to all the affected digital cross-connects to make appropriate reconfigurations to their switching elements in order to implement this new path. FASTAR and RTR restoration systems are examples of systems that use this restoration technique.
Distributed restoration techniques
In this restoration technique, no central controller is used, hence no up-to-date database o |
https://en.wikipedia.org/wiki/Othala | Othala (), also known as ēðel and odal, is a rune that represents the o and œ phonemes in the Elder Futhark and the Anglo-Saxon Futhorc writing systems respectively. Its name is derived from the reconstructed Proto-Germanic *ōþala- "heritage; inheritance, inherited estate". As it does not occur in Younger Futhark, it disappears from the Scandinavian record around the 8th century, however its usage continued in England into the 11th century.
As with other symbols used historically in Europe such as the swastika and Celtic cross, othala has been appropriated by far-right groups such as the Nazi party and neo-Nazis. The rune also continues to be used in non-racist contexts, both in Heathenry and in wider popular culture such as the works of J.R.R. Tolkien.
Name and etymology
The sole attested name of the rune is , meaning "homeland". Based on this, and cognates in other Germanic languages such as and , the can be reconstructed, meaning "ancestral land", "the land owned by one's kin", and by extension "property" or "inheritance". is in turn derived from , meaning "nobility" and "disposition".
Terms derived from are formative elements in some Germanic names, notably Ulrich.
The term "odal" () refers to Scandinavian laws of inheritance which established land rights for families that had owned that parcel of land over a number of generations, restricting its sale to others. Among other aspects, this protected the inheritance rights of daughters against males from outside the immediate family. Some of these laws remain in effect today in Norway as the Odelsrett (allodial right). The tradition of Udal law found in Shetland, Orkney, and the Isle of Man, is from the same origin.
Elder Futhark o-rune
The o-rune is attested early, in inscriptions from the 3rd century, such as the Thorsberg chape (DR7) and the Vimose planer (Vimose-Høvelen, DR 206).
The corresponding Gothic letter is (derived from Greek Ω), which had the name oþal. The othala rune is found in some tr |
https://en.wikipedia.org/wiki/Switched%20Multi-megabit%20Data%20Service | Switched Multi-megabit Data Service (SMDS) was a connectionless service used to connect LANs, MANs and WANs to exchange data, in early 1990s. In Europe, the service was known as Connectionless Broadband Data Service (CBDS).
SMDS was specified by Bellcore, and was based on the IEEE 802.6 metropolitan area network (MAN) standard, as implemented by Bellcore, and used cell relay transport, Distributed Queue Dual Bus layer-2 switching arbitrator, and standard SONET or G.703 as access interfaces.
It is a switching service that provides data transmission in the range between 1.544 Mbit/s (T1 or DS1) to 45 Mbit/s (T3 or DS3). SMDS was developed by Bellcore as an interim service until Asynchronous Transfer Mode matured. SMDS was notable for its initial introduction of the 53-byte cell and cell switching approaches, as well as the method of inserting 53-byte cells onto G.703 and SONET. In the mid-1990s, SMDS was replaced, largely by Frame Relay. |
https://en.wikipedia.org/wiki/Netsci%20Conference | The International School and Conference on Network Science, also called NetSci, is an annual conference focusing on networks. It is organized yearly since 2006 by the Network Science Society. Physicists are especially prominently represented among the participants, though people from other backgrounds attend as well. The study of networks expanded at the end of the twentieth century, with increasing citation of some seminal papers.
Following this increase in interest from the scientific community, network science was examined by the National Research Council (NRC), the arm of the US National Academies in charge of offering policy recommendations to the US government. NRC assembled two panels, resulting in recommendations summarized in two NRC Reports, offering a definition of the field of network science. These reports not only documented the emergence of a new research field, but highlighted the field’s role for science, national competitiveness and security. The NetSci conference series was set up in 2006 to address the need of the new and emerging highly interdisciplinary network science community to meet and exchange ideas. The NetSci conference has been a yearly event since then. In 2015, a shorter regional conference, called NetSci-X, was added.
History
The formal NetSci conference series was preceded by several meetings:
RESEARCH WORKSHOP ON GRAPH THEORY AND STATISTICAL PHYSICS, ICTP Trieste May 22–25 (2000)
XVIII Sitges Conference (2002)
COSIN Project Midterm Conference (2003)
CNLS Annual Conference 2003: Networks, Structure, Dynamics and Function, May 12–26, Santa Fe, New Mexico. Organized by Zoltán Toroczkai, Eli Ben-Naim, Hans Frauenfelder, Pieter Swart, supported by Los Alamos National Laboratory.
Aveiro Conference CNET 2004 August 29–September 2 (2004)
School and Workshop on Structure and Function of Complex Networks, ICTP Trieste May 16–28 (2005)
In 2006 these events became part of an organized structure with one network conference per yea |
https://en.wikipedia.org/wiki/Histamine | Histamine is an organic nitrogenous compound involved in local immune responses communication, as well as regulating physiological functions in the gut and acting as a neurotransmitter for the brain, spinal cord, and uterus. Since histamine was discovered in 1910, it has been considered a local hormone (autocoid) because it lacks the classic endocrine glands to secrete it; however, in recent years, histamine has been recognized as a central neurotransmitter. Histamine is involved in the inflammatory response and has a central role as a mediator of itching. As part of an immune response to foreign pathogens, histamine is produced by basophils and by mast cells found in nearby connective tissues. Histamine increases the permeability of the capillaries to white blood cells and some proteins, to allow them to engage pathogens in the infected tissues. It consists of an imidazole ring attached to an ethylamine chain; under physiological conditions, the amino group of the side-chain is protonated.
Properties
Histamine base, obtained as a mineral oil mull, melts at 83–84 °C. Hydrochloride and phosphorus salts form white hygroscopic crystals and are easily dissolved in water or ethanol, but not in ether. In aqueous solution, the imidazole ring of histamine exists in two tautomeric forms, identified by which of the two nitrogen atoms is protonated. The nitrogen farther away from the side chain is the 'tele' nitrogen and is denoted by a lowercase tau sign and the nitrogen closer to the side chain is the 'pros' nitrogen and is denoted by the pi sign. The tele tautomer, Nτ-H-histamine, is preferred in solution as compared to the pros tautomer, Nπ-H-histamine.
Histamine has two basic centres, namely the aliphatic amino group and whichever nitrogen atom of the imidazole ring does not already have a proton. Under physiological conditions, the aliphatic amino group (having a pKa around 9.4) will be protonated, whereas the second nitrogen of the imidazole ring (pKa ≈ 5.8) will no |
https://en.wikipedia.org/wiki/Western%20Digital%20FD1771 | The FD1771, sometimes WD1771, is the first in a line of floppy disk controllers produced by Western Digital. It uses single density FM encoding introduced in the IBM 3740. Later models in the series added support for MFM encoding and increasingly added onboard circuitry that formerly had to be implemented in external components. Originally packaged as 40-pin dual in-line package (DIP) format, later models moved to a 28-pin format that further lowered implementation costs.
Derivatives
The FD1771 was succeeded by many derivatives that were mostly software-compatible:
The FD1781 was designed for double density, but required external modulation and demodulation circuitry, so it could support MFM, M2FM, GCR or other double-density encodings.
The FD1791-FD1797 series added internal support for double density (MFM) modulation, compatible with the IBM System/34 disk format. They required an external data separator.
The WD1761-WD1767 series were versions of the FD179x series rated for a maximum clock frequency of 1 MHz, resulting in a data rate limit of 125 kbit/s for single density and 250 kbit/s for double density, thus preventing them from being used for 8-in (200 mm) floppy drives or the later "high-density" or 90 mm floppy drives. These were sold at a lower price point and widely used in home computer floppy drives.
The WD2791-WD2797 series added an internal data separator using an analog phase-locked loop, with some external passive components required for the VCO. They took a 1 MHz or 2 MHz clock and were intended for and drives.
The WD1770, WD1772, and WD1773 added an internal digital data separator and write precompensator, eliminating the need for external passive components but raising the clock rate requirement to 8 MHz. They supported double density, despite the apparent regression of the part number, and were packaged in 28-pin DIP packages.
The WD1772PH02-02 was a version of the chip that Atari fitted to the Atari STE which supported high density (500 |
https://en.wikipedia.org/wiki/Antimatter | In modern physics, antimatter is defined as matter composed of the antiparticles (or "partners") of the corresponding particles in "ordinary" matter, and can be thought of as matter with reversed charge, parity, and time, known as CPT reversal. Antimatter occurs in natural processes like cosmic ray collisions and some types of radioactive decay, but only a tiny fraction of these have successfully been bound together in experiments to form antiatoms. Minuscule numbers of antiparticles can be generated at particle accelerators; however, total artificial production has been only a few nanograms. No macroscopic amount of antimatter has ever been assembled due to the extreme cost and difficulty of production and handling. Nonetheless, antimatter is an essential component of widely-available applications related to beta decay, such as positron emission tomography, radiation therapy, and industrial imaging.
In theory, a particle and its antiparticle (for example, a proton and an antiproton) have the same mass, but opposite electric charge, and other differences in quantum numbers.
A collision between any particle and its anti-particle partner leads to their mutual annihilation, giving rise to various proportions of intense photons (gamma rays), neutrinos, and sometimes less-massive particleantiparticle pairs. The majority of the total energy of annihilation emerges in the form of ionizing radiation. If surrounding matter is present, the energy content of this radiation will be absorbed and converted into other forms of energy, such as heat or light. The amount of energy released is usually proportional to the total mass of the collided matter and antimatter, in accordance with the notable mass–energy equivalence equation, .
Antiparticles bind with each other to form antimatter, just as ordinary particles bind to form normal matter. For example, a positron (the antiparticle of the electron) and an antiproton (the antiparticle of the proton) can form an antihydrogen atom. |
https://en.wikipedia.org/wiki/Joy%20Morris | Joy Morris (born 1970) is a Canadian mathematician whose research involves group theory, graph theory, and the connections between the two through Cayley graphs. She is also interested in mathematics education, is the author of two open-access undergraduate mathematics textbooks, and oversees a program in which university mathematics education students provide a drop-in mathematics tutoring service for parents of middle school students. She is a professor of mathematics at the University of Lethbridge.
Education and career
Morris is originally from Toronto, Ontario. Both her parents had doctorates; she was the youngest of their four children, another of whom also earned a Ph.D.. She was educated through various alternative-education and gifted-student programs in the Toronto public school system. She graduated from Trent University in 1992 with a double major in mathematics and English, and with fourth-year honours in mathematics earned in part through a summer research project with Brian Alspach at Simon Fraser University.
She entered graduate study directly after graduating, continuing to work with Alspach at Simon Fraser, and completed her doctorate in 2000 with a dissertation on Isomorphisms of Cayley Graphs. Morris joined the Lethbridge faculty in 2000, and was promoted to full professor in 2015. her position as a professor at Lethbridge was for half-time.
Mathematics education
In 2017, after learning about the frustrating experiences of her middle-school daughter's friends' parents, Morris founded a drop-in mathematics tutoring center through the University of Lethbridge, in which Lethbridge mathematics education students would tutor middle-school parents on the mathematics their children were learning, and provide educational activities for the parents to do with their children. The program was successful, and has continued in subsequent years.
Other contributions
Morris's results in groups, graphs, and the symmetries of groups and graphs include a proof |
https://en.wikipedia.org/wiki/XIAP | X-linked inhibitor of apoptosis protein (XIAP), also known as inhibitor of apoptosis protein 3 (IAP3) and baculoviral IAP repeat-containing protein 4 (BIRC4), is a protein that stops apoptotic cell death. In humans, this protein (XIAP) is produced by a gene named XIAP gene located on the X chromosome.
XIAP is a member of the inhibitor of apoptosis family of proteins (IAP). IAPs were initially identified in baculoviruses, but XIAP is one of the homologous proteins found in mammals. It is so called because it was first discovered by a 273 base pair site on the X chromosome. The protein is also called human IAP-like Protein (hILP), because it is not as well conserved as the human IAPS: hIAP-1 and hIAP-2. XIAP is the most potent human IAP protein currently identified.
Discovery
Neuronal apoptosis inhibitor protein (NAIP) was the first homolog to baculoviral IAPs that was identified in humans. With the sequencing data of NIAP, the gene sequence for a RING zinc-finger domain was discovered at site Xq24-25. Using PCR and cloning, three BIR domains and a RING finger were found on the protein, which became known as X-linked Inhibitor of Apoptosis Protein. The transcript size of Xiap is 9.0kb, with an open reading frame of 1.8kb. Xiap mRNA has been observed in all human adult and fetal tissues "except peripheral blood leukocytes". The XIAP sequences led to the discovery of other members of the IAP family.
Structure
XIAP consists of three major types of structural elements (domains). Firstly, there is the baculoviral IAP repeat (BIR) domain consisting of approximately 70 amino acids, which characterizes all IAP. Secondly, there is a UBA domain, which allows XIAP to bind to ubiquitin. Thirdly, there is a zinc-binding domain, or a "carboxy-terminal RING Finger". XIAP has been characterized with three amino-terminal BIR domains followed by one UBA domain and finally one RING domain. Between the BIR-1 and BIR-2 domains, there is a linker-BIR-2 region that is thought to contai |
https://en.wikipedia.org/wiki/Layered%20permutation | In the mathematics of permutations, a layered permutation is a permutation that reverses contiguous blocks of elements. Equivalently, it is the direct sum of decreasing permutations.
One of the earlier works establishing the significance of layered permutations was , which established the Stanley–Wilf conjecture for classes of permutations forbidding a layered permutation, before the conjecture was proven more generally.
Example
For instance, the layered permutations of length four, with the reversed blocks separated by spaces, are the eight permutations
1 2 3 4
1 2 43
1 32 4
1 432
21 3 4
21 43
321 4
4321
Characterization by forbidden patterns
The layered permutations can also be equivalently described as the permutations that do not contain the permutation patterns 231 or 312. That is, no three elements in the permutation (regardless of whether they are consecutive) have the same ordering as either of these forbidden triples.
Enumeration
A layered permutation on the numbers from to can be uniquely described by the subset of the numbers from to that are the first element in a reversed block. (The number is always the first element in its reversed block, so it is redundant for this description.) Because there are subsets of the numbers from to , there are also layered permutation of length .
The layered permutations are Wilf equivalent to other permutation classes, meaning that the numbers of permutations of each length are the same. For instance, the Gilbreath permutations are counted by the same function .
Superpatterns
The shortest superpattern of the layered permutations of length is itself a layered permutation. Its length is a sorting number, the number of comparisons needed for binary insertion sort to sort elements. For these numbers are
1, 3, 5, 8, 11, 14, 17, 21, 25, 29, 33, 37, ...
and in general they are given by the formula
Related permutation classes
Every layered permutation is an involution. They are exactly the 231-avoiding involut |
https://en.wikipedia.org/wiki/Gauge%20theory | In physics, a gauge theory is a type of field theory in which the Lagrangian, and hence the dynamics of the system itself, do not change under local transformations according to certain smooth families of operations (Lie groups). Formally, the Lagrangian is invariant.
The term gauge refers to any specific mathematical formalism to regulate redundant degrees of freedom in the Lagrangian of a physical system. The transformations between possible gauges, called gauge transformations, form a Lie group—referred to as the symmetry group or the gauge group of the theory. Associated with any Lie group is the Lie algebra of group generators. For each group generator there necessarily arises a corresponding field (usually a vector field) called the gauge field. Gauge fields are included in the Lagrangian to ensure its invariance under the local group transformations (called gauge invariance). When such a theory is quantized, the quanta of the gauge fields are called gauge bosons. If the symmetry group is non-commutative, then the gauge theory is referred to as non-abelian gauge theory, the usual example being the Yang–Mills theory.
Many powerful theories in physics are described by Lagrangians that are invariant under some symmetry transformation groups. When they are invariant under a transformation identically performed at every point in the spacetime in which the physical processes occur, they are said to have a global symmetry. Local symmetry, the cornerstone of gauge theories, is a stronger constraint. In fact, a global symmetry is just a local symmetry whose group's parameters are fixed in spacetime (the same way a constant value can be understood as a function of a certain parameter, the output of which is always the same).
Gauge theories are important as the successful field theories explaining the dynamics of elementary particles. Quantum electrodynamics is an abelian gauge theory with the symmetry group U(1) and has one gauge field, the electromagnetic four-pote |
https://en.wikipedia.org/wiki/Penicillic%20acid | Penicillic acid is a mycotoxin that is produced by Aspergillus flavus and Penicillium roqueforti mold. It is also the major product of acid degradation of penicillin. Its first practical synthesis was reported in 1947 by Ralph Raphael, who had worked on penicillin during World War II. |
https://en.wikipedia.org/wiki/RV%20Calypso | RV Calypso is a former British Royal Navy minesweeper converted into a research vessel for the oceanographic researcher Jacques Cousteau, equipped with a mobile laboratory for underwater field research. She was severely damaged in 1996 and was planned to undergo a complete refurbishment in 2009–2011 that has not been accomplished. The ship is named after the Greek mythological figure Calypso.
World War II British minesweeper (1941–1947)
Calypso was originally a minesweeper built by the Ballard Marine Railway Company of Seattle, Washington, United States for the United States Navy for loan to the British Royal Navy under lend-lease. A wooden-hulled vessel, she is built of Oregon pine.
She was a British yard minesweeper (BYMS) Mark 1 class motor minesweeper, laid down on 12 August 1941 with yard designation BYMS-26 and launched on 21 March 1942. She was commissioned into the Royal Navy in February 1943 as HMS J-826 and assigned to active service in the Mediterranean Sea, based in Malta, and was reclassified as BYMS-2026 in 1944. Following the end of World War II, she was decommissioned in July 1946 and laid up at Malta. On 1 August 1947 she was formally handed back to the US Navy and then struck from the US Naval Register, remaining in lay-up.
Maltese ferry (1949–1950)
In May 1949 she was bought by Joseph Gasan of Malta, who had secured the mail contract on the ferry route between Marfa, in the north of Malta, and Mġarr, Gozo in 1947. She was converted to a ferry and renamed Calypso G after the nymph Calypso, whose island of Ogygia was mythically associated with Gozo, entering service in March 1950. After only four months on the route, Gasan received an attractive offer and sold her.
Jacques Cousteau's Calypso (1950–1997)
The British millionaire and former Member of Parliament (MP), Thomas Loel Guinness bought Calypso in July 1950. He leased her to Cousteau for a symbolic one franc a year. He had two conditions: that Cousteau never ask him for money and that he |
https://en.wikipedia.org/wiki/AbsoluteTelnet | AbsoluteTelnet is a software terminal client for Windows that implements Telnet, SSH 1 and 2, SFTP, TAPI Dialup and direct COM port connections. It is commercial software, originally released in 1999 and is still in regular development by Brian Pence of Celestial Software.
Features
Some features of AbsoluteTelnet:
Emulates VT52, VT100, VT220, VT320, ANSI, Xterm, QNX, SCO-ANSI, ANSIBBS, and WYSE60
Password, Public-key, keyboard-interactive, Smartcard and GSSAPI authentication support
Support Triple DES, TWOFISH, BLOWFISH, AES, ARCFOUR, CAST128 ciphers
Tabbed interface for multiple concurrent connections (Dockable)
Scripting support using VBScript
Unicode 5.0 support, including bidirectional text, surrogates, combining characters, etc...
Passthru printing
IPv6 support.
IDNA support
Localized into 8 languages (English, German, French, Portuguese, Chinese, Russian, Norwegian and Hungarian)
Pocket PC support
XMODEM, YMODEM, and ZMODEM file transfer for all terminal connections
SSH File Transfer Protocol for ssh2 connections only
See also
Comparison of SSH clients |
https://en.wikipedia.org/wiki/Microdissection | Microdissection refers to a variety of techniques where a microscope is used to assist in dissection.
Different kinds of techniques involve microdissection:
Chromosome microdissection — use of fine glass needle under a microscope to remove a portion from a complete chromosome.
Laser microdissection — use of a laser through a microscope to dissect selected cells.
Laser capture microdissection — use of a laser through a microscope to cause selected cells to adhere to a film.
Microscopy |
https://en.wikipedia.org/wiki/Spin%20Hall%20effect | The spin Hall effect (SHE) is a transport phenomenon predicted by Russian physicists Mikhail I. Dyakonov and Vladimir I. Perel in 1971. It consists of the appearance of spin accumulation on the lateral surfaces of an electric current-carrying sample, the signs of the spin directions being opposite on the opposing boundaries. In a cylindrical wire, the current-induced surface spins will wind around the wire. When the current direction is reversed, the directions of spin orientation is also reversed.
Definition
The spin Hall effect is a transport phenomenon consisting of the appearance of spin accumulation on the lateral surfaces of a sample carrying electric current. The opposing surface boundaries will have spins of opposite sign. It is analogous to the classical Hall effect, where charges of opposite sign appear on the opposing lateral surfaces in an electric-current carrying sample in a magnetic field. In the case of the classical Hall effect the charge build up at the boundaries is in compensation for the Lorentz force acting on the charge carriers in the sample due to the magnetic field. No magnetic field is needed for the spin Hall effect which is a purely spin-based phenomenon. The spin Hall effect belongs to the same family as the anomalous Hall effect, known for a long time in ferromagnets, which also originates from spin–orbit interaction.
History
The spin Hall effect (direct and inverse) was predicted by Russian physicists Mikhail I. Dyakonov and Vladimir I. Perel in 1971. They also introduced for the first time the notion of spin current.
In 1983 Averkiev and Dyakonov proposed a way to measure the inverse spin Hall effect under optical spin orientation in semiconductors. The first experimental demonstration of the inverse spin Hall effect, based on this idea, was performed by Bakun et al. in 1984
The term "spin Hall effect" was introduced by Hirsch who re-predicted this effect in 1999.
Experimentally, the (direct) spin Hall effect was observed i |
https://en.wikipedia.org/wiki/Sunday%20Night%20Football%20%28Australian%20TV%20program%29 | Sunday Night Football is an Australian rules football sports broadcast television program that aired on the Seven Network on 28 April 1991 until 9 April 2000. It was returned to broadcast on Seven from 6 April 2014 until 29 June 2014 in VIC, SA, WA, TAS, and on 7mate from 6 April 2014 to 29 June 2014 in NSW & QLD.
See also
Friday Night Football
Saturday Night Footy
Seven Sport § Australian rules football |
https://en.wikipedia.org/wiki/Road%20Fighter | is a racing arcade video game developed by Konami and released in 1984, and it was the first racing game from the company. The goal is to reach the finish line within the stages without running out of time, hitting other cars or running out of fuel (which is refilled by hitting a special type of car). The game spawned a spiritual successor, Konami GT (1986), and two sequels, Midnight Run: Road Fighter 2 (1995) and Winding Heat (1996). A Japan-only sequel was also released 14 years later, Road Fighters (2010).
Gameplay
The first two levels contain four courses, ranging from grassy plains to an over-water bridge to a seashore, mountains and finally a forest area. In the arcade version, six stages were contained. The player controls a red Chevrolet Corvette and pressing the B accelerates the car to around 224 km/h while the A button increases it to 400. The player has a limited amount of fuel points (equal to about 100 seconds) and can earn more by touching special multi-colored cars. If the player collides into any other car or slips on occasionally appearing patches of oil, the car will spin out and if not corrected, may crash into the side barriers, causing a loss of five to six fuel points. The NES and Famicom versions have a total of six types of opponents, one yellow and red, three blue and one truck. Yellow cars travel along a straight line and occur in large numbers. Red cars are less likely to appear, but they will change the lane they are travelling in once to get in the way of the player. Blue cars are the game's main "enemies"; they vary in the way they change their lane and attempts to hit the player. Trucks go on a straight way, but colliding with them instantly destroys the player's car. Konami Man will make a cameo appearance, flying by the side of the road if the player progresses to a certain point in the level without crashing (not included on course two in NES and Famicom versions).
Ports
The game was later released for the MSX home computer syste |
https://en.wikipedia.org/wiki/Rasti | Rasti (from the German verb rasten, 'to secure, to place firmly, to lock into its place') is a construction toy made from plastic—similar to Lego, Tente and Mis Ladrillos—that was produced by the Knittax company in Argentina during the 1960s and 1970s. In 2007, manufacturing began again.
Construction
The construction method allows a person to join pieces by using medium pressure, locking the blocks together with semi-rigid plastic pins which, unlike other brands, avoids the friction and wear between the pins and the internal surfaces of the blocks. This system avoids the fragility and instability in the models, giving a solidity unequaled by any other construction toy or block system.
Quality
Rasti bricks produced in Argentina were made with quality in mind—the shafts were made in chromed steel with plastic points for joining the special holes in the wheels. They were so popular that the word "Rasti" became regularly used as a synonym for durability and stability. It is still used as a generic noun to describe objects that can be assembled or disassembled in pieces saying: "it was broken like a Rasti" or "you can build it like a Rasti".
Having obtained a considerable popularity in the toy market in Argentina, the "Rasti" was exported to countries such as Canada and Germany (from where the original design came) until its production moved to Brazil where its license was granted to the company Hering, and remained in production for several years. The brand "Rasti" comes from the German word "rasten" which means "to affirm, to establish giving solidity, stability and firmness" which is the concept that governs the plastic blocks that are embedded to form various structures.
In Europe, with new colors but with the same basic pieces, the Rasti still continued selling after the new millennium (“Rasti-2000”).
Between the more popular sets sold in the Rasti decades, there are the Minibox 600, the Multibox 800, the technical kits 501 and 502, the three variants of Rasti |
https://en.wikipedia.org/wiki/Computable%20model%20theory | Computable model theory is a branch of model theory which deals with questions of computability as they apply to model-theoretical structures.
Computable model theory introduces the ideas of computable and decidable models and theories and one of the basic problems is discovering whether or not computable or decidable models fulfilling certain model-theoretic conditions can be shown to exist.
Computable model theory was developed almost simultaneously by mathematicians in the West, primarily located in the United States and Australia, and Soviet Russia during the middle of the 20th century. Because of the Cold War there was little communication between these two groups and so a number of important results were discovered independently.
See also
Vaught conjecture |
https://en.wikipedia.org/wiki/1984%20Rajneeshee%20bioterror%20attack | In 1984, 751 people suffered food poisoning in The Dalles, Oregon, United States, due to the deliberate contamination of salad bars at ten local restaurants with Salmonella. A group of prominent followers of Rajneesh (later known as Osho) led by Ma Anand Sheela had hoped to incapacitate the voting population of the city so that their own candidates would win the 1984 Wasco County elections. The incident was the first and is still the single largest bioterrorist attack in U.S. history.
Rajneesh's followers had previously gained political control of Antelope, Oregon, as they were based in the nearby intentional community of Rajneeshpuram, and they now sought election to two of the three seats on the Wasco County Circuit Court that were up for election in November 1984. Some Rajneeshpuram officials feared that they would not get enough votes, so they decided to incapacitate voters in The Dalles, the largest population center in Wasco County. The chosen biological agent was Salmonella enterica Typhimurium, which was first delivered through glasses of water to two county commissioners and then at salad bars and in salad dressing.
As a result of the attack, 751 people contracted salmonellosis, 45 of whom were hospitalized, but none died. An initial investigation by the Oregon Health Authority and the Centers for Disease Control did not rule out deliberate contamination, and the agents and contamination were confirmed a year later, on February 28, 1985. Congressman James H. Weaver gave a speech in the U.S. House of Representatives in which he "accused the Rajneeshees of sprinkling Salmonella culture on salad bar ingredients in eight restaurants".
At a press conference in September 1985, Rajneesh accused several of his followers of participation in this and other crimes, including an aborted plan in 1985 to assassinate a United States Attorney, and he asked state and federal authorities to investigate. Oregon Attorney General David B. Frohnmayer set up an inter-agency ta |
https://en.wikipedia.org/wiki/RelayOne | RelayOne was an email to postal system service run by the United Kingdom's Royal Mail from 1998 until 2000. The company described it as a modern-day telegram.
History
The service launched in the United Kingdom in March 1998, and by April of that year a United States launch was planned.
The service cost £1.50 per page and up to 50 pages could be sent for £5, which was a significantly higher cost than the price of a postage stamp. The email would be printed by Royal Mail in London and then posted in the regular mail to its recipient. A selection of greeting cards that could be printed was later added. Royal Mail's technology partner for the system was the American company, Microsoft.
The service was discontinued in 2000 as it was not commercially viable. At the time of its demise RelayOne was handling 400 items a month. Royal Mail did however continue to support a similar free of charge system used to send mail to the British Armed Forces serving overseas. |
https://en.wikipedia.org/wiki/Point%20shooting | Point shooting (also known as target- or threat-focused shooting, intuitive shooting, instinctive shooting, subconscious tactical shooting, or hipfiring) is a practical shooting method where the shooter points a ranged weapon (typically a revolver or semiautomatic pistol) at a target without relying on the use of sights to aim. Emphasis is placed on fast draw and trying to score preemptive hits first. In close quarters combat, where life-threatening situations emerge very quickly, sighted marksmanship techniques become risky, so advocates of point shooting emphasize a less sighting-oriented style that prioritizes the tactical advantages of quick fire superiority and suppression.
Point shooting is also a technique used by trained archers and marksmen to improve general accuracy when using a bow, crossbow, firearm or other ranged weapon. By developing a muscle memory for a given weapon, the shooter can become so accustomed to the weapon's weight and balance in its typical shooting position as to remain relatively accurate without needing to focus on the sights to aim. With sustained practice, a shooter can develop a subconscious hand-eye coordination utilizing proprioceptive reflex, minimizing the concentration required for effective shooting.
Overview
One point shooting method, referred to as aimed point shooting, has been used and discussed since the early 19th century. The method employs the use of the index finger along the side of the gun to aim the gun, and the middle finger is used to pull the trigger. Mention of the use of the middle finger can be found in books from the early 1800s up through the 20th century: 1804, 1810, 1816, 1829 1835, 1885, 1898, 1900, 1908, 1912, and in many other military manuals on the M1911.
The United States Army's first instructional manual on the use of the M1911 pistol specifically mentions it, but in a cautionary way due to the design of the slide stop: the slide stop pin protrudes out from the right side of the pistol, and if |
https://en.wikipedia.org/wiki/Els%20Enfarinats | The annual festival of Els Enfarinats () takes place in the town of Ibi in Alicante, Spain on December 28, as part of celebrations related to the Day of the Innocents. Els enfarinats comes from the Valencian word for "breading", and roughly translate to "the breaded ones" or "the floured ones". In the day-long festival, participants known as the Els Enfarinats dress in mock military dress and stage a mock coup d'état. Meanwhile, the Casats i Fadrins, accompanied by a band of street musicians called the Rondalla, known by the name of Sonet, Xirimita and Tabal, tour the city.
At 8am, the Els Enfarinats take the city under the slogan "New Justice", and at 9am the Race for Mayor will take place in which it will be decided who is to be Mayor of the Els Enfarinats. Then, the act of L'Aixavegó is carried out in the Plaça de l'Església (Church Square), where the Els Enfarinats reside. Here, it is decided that those who do not pay the fine will go to jail. At midday, a collection called the Arreplegada dels Enfarinats takes place through the streets of the old quarter and of the city centre of Ibi, terminating in the Sant Joaquim Sanctuary. They exercise their authority under a blaze of fireworks, flour bombs and eggs. At five o'clock in the afternoon the authority of Els Enfarinats comes to an end giving way to the celebration of the traditional Dansà.
The tradition is over 200 years old. |
https://en.wikipedia.org/wiki/George%20Gamow | George Gamow (born Georgiy Antonovich Gamov (); March 4, 1904 – August 19, 1968) was a Soviet and American polymath, theoretical physicist and cosmologist. He was an early advocate and developer of Lemaître's Big Bang theory. He discovered a theoretical explanation of alpha decay by quantum tunneling, invented the liquid drop model and the first mathematical model of the atomic nucleus, and worked on radioactive decay, star formation, stellar nucleosynthesis and Big Bang nucleosynthesis (which he collectively called nucleocosmogenesis), and molecular genetics.
In his middle and late career, Gamow directed much of his attention to teaching and wrote popular books on science, including One Two Three... Infinity and the Mr Tompkins series of books (1939–1967). Some of his books are still in print more than a half-century after their original publication.
Early life and career
Gamow was born in Odessa, Russian Empire (now Odesa, Ukraine). His father taught Russian language and literature in high school, and his mother taught geography and history at a school for girls. In addition to Russian, Gamow learned to speak some French from his mother and German from a tutor. Gamow learned English in his college years and became fluent. Most of his early publications were in German or Russian, but he later used English for both technical papers and for the lay audience.
He was educated at the Institute of Physics and Mathematics in Odessa (1922–23) and at the University of Leningrad (1923–1929). Gamow studied under Alexander Friedmann in Leningrad, until Friedmann's early death in 1925, which required him to change dissertation advisors. At the university, Gamow made friends with three other students of theoretical physics, Lev Landau, Dmitri Ivanenko, and Matvey Bronshtein. The four formed a group they called the Three Musketeers, which met to discuss and analyze the ground-breaking papers on quantum mechanics published during those years. He later used the same phrase to |
https://en.wikipedia.org/wiki/Social%20statistics | Social statistics is the use of statistical measurement systems to study human behavior in a social environment. This can be accomplished through polling a group of people, evaluating a subset of data obtained about a group of people, or by observation and statistical analysis of a set of data that relates to people and their behaviors.
Statistics in the social sciences
History
Adolph Quetelet was a proponent of social physics. In his book Physique sociale he presents distributions of human heights, age of marriage, time of birth and death, time series of human marriages, births and deaths, a survival density for humans and curve describing fecundity as a function of age. He also developed the Quetelet Index.
Francis Ysidro Edgeworth published "On Methods of Ascertaining Variations in the Rate of Births, Deaths, and Marriages" in 1885 which uses squares of differences for studying fluctuations and George Udny Yule published "On the Correlation of total Pauperism with Proportion of Out-Relief" in 1895.
A numerical calibration for the fertility curve was given by Karl Pearson in 1897 in his "The Chances of Death, and Other Studies in Evolution" In this book Pearson also uses standard deviation, correlation and skewness for studying humans.
Vilfredo Pareto published his analysis of the distribution of income in Great Britain and Ireland in 1897, this is now known as the Pareto principle.
Louis Guttman proposed that the values of ordinal variables can be represented by a Guttman scale, which is useful if the number of variables is large and allows the use of techniques such as ordinary least squares.
Macroeconomic statistical research has provided stylized facts, which include:
Bowley's law (1937) regarding the proportion between wages and national output
The Phillips curve (1958) regarding the relation between wages and unemployment
Statistics and statistical analyses have become a key feature of social science: statistics is employed in economics, psychol |
https://en.wikipedia.org/wiki/Huff%20model | In spatial analysis, the Huff model is a widely used tool for predicting the probability of a consumer visiting a site, as a function of the distance of the site, its attractiveness, and the relative attractiveness of alternatives. It was formulated by David Huff in 1963. It is used in marketing, economics, retail research and urban planning, and is implemented in several commercially available GIS systems.
Its relative ease of use and applicability to a wide range of problems contribute to its enduring appeal.
The formula is given as:
where :
is a measure of attractiveness of store j
is the distance from the consumer's location, i, to store j.
is an attractiveness parameter
is a distance decay parameter
is the total number of stores, including store j |
https://en.wikipedia.org/wiki/Specific%20ion%20interaction%20theory | In theoretical chemistry, Specific ion Interaction Theory (SIT theory) is a theory used to estimate single-ion activity coefficients in electrolyte solutions at relatively high concentrations. It does so by taking into consideration interaction coefficients between the various ions present in solution. Interaction coefficients are determined from equilibrium constant values obtained with solutions at various ionic strengths. The determination of SIT interaction coefficients also yields the value of the equilibrium constant at infinite dilution.
Background
The need for this theory arises from the need to derive activity coefficients of solutes when their concentrations are too high to be predicted accurately by Debye–Hückel theory. These activity coefficients are needed because an equilibrium constant is defined in thermodynamics as a quotient of activities but is usually measured using concentrations. The protonation of a monobasic acid will be used to simplify the exposition. The equilibrium for protonation of the conjugate base, A− of the acid, may be written as
H+ + A- <=> HA
for which
where {HA} signifies an activity of the chemical species HA etc.. The role of water in the equilibrium has been ignored as in all but the most concentrated solutions the activity of water is a constant. K is defined here as an association constant, the reciprocal of an acid dissociation constant.
Each activity term can be expressed as the product of a concentration and an activity coefficient. For example,
where the square brackets signify a concentration and γ is an activity coefficient. Thus the equilibrium constant can be expressed as a product of a concentration quotient and an activity coefficient quotient.
Taking logarithms.
K0 is the hypothetical value that the equilibrium constant would have if the solution of the acid were so dilute that the activity coefficients were all equal to one.
It is common practise to determine equilibrium constants in solutions conta |
https://en.wikipedia.org/wiki/National%20Archive%20of%20Computerized%20Data%20on%20Aging | The National Archive of Computerized Data on Aging (NACDA), located within ICPSR, is funded by the National Institute on Aging (NIA). NACDA's mission is to advance research on aging by helping researchers to profit from the under-exploited potential of a broad range of datasets.
NACDA acquires and preserves data relevant to gerontological research, processing as needed to promote effective research use, disseminates them to researchers, and facilitates their use. By preserving and making available the largest library of electronic data on aging in the United States, NACDA offers opportunities for secondary analysis on major issues of scientific and policy relevance.
Description
A program within the Inter-university Consortium for Political and Social Research (ICPSR) at the University of Michigan. The NACDA collection consists of over sixteen hundred datasets relevant to gerontological research and represents the world's largest collection of publicly available research data on the aging lifecourse.
History
The NACDA Program on Aging began over 45 years ago under the sponsorship of the United States Administration on Aging (AoA). At that time NACDA was seen as a novel experiment - neither the concept of a research archive devoted to aging issues nor the idea of making research data freely available to the public were well established. Over the years, NACDA’s mission has changed both in scope and in direction. Originally conceived as a storehouse for data, NACDA has aggressively pursued a role of increasing involvement in the research community by actively promoting and distributing data. In 1984, the NIA became the sponsor of the National Archive of Computerized Data on Aging, and NACDA has flourished under its support. Over the years, NACDA has evolved and grown in response to changes in technology. In many instances, leading the pace of change in methodology related to the storage, protection, and distribution of data.
NACDA was one of the first organizations |
https://en.wikipedia.org/wiki/Pharmaceutical%20formulation | Pharmaceutical formulation, in pharmaceutics, is the process in which different chemical substances, including the active drug, are combined to produce a final medicinal product. The word formulation is often used in a way that includes dosage form.
Stages and timeline
Formulation studies involve developing a preparation of the drug which is both stable and acceptable to the patients. For orally administered drugs, this usually involves incorporating the drug into a tablet or a capsule. It is important to make the distinction that a tablet contains a variety of other potentially inert substances apart from the drug itself, and studies have to be carried out to ensure that the encapsulated drug is compatible with these other substances in a way that does not cause harm, whether direct or indirect.
Preformulation involves the characterization of a drug's physical, chemical, and mechanical properties in order to choose what other ingredients (excipients) should be used in the preparation. In dealing with protein pre-formulation, the important aspect is to understand the solution behavior of a given protein under a variety of stress conditions such as freeze/thaw, temperature, shear stress among others to identify mechanisms of degradation and therefore its mitigation.
Formulation studies then consider such factors as particle size, polymorphism, pH, and solubility, as all of these can influence bioavailability and hence the activity of a drug. The drug must be combined with inactive ingredients by a method that ensures that the quantity of drug present is consistent in each dosage unit e.g. each tablet. The dosage should have a uniform appearance, with an acceptable taste, tablet hardness, and capsule disintegration.
It is unlikely that formulation studies will be complete by the time clinical trials commence. This means that simple preparations are developed initially for use in phase I clinical trials. These typically consist of hand-filled capsules containing a |
https://en.wikipedia.org/wiki/IDEDOS | IDEDOS is a ROM-based disk operating system written in 6502/65816 assembly language for the Commodore 64, 128 and SuperCPU. Its main purpose is to control ATA(PI) devices connected to an IDE64 cartridge and present them like normal Commodore drives. Additionally it supports networked drives (PCLink) and has a built-in machine code monitor and file manager.
Architecture
The C64 KERNAL uses a vector table at page 3 to allow redirection of common kernal file handling and basic functions. This feature is used by IDEDOS to hook into the C64 kernal.
The operating system itself is divided into four pages of 16 KiB which are mapped in when required. The mapping is temporarily switched off while interrupts are running for increased compatibility, however this causes a ≈40 μs latency.
Additional RAM for buffers and internal data are also mapped in from either the IDE64 cartridge (28 KiB) or the additional RAM of the SuperCPU is used. The standard kernal memory locations at page zero and page two are handled in a kernal-compatible way; temporarily used memory is restored after the routines are finished.
Beyond the kernal table IDEDOS has two new calls for bulk data handling (read/write) which allows much faster data transfer rates than the character-based I/O.
The native file system is non-CBM style at the low level to allow partitions greater than 16 MiB. High-level features like the 16-character filenames or filetypes are retained. Due to complexity and memory requirements, the filesystem creation and consistency check is not part of the operating system, unlike CBM DOS or CMD DOS.
Additional filesystems like ISO 9660 or FAT are abstracted internally and mostly use the same routines for handling, thereby little difference is noticeable to user programs, except if some features are not fully implemented.
The device handling is done by additional device numbers assigned to the new devices. The device numbers for IDEDOS devices are configurable and is normally in the ran |
https://en.wikipedia.org/wiki/VoFR | Voice over Frame Relay (VoFR) is a protocol to transfer voice over Frame Relay networks. VoFR uses two sub-protocols, FRF.11 and FRF.12. FRF.11 defines the frame format of VoFR, and FRF.12 is used for packet fragmentation and reassembly. |
https://en.wikipedia.org/wiki/Nofence | Nofence is a Norwegian company that makes GPS collars for farm animals (cattle, sheep, and goats) that discourage them from crossing virtual fences.
Oscar Hovde Berntsen has been working on the idea of virtual fencing, as an alternative to fixed electric fencing, since the 1990s. Nofence was incorporated in 2011. In 2016, there was a pilot project in Norway with 850 goats. In 2017, the Norwegian Food Safety Authority approved the use of Nofence for goats; and for cattle and sheep in 2020. Nofence are based in Batnfjordsøra, Norway, with a UK office in Warwick.
The solar-powered collars play a melody "ringtone" when the animal reaches the fence, and if they continue, a small shock, as with an electric fence. Farmers can use a mobile app to change boundaries throughout the day, and avoid over-grazing. Fenceless grazing is being supported by conservationists and farmers, particularly in sensitive areas or difficult upland areas where physical fencing would be impractical, expensive or inappropriate.
In September 2020, The Times reported that trials were being conducted at six sites in the UK, including Epping Forest, in Essex.
In December 2020, Nofence stated that 17,000 collars were in use in Norway. |
https://en.wikipedia.org/wiki/Herpetosiphon%20giganteus | Herpetosiphon giganteus is a species of bacteria in the genus Herpetosiphon known to produce 16 restriction enzymes.
H. giganteus has been studied for its gliding motility. |
https://en.wikipedia.org/wiki/Child%20Development%20Index | The Child Development Index (CDI) is an index combining performance measures specific to children—education, health and nutrition—to produce a score on a scale of 0 to 100. A zero score would be the best. The higher the score, the worse children are faring.
The Child Development Index was developed by Save the Children UK in 2008 through the contributions of Terry McKinley, Director of the Centre for Development Policy and Research at the School of Oriental and African Studies (SOAS), University of London, with support from Katerina Kyrili.
The indicators which make up the index were chosen because they are easily available, commonly understood, and clearly indicative of child wellbeing. The three indicators are:
Health: the under-five mortality rate (the probability of dying between birth and five years of age, expressed as a percentage on a scale of 0 to 340 deaths per 1,000 live births). This means that a zero score in this component equals an under-five mortality rate of 0 deaths per 1,000 live births, and a score of 100 equals our upper bound of 340 deaths per 1,000 live births. The upper bound is higher than any country has ever reached; Niger came the closest in the 1990s with 320 under-five deaths per 1,000 live births.
Nutrition: the percentage of under fives who are moderately or severely underweight. The common definition of moderately or severely underweight, which we use here, is being below two standard deviations of the median weight for age of the reference population.
Education: the percentage of primary school-age children who are not enrolled in school. For our measure of education deprivation, we use the opposite of the Net Primary Enrolment rate—i.e., 100—the NER. This gives us the percentage of primary school-age children who are not enrolled.
What does the Child Development Index tell us about how children are faring around the world?
Are some countries making good progress in improving child well-being? Is it getting worse in other c |
https://en.wikipedia.org/wiki/Teleradiology | Teleradiology is the transmission of radiological patient images from procedures such as x-rays photographs, Computed tomography (CT), and MRI imaging, from one location to another for the purposes of sharing studies with other radiologists and physicians. Teleradiology allows radiologists to provide services without actually having to be at the location of the patient. This is particularly important when a sub-specialist such as an MRI radiologist, neuroradiologist, pediatric radiologist, or musculoskeletal radiologist is needed, since these professionals are generally only located in large metropolitan areas working during daytime hours. Teleradiology allows for specialists to be available at all times.
Teleradiology utilizes standard network technologies such as the Internet, telephone lines, wide area networks, local area networks (LAN) and the latest advanced technologies such as medical cloud computing. Specialized software is used to transmit the images and enable the radiologist to effectively analyze potentially hundreds of images of a given study. Technologies such as advanced graphics processing, voice recognition, artificial intelligence, and image compression are often used in teleradiology. Through teleradiology and mobile DICOM viewers, images can be sent to another part of the hospital or to other locations around the world with equal effort.
Teleradiology is a growth technology given that imaging procedures are growing approximately 15% annually against an increase of only 2% in the radiologist population.
Reports
Teleradiologists can provide a preliminary read for emergency room cases and other emergency cases or a final read for the official patient record and for use in billing.
Preliminary reports include all pertinent findings and a telephone call for any critical findings. For some teleradiology services, the turnaround time is rapid with a 30-minute standard turnaround and expedited for critical and stroke studies.
Teleradiology final |
https://en.wikipedia.org/wiki/Geoboard | A geoboard is a mathematical manipulative used to explore basic concepts in plane geometry such as perimeter, area and the characteristics of triangles and other polygons. It consists of a physical board with a certain number of nails half driven in, around which are wrapped geo bands that are made of rubber. Normal rubber bands can also be used.
Geoboards were invented and popularized in the 1950s by Egyptian mathematician Caleb Gattegno (1911-1988).
Structure and use
Geoboard is a board. A variety of boards are used. Originally made out of plywood and brass nails or pegs, geoboards are now usually made out of plastic. They may have an upright square lattice of 9, 16 or 25 nails or more, or a circle of nails around a central nail. Students are asked to place rubber bands around the nails to explore geometric concepts or to solve mathematical puzzles.
Geoboards may be used to learn about:
plane shapes;
translation;
rotation;
reflection;
similarity;
co-ordination;
counting;
right angles;
pattern;
classification;
scaling;
position;
congruence;
area;
perimeter.
Two-dimensional representations of the geoboard may be applied to ordinary paper using rubber stamps or special "geoboard paper" with diagrams of geoboards may be used to help capture a student's explanations of the concept they have discovered or illustrated on the geoboard. There are also a number of online virtual geoboards. |
https://en.wikipedia.org/wiki/Phosphatidylinositol%204-phosphate | Phosphatidylinositol-4-phosphate (PtdIns4P, PI-4-P, PI4P, or PIP) is a precursor of phosphatidylinositol (4,5)-bisphosphate. PtdIns4P is prevalent in the membrane of the Golgi apparatus.
In the Golgi apparatus, PtdIns4P binds to the GTP-binding protein ARF and to effector proteins, including four-phosphate-adaptor protein 1 and 2 (PLEKHA3 and PLEKHA8). This three molecule complex recruits proteins that need to be carried to the cell membrane.
There is now evidence that PI-4-P is capable of deforming lipid systems into tightly curved assemblies, this is consistent with similar behaviour observed in phosphatidylinositol.
See also
Phosphatidylinositol 3-phosphate
Phosphatidylinositol 5-phosphate
Phosphatidylinositol (3,4,5)-trisphosphate |
https://en.wikipedia.org/wiki/CagZ | In molecular biology, CagZ is a protein produced by Helicobacter pylori (Campylobacter pylori). It is a 23 kDa protein consisting of a single compact L-shaped domain, composed of seven alpha-helices that run antiparallel to each other. 70% of the amino acids are in alpha-helix conformation and no beta-sheet is present. CagZ is essential for the translocation of the pathogenic protein CagA into host cells. |
https://en.wikipedia.org/wiki/Brazier%20effect | The Brazier effect was first discovered in 1927 by Brazier. He showed that when an initially straight tube was bent uniformly, the longitudinal tension and compression which resist the applied bending moment also tend to flatten or ovalise the cross-section. As the curvature increases, the flexural stiffness decreases. Brazier showed that under steadily increasing curvature the bending moment reaches a maximum value. After the bending moment reaches its maximum value, the structure becomes unstable, and so the object suddenly forms a "kink".
From Brazier’s analysis it follows that the crushing pressure increases with the square of the curvature of the section, and thus with the square of the bending moment.
See also
Bending |
https://en.wikipedia.org/wiki/Novo%20Nordisk | Novo Nordisk A/S is a Danish multinational pharmaceutical company headquartered in Bagsværd, Denmark, with production facilities in nine countries and affiliates or offices in five countries. Novo Nordisk is controlled by majority shareholder Novo Holdings A/S which holds approximately 28% of its shares and a majority (77%) of its voting shares.
Novo Nordisk manufactures and markets pharmaceutical products and services, specifically diabetes care medications and devices. Its main product is the drug semaglutide, used to treat diabetes under the brand name Ozempic and obesity under the brand name Wegovy. Novo Nordisk is also involved with hemostasis management, growth hormone therapy, and hormone replacement therapy. The company makes several drugs under various brand names, including Levemir, Tresiba, NovoLog, Novolin R, NovoSeven, NovoEight, and Victoza.
Novo Nordisk employs more than 48,000 people globally, and markets its products in 168 countries. The corporation was created in 1989, through a merger of two Danish companies which date back to the 1920s. The Novo Nordisk logo is the Apis bull, one of the sacred animals of ancient Egypt. Novo Nordisk is a full member of the European Federation of Pharmaceutical Industries and Associations (EFPIA).
The company was ranked 25th among the 100 Best Companies to Work For in 2010 and 72nd in 2014 by Fortune. In January 2012, Novo Nordisk was named the most sustainable company in the world by the business magazine Corporate Knights, while spin-off company Novozymes was named fourth. Novo Nordisk is the largest pharmaceutical company in Denmark. Its market capitalization exceeded the GDP of its domestic economy in 2023, and it is the highest valued pharmaceutical company in Europe.
History
1923
Nordisk Insulinlaboratorium commercialises the production of insulin.
1982–1994
The company established its presence in the United States in 1982.
In 1986, Novo Industri A/S acquired the Ferrosan Group, now named as "Novo |
https://en.wikipedia.org/wiki/Stable%20Diffusion | Stable Diffusion is a deep learning, text-to-image model released in 2022 based on diffusion techniques. It is primarily used to generate detailed images conditioned on text descriptions, though it can also be applied to other tasks such as inpainting, outpainting, and generating image-to-image translations guided by a text prompt. It was developed by researchers from the CompVis Group at Ludwig Maximilian University of Munich and Runway with a compute donation by Stability AI and training data from non-profit organizations.
Stable Diffusion is a latent diffusion model, a kind of deep generative artificial neural network. Its code and model weights have been released publicly, and it can run on most consumer hardware equipped with a modest GPU with at least 4 GB VRAM. This marked a departure from previous proprietary text-to-image models such as DALL-E and Midjourney which were accessible only via cloud services.
Development
The development of Stable Diffusion was funded and shaped by the start-up company Stability AI.
The technical license for the model was released by the CompVis group at Ludwig Maximilian University of Munich. Development was led by Patrick Esser of Runway and Robin Rombach of CompVis, who were among the researchers who had earlier invented the latent diffusion model architecture used by Stable Diffusion. Stability AI also credited EleutherAI and LAION (a German nonprofit which assembled the dataset on which Stable Diffusion was trained) as supporters of the project.
In October 2022, Stability AI raised US$101 million in a round led by Lightspeed Venture Partners and Coatue Management.
Technology
Architecture
Stable Diffusion uses a kind of diffusion model (DM), called a latent diffusion model (LDM) developed by the CompVis group at LMU Munich. Introduced in 2015, diffusion models are trained with the objective of removing successive applications of Gaussian noise on training images, which can be thought of as a sequence of denoising autoenc |
https://en.wikipedia.org/wiki/Code%20property%20graph | In computer science, a code property graph (CPG) is a computer program representation that captures syntactic structure, control flow, and data dependencies in a property graph. The concept was originally introduced to identify security vulnerabilities in C and C++ system code, but has since been employed to analyze web applications, cloud deployments, and smart contracts. Beyond vulnerability discovery, code property graphs find applications in code clone detection, attack-surface detection, exploit generation, measuring code testability, and backporting of security patches.
Definition
A code property graph of a program is a graph representation of the program obtained by merging its abstract syntax trees (AST), control-flow graphs (CFG) and program dependence graphs (PDG) at statement and predicate nodes. The resulting graph is a property graph, which is the underlying graph model of graph databases such as Neo4j, JanusGraph and OrientDB where data is stored in the nodes and edges as key-value pairs. In effect, code property graphs can be stored in graph databases and queried using graph query languages.
Example
Consider the function of a C program:
void foo() {
int x = source();
if (x < MAX) {
int y = 2 * x;
sink(y);
}
}
The code property graph of the function is obtained by merging its abstract syntax tree, control-flow graph, and program dependence graph at statements and predicates as seen in the following figure:
Implementations
Joern CPG. The original code property graph was implemented for C/C++ in 2013 at University of Göttingen as part of the open-source code analysis tool Joern. This original version has been discontinued and superseded by the open-source Joern Project, which provides a formal code property graph specification applicable to multiple programming languages. The project provides code property graph generators for C/C++, Java, Java bytecode, Kotlin, Python, JavaScript, TypeScript, LLVM bitcode, and x86 binaries (via th |
https://en.wikipedia.org/wiki/Finitism | Finitism is a philosophy of mathematics that accepts the existence only of finite mathematical objects. It is best understood in comparison to the mainstream philosophy of mathematics where infinite mathematical objects (e.g., infinite sets) are accepted as legitimate.
Main idea
The main idea of finitistic mathematics is not accepting the existence of infinite objects such as infinite sets. While all natural numbers are accepted as existing, the set of all natural numbers is not considered to exist as a mathematical object. Therefore quantification over infinite domains is not considered meaningful. The mathematical theory often associated with finitism is Thoralf Skolem's primitive recursive arithmetic.
History
The introduction of infinite mathematical objects occurred a few centuries ago when the use of infinite objects was already a controversial topic among mathematicians. The issue entered a new phase when Georg Cantor in 1874 introduced what is now called naive set theory and used it as a base for his work on transfinite numbers. When paradoxes such as Russell's paradox, Berry's paradox and the Burali-Forti paradox were discovered in Cantor's naive set theory, the issue became a heated topic among mathematicians.
There were various positions taken by mathematicians. All agreed about finite mathematical objects such as natural numbers. However there were disagreements regarding infinite mathematical objects.
One position was the intuitionistic mathematics that was advocated by L. E. J. Brouwer, which rejected the existence of infinite objects until they are constructed.
Another position was endorsed by David Hilbert: finite mathematical objects are concrete objects, infinite mathematical objects are ideal objects, and accepting ideal mathematical objects does not cause a problem regarding finite mathematical objects. More formally, Hilbert believed that it is possible to show that any theorem about finite mathematical objects that can be obtained using i |
https://en.wikipedia.org/wiki/Scorpionism%20in%20Central%20America | Scorpionism is defined as the accidental envenomation of humans by toxic scorpions. If the injection of venom in a human results in death, this is defined as scorpionism. This is seen all over the world but is predominantly seen in the tropical and subtropical areas. These areas include Mexico, northern South America and southeast Brazil in the Western hemisphere. In the Eastern hemisphere, scorpionism possess a public health threat in the regions of South Africa, the Middle East, and the Indian subcontinent.
Species involved
Scorpions are nocturnal animals that typically live in deserts, mountains, caves, and under rocks. It is when they are disturbed that they attack. Scorpions that possess the ability to inject poisonous venom with their sting belong to the family Buthidae. The Middle East and North Africa are home to the deadliest scorpions, belonging to the genus Buthus, Leiurus, Androctonus, and Hottentotta. In the region of South Africa, the deadliest scorpion belongs to the Tityus genus. In India and Mexico, the deadliest scorpions involved in scorpionism are Mesobuthus and Centruroides, respectively.
In Central America, most scorpion stings are mildly toxic to humans, however, Panama has reported an incidence of 52 cases per 100,000 people in 2007. Between 1998 and 2006, 28 people have died as result of scorpion stings. In Panama, the taxa of scorpions responsible for these deaths belong to the genus Tityus. This scorpion species is also found in parts of northern South America. Historically, the presence of these scorpions in Panama could be due to the closure of the Panamanian isthmus, thus allowing for the migration of the scorpions from Panama into the northern part of South America. Tityus pachyurus belongs to the family of Tityus scorpions found in Panama. T. pachyurus is among the most medically important species. Envenomation by this kind of scorpion is characterized by intense local pain, that usually does not result in tissue injury. Scorpions |
https://en.wikipedia.org/wiki/Penetrant%20%28biochemical%29 | A biochemical penetrant is a chemical that increases the ability of a poison to apply its toxic effect to a living organism.
Typically, the term penetrant when used for a biochemical agent, relates to an agrichemical that is used with a weedkiller or fungicide. The term seems to be used in relation to agrichemicals within English speaking countries rather than North American.
When mixed with a weedkiller (normally as an aqua solution) the penetrant chemical causes a plant to absorb the poison in a more effective manner and so succumb more readily. Penetrants are most often used against plants that would otherwise be able to resist the weedkiller. Often such plants have tough leaves or shiny leaves that shed water easily.
See Also
Surfactant |
https://en.wikipedia.org/wiki/Simple%20linear%20regression | In statistics, simple linear regression is a linear regression model with a single explanatory variable. That is, it concerns two-dimensional sample points with one independent variable and one dependent variable (conventionally, the x and y coordinates in a Cartesian coordinate system) and finds a linear function (a non-vertical straight line) that, as accurately as possible, predicts the dependent variable values as a function of the independent variable.
The adjective simple refers to the fact that the outcome variable is related to a single predictor.
It is common to make the additional stipulation that the ordinary least squares (OLS) method should be used: the accuracy of each predicted value is measured by its squared residual (vertical distance between the point of the data set and the fitted line), and the goal is to make the sum of these squared deviations as small as possible. Other regression methods that can be used in place of ordinary least squares include least absolute deviations (minimizing the sum of absolute values of residuals) and the Theil–Sen estimator (which chooses a line whose slope is the median of the slopes determined by pairs of sample points). Deming regression (total least squares) also finds a line that fits a set of two-dimensional sample points, but (unlike ordinary least squares, least absolute deviations, and median slope regression) it is not really an instance of simple linear regression, because it does not separate the coordinates into one dependent and one independent variable and could potentially return a vertical line as its fit.
The remainder of the article assumes an ordinary least squares regression.
In this case, the slope of the fitted line is equal to the correlation between and corrected by the ratio of standard deviations of these variables. The intercept of the fitted line is such that the line passes through the center of mass of the data points.
Fitting the regression line
Consider the model function
|
https://en.wikipedia.org/wiki/Acorn%20Archimedes | Acorn Archimedes is a family of personal computers designed by Acorn Computers of Cambridge, England. The systems are based on Acorn's own ARM architecture processors and the proprietary operating systems Arthur and RISC OS. The first models were introduced in 1987, and systems in the Archimedes family were sold until the mid-1990s.
ARM's RISC design, a 32-bit CPU (using 26-bit addressing), running at 8 MHz, was stated as achieving 4 MIPS, which provided a significant upgrade from 8-bit home computers, such as Acorn's previous machines. Claims of being the fastest micro in the world and running at 18 MIPS were also made during tests.
Two of the first models—the A305 and A310—were given the BBC branding, with BBC Enterprises regarding the machines as "a continuing part of the original computer literacy project". Dissatisfaction with the branding arrangement was voiced by competitor Research Machines and an industry group led by a Microsoft representative, the British Micro Federation, who advocated the use of "business standard" operating systems such as MS-DOS. Responding to claims that the BBC branding was "unethical" and "damaging", a BBC Enterprises representative claimed that, with regard to the BBC's ongoing computer literacy initiatives, bringing in "something totally new would be irresponsible".
The name "Acorn Archimedes" is commonly used to describe any of Acorn's contemporary designs based on the same architecture. This architecture can be broadly characterised as involving the ARM CPU and the first generation chipset consisting of MEMC (MEMory Controller), VIDC (VIDeo and sound Controller) and IOC (Input Output Controller).
History
Having introduced the BBC Micro in 1981, Acorn established itself as a major supplier to primary and secondary education in the United Kingdom. However, attempts to replicate this dominance in other sectors, such as home computing with the BBC Micro and Acorn Electron, and in other markets, including the United States and |
https://en.wikipedia.org/wiki/Terragen | Terragen is a scenery generator program for Microsoft Windows and Mac OS X developed and published by Planetside Software. It can be used to create renderings and animations of landscapes.
History
Released in stages (tech preview and beta) to a participating community, Terragen 2 was released to pre-purchasers on 2 April 2009. Terragen 2 is offered in feature limited freeware and full-featured commercial licenses.
Planetside Software released the first public version of Terragen 2 after more than three years of development of both the core technologies and the program itself. Since then there have been several released updates to both licenses of the software along the development cycle with a series of technology previews and a beta release. The "final" build was released on April 23, 2009, and more updates, including feature modules, are expected to be released later.
Planetside released Terragen 3 in August 2013. Version 3.1 was released in February 2014. Version 4 was released in 2016.
Terragen Classic
Terragen Classic is popular among amateur artists, which can be attributed to it being freeware, its intuitive interface, and its capability to create photorealistic landscapes when used skillfully. It can also use DEM (digital elevation model) files, and other graphic surface maps for rendering.
A commercial version of the software is also available and is capable of creating larger terrains, renders with higher image resolution, larger terrain files, and better post-render anti-aliasing than the freeware version.
The terrain is generated from a two-dimensional heightmap. The program contains facilities for importing and exporting heightmaps to images, for use in other programs.
Use in media
Rendering software contributed by PlanetSide proprietor Matt Fairclough was used by Digital Domain for effects in Star Trek Nemesis. Terragen Classic was used in The Golden Compass, the 2006 remake of The Wicker Man, games, and many TV commercials.
An image from w |
https://en.wikipedia.org/wiki/Ridge%20%28biology%29 | Ridges (regions of increased gene expression) are domains of the genome with a high gene expression; the opposite of ridges are antiridges. The term was first used by Caron et al. in 2001. Characteristics of ridges are:
Gene dense
Contain many C and G nucleobases
Genes have short introns
High SINE repeat density
Low LINE repeat density
Discovery
Clustering of genes in prokaryotes was known for a long time. Their genes are grouped in operons, genes within operons share a common promoter unit. These genes are mostly functionally related. The genome of prokaryotes is relatively very simple and compact. In eukaryotes the genome is huge and only a small amount of it are functionally genes, furthermore the genes are not arranged in operons. Except for nematodes and trypanosomes; although their operons are different from the prokaryotic operons. In eukaryotes each gene has a transcription regulation site of its own. Therefore, genes don't have to be in close proximity to be co-expressed. Therefore, it was long assumed that eukaryotic genes were randomly distributed across the genome due to the high rate of chromosome rearrangements. But because the complete sequence of genomes became available it became possible to absolutely locate a gene and measure its distance to other genes.
The first eukaryote genome ever sequenced was that of Saccharomyces cerevisiae, or budding yeast, in 1996. Half a year after that Velculescu et al. (1997) published a research in which they had integrated SAGE data with the now available genome map. During a cell cycle different genes are active in a cell. Therefore, they used SAGE data from three moments of the cell cycle (log phase, S phase-arrested and G2/M-phase arrested cells). Because in yeast all genes have a promoter unit of their own it was not suspected that genes would cluster near to each other but they did. Clusters were present on all 16 yeast chromosomes.
A year later Cho et al. also reported (although in more detail) that certain |
https://en.wikipedia.org/wiki/Additively%20indecomposable%20ordinal | In set theory, a branch of mathematics, an additively indecomposable ordinal α is any ordinal number that is not 0 such that for any , we have Additively indecomposable ordinals are also called gamma numbers or additive principal numbers. The class of additively indecomposable ordinals may be denoted , from the German "Hauptzahl". The additively indecomposable ordinals are precisely those ordinals of the form for some ordinal .
From the continuity of addition in its right argument, we get that if and α is additively indecomposable, then
Obviously 1 is additively indecomposable, since No finite ordinal other than is additively indecomposable. Also, is additively indecomposable, since the sum of two finite ordinals is still finite. More generally, every infinite initial ordinal (an ordinal corresponding to a cardinal number) is additively indecomposable.
The class of additively indecomposable numbers is closed and unbounded. Its enumerating function is normal, given by .
The derivative of (which enumerates its fixed points) is written Ordinals of this form (that is, fixed points of ) are called epsilon numbers. The number is therefore the first fixed point of the sequence
Multiplicatively indecomposable
A similar notion can be defined for multiplication. If α is greater than the multiplicative identity, 1, and β < α and γ < α imply β·γ < α, then α is multiplicatively indecomposable. 2 is multiplicatively indecomposable since 1·1 = 1 < 2. Besides 2, the multiplicatively indecomposable ordinals (also called delta numbers) are those of the form for any ordinal α. Every epsilon number is multiplicatively indecomposable; and every multiplicatively indecomposable ordinal (other than 2) is additively indecomposable. The delta numbers (other than 2) are the same as the prime ordinals that are limits.
Higher indecomposables
Exponentially indecomposable ordinals are equal to the epsilon numbers, tetrationally indecomposable ordinals are equal to t |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.