source stringlengths 31 227 | text stringlengths 9 2k |
|---|---|
https://en.wikipedia.org/wiki/Caenorhabditis%20sp.%2035 | Caenorhabditis sp. 35 is a yet unnamed species of nematodes in the genus Caenorhabditis. Two isolates were discovered in 2013 by N. Kanzaki from Banda Aceh, Indonesia, associated with Ficus hispida.
Genetic studies show that it is basal in the 'Elegans' supergroup with C. inopinata. |
https://en.wikipedia.org/wiki/Embedded%20value | The Embedded Value (EV) of a life insurance company is the present value of future profits plus adjusted net asset value. It is a construct from the field of actuarial science which allows insurance companies to be valued.
Background
Life insurance policies are long-term contracts, where the policyholder pays a premium to be covered against a possible future event (such as the death of the policyholder).
Future income for the insurer consists of premiums paid by policyholders whilst future outgoings comprise claims paid to policyholders as well as various expenses. The difference, combined with income on and release of statutory reserves, represents future profit.
Net asset value is the difference between the total assets and liabilities of an insurance company.
For companies, the net asset value is usually calculated at book value. This needs to be adjusted to market values for EV purposes. Furthermore, this value may be discounted to reflect the "lock in" of some of the assets by their nature. (An example of such a lock-in would be assets held within the with-profits fund)
Value of the insurer
EV measures the value of the insurer by adding today's value of the existing business (i.e. future profits) to the market value of net assets (i.e. accumulated past profits).
It is a conservative measure of the insurer's value in the sense that it only considers future profits from existing policies and so ignores the possibility that the insurer may sell new policies in future. It also excludes goodwill. As a result, the insurer is worth more than its EV.
Formula
Embedded Value is calculated as follows:
EV = PVFP + ANAV
where
EV = Embedded Value
PVFP = present value of future profits
ANAV = adjusted net asset value
Improvements
European embedded value (EEV) is a variation of EV which was set up by the CFO Forum which allows for a more formalised method of choosing the parameters and doing the calculations, to enable greater transparency and comparability.
Market |
https://en.wikipedia.org/wiki/Luca%20Gammaitoni | Luca Gammaitoni (born 16 June 1961 in Perugia) is a scientist in the area of noise and nonlinear dynamics. He is currently the Director of the Noise in Physical System Laboratory (NiPS Lab) at the Physics Department of the Università di Perugia, in Italy.
Education and career
He graduated in physics at the University of Perugia and obtained his Ph.D. in physics from the University of Pisa in 1991 (S. Santucci advisor). His thesis was entitled "Stochastic Resonance". He is currently Professor at the Faculty of Science of the University of Perugia in Italy. The Noise in Physical Systems (NiPS) Laboratory is a research facility within the Physics Department of the University of Perugia. NiPS has a long-standing tradition in studying physical systems in the presence of noise. Scientific interest ranges from stochastic nonlinear dynamics modelling to thermal noise measurements.
Scientific Interests
Stochastic nonlinear dynamics with specific reference to Stochastic Resonance, dithering, resonant trapping, resonant crossing phenomena.
Energy Harvesting, with specific reference to nonlinear vibration harvesting, micro and nanoscale energy management.
Energy efficiency in computing devices, with specific reference to micro and nanoscale logic gate devices
Thermal noise and non equilibrium relaxation processes in solid state systems
Thermodynamics of computing and fundamental limits in the physics of computation.
Publications
The Physics of Computing, Springer, 2021
Introduzione alla scienza dei computer, McGraw-Hill, 2004. |
https://en.wikipedia.org/wiki/BigBlueButton | BigBlueButton is a virtual classroom software program designed for online education. Accessed through a variety of Learning Management Systems, the application provides engagement tools and analytics for educators to interact with their students remotely. It is open source, except for some versions of its database software.
History
The project was started at Carleton University in 2007 by the Technology Innovation Management program.
The first version was written by Richard Alam (it was initially called the Blindside project) under the supervision of Tony Bailetti. BigBlueButton is an affiliate member of the Open Source Initiative. The BigBlueButton name comes from the initial concept that starting a web conference should be as simple as pressing a metaphorical big blue button.
In 2009 Richard Alam, Denis Zgonjanin, and Fred Dixon uploaded the BigBlueButton source code to Google Code and formed Blindside Networks, a company pursuing the traditional open source business model of providing paid support and services to the BigBlueButton community.
In 2010 the core developers added a whiteboard for annotating the uploaded presentation. Jeremy Thomerson added an application programming interface (API) which the BigBlueButton community subsequently used to integrate with Sakai, WordPress, Moodle 1.9, Moodle 2.0, Joomla, Redmine, Drupal, Tiki Wiki CMS Groupware, Foswiki, and LAMS. Google accepted BigBlueButton into the 2010 Google Summer of Code program. To encourage contributions from others, the core developers moved the source code from Google Code to GitHub. The project indicated its intent to create an independent, not-for-profit BigBlueButton Foundation to oversee future development.
In 2011, the core developers announced they were adding record and playback capabilities to BigBlueButton 0.80.
In 2020, the project released BigBlueButton 2.2, a full rewrite of the client and server to support HTML5.
In March 2020, BigBlueButton 2.2 was awarded by the Preside |
https://en.wikipedia.org/wiki/Karl%20Prachar | Karl Prachar (; 1924 – November 27, 1994) was an Austrian mathematician who worked in the area of analytic number theory. He is known for his much acclaimed book on the distribution of the prime numbers, Primzahlverteilung (Springer Verlag, 1957).
Prachar received his doctorate in 1947 from the University of Vienna. |
https://en.wikipedia.org/wiki/Soil%20fertility | Soil fertility refers to the ability of soil to sustain agricultural plant growth, i.e. to provide plant habitat and result in sustained and consistent yields of high quality. It also refers to the soil's ability to supply plant/crop nutrients in the right quantities and qualities over a sustained period of time. A fertile soil has the following properties:
The ability to supply essential plant nutrients and water in adequate amounts and proportions for plant growth and reproduction; and
The absence of toxic substances which may inhibit plant growth e.g Fe2+ which leads to nutrient toxicity.
The following properties contribute to soil fertility in most situations:
Sufficient soil depth for adequate root growth and water retention;
Good internal drainage, allowing sufficient aeration for optimal root growth (although some plants, such as rice, tolerate waterlogging);
Topsoil or horizon O is with sufficient soil organic matter for healthy soil structure and soil moisture retention;
Soil pH in the range 5.5 to 7.0 (suitable for most plants but some prefer or tolerate more acid or alkaline conditions);
Adequate concentrations of essential plant nutrients in plant-available forms;
Presence of a range of microorganisms that support plant growth.
In lands used for agriculture and other human activities, maintenance of soil fertility typically requires the use of soil conservation practices. This is because soil erosion and other forms of soil degradation generally result in a decline in quality with respect to one or more of the aspects indicated above.
Soil fertilization
Bioavailable phosphorus (available to soil life) is the element in soil that is most often lacking. Nitrogen and potassium are also needed in substantial amounts. For this reason these three elements are always identified on a commercial fertilizer analysis. For example, a 10-10-15 fertilizer has 10 percent nitrogen, 10 percent available phosphorus (P2O5) and 15 percent water-soluble potassiu |
https://en.wikipedia.org/wiki/Institute%20of%20Applied%20Physics%20at%20the%20Ilia%20State%20University | The Institute of Applied Physics at the Ilia State University (ISU)-head Prof. Ivane G. Murusidze
Research
Research areas include:
Structural and electronic properties of low-dimensional systems, atomic clusters and nanostructures;
Biophysics of bacteria-metal interactions; microbial production of noble and semimetal nanoparticles;
Intense laser interactions with plasmas, advanced plasma-based acceleration concepts, new radiation sources; relativistic nonlinear optics;
Development and application of numerical techniques for the computer simulations of complex nonlinear systems.
Current research is focused on:
Computer modeling and simulations of boron nitride nanostructures, their electronic and size-dependent structural properties.
Biophysics of microbial synthesis of gold and silver nanoparticles by extremophilic archaeal strains.
Theoretical modeling and computer simulations of intense laser-plasma interactions, relativistic nonlinear optics of ultrashort, ultraintense laser pulses in plasmas.
Teaching & Education
The Institute of Applied Physics provides high-quality introductory, intermediate, and advanced-level courses in physics, statistics, computer modeling and applied mathematics to the School of Engineering and the School of Arts and Sciences at the Ilia State University.
There are four Schools in the university, each offering undergraduate, graduate and post-graduate programs:
External links
Physics research institutes
Universities in Georgia (country) |
https://en.wikipedia.org/wiki/Murderous%20Maths | Murderous Maths is a series of British educational books by author Kjartan Poskitt. Most of the books in the series are illustrated by illustrator Philip Reeve, with the exception of "The Secret Life of Codes", which is illustrated by Ian Baker, "Awesome Arithmetricks" illustrated by Daniel Postgate and Rob Davis, and "The Murderous Maths of Everything", also illustrated by Rob Davis.
The Murderous Maths books have been published in over 25 countries. The books, which are aimed at children aged 8 and above, teach maths, spanning from basic arithmetic to relatively complex concepts such as the quadratic formula and trigonometry. The books are written in an informal similar style to the Horrible Histories, Horrible Science and Horrible Geography series, involving evil geniuses, gangsters, and a generally comedic tone.
Development
The first two books of the series were originally part of "The Knowledge" (now "Totally") series, itself a spin-off of Horrible Histories. However, these books were eventually redesigned and they, as well as the rest of the titles in the series, now use the Murderous Maths banner. According to Poskitt, "these books have even found their way into schools and proved to be a boost to GCSE studies". The books are also available in foreign editions, including: German, Spanish, Polish, Czech, Greek, Dutch, Norwegian, Turkish, Croatian, Italian, Lithuanian, Korean, Danish, Hungarian, Finnish, Thai and Portuguese (Latin America). In 2009, the books were redesigned again, changing the cover art style and the titles of most of the books in the series.
Poskitt's goal, according to the Murderous Maths website, is to write books that are "something funny to read", have "good amusing illustrations", include "tricks", and "explaining the maths involved as clearly as possible". He adds that although he doesn't "work to any government imposed curriculum or any stage achievement levels", he has "been delighted to receive many messages of support and thanks |
https://en.wikipedia.org/wiki/Fibonacci%20polynomials | In mathematics, the Fibonacci polynomials are a polynomial sequence which can be considered as a generalization of the Fibonacci numbers. The polynomials generated in a similar way from the Lucas numbers are called Lucas polynomials.
Definition
These Fibonacci polynomials are defined by a recurrence relation:
The Lucas polynomials use the same recurrence with different starting values:
They can be defined for negative indices by
The Fibonacci polynomials form a sequence of orthogonal polynomials with and .
Examples
The first few Fibonacci polynomials are:
The first few Lucas polynomials are:
Properties
The degree of Fn is n − 1 and the degree of Ln is n.
The Fibonacci and Lucas numbers are recovered by evaluating the polynomials at x = 1; Pell numbers are recovered by evaluating Fn at x = 2.
The ordinary generating functions for the sequences are:
The polynomials can be expressed in terms of Lucas sequences as
They can also be expressed in terms of Chebyshev polynomials and as
where is the imaginary unit.
Identities
As particular cases of Lucas sequences, Fibonacci polynomials satisfy a number of identities, such as
Closed form expressions, similar to Binet's formula are:
where
are the solutions (in t) of
For Lucas Polynomials n > 0, we have
A relationship between the Fibonacci polynomials and the standard basis polynomials is given by
For example,
Combinatorial interpretation
If F(n,k) is the coefficient of xk in Fn(x), namely
then F(n,k) is the number of ways an n−1 by 1 rectangle can be tiled with 2 by 1 dominoes and 1 by 1 squares so that exactly k squares are used. Equivalently, F(n,k) is the number of ways of writing n−1 as an ordered sum involving only 1 and 2, so that 1 is used exactly k times. For example F(6,3)=4 and 5 can be written in 4 ways, 1+1+1+2, 1+1+2+1, 1+2+1+1, 2+1+1+1, as a sum involving only 1 and 2 with 1 used 3 times. By counting the number of times 1 and 2 are both used in such a sum, it is evident tha |
https://en.wikipedia.org/wiki/Inferring%20horizontal%20gene%20transfer | Horizontal or lateral gene transfer (HGT or LGT) is the transmission of portions of genomic DNA between organisms through a process decoupled from vertical inheritance. In the presence of HGT events, different fragments of the genome are the result of different evolutionary histories. This can therefore complicate investigations of the evolutionary relatedness of lineages and species. Also, as HGT can bring into genomes radically different genotypes from distant lineages, or even new genes bearing new functions, it is a major source of phenotypic innovation and a mechanism of niche adaptation. For example, of particular relevance to human health is the lateral transfer of antibiotic resistance and pathogenicity determinants, leading to the emergence of pathogenic lineages.
Inferring horizontal gene transfer through computational identification of HGT events relies upon the investigation of sequence composition or evolutionary history of genes. Sequence composition-based ("parametric") methods search for deviations from the genomic average whereas evolutionary history-based ("phylogenetic") approaches identify genes whose evolutionary history significantly differs from that of the host species. The evaluation and benchmarking of HGT inference methods typically rely upon simulated genomes, for which the true history is known. On real data, different methods tend to infer different HGT events, and as a result it can be difficult to ascertain all but simple and clear-cut HGT events.
Overview
Horizontal gene transfer was first observed in 1928, in Frederick Griffith's experiment: showing that virulence was able to pass from virulent to non-virulent strains of Streptococcus pneumoniae, Griffith demonstrated that genetic information can be horizontally transferred between bacteria via a mechanism known as transformation. Similar observations in the 1940s and 1950s showed evidence that conjugation and transduction are additional mechanisms of horizontal gene transfer.
To |
https://en.wikipedia.org/wiki/Sommerfeld%20identity | The Sommerfeld identity is a mathematical identity, due Arnold Sommerfeld, used in the theory of propagation of waves,
where
is to be taken with positive real part, to ensure the convergence of the integral and its vanishing in the limit and
.
Here, is the distance from the origin while is the distance from the central axis of a cylinder as in the cylindrical coordinate system. Here the notation for Bessel functions follows the German convention, to be consistent with the original notation used by Sommerfeld. The function is the zeroth-order Bessel function of the first kind, better known by the notation in English literature.
This identity is known as the Sommerfeld identity.
In alternative notation, the Sommerfeld identity can be more easily seen as an expansion of a spherical wave in terms of cylindrically-symmetric waves:
Where
The notation used here is different form that above: is now the distance from the origin and is the radial distance in a cylindrical coordinate system defined as . The physical interpretation is that a spherical wave can be expanded into a summation of cylindrical waves in direction, multiplied by a two-sided plane wave in the direction; see the Jacobi-Anger expansion. The summation has to be taken over all the wavenumbers .
The Sommerfeld identity is closely related to the two-dimensional Fourier transform with cylindrical symmetry, i.e., the Hankel transform. It is found by transforming the spherical wave along the in-plane coordinates (,, or , ) but not transforming along the height coordinate .
Notes |
https://en.wikipedia.org/wiki/Nanchung | Nanchung is an invertebrate TRP channel that acts to sense mechanical force. Drosophila nanchung mutants show deficits in antennal sensation, including hearing and hygrosensation, and are unable to transduce sound stimuli. |
https://en.wikipedia.org/wiki/Protoplasma | Protoplasma is a bimonthly peer-reviewed scientific journal covering various aspects of protoplasm research. It is published by Springer Science+Business Media and was established in 1926. The editor-in-chief is P. Nick (University of Karlsruhe).
The journal publishes research articles, reviews, and commentaries related to protoplasm, including its cell structure, signal transduction, biotechnology, and genetic engineering.
Abstracting and indexing
The journal is abstracted and indexed in:
According to the Journal Citation Reports, the journal has a 2021 impact factor of 3.186. |
https://en.wikipedia.org/wiki/Vitellogenin | Vitellogenin (VTG or less popularly known as VG) (from Latin vitellus, yolk, and genero, I produce) is a precursor of egg yolk that transports protein and some lipid from the liver through the blood to the growing oocytes where it becomes part of the yolk. Normally, it is only found in the blood or hemolymph of females, and can therefore be used as a biomarker in vertebrates of exposure to environmental estrogens which stimulate elevated levels in males as well as females. "Vitellogenin" is a synonymous term for the gene and the expressed protein. The protein product is classified as a glycolipoprotein, having properties of a sugar, fat and protein. It belongs to a family of several lipid transport proteins.
Vitellogenin is an egg yolk precursor found in the females of nearly all oviparous species including fish, amphibians, reptiles, birds, most invertebrates, and monotremes. Vitellogenin is the precursor of the lipoproteins and phosphoproteins that make up most of the protein content of yolk. In the presence of estrogenic endocrine disruptive chemicals (EDCs), male fish can express the gene in a dose dependent manner. This gene expression in male fish can be used as a molecular marker of exposure to estrogenic EDCs.
Function
Vitellogenin provides the major egg yolk protein that is a source of nutrients during early development of egg-laying (oviparous) vertebrates and invertebrates. Although vitellogenin also carries some lipid for deposition in the yolk, the primary mechanism for deposition of yolk lipid is instead via VLDLs, at least in birds and reptiles. Vitellogenin precursors are multi-domain apolipoproteins (proteins that bind to lipids to form lipoproteins), that are cleaved into distinct yolk proteins. Different vitellogenin proteins exist, which are composed of variable combinations of yolk protein components; however, the cleavage sites are conserved.
Components
In vertebrates, a complete vitellogenin is composed of:
an N-terminal signal peptid |
https://en.wikipedia.org/wiki/MIME%20Object%20Security%20Services | MIME Object Security Services (MOSS) is a protocol that uses the multipart/signed and multipart/encrypted framework to apply digital signature and encryption services to MIME objects.
Details
The services are offered through the use of end-to-end cryptography between an originator and a recipient at the application layer. Asymmetric (public key) cryptography is used in support of the digital signature service and encryption key management. Symmetric (secret key) cryptography is used in support of the encryption service. The procedures are intended to be compatible with a wide range of public key management approaches, including both ad hoc and certificate-based schemes. Mechanisms are provided to support many public key management approaches.
Spreading
MOSS was never widely deployed and is now abandoned, largely due to the popularity of PGP. |
https://en.wikipedia.org/wiki/Tripartite%20synapse | Tripartite synapse refers to the functional integration and physical proximity of the presynaptic membrane, postsynaptic membrane, and their intimate association with surrounding glia as well as the combined contributions of these three synaptic components to the production of activity at the chemical synapse. Tripartite synapses occur at a number of locations in the central nervous system with astrocytes and may also exist with Muller glia of retinal ganglion cells and Schwann cells at the neuromuscular junction. The term was first introduced in the late 1990s to account for a growing body of evidence that glia are not merely passive neuronal support cells but, instead, play an active role in the integration of synaptic information through bidirectional communication with the neuronal components of the synapse as mediated by neurotransmitters and gliotransmitters.
Evidence of the Tripartite Synapse
Evidence for the role of astrocytes in the integration and processing of synaptic integration presents itself in a number of ways:
Astrocytes are excitable cells: In response to stimuli from any of the three components of the tripartite synapse, astrocytes are capable of producing transient changes in their intracellular calcium concentrations through release of calcium stores from the endoplasmic reticulum
Astrocytes communicate bidirectionally with neurons: Through changes in their calcium concentration excitability, astrocytes are able to detect neurotransmitters and other signals released from neurons at the synapse and can release their own neurotransmitters or gliotransmitters that are, in turn, capable of modifying the electrophysiological excitability of neurons.
Astrocytes are capable of responding selectively to stimuli: Astrocytes of the hippocampal stratum oriens form tripartite synapses with axonal projections from the alveus. The alveus projections can form either glutamatergic or cholinergic synapses with the stratum oriens, but the astrocytes of this r |
https://en.wikipedia.org/wiki/Amalka%20Supercomputing%20facility | The Amalka Supercomputing facility is the largest of the three Czech parallel supercomputers. It is used by Department of Space Physics,
Institute of Atmospheric Physics, Academy of Sciences of the Czech Republic.
The primary task is computation and visualisation in the area of space research for the European Space Agency or NASA, such as a preparation of Demeter (satellite) launch.
Amalka Supercomputing facility is credited with computing the first kinetic magnetic field model of Mercury in the MESSENGER project. It also helped to understand the results from the Cluster II mission.
At present, the facility is supporting the THEMIS (Time History of Events and Macroscale Interactions during Substorms) project. The results will be useful in planning for creating permanent human bases on Moon that will be protected from solar wind.
The current version runs Linux slackware and delivers 6.38 TFlops. Expansion and optimization of the infrastructure is being implemented by Sprinx Systems. |
https://en.wikipedia.org/wiki/Handgun%20effectiveness | Handgun effectiveness is a measure of the stopping power of a handgun: its ability to incapacitate a hostile target as quickly and efficiently as possible.
Overview
Most handgun projectiles have significantly lower energy than centerfire rifles and shotguns. What they lack in power, they make up for in being small and lightweight, lending to concealability and practicality. Handgun power and the effectiveness of different cartridges are widely debated topics. Experimental research among civilians, law enforcement agencies, militaries, and ammunition companies is constantly ongoing. Factors that can influence handgun effectiveness include handgun design, bullet type, and bullet capabilities (e.g. wound mechanisms, penetration, velocity, and weight).
Factors
Cavitation
Most handgun projectiles wound primarily through the size of the hole they produce, known as a permanent cavity or simply a bullet hole. Rifles are capable of much higher velocities with similar cartridges and add Temporary cavitation for additional lethality. Many handgun bullets move too slowly to cause temporary cavitation, but it may occur if the bullet fragments, strikes inelastic tissue (liver, spleen, kidneys, CNS), or transfers at least of energy into the subject. This last instance usually requires a larger and/or higher velocity projectile than is commonly used with handguns.
Penetration
One factor used to measure a handgun's effectiveness is penetration. The FBI's requirement for all service rounds is penetration in calibrated ballistic gelatin. This generally ensures a bullet will reach the vital human organs from many angles and through many different layers and materials of clothing. Penetration is often argued as the most important factor in handgun cartridge wounding potential outside the skill of the shooter.
Ballistic Pressure Wave/Hydrostatic Shock
There is a significant body of evidence that Hydrostatic shock (more precisely known as the ballistic pressure wave) can contrib |
https://en.wikipedia.org/wiki/Minoan%20Moulds%20of%20Palaikastro | The Minoan Moulds of Palaikastro () are two double-sided pieces of schist, formed in the Minoan period as casting moulds for plaques with figures and symbols. These include female figures with raised arms, labrys double axes (Λάβρυες, labryes) and opium poppy flowers or capsules, two double axes with indented edges, the Horns of Consecration symbol, and a sun-like disc with complex markings, which has been claimed by some researchers to be for making objects to use in astronomical predictions of solar and lunar eclipses.
They were found in 1899 near Palaikastro in the eastern part of Crete, and are now in the Herakleion Archeological Museum in Crete.
Description
Stefanos Xanthoudidis, who published the find in 1900 described the two moulds, which were made from relatively soft and brittle schist as Plate Α and Plate Β. His plaster casts, which are also reproduced on the right hand side, are mirror images of the original moulds. Both moulds are wide, high and thick, while the width of the plaster casts is .
The front of Plate Α shows a large disc with rectangular spokes and a serrated edge (which some are keen to interpret as "geared"), a female figure with raised arms, who holds flowers in her hands and a small disc with a cross in the centre on top of a bell-shaped and horizontally striped base, above a crescent. Double horns, the 'Horns of Consecration' of the Minoan culture, and a trident are shown on the rear. A small piece of the lower edge of the mould is broken-off.
The front of Plate B shows engravings of a couple of double axes, dissimilar in size with teethed edges. The double axe or labrys was a cultural, almost certainly religious, symbol of the Minoan culture, often used for votive offerings, as were goddess figures with uplifted hands. The rear of the plate shows a female figure with raised arms holding two double axes. A small piece of the lower edge of the mould is broken-off as well. Both plates are exhibited side by side in the Heraklion Arc |
https://en.wikipedia.org/wiki/Legacy-free%20PC | A legacy-free PC is a type of personal computer that lacks a floppy and/or optical disc drive, legacy ports, and an Industry Standard Architecture (ISA) bus (or sometimes, any internal expansion bus at all). According to Microsoft, "The basic goal for these requirements is that the operating system, devices, and end users cannot detect the presence of the following: ISA slots or devices; legacy floppy disk controller (FDC); and PS/2, serial, parallel, and game ports." The legacy ports are usually replaced with Universal Serial Bus (USB) ports. A USB adapter may be used if an older device must be connected to a PC lacking these ports. According to the 2001 edition of Microsoft's PC System Design Guide, a legacy-free PC must be able to boot from a USB device.
Removing older, usually bulkier ports and devices allows a legacy-free PC to be much more compact than earlier systems and many fall into the nettop or all-in-one form factor. Netbooks and ultrabooks could also be considered a portable form of a legacy-free PC. Legacy-free PCs can be more difficult to upgrade than a traditional beige box PC, and are more typically expected to be replaced completely when they become obsolete. Many legacy-free PCs include modern devices that may be used to replace ones omitted, such as a memory card reader replacing the floppy drive.
As the first decade of the 21st century progressed, the legacy-free PC went mainstream, with legacy ports removed from commonly available computer systems in all form factors. However, the PS/2 keyboard connector still retains some use, as it can offer some uses (e.g. implementation of n-key rollover) not offered by USB.
With those parts becoming increasingly rare on newer computers as of the late 2010s and early 2020s, the term "legacy-free PC" itself have also become increasingly rare.
History
Late 1980s
In 1987, IBM released the IBM PS/2 line with new internal architecture; the BIOS and the new PS/2 port and VGA port was introduced, but this li |
https://en.wikipedia.org/wiki/Artificial%20intelligence%20in%20video%20games | In video games, artificial intelligence (AI) is used to generate responsive, adaptive or intelligent behaviors primarily in non-player characters (NPCs) similar to human-like intelligence. Artificial intelligence has been an integral part of video games since their inception in the 1950s. AI in video games is a distinct subfield and differs from academic AI. It serves to improve the game-player experience rather than machine learning or decision making. During the golden age of arcade video games the idea of AI opponents was largely popularized in the form of graduated difficulty levels, distinct movement patterns, and in-game events dependent on the player's input. Modern games often implement existing techniques such as pathfinding and decision trees to guide the actions of NPCs. AI is often used in mechanisms which are not immediately visible to the user, such as data mining and procedural-content generation.
In general, game AI does not, as might be thought and sometimes is depicted to be the case, mean a realization of an artificial person corresponding to an NPC in the manner of the Turing test or an artificial general intelligence.
Overview
The term "game AI" is used to refer to a broad set of algorithms that also include techniques from control theory, robotics, computer graphics and computer science in general, and so video game AI may often not constitute "true AI" in that such techniques do not necessarily facilitate computer learning or other standard criteria, only constituting "automated computation" or a predetermined and limited set of responses to a predetermined and limited set of inputs.
Many industries and corporate voices claim that so-called video game AI has come a long way in the sense that it has revolutionized the way humans interact with all forms of technology, although many expert researchers are skeptical of such claims, and particularly of the notion that such technologies fit the definition of "intelligence" standardly used in the |
https://en.wikipedia.org/wiki/Combinatory%20categorial%20grammar | Combinatory categorial grammar (CCG) is an efficiently parsable, yet linguistically expressive grammar formalism. It has a transparent interface between surface syntax and underlying semantic representation, including predicate–argument structure, quantification and information structure. The formalism generates constituency-based structures (as opposed to dependency-based ones) and is therefore a type of phrase structure grammar (as opposed to a dependency grammar).
CCG relies on combinatory logic, which has the same expressive power as the lambda calculus, but builds its expressions differently. The first linguistic and psycholinguistic arguments for basing the grammar on combinators were put forth by Steedman and Szabolcsi.
More recent prominent proponents of the approach are Pauline Jacobson and Jason Baldridge. In these new approaches, the combinator B (the compositor) is useful in creating long-distance dependencies, as in "Who do you think Mary is talking about?" and the combinator W (the duplicator) is useful as the lexical interpretation of reflexive pronouns, as in "Mary talks about herself". Together with I (the identity mapping) and C (the permutator) these form a set of primitive, non-interdefinable combinators. Jacobson interprets personal pronouns as the combinator I, and their binding is aided by a complex combinator Z, as in "Mary lost her way". Z is definable using W and B.
Parts of the formalism
The CCG formalism defines a number of combinators (application, composition, and type-raising being the most common). These operate on syntactically-typed lexical items, by means of Natural deduction style proofs. The goal of the proof is to find some way of applying the combinators to a sequence of lexical items until no lexical item is unused in the proof. The resulting type after the proof is complete is the type of the whole expression. Thus, proving that some sequence of words is a sentence of some language amounts to proving that the words reduc |
https://en.wikipedia.org/wiki/Melanoma-associated%20antigen | The mammalian members of the MAGE (melanoma-associated antigen) gene family were originally described as completely silent in normal adult tissues, with the exception of male germ cells and, for some of them, placenta. By contrast, these genes were expressed in various kinds of tumors.
MAGE-like genes have also been identified in non-mammalian species, like the zebrafish or Drosophila melanogaster. Although no MAGE homologous sequences have been identified in Caenorhabditis elegans, Saccharomyces cerevisiae or Schizosaccharomyces pombe, MAGE sequences have been found in several vegetal species, including Arabidopsis thaliana.
The only region of homology shared by all of the members of the family is a stretch of about 200 amino acids which has been named the MAGE conserved domain. The MAGE conserved domain is usually located close to the C-terminus, although it can also be found in a more central position in some proteins. The MAGE conserved domain is generally present as a single copy but it is duplicated in some proteins. It has been proposed that the MAGE conserved domain of MAGE-D proteins might interact with p75 neurotrophin or related receptors.
Human proteins containing this domain
MAGE-B1; MAGEA1; MAGEA10; MAGEA11; MAGEA12; MAGEA2B; MAGEA3;
MAGEA4; MAGEA6; MAGEA8; MAGEA9; MAGEB1; MAGEB10; MAGEB16; MAGEB18;
MAGEB2; MAGEB3; MAGEB4; MAGEB5; MAGEB6; MAGEB6B; MAGEC1; MAGEC2;
MAGEC3; MAGED1; MAGED2; MAGED4; MAGEE1; MAGEE2; MAGEF1; MAGEH1;
MAGEL2; NDN; NDNL2; |
https://en.wikipedia.org/wiki/Infinidat | Infinidat is an Israeli-American data storage company.
History
Infinidat was founded by Moshe Yanai in 2011. By 2015 it was valued at $1.2 billion, and in 2017 it was valued at $1.6 billion. The company has offices in 17 countries and two headquarters: one in Waltham, MA and one in Herzliya, Israel.
InfiniBox
In 2013 the company filed for thirty-nine patents, and later that year released its flagship product, the InfiniBox. Each system initially managed about five petabytes of data.
As of October 2017, the company had shipped about two exabytes worth of storage to its customers. The company uses conventional and flash storage, and has a better than one million IOPS performance and 99.99999 percent reliability. The product is used by large corporations and clients including cloud service providers, telecoms, financial services firms, healthcare providers, and others that require large amounts of data storage.
Funding
In 2015 the company received $150 million in funding during its Series B round led by TPG Growth.
In 2017, the company received $95 million in funding, in a Series C round led by Goldman Sachs. At this stage it had received $325 million in total funding. |
https://en.wikipedia.org/wiki/D%20%28programming%20language%29 | D, also known as dlang, is a multi-paradigm system programming language created by Walter Bright at Digital Mars and released in 2001. Andrei Alexandrescu joined the design and development effort in 2007. Though it originated as a re-engineering of C++, D is a profoundly different language —features of D can be considered streamlined and expanded-upon ideas from C++, however D also draws inspiration from other high-level programming languages, notably Java, Python, Ruby, C#, and Eiffel.
D combines the performance and safety of compiled languages with the expressive power of modern dynamic and functional programming languages. Idiomatic D code is commonly as fast as equivalent C++ code, while also being shorter. The language as a whole is not memory-safe but includes optional attributes designed to guarantee memory safety of either subsets of or the whole program.
Type inference, automatic memory management and syntactic sugar for common types allow faster development, while bounds checking and design by contract find bugs earlier at runtime, and a concurrency-aware type system catches bugs at compile time.
Features
D was designed with lessons learned from practical C++ usage, rather than from a purely theoretical perspective. Although the language uses many C and C++ concepts, it also discards some, or uses different approaches (and syntax) to achieve some goals. As such, it is not source compatible (nor does it aim to be) with C and C++ source code in general (some simpler code bases from these languages might by luck work with D, or require some porting). D has, however, been constrained in its design by the rule that any code that was legal in both C and D should behave in the same way.
D gained some features before C++, such as closures, anonymous functions, compile-time function execution, ranges, built-in container iteration concepts and type inference. D adds to the functionality of C++ by also implementing design by contract, unit testing, true modules, |
https://en.wikipedia.org/wiki/Mga%20%28protein%29 | Mga is a DNA-binding protein that activates the expression of several important virulence genes in Streptococcus pyogenes (group A Streptococcus, GAS) in response to changing environmental conditions. The family also contains VirR like proteins which match only at the C-terminus.
Mga is a wide-reaching regulator, affecting gene expression in over 10% of the S. pyrogenes genome. The other large regulator of virulence in GAS is the CovR/S two-component system, which affects the expression of approximately 15% of the genome. The two systems are linked through another protein, RivR, and a small non-coding RNA RivX. |
https://en.wikipedia.org/wiki/Microbial%20cyst | A microbial cyst is a resting or dormant stage of a microorganism, usually a bacterium or a protist or rarely an invertebrate animal, that helps the organism to survive in unfavorable environmental conditions. It can be thought of as a state of suspended animation in which the metabolic processes of the cell are slowed and the cell ceases all activities like feeding and locomotion. Encystment, the formation of the cyst, also helps the microbe to disperse easily, from one host to another or to a more favorable environment. When the encysted microbe reaches an environment favorable to its growth and survival, the cyst wall breaks down by a process known as excystation. In excystment, the exact stimulus is unknown for most protists.
Unfavorable environmental conditions such as lack of nutrients or oxygen, extreme temperatures, lack of moisture and presence of toxic chemicals, which are not conducive for the growth of the microbe trigger the formation of a cyst.
The main functions of cysts are to protect against adverse changes in the environment such as nutrient deficiency, desiccation, adverse pH, and low levels of oxygen, they are sites for nuclear reorganization and cell division, and in parasitic species they are the infectious stage between hosts.
Cyst formation across species
In bacteria
In bacteria (for instance, Azotobacter sp.), encystment occurs by changes in the cell wall; the cytoplasm contracts and the cell wall thickens. Bacterial cysts differ from endospores in the way they are formed and also the degree of resistance to unfavorable conditions. Endospores are much more resistant than cysts.
Bacteria do not always form a single cyst. Varieties of cysts formation events are known. As an example Rhodospirillium centenum can change the number of cell per cyst, usually ranging from four to ten cells per cyst depending on environment.
In protists
Protists, especially protozoan parasites, are often exposed to very harsh conditions at various stages in t |
https://en.wikipedia.org/wiki/Owen%27s%20T%20function | In mathematics, Owen's T function T(h, a), named after statistician Donald Bruce Owen, is defined by
The function was first introduced by Owen in 1956.
Applications
The function T(h, a) gives the probability of the event (X > h and 0 < Y < aX) where X and Y are independent standard normal random variables.
This function can be used to calculate bivariate normal distribution probabilities and, from there, in the calculation of multivariate normal distribution probabilities.
It also frequently appears in various integrals involving Gaussian functions.
Computer algorithms for the accurate calculation of this function are available; quadrature having been employed since the 1970s.
Properties
Here Φ(x) is the standard normal cumulative distribution function
More properties can be found in the literature. |
https://en.wikipedia.org/wiki/Epicormic%20shoot | An epicormic shoot is a shoot growing from an epicormic bud, which lies underneath the bark of a trunk, stem, or branch of a plant.
Epicormic buds lie dormant beneath the bark, their growth suppressed by hormones from active shoots higher up the plant. Under certain conditions, they grow into active shoots, such as when damage occurs to higher parts of the plant, or light levels are increased following removal of nearby plants. Epicormic buds and shoots occur in many woody species, but are absent from many others, such as most conifers.
Function
Human horticultural practices that exploit epicormic growth rely on plants that have epicormic budding capabilities for regenerative function in response to crown damage, such as through wind or fire.
Epicormic shoots are the means by which trees regrow after coppicing or pollarding, where the tree's trunk or branches are cut back on a regular cycle. These forestry techniques cannot be used on species which do not possess strong epicormic growth abilities.
Pruning leads to growth of suppressed shoots below the cut – these may be from epicormic buds, but they may also be other growth, such as normal buds or small shoots which are only partly suppressed.
Examples
Epicormic resprouting is typical of some tree species from fire-prone ecosystems.
As one of their responses to frequent bushfires which would destroy most other plants, many Eucalypt trees found widely throughout Australia have extensive epicormic buds which sprout following a fire, allowing the vegetative regeneration of branches from their trunks. These epicormic buds are highly protected, set deeper beneath the thick bark than in other tree species, allowing both the buds and vascular cambium to be insulated from the intense heat. Not all eucalypt trees possess this means of vegetative recovery, and the ability of a tree to survive and re-sprout depends on many factors, such as fire intensity, scorch height, and tree height, species, age, and size. Jarrah t |
https://en.wikipedia.org/wiki/LowerUnivalents | In proof compression, an area of mathematical logic, LowerUnivalents is an algorithm used for the compression of propositional resolution proofs. LowerUnivalents is a generalised algorithm of the LowerUnits, and it is able to lower not only units but also subproofs of non-unit clauses, provided that they satisfy some additional conditions. |
https://en.wikipedia.org/wiki/Advisory%20Committee%20on%20Human%20Radiation%20Experiments | The Advisory Committee on Human Radiation Experiments was established in 1994 to investigate questions of the record of the United States government with respect to human radiation experiments. The special committee was created by President Bill Clinton in Executive Order 12891, issued January 15, 1994. Ruth Faden of The Johns Hopkins Berman Institute of Bioethics chaired the committee.
The thousand-page final report of the Committee was released in October 1995 at a White House ceremony.
Background
The scandal first came to public attention in a newsletter called Science Trends in 1976 and in Mother Jones magazine in 1981. Mother Jones reporter Howard Rosenburg used the Freedom of Information Act to gather hundreds of documents to investigate total radiation studies which were done at the Oak Ridge Institute for Nuclear Studies (now the Oak Ridge Institute for Science and Education). The Mother Jones article triggered a hearing before the Subcommittee on Investigations and Oversight of the House Science and Technology Committee. Congressman Al Gore of Tennessee chaired the hearing. Gore's subcommittee report stated that the radiation experiments were "satisfactory, but not perfect."
In November 1986, a report by the staff of Massachusetts Congressman Ed Markey was released, but received only cursory media coverage. Entitled "American Nuclear Guinea Pigs: Three decades of radiation experiments on U.S. citizens", the report stated that there had been 31 human radiation experiments involving nearly 700 people. Markey urged the Department of Energy to make every effort to find the experimental subjects and compensate them for damages, which did not occur. DOE officials knew who had conducted the experiments, and the names of some of the subjects. After the report was released, President Ronald Reagan and Vice-President George H. W. Bush resisted opening investigations of the radiation experiments.
The Markey report found that between 1945 and 1947 eighteen hospita |
https://en.wikipedia.org/wiki/Standardized%20mortality%20ratio | In epidemiology, the standardized mortality ratio or SMR, is a quantity, expressed as either a ratio or percentage quantifying the increase or decrease in mortality of a study cohort with respect to the general population.
Standardized mortality ratio
The standardized mortality ratio is the ratio of observed deaths in the study group to expected deaths in the general population. This ratio can be expressed as a percentage simply by multiplying by 100.
The SMR may be quoted as either a ratio or a percentage. If the SMR is quoted as a ratio and is equal to 1.0, then this means the number of observed deaths equals that of expected cases. If higher than 1.0, then there is a higher number of deaths than is expected. SMR constitutes an indirect form of standardization. It has an advantage over the direct method of standardization since age-adjustment is permitted in situations where age stratification may not be available for the cohort being studied or where strata-specific data are subject to excessive random variability.
Definition
The requirements for calculating SMR for a cohort are:
The number of persons in each age group in the population being studied
The age specific death rates of the general population in the same age groups of the study population
The observed deaths in the study population
Expected deaths would then be calculated simply by multiplying the death rates of the general population by the total number of participants in the study group at the corresponding age group and summing up all the values for each age group to arrive at the number of expected deaths. The study groups are weighted based on their particular distribution (for example, age), as opposed to the general populations's distribution. This is a fundamental distinction between an indirect method of standardization like SMR from direct standardization techniques.
The SMR may well be quoted with an indication of the uncertainty associated with its estimation, such as a confidence |
https://en.wikipedia.org/wiki/Minification%20%28programming%29 | Minification (also minimisation or minimization) is the process of removing all unnecessary characters from the source code of interpreted programming languages or markup languages without changing its functionality. These unnecessary characters usually include white space characters, new line characters, comments, and sometimes block delimiters, which are used to add readability to the code but are not required for it to execute. Minification reduces the size of the source code, making its transmission over a network (e.g. the Internet) more efficient. In programmer culture, aiming at extremely minified source code is the purpose of recreational code golf competitions.
Minification can be distinguished from the more general concept of data compression in that the minified source can be interpreted immediately without the need for an uncompression step: the same interpreter can work with both the original as well as with the minified source.
The goals of minification are not the same as the goals of obfuscation; the former is often intended to be reversed using a pretty-printer or unminifier. However, to achieve its goals, minification sometimes uses techniques also used by obfuscation; for example, shortening variable names and refactoring the source code. When minification uses such techniques, the pretty-printer or unminifier can only fully reverse the minification process if it is supplied details of the transformations done by such techniques. If not supplied those details, the reversed source code will contain different variable names and control flow, even though it will have the same functionality as the original source code.
Example
For example, the JavaScript code
// This is a comment that will be removed by the minifier
var array = [];
for (var i = 0; i < 20; i++) {
array[i] = i;
}
is equivalent to but longer than
for(var a=[],i=0;i<20;a[i]=i++);
History
In 2001 Douglas Crockford introduced JSMin, which removed comments and whitespace from JavaScrip |
https://en.wikipedia.org/wiki/Multiple%20Console%20Time%20Sharing%20System | The Multiple Console Time Sharing System (MCTS) was an operating system developed by General Motors Research Laboratories in the 1970s for the Control Data Corporation STAR-100 supercomputer. MCTS was built to support GM's computer-aided design (CAD) applications.
MCTS was based on Multics.
See also
GM-NAA I/O
SHARE Operating System
Timeline of operating systems |
https://en.wikipedia.org/wiki/Upward%20continuation | Upward continuation is a method used in oil exploration and geophysics to estimate the values of a gravitational or magnetic field by using measurements at a lower elevation and extrapolating upward, assuming continuity. This technique is commonly used to merge different measurements to a common level so as to reduce scatter and allow for easier analysis.
See also
Petroleum geology |
https://en.wikipedia.org/wiki/Acoustic%20transmission | Acoustic transmission is the transmission of sounds through and between materials, including air, wall, and musical instruments.
The degree to which sound is transferred between two materials depends on how well their acoustical impedances match.
In musical instrument design
Musical instruments are generally designed to radiate sound effectively. A high-impedance part of the instrument, such as a string, transmits vibrations through a bridge (intermediate impedance) to a sound board (lower impedance). The soundboard then moves the still lower-impedance air. Without bridge and soundboard, the instrument does not transmit enough sound to the air, and is too quiet to be performed with. An electric guitar has no soundboard; it uses a microphone pick-up and artificial amplification. Without amplification, electric guitars are very quiet.
Stethoscope
Stethoscopes roughly match the acoustical impedance of the human body, so they transmit sounds from a patient's chest to the doctor's ear much more effectively than the air does. Putting an ear to someone's chest would have a similar effect.
Building acoustics
Acoustic transmission in building design refers to a number of processes by which sound can be transferred from one part of a building to another. Typically these are:
Airborne transmission - a noise source in one room sends air pressure waves which induce vibration to one side of a wall or element of structure setting it moving such that the other face of the wall vibrates in an adjacent room. Structural isolation therefore becomes an important consideration in the acoustic design of buildings. Highly sensitive areas of buildings, for example recording studios, may be almost entirely isolated from the rest of a structure by constructing the studios as effective boxes supported by springs. Air tightness also becomes an important control technique. A tightly sealed door might have reasonable sound reduction properties, but if it is left open only a few millimeters i |
https://en.wikipedia.org/wiki/Entwisleia | Entwisleia is a monotypic genus in the red algae family, Entwisleiaceae. There is just one species (the type species) in this genus,
Entwisleia bella, from south-eastern Tasmania and represents both a new family and a new order in the Nemaliophycidae.
It is a marine species found in the Derwent River estuary. It grows at depths between 5.0 and 9.0 m and is found scattered on mudstone reef flats dusted or shallowly covered by sand. The site at which it was found is subject to episodic high-rainfall events throughout the year and heavy swells in winter. It is a feathery dioecious seaweed, very like the freshwater red algae, Batrachospermum, but from DNA sequencing, appears to be quite unrelated. Scott et al.'s (2013) study shows it as a sister clade of the Colaconematales.
The genus was named to honour Tim Entwisle, was circumscribed by Fiona Jean Scott and Gerald Thompson Kraft in Eur. J. Phycol. Vol.48 (Issue 4) on page 402 in 2013. |
https://en.wikipedia.org/wiki/Content-addressable%20network | The content-addressable network (CAN) is a distributed, decentralized P2P infrastructure that provides hash table functionality on an Internet-like scale. CAN was one of the original four distributed hash table proposals, introduced concurrently with Chord, Pastry, and Tapestry.
Overview
Like other distributed hash tables, CAN is designed to be scalable, fault tolerant, and self-organizing. The architectural design is a virtual multi-dimensional Cartesian coordinate space, a type of overlay network, on a multi-torus. This n-dimensional coordinate space is a virtual logical address, completely independent of the physical location and physical connectivity of the nodes. Points within the space are identified with coordinates. The entire coordinate space is dynamically partitioned among all the nodes in the system such that every node possesses at least one distinct zone within the overall space.
Routing
A CAN node maintains a routing table that holds the IP address and virtual coordinate zone of each of its neighbors. A node routes a message towards a destination point in the coordinate space. The node first determines which neighboring zone is closest to the destination point, and then looks up that zone's node's IP address via the routing table.
Node joining
To join a CAN, a joining node must:
Find a node already in the overlay network.
Identify a zone that can be split
Update the routing tables of nodes neighboring the newly split zone.
To find a node already in the overlay network, bootstrapping nodes may be used to inform the joining node of IP addresses of nodes currently in the overlay network.
After the joining node receives an IP address of a node already in the CAN, it can attempt to identify a zone for itself. The joining node randomly picks a point in the coordinate space and sends a join request, directed to the random point, to one of the received IP addresses. The nodes already in the overlay network route the join request to the correct d |
https://en.wikipedia.org/wiki/Synthetic%20division | In algebra, synthetic division is a method for manually performing Euclidean division of polynomials, with less writing and fewer calculations than long division.
It is mostly taught for division by linear monic polynomials (known as Ruffini's rule), but the method can be generalized to division by any polynomial.
The advantages of synthetic division are that it allows one to calculate without writing variables, it uses few calculations, and it takes significantly less space on paper than long division. Also, the subtractions in long division are converted to additions by switching the signs at the very beginning, helping to prevent sign errors.
Regular synthetic division
The first example is synthetic division with only a monic linear denominator .
The numerator can be written as .
The zero of the denominator is .
The coefficients of are arranged as follows, with the zero of on the left:
The after the bar is "dropped" to the last row.
The is multiplied by the before the bar, and placed in the .
An is performed in the next column.
The previous two steps are repeated and the following is obtained:
Here, the last term (-123) is the remainder while the rest correspond to the coefficients of the quotient.
The terms are written with increasing degree from right to left beginning with degree zero for the remainder and the result.
Hence the quotient and remainder are:
Evaluating polynomials by the remainder theorem
The above form of synthetic division is useful in the context of the polynomial remainder theorem for evaluating univariate polynomials. To summarize, the value of at is equal to the remainder of the division of by
The advantage of calculating the value this way is that it requires just over half as many multiplication steps as naive evaluation. An alternative evaluation strategy is Horner's method.
Expanded synthetic division
This method generalizes to division by any monic polynomial with only a slight modification with chan |
https://en.wikipedia.org/wiki/Naptumomab%20estafenatox | Naptumomab estafenatox (ABR-217620) is a drug being developed for the treatment of various types of cancer like non-small cell lung carcinoma and renal cell carcinoma.
Mechanism of action
Chemically, it is a fusion protein consisting of the antigen-binding fragment (Fab) of a monoclonal antibody with the superantigen staphylococcal enterotoxin A (SEA/E-120, "estafenatox"). The Fab binds to 5T4, an antigen expressed by various tumor cells, and the superantigen induces an immune response by activating T lymphocytes.
See also
Nacolomab tafenatox, a drug with a similar chemical structure and mechanism |
https://en.wikipedia.org/wiki/Zero-forcing%20precoding | Zero-forcing (or null-steering) precoding is a method of spatial signal processing by which a multiple antenna transmitter can null the multiuser interference in a multi-user MIMO wireless communication system. When the channel state information is perfectly known at the transmitter, the zero-forcing precoder is given by the pseudo-inverse of the channel matrix.
Mathematical description
In a multiple antenna downlink system which comprises transmit antenna access points and single receive antenna users, such that , the received signal of user is described as
where is the vector of transmitted symbols, is the noise signal, is the channel vector and is some linear precoding vector. Here is the matrix transpose, is the square root of transmit power, and is the message signal with zero mean and variance .
The above signal model can be more compactly re-written as
where
is the received signal vector,
is channel matrix,
is the precoding matrix,
is a diagonal power matrix, and
is the transmit signal.
A zero-forcing precoder is defined as a precoder where intended for user is orthogonal to every channel vector associated with users where . That is,
Thus the interference caused by the signal meant for one user is effectively nullified for rest of the users via zero-forcing precoder.
From the fact that each beam generated by zero-forcing precoder is orthogonal to all the other user channel vectors, one can rewrite the received signal as
The orthogonality condition can be expressed in matrix form as
where is some diagonal matrix. Typically, is selected to be an identity matrix. This makes the right Moore-Penrose pseudo-inverse of given by
Given this zero-forcing precoder design, the received signal at each user is decoupled from each other as
Quantify the feedback amount
Quantify the amount of the feedback resource required to maintain at least a given throughput performance gap between zero-forcing with perfect feedback and wi |
https://en.wikipedia.org/wiki/Flag%20of%20the%20Western%20Union | The Western Union (WU) was a military alliance established between France, the United Kingdom and the three Benelux countries between 1948 and 1954. The flag of the Western Union, also referred to as the Western Union Standard, displays an unbroken chain of five rectangular links in the shape of an upside-down pentagon on a blue field, with a multicoloured border (red on the outside, gold, black and white) taken from the WU member states' flags.
Design
Field (badge design)
The field of the flag is blue, and displays an unbroken chain of five rectangular links in the shape of an upside-down pentagon.
Border
The border of the flag is multicoloured border (red on the outside, gold, black and white) taken from the WU member states' flags. The relative proportions of the border are approximatively: Red 3, each of the others 1. The total width of the border is approximatively half that of the depth of the flag. The number of links symbolises the Western Union's five members.
History
The flag was first seen in October 1949. The flag might also have been introduced and used as a command flag of Commander in Chief Admiral of the Fleet Rhoderick McGrigor during Exercise Verity in 1949, the only major exercise held by the organisation. The flag was also flown on a car belonging to Bernard Montgomery, Chairman of the Commanders-in-Chief Committee.
A photo of the flag is shown in the book entitled Badges on Battledress by Howard N. Cole (Aldershot, Gale & Polden, 1953). The original caption states: 'NCOs of the Corps of Royal Military Police displaying the Western Union Standard which incorporates the badge of the Headquarters, Western Europe Commanders-in-Chief'. It doesn't say where the photo is taken, although it might just be Fontainebleau.
Modification
The flag ceased to be used upon the creation of NATO's Headquarters, Allied Land Forces Central Europe (AFCENT) in August 1953, at which point one extra link was added to the emblem, symbolising the United States. Simi |
https://en.wikipedia.org/wiki/Hilbert%20transform | In mathematics and signal processing, the Hilbert transform is a specific singular integral that takes a function, of a real variable and produces another function of a real variable . The Hilbert transform is given by the Cauchy principal value of the convolution with the function (see ). The Hilbert transform has a particularly simple representation in the frequency domain: It imparts a phase shift of ±90° (/2 radians) to every frequency component of a function, the sign of the shift depending on the sign of the frequency (see ). The Hilbert transform is important in signal processing, where it is a component of the analytic representation of a real-valued signal . The Hilbert transform was first introduced by David Hilbert in this setting, to solve a special case of the Riemann–Hilbert problem for analytic functions.
Definition
The Hilbert transform of can be thought of as the convolution of with the function , known as the Cauchy kernel. Because 1/ is not integrable across , the integral defining the convolution does not always converge. Instead, the Hilbert transform is defined using the Cauchy principal value (denoted here by ). Explicitly, the Hilbert transform of a function (or signal) is given by
provided this integral exists as a principal value. This is precisely the convolution of with the tempered distribution . Alternatively, by changing variables, the principal-value integral can be written explicitly as
When the Hilbert transform is applied twice in succession to a function , the result is
provided the integrals defining both iterations converge in a suitable sense. In particular, the inverse transform is
. This fact can most easily be seen by considering the effect of the Hilbert transform on the Fourier transform of (see below).
For an analytic function in the upper half-plane, the Hilbert transform describes the relationship between the real part and the imaginary part of the boundary values. That is, if is analytic in the upp |
https://en.wikipedia.org/wiki/Richard%20Owen | Sir Richard Owen (20 July 1804 – 18 December 1892) was an English biologist, comparative anatomist and paleontologist. Owen is generally considered to have been an outstanding naturalist with a remarkable gift for interpreting fossils.
Owen produced a vast array of scientific work, but is probably best remembered today for coining the word Dinosauria (meaning "Terrible Reptile" or "Fearfully Great Reptile"). An outspoken critic of Charles Darwin's theory of evolution by natural selection, Owen agreed with Darwin that evolution occurred but thought it was more complex than outlined in Darwin's On the Origin of Species. Owen's approach to evolution can be considered to have anticipated the issues that have gained greater attention with the recent emergence of evolutionary developmental biology.
Owen was the first president of the Microscopical Society of London in 1839 and edited many issues of its journal – then known as The Microscopic Journal. Owen also campaigned for the natural specimens in the British Museum to be given a new home. This resulted in the establishment, in 1881, of the now world-famous Natural History Museum in South Kensington, London. Bill Bryson argues that, "by making the Natural History Museum an institution for everyone, Owen transformed our expectations of what museums are for."
While he made several contributions to science and public learning, Owen was a controversial figure among his contemporaries, both for his disagreements on matters of common descent, and for accusations that he took credit for other people's work.
Biography
Owen became a surgeon's apprentice in 1820 and was appointed to the Royal College of Surgeons in 1826. In 1836, Owen was appointed Hunterian professor at the Royal College, and in 1849, he succeeded William Clift as conservator of the Hunterian Museum. He held the latter office until 1856, when he became superintendent of the natural history department of the British Museum. He then devoted much of his energ |
https://en.wikipedia.org/wiki/BioPAX | BioPAX (Biological Pathway Exchange) is a RDF/OWL-based
standard language to represent biological pathways at the molecular and cellular level. Its major use is to facilitate the exchange of pathway data. Pathway data captures our understanding of biological processes, but
its rapid growth necessitates development of databases and computational tools to aid interpretation. However, the current fragmentation of pathway information across many
databases with incompatible formats presents barriers to its effective use. BioPAX solves this
problem by making pathway data substantially easier to collect, index, interpret and share.
BioPAX can represent metabolic and signaling pathways, molecular and genetic interactions and
gene regulation networks. BioPAX was created through a community process. Through BioPAX, millions of interactions organized into thousands of pathways across many organisms, from a
growing number of sources, are available. Thus, large amounts of pathway data are available in a
computable form to support visualization, analysis and biological discovery.
It is supported by a variety of online databases (e.g. Reactome) and tools. The latest released version is BioPAX Level 3. There is also an effort to create a version of BioPAX as part of OBO.
Governance and development
The next version of BioPAX, Level 4, is being developed by a community of researchers. Development is coordinated by the board of editors and facilitated by various BioPAX work groups.
Systems Biology Pathway Exchange (SBPAX) is an extension for Level 3 and proposal for Level 4 to add quantitative data and systems biology terms (such as Systems Biology Ontology). SBPAX export has been implemented by the pathway databases Signaling Gateway Molecule Pages, and the SABIO-Reaction Kinetics Database. SBPAX import has been implemented by the cellular modeling framework Virtual Cell.
Other proposals for Level 4 include improved support for Semantic Web, validation and visualization.
Databa |
https://en.wikipedia.org/wiki/Aluminium%20phosphide | Aluminium phosphide is a highly toxic inorganic compound with the chemical formula AlP, used as a wide band gap semiconductor and a fumigant. This colorless solid is generally sold as a grey-green-yellow powder due to the presence of impurities arising from hydrolysis and oxidation.
Properties
AlP crystals are dark grey to dark yellow in color and have a zincblende crystal structure with a lattice constant of 5.4510 Å at 300 K. They are thermodynamically stable up to .
Aluminium phosphide reacts with water or acids to release phosphine:
AlP + 3 H2O → Al(OH)3 + PH3
AlP + 3 H+ → Al3+ + PH3
This reaction is the basis of its toxicity.
Preparation
AlP is synthesized by combination of the elements:
4Al + P4 → 4AlP
Caution must be taken to avoid exposing the AlP to any sources of moisture, as this generates toxic phosphine gas. Phosphine also poses fire hazards, as it is a dangerous pyrophoric compound, igniting easily in air.
Uses
Pesticide
AlP is used as a rodenticide, insecticide, and fumigant for stored cereal grains. It is used to kill small verminous mammals such as moles and rodents. The tablets or pellets, known as "wheat pills", typically also contain other chemicals that evolve ammonia and carbon dioxide (e.g. ammonium carbamate), which help to reduce the potential for spontaneous ignition or explosion of the phosphine gas.
AlP is used as both a fumigant and an oral pesticide. As a rodenticide, aluminium phosphide pellets are provided as a mixture with food for consumption by the rodents. The acid in the digestive system of the rodent reacts with the phosphide to generate the toxic phosphine gas. Other pesticides similar to aluminium phosphide are zinc phosphide and calcium phosphide. In this application, aluminium phosphide can be encountered under various brand names, e.g. PestPhos,Quickphos, Celphos, Fostox, Fumitoxin, Phostek, Phostoxin, Talunex, Fieldphos, and Weevil-Cide. It generates phosphine gas according to the following hydrolysis equation |
https://en.wikipedia.org/wiki/Tree%20hollow | A tree hollow or tree hole is a semi-enclosed cavity which has naturally formed in the trunk or branch of a tree. They are found mainly in old trees, whether living or not. Hollows form in many species of trees, and are a prominent feature of natural forests and woodlands, and act as a resource or habitat for a number of vertebrate and invertebrate animals.
Hollows may form as the result of physiological stress from natural forces causing the excavating and exposure of the heartwood. Forces may include wind, fire, heat, lightning, rain, attack from insects (such as ants or beetles), bacteria, or fungi. Also, trees may self-prune, dropping lower branches as they reach maturity, exposing the area where the branch was attached. Many animals further develop the hollows using instruments such as their beak, teeth or claws.
The size of hollows may depend on the age of the tree. For example, eucalypts develop hollows at all ages, but only from when the trees are 120 years old do they form hollows suitable for vertebrates, and it may take 220 years for hollows suitable for larger species to form.
Hollows in fallen timber are also very important for animals such as echidnas, numbats, chuditch and many reptiles. In streams, hollow logs may be important to aquatic animals for shelter and egg attachment.
Hollows are an important habitat for many wildlife species, especially where the use of hollows is obligate, as this means no other resource would be a feasible substitute. Animals may use hollows as diurnal or nocturnal shelter sites, as well as for rearing young, feeding, thermoregulation, and to facilitate ranging behaviour and dispersal. While use may also be opportunistic, rather than obligate, it may be difficult to determine the nature of a species' relationship to hollows—it may vary across a species' range, or depend on climatic conditions.
Animals will select a hollow based on factors including entrance size and shape, depth, and degree of insulation. Such factor |
https://en.wikipedia.org/wiki/Kalafungin | Kalafungin is a substance discovered in the 1960s and found to act as a broad-spectrum antibiotic in vitro. It was isolated from a strain of the bacterium Streptomyces tanashiensis.
It is not known to be marketed anywhere in the world. |
https://en.wikipedia.org/wiki/Content%20Security%20Policy | Content Security Policy (CSP) is a computer security standard introduced to prevent cross-site scripting (XSS), clickjacking and other code injection attacks resulting from execution of malicious content in the trusted web page context. It is a Candidate Recommendation of the W3C working group on Web Application Security, widely supported by modern web browsers. CSP provides a standard method for website owners to declare approved origins of content that browsers should be allowed to load on that website—covered types are JavaScript, CSS, HTML frames, web workers, fonts, images, embeddable objects such as Java applets, ActiveX, audio and video files, and other HTML5 features.
Status
The standard, originally named Content Restrictions, was proposed by Robert Hansen in 2004, first implemented in Firefox 4 and quickly picked up by other browsers. Version 1 of the standard was published in 2012 as W3C candidate recommendation and quickly with further versions (Level 2) published in 2014. , the draft of Level 3 is being developed with the new features being quickly adopted by the web browsers.
The following header names are in use as part of experimental CSP implementations:
Content-Security-Policy – standard header name proposed by the W3C document. Google Chrome supports this as of version 25. Firefox supports this as of version 23, released on 6 August 2013. WebKit supports this as of version 528 (nightly build). Chromium-based Microsoft Edge support is similar to Chrome's.
X-WebKit-CSP – deprecated, experimental header introduced into Google Chrome, Safari and other WebKit-based web browsers in 2011.
X-Content-Security-Policy – deprecated, experimental header introduced in Gecko 2 based browsers (Firefox 4 to Firefox 22, Thunderbird 3.3, SeaMonkey 2.1).
A website can declare multiple CSP headers, also mixing enforcement and report-only ones. Each header will be processed separately by the browser.
CSP can also be delivered within the HTML code using a HTML |
https://en.wikipedia.org/wiki/Dyad%20symmetry | In genetics, dyad symmetry refers to two areas of a DNA strand whose base pair sequences are inverted repeats of each other. They are often described as palindromes.
For example, the following shows dyad symmetry between sequences GAATAC and GTATTC which are reverse complements of each other.
...GAATAC...CTG...GTATTC...
Involvement in transcription
Since the two reverse complementary sequences will fold and base-pair with each other, the sequence of bases between them form a hairpin loop. This structure is thought to destabilize the binding of RNA polymerase enzyme to DNA (hence terminating transcription). Dyad symmetry is known to have a role in the rho independent method of transcription termination in E. coli. Regions of dyad symmetry in the DNA sequence stall the RNA polymerase enzyme as it transcribes them.
Involvement in prophage integration
Temperate bacteriophages integrate into the host genome at specific interrupted dyad symmetry sequences using the phage encoded enzyme integrase (see prophage integration). |
https://en.wikipedia.org/wiki/Bead%20theory | The bead theory is a disproved hypothesis that genes are arranged on the chromosome like beads on a necklace. This theory was first proposed by Thomas Hunt Morgan after discovering genes through his work with breeding red and white eyed fruit flies. According to this theory, the existence of a gene as a unit of inheritance is recognized through its mutant alleles. A mutant allele affects a single phenotypic character, maps to one chromosome locus, gives a mutant phenotype when paired and shows a Mendelian ratio when intercrossed. Several tenets of the bead theory are worth emphasizing :-
1. The gene is viewed as a fundamental unit of structure, indivisible by crossing over. Crossing over take place between genes ( the beads in this model ) but never within them.
2. The gene is viewed as the fundamental unit of change or mutation. It changes in toto from one allelic form into another; there are no smaller components within it that can change.
3. The gene is viewed as the fundamental unit of function ( although the precise function of gene is not specified in this model ). Parts of a gene, if they exist cannot function. Guido Pontecorvo continued to work under the basis of this theory until
Seymour Benzer showed in the 1950s that the bead theory was not correct. He demonstrated that a gene can be defined as a unit of function. A gene can be subdivided into a linear array of sites that are mutable and that can be recombined. The smallest units of mutation and recombination are now known to be correlated with single nucleotide pairs. |
https://en.wikipedia.org/wiki/Yaw%20%28rotation%29 | A yaw rotation is a movement around the yaw axis of a rigid body that changes the direction it is pointing, to the left or right of its direction of motion. The yaw rate or yaw velocity of a car, aircraft, projectile or other rigid body is the angular velocity of this rotation, or rate of change of the heading angle when the aircraft is horizontal. It is commonly measured in degrees per second or radians per second.
Another important concept is the yaw moment, or yawing moment, which is the component of a torque about the yaw axis.
Measurement
Yaw velocity can be measured by measuring the ground velocity at two geometrically separated points on the body, or by a gyroscope, or it can be synthesized from accelerometers and the like. It is the primary measure of how drivers sense a car's turning visually.
It is important in electronic stabilized vehicles. The yaw rate is directly related to the lateral acceleration of the vehicle turning at constant speed around a constant radius, by the relationship
tangential speed*yaw velocity = lateral acceleration = tangential speed^2/radius of turn, in appropriate units
The sign convention can be established by rigorous attention to coordinate systems.
In a more general manoeuvre where the radius is varying, and/or the speed is varying, the above relationship no longer holds.
Yaw rate control
The yaw rate can be measured with accelerometers in the vertical axis. Any device intended to measure the yaw rate is called a yaw rate sensor.
Road vehicles
Studying the stability of a road vehicle requires a reasonable approximation to the equations of motion.
The diagram illustrates a four-wheel vehicle, in which the front axle is located a metres ahead of the centre of gravity and the rear axle is b metres towards the rear from the center of gravity. The body of the car is pointing in a direction (theta) while it is travelling in a direction (psi). In general, these are not the same. The tyre treads at the region of contact po |
https://en.wikipedia.org/wiki/Spectral%20submanifold | In dynamical systems, a spectral submanifold (SSM) is the unique smoothest invariant manifold serving as the nonlinear extension of a spectral subspace of a linear dynamical system under the addition of nonlinearities. SSM theory provides conditions for when invariant properties of eigenspaces of a linear dynamical system can be extended to a nonlinear system, and therefore motivates the use of SSMs in nonlinear dimensionality reduction.
Definition
Consider a nonlinear ordinary differential equation of the form
with constant matrix and the nonlinearities contained in the smooth function .
Assume that for all eigenvalues of , that is, the origin is an asymptotically stable fixed point. Now select a span of eigenvectors of . Then, the eigenspace is an invariant subspace of the linearized system
Under addition of the nonlinearity to the linear system, generally perturbs into infinitely many invariant manifolds. Among these invariant manifolds, the unique smoothest one is referred to as the spectral submanifold.
An equivalent result for unstable SSMs holds for .
Existence
The spectral submanifold tangent to at the origin is guaranteed to exist provided that certain non-resonance conditions are satisfied by the eigenvalues in the spectrum of . In particular, there can be no linear combination of equal to one of the eigenvalues of outside of the spectral subspace. If there is such an outer resonance, one can include the resonant mode into and extend the analysis to a higher-dimensional SSM pertaining to the extended spectral subspace.
Non-autonomous extension
The theory on spectral submanifolds extends to nonlinear non-autonomous systems of the form
with a quasiperiodic forcing term.
Significance
Spectral submanifolds are useful for rigorous nonlinear dimensionality reduction in dynamical systems. The reduction of a high-dimensional phase space to a lower-dimensional manifold can lead to major simplifications by allowing for an accurate description |
https://en.wikipedia.org/wiki/Harold%20Scott%20MacDonald%20Coxeter | Harold Scott MacDonald "Donald" Coxeter (9 February 1907 – 31 March 2003) was a British-Canadian geometer and mathematician. He is regarded as one of the greatest geometers of the 20th century.
Biography
Coxeter was born in Kensington, England, to Harold Samuel Coxeter and Lucy (). His father had taken over the family business of Coxeter & Son, manufacturers of surgical instruments and compressed gases (including a mechanism for anaesthetising surgical patients with nitrous oxide), but was able to retire early and focus on sculpting and baritone singing; Lucy Coxeter was a portrait and landscape painter who had attended the Royal Academy of Arts. A maternal cousin was the architect Sir Giles Gilbert Scott.
In his youth, Coxeter composed music and was an accomplished pianist at the age of 10. He felt that mathematics and music were intimately related, outlining his ideas in a 1962 article on "Music and Mathematics" in the Canadian Music Journal.
He was educated at King Alfred School, London, and St George's School, Harpenden, where his best friend was John Flinders Petrie, later a mathematician for whom Petrie polygons were named. He was accepted at King's College, Cambridge, in 1925, but decided to spend a year studying in hopes of gaining admittance to Trinity College, where the standard of mathematics was higher. Coxeter won an entrance scholarship and went to Trinity in 1926 to read mathematics. There he earned his BA (as Senior Wrangler) in 1928, and his doctorate in 1931. In 1932 he went to Princeton University for a year as a Rockefeller Fellow, where he worked with Hermann Weyl, Oswald Veblen, and Solomon Lefschetz. Returning to Trinity for a year, he attended Ludwig Wittgenstein's seminars on the philosophy of mathematics. In 1934 he spent a further year at Princeton as a Procter Fellow.
In 1936 Coxeter moved to the University of Toronto. In 1938 he and P. Du Val, H. T. Flather, and John Flinders Petrie published The Fifty-Nine Icosahedra with Universi |
https://en.wikipedia.org/wiki/Pascal%20Lee | Pascal Lee (; born 1964) is co-founder and chairman of the Mars Institute, a planetary scientist at the SETI Institute, and the Principal Investigator of the Haughton-Mars Project (HMP) at NASA Ames Research Center in Mountain View, California. He holds an ME in geology and geophysics from the University of Paris, and a PhD in astronomy and space sciences from Cornell University.
Lee's research focuses on Mars, asteroids, and impact craters, in particular in connection with the history of water on planets and the possibility of extraterrestrial life. He is known internationally for his work on Moon and Mars analogs in the Arctic, Antarctica, and other extreme environments on Earth. He is the author and co-author of over 100 scientific publications, and first proposed the "Mars Always Cold, Sometimes Wet" model of Mars evolution based on field studies of the geology of Earth's polar regions.
In 1988, Lee wintered over for 402 days at Dumont d'Urville station, Adelie Land, Antarctica, where he served as the station's chief geophysicist. He also participated in five summer campaigns in Antarctica as a geologist and planetary scientist, in particular as a member of the US Antarctic Search for Meteorites (ANSMET) program.
In 1997, Lee initiated the Haughton-Mars Project (HMP), an international multidisciplinary field research project centered on science and exploration studies at the Haughton impact crater and surrounding terrain on Devon Island, Arctic Canada, viewed as an analog site for the Moon and Mars. Lee has led over 18 HMP field expeditions to date, including the "Northwest Passage Drive Expedition" in April 2009 and May 2010, and continues to serve as the HMP's Director in support of research for NASA and the Canadian Space Agency.
Pascal Lee is widely recognized for his efforts to advance the human exploration of Mars, in particular via its asteroid-like moons Phobos and Deimos.
Lee is a recipient of the United States Antarctic Service Medal and the Space |
https://en.wikipedia.org/wiki/Dirty%20paper%20coding | In telecommunications, dirty paper coding (DPC) or Costa precoding is a technique for efficient transmission of digital data through a channel subjected to some interference known to the transmitter. The technique consists of precoding the data in order to cancel the interference. Dirty-paper coding achieves the channel capacity, without a power penalty and without requiring the receiver to know the interfering signal.
The term dirty paper coding was coined by Max Costa who compared the technique to writing a message on a piece of paper which is partially soiled with random ink strokes or spots. By erasing and adding ink in the proper places, the writer can convey just as much information as if the paper were clean, even though the reader does not know where the dirt was. In this analogy, the paper is the channel, the dirt is interference, the writer is the transmitter, and the reader is the receiver.
Note that DPC at the encoder is an information-theoretic dual of Wyner-Ziv coding at the decoder.
Variants
Instances of dirty paper coding include Costa precoding (1983). Suboptimal approximations of dirty paper coding include Tomlinson-Harashima precoding (THP) published in 1971 and the vector perturbation technique of Hochwald et al. (2005).
Design considerations
DPC and DPC-like techniques require knowledge of the interference state in a non causal manner, such as channel state information of all users and other user data. Hence, the design of a DPC-based system should include a procedure to feed side information to the transmitters.
Applications
In 2003, Caire and Shamai applied DPC to the multi-antenna multi-user downlink, which is referred to as the 'broadcast channel' by information theorists. Since then, there has been widespread use of DPC in wireless networks and into an interference aware coding technique for dynamic wireless networks.
Recently, DPC has also been used for "informed digital watermarking" and is the modulation mechanism used by |
https://en.wikipedia.org/wiki/Formation%20rule | In mathematical logic, formation rules are rules for describing which strings of symbols formed from the alphabet of a formal language are syntactically valid within the language. These rules only address the location and manipulation of the strings of the language. It does not describe anything else about a language, such as its semantics (i.e. what the strings mean). (See also formal grammar).
Formal language
A formal language is an organized set of symbols the essential feature being that it can be precisely defined in terms of just the shapes and locations of those symbols. Such a language can be defined, then, without any reference to any meanings of any of its expressions; it can exist before any interpretation is assigned to it—that is, before it has any meaning. A formal grammar determines which symbols and sets of symbols are formulas in a formal language.
Formal systems
A formal system (also called a logical calculus, or a logical system) consists of a formal language together with a deductive apparatus (also called a deductive system). The deductive apparatus may consist of a set of transformation rules (also called inference rules) or a set of axioms, or have both. A formal system is used to derive one expression from one or more other expressions. Propositional and predicate calculi are examples of formal systems.
Propositional and predicate logic
The formation rules of a propositional calculus may, for instance, take a form such that;
if we take Φ to be a propositional formula we can also take Φ to be a formula;
if we take Φ and Ψ to be a propositional formulas we can also take (Φ Ψ), (Φ Ψ), (Φ Ψ) and (Φ Ψ) to also be formulas.
A predicate calculus will usually include all the same rules as a propositional calculus, with the addition of quantifiers such that if we take Φ to be a formula of propositional logic and α as a variable then we can take (α)Φ and (α)Φ each to be formulas of our predicate calculus.
See also
Finite state autom |
https://en.wikipedia.org/wiki/Holographic%20Versatile%20Card | The Holographic Versatile Card (HVC) was a proposed data storage format by Optware; the projected date for a Japanese launch had been the first half of 2007, pending finalization of the specification, however as of March 2022, nothing has yet surfaced. One of its main advantages compared with discs was supposed to be the lack of moving parts when played. They claimed it would hold 30GB of data, have a write speed 3 times faster than Blu-ray, and be approximately the size of a credit card. Optware claimed that at release the media would cost about ¥100 (roughly $1.20) each, that reader devices would initially cost about ¥200,000(roughly $2400) while reader/writer devices would have cost ¥1 000,000 (roughly $12000, as per exchange rate of Apr 2011) each.
See also
DVD
HD DVD
Holographic memory
Holographic Versatile Disc
Vaporware |
https://en.wikipedia.org/wiki/Anatoliy%20Skorokhod | Anatoliy Volodymyrovych Skorokhod (; September 10, 1930January 3, 2011) was a Soviet and Ukrainian mathematician.
Skorokhod is well-known for a comprehensive treatise on the theory of stochastic processes, co-authored with Gikhman.
Career
Skorokhod worked at Kyiv University from 1956 to 1964. He was subsequently at the Institute of Mathematics of the National Academy of Sciences of Ukraine from 1964 until 2002. Since 1993, he had been a professor at Michigan State University in the US, and a member of the American Academy of Arts and Sciences.
He was an academician of the National Academy of Sciences of Ukraine from 1985 to his death in 2011.
His scientific works are on the theory of:
stochastic differential equations,
limit theorems of random processes,
distributions in infinite-dimensional spaces,
statistics of random processes and Markov processes.
Skorokhod authored over 450 scientific works, including more than 40 monographs and books.
Many terms and concepts have his name, including:
Skorokhod's embedding theorem
Skorokhod integral
Skorokhod's representation theorem
Skorokhod space
Skorokhod problem
Selected works
with I. I. Gikhman: Introduction to the theory of random processes, W. B. Saunders 1969, Dover 1996
with I. I. Gikhman: Stochastic Differential Equations, Springer Verlag 1972
with I. I. Gikhman: Controlled stochastic processes, Springer Verlag 1979
with I. I. Gikhman: The Theory of Stochastic Processes, Springer Verlag, 3 vols., 2004–2007
Random processes with independent increments, Kluwer 1991
Asymptotic methods in the theory of stochastic differential equations , American Mathematical Society 1989
Random linear operators, Reidel 1984
Studies in the theory of random processes, Dover 1982
Stochastic equations for complex systems, Reidel/Kluwer 1988
Stochastische Differentialgleichungen, Berlin, Akademie Verlag 1971
Integration in Hilbert Space, Springer Verlag 1974
with Yu. V. Prokhorov: Basic principles and applications of probab |
https://en.wikipedia.org/wiki/Kettapeptin | Kettapeptin is a depsipeptide antibiotic isolated from Streptomyces. It is effective against gram positive bacteria, as well as the fungi Candida albicans, Mucor miehei, and the microalgae Scenedesmus subspicatus. |
https://en.wikipedia.org/wiki/Military%20Cryptanalytics | Military Cryptanalytics (or MILCRYP as it is sometimes known) is a revision by Lambros D. Callimahos of the series of books written by William F. Friedman under the title Military Cryptanalysis. It may also contain contributions by other cryptanalysts. It was a training manual for National Security Agency and military cryptanalysts. It was published for government use between 1957 and 1977, though parts I and II were written in 1956 and 1959.
Callimahos on the work
From the Introduction in Part I, Volume I, by Callimahos:
"This text represents an extensive expansion and revision, both in scope and content, of the earlier work entitled 'Military Cryptanalysis, Part I' by William F. Friedman. This expansion and revision was necessitated by the considerable advancement made in the art since the publication of the previous text."
Callimahos referred to parts III–VI at the end of the first volume:
"...Part III will deal with varieties of aperiodic substitution systems, elementary cipher devices and cryptomechanisms, and will embrace a detailed treatment of cryptomathematics and diagnostic tests in cryptanalysis; Part IV will treat transposition and fractioning systems, and combined substitution-transposition systems; Part V will treat the reconstruction of codes, and the solution of enciphered code systems, and Part VI will treat the solution of representative machine cipher systems."
However, parts IV–VI were never completed.
Declassification
Both Military Cryptanalytics and Military Cryptanalysis have been subjects of Mandatory Declassification Review (MDR) requests, including one by John Gilmore in 1992-1993 and two by Charles Varga in 2004 and 2016.
All four parts of Military Cryptanalysis and the first two parts of the Military Cryptanalytics series have been declassified. The third part of Military Cryptanalytics was declassified in part in December 2020 and published by GovernmentAttic.org in 2021. In 1984 NSA released copies of Military Cryptanalytics parts I |
https://en.wikipedia.org/wiki/Pan-African%20Ornithological%20Congress | The Pan-African Ornithological Congress (PAOC) is a regular conference on African ornithology, usually held every four years at an African venue. Its geographic scope is:
"...the entire continent from North Africa to the Cape of Good Hope, and east to the Suez Canal and Red Sea; the Cape Verde, Madeira and Canary islands; also isolated Atlantic Oceanic islands nearer Africa than South America, Antarctic islands south of Africa and the African-facing coast of Antarctica, the Seychelles, Comoros, Socotra, Mascarene Islands and Madagascar. All continental shelf islands (e.g. Zanzibar, Fernando Po — now Bioko) are considered areas of interest, as are areas of provenance and intervening routes of migratory birds that visit Africa."
Its aims and purposes are, with regard to African birds, to:
Further their study
Promote their preservation as an integral part of African heritage
Foster their appreciation and discussion in relation to man, and
Disseminate information about them through international meetings (Congresses) and publications (Proceedings)
The constitution also states:
"Of vital importance to this scientific and educational organisation is the opportunity for free and open discussion of African avian biology, birds and their relations to man, and man’s effects on bird populations."
History
The concept of holding a conference focussing on the African avifauna originated in an invitation by Cecily Niven, then President of the South African Ornithological Society (SAOS, later BirdLife South Africa), to the 11th International Ornithological Congress (IOC), in Basel, Switzerland in 1954, to hold the 12th IOC in South Africa. Although the offer was not taken up at the time, it stimulated discussion about holding an independent conference on African birds, leading to the first PAOC in Livingstone, Northern Rhodesia, in 1957.
The first three congresses took place in southern Africa under the auspices of the SAOS, with the third (in Kruger National Park) largel |
https://en.wikipedia.org/wiki/Ergodic%20Ramsey%20theory | Ergodic Ramsey theory is a branch of mathematics where problems motivated by additive combinatorics are proven using ergodic theory.
History
Ergodic Ramsey theory arose shortly after Endre Szemerédi's proof that a set of positive upper density contains arbitrarily long arithmetic progressions, when Hillel Furstenberg gave a new proof of this theorem using ergodic theory. It has since produced combinatorial results, some of which have yet to be obtained by other means, and has also given a deeper understanding of the structure of measure-preserving dynamical systems.
Szemerédi's theorem
Szemerédi's theorem is a result in arithmetic combinatorics, concerning arithmetic progressions in subsets of the integers. In 1936, Erdős and Turán conjectured that every set of integers A with positive natural density contains a k term arithmetic progression for every k. This conjecture, which became Szemerédi's theorem, generalizes the statement of van der Waerden's theorem. Hillel Furstenberg proved the theorem using ergodic principles in 1977.
See also
IP set
Piecewise syndetic set
Ramsey theory
Syndetic set
Thick set |
https://en.wikipedia.org/wiki/Barbanel%E2%80%93Brams%20moving-knives%20procedure | The Barbanel–Brams rotating-knife procedure is a procedure for envy-free cake-cutting of a cake among three partners. It makes only two cuts, so each partner receives a single connected piece.
Its main advantage over the earlier Stromquist moving-knives procedure is that it requires only two moving knives, instead of four. The earlier Robertson–Webb rotating-knife procedure requires only one moving knife, but it works only for a two-dimensional cake, while the Barbanel–Brams procedure works also for a one-dimensional cake.
Procedure
Initially, each partner marks a point such that the cake to its left is worth for them exactly 1/3. The leftmost mark is selected. Suppose this mark belongs to Alice. Alice is then asked to mark another point such that the cake to its left is worth for her exactly 2/3. So now the cake is divided to three pieces that are equal for Alice.
Bob and Carl are asked to evaluate the two rightmost pieces. There are several cases:
Each of Bob and Carl prefers a different piece. Then, each receives his preferred piece, and Alice gets the leftmost piece.
Both Bob and Carl prefer the middle piece. Alice places two knives in the two endpoints of the middle piece and moves them inwards simultaneously, such that the two external pieces remain equal in her eyes. The value of the middle piece shrinks until, at some point, either Bob or Carl thinks it is equal to an external piece. The first that thinks so shouts "stop" and receives an external piece; Alice receives the other external piece and the non-shouter receives the middle piece.
Both Bob and Carl prefer the rightmost piece. Alice places two knives in the two endpoints of the middle piece and moves them rightwards simultaneously, such that the two leftmost pieces remain equal in her eyes. The value of the rightmost piece shrinks until, at some point, either Bob or Carl thinks it is equal to one of the leftmost pieces. The first that thinks so shouts "stop" and receives a leftmost piece; Alic |
https://en.wikipedia.org/wiki/Nucleoside%20analogue | Nucleoside analogues are structural analogues of a nucleoside, which normally contain a nucleobase and a sugar. Nucleotide analogues are analogues of a nucleotide, which normally has one to three phosphates linked to a nucleoside. Both types of compounds can deviate from what they mimick in a number of ways, as changes can be made to any of the constituent parts (nucleobase, sugar, phosphate). They are related to nucleic acid analogues.
Nucleoside and nucleotide analogues can be used in therapeutic drugs, including a range of antiviral products used to prevent viral replication in infected cells. The most commonly used is acyclovir.
Nucleotide and nucleoside analogues can also be found naturally. Examples include ddhCTP (3ʹ-deoxy-3′,4ʹdidehydro-CTP) produced by the human antiviral protein viperin and sinefungin (a S-Adenosyl methionine analogue) produced by some Streptomyces.
Function
These agents can be used against hepatitis B virus, hepatitis C virus, herpes simplex, and HIV. Once they are phosphorylated, they work as antimetabolites by being similar enough to nucleotides to be incorporated into growing DNA strands; but they act as chain terminators and stop viral DNA polymerase. They are not specific to viral DNA and also affect mitochondrial DNA. Because of this they have side effects such as bone marrow suppression.
There is a large family of nucleoside analogue reverse transcriptase inhibitors, because DNA production by reverse transcriptase is very different from normal human DNA replication, so it is possible to design nucleoside analogues that are preferentially incorporated by the former. Some nucleoside analogues, however, can function both as NRTIs and polymerase inhibitors for other viruses (e.g., hepatitis B).
Less selective nucleoside analogues are used as chemotherapy agents to treat cancer, e.g. gemcitabine. They are also used as antiplatelet drugs to prevent the formation of blood clots, ticagrelor and cangrelor.
Resistance
Resistance can de |
https://en.wikipedia.org/wiki/Mori-Zwanzig%20formalism | The Mori–Zwanzig formalism, named after the physicists Hajime Mori and Robert Zwanzig, is a method of statistical physics. It allows the splitting of the dynamics of a system into a relevant and an irrelevant part using projection operators, which helps to find closed equations of motion for the relevant part. It is used e.g. in fluid mechanics or condensed matter physics.
Idea
Macroscopic systems with a large number of microscopic degrees of freedom are often well described by a small number of relevant variables, for example the magnetization in a system of spins. The Mori–Zwanzig formalism allows the finding of macroscopic equations that only depend on the relevant variables based on microscopic equations of motion of a system, which are usually determined by the Hamiltonian. The irrelevant part appears in the equations as noise. The formalism does not determine what the relevant variables are, these can typically be obtained from the properties of the system.
The observables describing the system form a Hilbert space. The projection operator then projects the dynamics onto the subspace spanned by the relevant variables. The irrelevant part of the dynamics then depends on the observables that are orthogonal to the relevant variables. A correlation function is used as a scalar product, which is why the formalism can also be used for analyzing the dynamics of correlation functions.
Derivation
A not explicitly time-dependent observable obeys the Heisenberg equation of motion
where the Liouville operator is defined using the commutator in the quantum case and using the Poisson bracket in the classical case. We assume here that the Hamiltonian does not have explicit time-dependence. The derivation can also be generalized towards time-dependent Hamiltonians. This equation is formally solved by
The projection operator acting on an observable is defined as
where is the relevant variable (which can also be a vector of various observables), and is some sc |
https://en.wikipedia.org/wiki/Astrodatabank | Astrodatabank is a wiki website containing a collection of astrological data. The freely accessible database features the birth details and associated birth charts of public figures and mundane events. The collection was started by astrologer, Lois Rodden in 1979. Astrodatabank is currently owned and maintained by the Swiss company Astrodienst and is published in English.
History
In 1979, Lois Rodden started publishing birth data with her book, Profiles of Women. Eventually this led onto the formation of Astrodatabank. Mark McDonough developed the Astrodatabank database work using the astrology birth data collected by Lois Rodden over 40 years of research. Six months before Lois Rodden died, Lois named Pat Taglilatelo as her successor. The data collection of another lifelong collector, Edwin Charles Steinbrecher, was integrated into the database during the next two years.
In July 2005, McDonough gave Richard Smoot ownership of the company. Astrodatabank was later bought by the Swiss company Astrodienst AG, of Alois Treindl, in 2008, and converted into a wiki project and made freely accessible to all.
Astrodatabank as a wiki was released on 12 March 2009, with 72,271 pages, in English. The names Astrodatabank, Astro-Databank, AstroDatabank, and ADB refer to the one and same project Astrodatabank.
Application and Usage
The contents and data of the wiki have been recommended for the purpose of research related to astrological studies. The project has been referred to by the National Council for Geocosmic Research for astrological research and has been recommended by Astrological Association of Great Britain as a large collection of verified astrological charts and a useful resource for scientific research.
Researchers have imported Astrodatabank birth chart data into the Statistical Package for the Social Sciences (SPSS) for analysis. They claim that Astrodatabank sets the standards for rigorous astrological methodology and meet social scientific research d |
https://en.wikipedia.org/wiki/Timer%20coalescing | Timer coalescing is a computer system energy-saving technique that reduces central processing unit (CPU) power consumption by reducing the precision of software timers used for synchronization of process wake-ups, minimizing the number of times the CPU is forced to perform the relatively power-costly operation of entering and exiting idle states.
Implementations of timer coalescing
The Linux kernel gained support for deferrable timers in 2.6.22, and controllable "timer slack" for threads in 2.6.28 allowing timer coalescing.
Timer coalescing has been a feature of Microsoft Windows from Windows 7 onward.
Apple's XNU kernel based OS X gained support as of OS X Mavericks.
FreeBSD supports it since September 2010.
See also
Advanced Configuration and Power Interface (ACPI)
Advanced Programmable Interrupt Controller (APIC)
High Precision Event Timer (HPET)
HLT (x86 instruction)
Interrupt coalescing
Programmable interval timer
Time Stamp Counter (TSC) |
https://en.wikipedia.org/wiki/Barrel%20cactus | Barrel cacti are various members of the two genera Echinocactus and Ferocactus, endemic to the deserts of Southwestern North America southward to north central Mexico. Some of the largest specimens are found in the Sonoran Desert.
Description
Some species of barrel cactus reach over in height at maturity, and have been known to reach in some regions. The ribs are numerous and pronounced, and the spines are long and can range in color from yellow to tan to red, depending on the age of the plant and the species. Flowers appear at the top of the plant only after many years. The barrel cactus can live to be over 100 years old.
Barrel cactus buds typically start to bloom in April with a bright yellow or orange flower. Pink and red varieties also exist but occur less frequently. The flowers only appear on the very top of the plant. As the flowers begin to wilt in early May, they may change color. A late summer desert rainstorm can produce a late bloom, as shown in the photograph below of the orange-flowered variety (it bloomed two days after a hurricane in mid-August and continued to bloom through the end of September).
Fruit
As the flowers wilt away, small pineapple-shaped greenish fruit may form. Left untouched, the fruit has been known to last a full calendar year. The fruit can be easily removed but is not usually consumed because it is fairly dry and bitter.
Facts
Native Americans collected the fruit as emergency food during extreme drought conditions.
The Seri people distinguished three species of barrel cactus:
Saguaro barrel cactus — Ferocactus cylindraceus
Siml caacöl (big barrel cactus), siml cöquicöt (killer barrel cactus) — Ferocactus emoryi
Siml áa (true barrel cactus) — Ferocactus wislizeni
In Mexico the flesh of the barrel cactus is candied and eaten as a treat.
Cultivation
Barrel cacti are cultivated by plant nurseries as an ornamental plant. They are considered easy to grow and relatively fast growing. They may produce round offshoots.
Barr |
https://en.wikipedia.org/wiki/Ruaniaceae | Ruaniaceae is an Actinomycete family with two monotypic genera.
Phylogeny
The currently accepted taxonomy is based on the List of Prokaryotic names with Standing in Nomenclature (LPSN) and National Center for Biotechnology Information (NCBI)
and the phylogeny is based on 16S rRNA-based LTP release 106 by The All-Species Living Tree Project |
https://en.wikipedia.org/wiki/Beutenbergiaceae | Beutenbergiaceae is an Actinomycete family.
Phylogeny
The currently accepted taxonomy is based on the List of Prokaryotic names with Standing in Nomenclature (LPSN) and National Center for Biotechnology Information (NCBI)
and the phylogeny is based on 16S rRNA-based LTP release 106 by The All-Species Living Tree Project: |
https://en.wikipedia.org/wiki/SDD-AGE | In biochemistry and molecular biology, SDD-AGE is short for Semi-Denaturating Detergent Agarose Gel Electrophoresis. This is a method for detecting and characterizing large protein polymers which are stable in 2% SDS at room temperature, unlike most large protein complexes. This method is very useful for studying prions and amyloids, which are characterized by the formation of proteinaceous polymers. Agarose is used for the gel since the SDS-resistant polymers are large (in the 200-4000+ kDa range) and cannot enter a conventional polyacrylamide gel, which has small pores. Agarose on the other hand has large pores, which allows for the separation of polymers.
Use of this method allowed researchers to understand that at least some types of prion aggregates existed in a two-level structure - protein molecules grouped into polymers, which are very stable and withstand treatment with 2% SDS at room temperature, and aggregates, which are bundles of polymers, that dissociate under these conditions.
Differences in the size of polymers can indicate the efficiency of polymer fragmentation in vivo.
History
The method was created in the Molecular Genetics laboratory of the Russian Cardiology Research Institute and was published in 2003 by Kryndushkin et al. The original method used a TAE buffering system and incorporated a modified vacuum blotting system for the transfer of proteins onto a membrane (originally PVDF). The modified vacuum blotting system is actually a vacuum-assisted capillary transfer, since the vacuum only helps fluid that has already gone through the gel and membrane to leave the system.
Variations
Other modifications have also been used, such as the one described in Bagriantsev et al., using traditional wet transfer and a TGB buffering system, and others using semi-dry transfer or capillary transfer.
DD-AGE, a further variation of the method that uses fully denaturing conditions - including reducing agents such as dithiothreitol (DTT) and heat denaturat |
https://en.wikipedia.org/wiki/Science%20Translational%20Medicine | Science Translational Medicine is an interdisciplinary biomedical journal established in October 2009 by the American Association for the Advancement of Science.
It publishes basic, biomedical, translational, and clinical research about human diseases. According to Web of Science, the journal has a 2021 impact factor of 19.319
Abstracting and indexing
The journal is abstracted and indexed by the major services with a focus on medicine and biology, including Science Citation Index
& Web of Science, Index Medicus/MEDLINE/PubMed, and Scopus |
https://en.wikipedia.org/wiki/Riken | is a large scientific research institute in Japan. Founded in 1917, it now has about 3,000 scientists on seven campuses across Japan, including the main site at Wakō, Saitama Prefecture, just outside Tokyo. Riken is a Designated National Research and Development Institute, and was formerly an Independent Administrative Institution.
Riken conducts research in many areas of science, including physics, chemistry, biology, genomics, medical science, engineering, high-performance computing and computational science, and ranging from basic research to practical applications with 485 partners worldwide. It is almost entirely funded by the Japanese government, and its annual budget is about ¥88 billion (US$790 million).
Name
"Riken" is an acronym of the formal name , and its full name in Japanese is and in English is the Institute of Physical and Chemical Research.
History
In 1913, the well-known scientist Jokichi Takamine first proposed the establishment of a national science research institute in Japan. This task was taken on by Viscount Shibusawa Eiichi, a prominent businessman, and following a resolution by the Diet in 1915, Riken came into existence in March 1917. In its first incarnation, Riken was a private foundation (zaidan), funded by a combination of industry, the government, and the Imperial Household. It was located in the Komagome district of Tokyo, and its first director was the mathematician Baron Dairoku Kikuchi.
In 1927, Viscount Masatoshi Ōkōchi, the third director, established the Riken Concern (a zaibatsu). This was a group of spin-off companies that used Riken's scientific achievements for commercial ends and returned the profits to Riken. At its peak in 1939 the zaibatsu comprised about 121 factories and 63 companies, including Riken Kankōshi, which is now Ricoh.
During World War II, the Japanese army's atomic bomb program was conducted at Riken. In April 1945 the US bombed Riken's laboratories in Komagome, and in November, after the end of the |
https://en.wikipedia.org/wiki/Phase%20transition | In chemistry, thermodynamics, and other related fields, a phase transition (or phase change) is the physical process of transition between one state of a medium and another. Commonly the term is used to refer to changes among the basic states of matter: solid, liquid, and gas, and in rare cases, plasma. A phase of a thermodynamic system and the states of matter have uniform physical properties. During a phase transition of a given medium, certain properties of the medium change as a result of the change of external conditions, such as temperature or pressure. This can be a discontinuous change; for example, a liquid may become gas upon heating to its boiling point, resulting in an abrupt change in volume. The identification of the external conditions at which a transformation occurs defines the phase transition point.
Types of phase transition
States of matter
Phase transitions commonly refer to when a substance transforms between one of the four states of matter to another. At the phase transition point for a substance, for instance the boiling point, the two phases involved - liquid and vapor, have identical free energies and therefore are equally likely to exist. Below the boiling point, the liquid is the more stable state of the two, whereas above the boiling point the gaseous form is the more stable.
Common transitions between the solid, liquid, and gaseous phases of a single component, due to the effects of temperature and/or pressure are identified in the following table:
For a single component, the most stable phase at different temperatures and pressures can be shown on a phase diagram. Such a diagram usually depicts states in equilibrium. A phase transition usually occurs when the pressure or temperature changes and the system crosses from one region to another, like water turning from liquid to solid as soon as the temperature drops below the freezing point. In exception to the usual case, it is sometimes possible to change the state of a system dia |
https://en.wikipedia.org/wiki/Flora%20Graeca | Flora Graeca is a publication of the plants of Greece in the late 18th century, resulting from a survey by John Sibthorp and Ferdinand Bauer. The botanical descriptions and illustrations became highly valued by the English audience; the finely crafted and illustrated work was of both scientific and horticultural interest.
Sibthorp met the botanical illustrator Bauer in Vienna, where he had made a voyage to study a copy of Dioscorides' early botanical work, the renowned Vienna Dioscurides. This was the first part of a journey, to identify medicinal plants used in Greece; Bauer was to join the expedition as the illustrator. They were to record and collect a large number of novel specimens; their publication introduced these to an English audience. From March 1786 to December 1787 they surveyed the plants and animals of the eastern Mediterranean, Sibthorp collecting and describing, Bauer making dried specimens and producing colour-coded sketches. Bauer's work, including around a thousand intricate and annotated sketches, is now regarded as one of the finest examples of botanical illustration.
Sibthorp's volumes were to become a botanical publication, the original intention to produce a herbal or medical volume was transformed into a scientific survey. An accompanying volume, Fauna Graeca, and other planned works on the region, was not realised.
Sibthorp assembled the descriptions and plates, at his death in 1796 his will included an endowment to see the book published. The task of preparing the works was undertaken by James Edward Smith, who issued the two volumes of the Prodromus in 1806 and 1813, and six volumes as Flora Graeca Sibthorpiana between 1806 and 1828. The seventh appeared in 1830, after Smith's death, and the remaining three were produced by John Lindley between 1833 and 1840.
Each volume contained a hundred plates, except the last, and these were engraved by James Sowerby. Only 30 copies of this set were issued, another 50 complete sets were rei |
https://en.wikipedia.org/wiki/FastSpring | Bright Market, LLC, dba FastSpring, is a software as a service (SaaS) company that offers a full service e-commerce platform for companies that sell software and other online digital products.
History
Founded in 2005 by Dan Engel, Ken White, Jason Foodman and Ryan Dewell, the company is based in Santa Barbara, California. The four founders put in a combined $30,000 to launch the company. FastSpring initially focused on companies selling desktop software and downloadable games, before moving into SaaS in 2011. In March 2011, FastSpring launched its subscription e-commerce platform for subscription-based businesses to manage online subscription payments. The company received its first outside investment in April 2013 for an estimated $12 million from Pylon Capital. The company's revenue would grow from less than $1 million in 2007 to over $100 million by 2013. In 2018, Accel-KKR purchased a majority stake in FastSpring.
FastSpring was recognized for a Silver Stevie Award for Customer Service Department of the Year in 2013 and again in 2014 and 2015. It is a five-time Inc. 5000 honoree, having been ranked as high as #41 in 2010.
In July 2018, Sian Wang was named CFO of the company. In March 2019, David Nachman was named CEO, succeeding Chris Lueck. Nachman was formerly CEO of government technology company Vision, and chief business officer of Velocify, a SaaS CRM business.
Products and services
FastSpring's platform offers digital commerce products for software, cloud-based, and as-a-service businesses, supporting a variety of digital products and distribution models. It enables purchases and subscriptions for desktop, mobile, and apps.
Global Online Payments
Subscription Management + Billing
Branded Checkout
Global Taxes + Financial Services
Risk Management + Compliance
Integrations |
https://en.wikipedia.org/wiki/Trilinear%20polarity | In Euclidean geometry, trilinear polarity is a certain correspondence between the points in the plane of a triangle not lying on the sides of the triangle and lines in the plane of the triangle not passing through the vertices of the triangle. "Although it is called a polarity, it is not really a polarity at all, for poles of concurrent lines are not collinear points." It was Jean-Victor Poncelet (1788–1867), a French engineer and mathematician, who introduced the idea of the trilinear polar of a point in 1865.
Definitions
Let be a plane triangle and let be any point in the plane of the triangle not lying
on the sides of the triangle. Briefly, the trilinear polar of is the axis of perspectivity of the cevian triangle of and the triangle .
In detail, let the line meet the sidelines at respectively. Triangle is the cevian triangle of with reference to triangle . Let the pairs of line intersect at respectively. By Desargues' theorem, the points are collinear. The line of collinearity is the axis of perspectivity of triangle and triangle . The line is the trilinear polar of the point .
The points can also be obtained as the harmonic conjugates of with respect to the pairs of points respectively. Poncelet used this idea to define the concept of trilinear polars.
If the line is the trilinear polar of the point with respect to the reference triangle then is called the trilinear pole of the line with respect to the reference triangle .
Trilinear equation
Let the trilinear coordinates of the point be . Then the trilinear equation of the trilinear polar of is
Construction of the trilinear pole
Let the line meet the sides of triangle at respectively. Let the pairs of lines meet at . Triangles and are in perspective and let be the center of perspectivity. is the trilinear pole of the line .
Some trilinear polars
Some of the trilinear polars are well known.
The trilinear polar of the centroid of triangle is the line at infini |
https://en.wikipedia.org/wiki/Lipozyme | Lipozyme, a registered trademark of Novo Nordisk A/S Corp., is a class of industrial enzymes, specifically: lipases.
Lipozymes can be differentiated by origin - it can be extracted from Mucor miehei, Thermomyces lanuginosus, Candida antarctica, and others. For industrial purposes, it can be immobilized on macroporous ion-exchange resins. Lipases like Lipozyme and Novozyme (reg.trademark by Novozymes) play a big role in the synthesis of biodiesel. Lipozyme is also offered as a food supplement clad in capsules. It comes in different activities, measured e.g. in IUN/g or KLU/g (IUN = Interesterification Unit, K = Kilo, LU = Lipase unit). |
https://en.wikipedia.org/wiki/Deep%20palmar%20branch%20of%20ulnar%20artery | The deep palmar branch of ulnar artery (deep volar branch, profunda branch) passes between the Abductor digiti minimi and Flexor digiti minimi brevis and through the origin of the Opponens digiti minimi; it anastomoses with the radial artery, and completes the deep volar arch.
See also
Deep branch of ulnar nerve |
https://en.wikipedia.org/wiki/Gallery%20of%20flags%20of%20dependent%20territories | This overview contains the flags of dependent territories and other areas of special sovereignty.
Australia
Chile
China
Denmark
Finland
France
Overseas collectivities and territory
Overseas departments and regions
Netherlands
Constituent countries
Special municipalities
New Zealand
Dependent territories
Special territorial authority
Portugal
Spain
United Kingdom
British Overseas Territories
Saint Helena, Ascension and Tristan da Cunha
flag of Scotland
Crown Dependencies
United States
See also
Armorial of dependent territories
Armorial of sovereign states
Flags of micronations
Gallery of sovereign state flags
List of country subdivision flags
List of former sovereign states
Lists of city flags
Lists and galleries of flags |
https://en.wikipedia.org/wiki/Cartan%27s%20theorem | Cartan's theorem may refer to several mathematical results by Élie Cartan:
Closed-subgroup theorem, 1930, that any closed subgroup of a Lie group is a Lie subgroup
Theorem of the highest weight, that the irreducible representations of Lie algebras or Lie groups are classified by their highest weights
Lie's third theorem, an equivalence between Lie algebras and simply-connected Lie groups
See also
Cartan's theorems A and B, c.1931 results by Henri Cartan concerning a coherent sheaf on a Stein manifold
Cartan's lemma, several results by Élie or Henri Cartan
Cartan–Dieudonné theorem, a result on orthogonal transformations and reflections
Lie groups
Theorems in abstract algebra |
https://en.wikipedia.org/wiki/Distributivity%20%28order%20theory%29 | In the mathematical area of order theory, there are various notions of the common concept of distributivity, applied to the formation of suprema and infima. Most of these apply to partially ordered sets that are at least lattices, but the concept can in fact reasonably be generalized to semilattices as well.
Distributive lattices
Probably the most common type of distributivity is the one defined for lattices, where the formation of binary suprema and infima provide the total operations of join () and meet (). Distributivity of these two operations is then expressed by requiring that the identity
hold for all elements x, y, and z. This distributivity law defines the class of distributive lattices. Note that this requirement can be rephrased by saying that binary meets preserve binary joins. The above statement is known to be equivalent to its order dual
such that one of these properties suffices to define distributivity for lattices. Typical examples of distributive lattice are totally ordered sets, Boolean algebras, and Heyting algebras. Every finite distributive lattice is isomorphic to a lattice of sets, ordered by inclusion (Birkhoff's representation theorem).
Distributivity for semilattices
A semilattice is partially ordered set with only one of the two lattice operations, either a meet- or a join-semilattice. Given that there is only one binary operation, distributivity obviously cannot be defined in the standard way. Nevertheless, because of the interaction of the single operation with the given order, the following definition of distributivity remains possible. A meet-semilattice is distributive, if for all a, b, and x:
If a ∧ b ≤ x then there exist a and b such that a ≤ a, b ≤ b' and x = a ∧ b' .
Distributive join-semilattices are defined dually: a join-semilattice is distributive, if for all a, b, and x:
If x ≤ a ∨ b then there exist a and b such that a ≤ a, b ≤ b and x = a ∨ b' .
In either case, a' and b' need not be unique.
These definit |
https://en.wikipedia.org/wiki/Knight%20shift | The Knight shift is a shift in the nuclear magnetic resonance (NMR) frequency of a paramagnetic
substance first published in 1949 by the UC Berkeley physicist Walter D. Knight.
For an ensemble of N spins in a magnetic induction field , the nuclear Hamiltonian for the Knight shift is expressed in Cartesian form by:
, where for the ith spin is the gyromagnetic ratio, is a vector of the Cartesian nuclear angular momentum operators, the
matrix is a second-rank tensor similar to the chemical shift shielding tensor.
The Knight shift refers to the relative shift K in NMR frequency for atoms in a metal (e.g. sodium) compared with the same atoms in a nonmetallic environment (e.g. sodium chloride). The observed shift reflects the local magnetic field produced at the sodium nucleus by the magnetization of the conduction electrons. The average local field in sodium augments the applied resonance field by approximately one part per 1000. In nonmetallic sodium chloride the local field is negligible in comparison.
The Knight shift is due to the conduction electrons in metals. They introduce an "extra" effective field at the nuclear site, due to the spin orientations of the conduction electrons in the presence of an external field. This is responsible for the shift observed in the nuclear magnetic resonance. The shift comes from two sources, one is the Pauli paramagnetic spin susceptibility, the other is the s-component wavefunctions at the nucleus.
Depending on the electronic structure, the Knight shift may be temperature dependent. However, in metals which normally have a broad featureless electronic density of states, Knight shifts are temperature independent. |
https://en.wikipedia.org/wiki/Lenovo | Lenovo Group Limited, often shortened to Lenovo ( , ), is a Chinese multinational technology company specializing in designing, manufacturing, and marketing consumer electronics, personal computers, software, business solutions, and related services. Products manufactured by the company include desktop computers, laptops, tablet computers, smartphones, workstations, servers, supercomputers, data storage devices, IT management software, and smart televisions. Its best-known brands include its ThinkPad business line of laptop computers (acquired from IBM), the IdeaPad, Yoga, and Legion consumer lines of laptop computers, and the IdeaCentre and ThinkCentre lines of desktop computers. As of 2021, Lenovo is the world's largest personal computer vendor by unit sales.
Lenovo has operations in over 60 countries and sells its products in around 180+ countries. It was incorporated in Hong Kong, with global headquarters in Beijing, and Morrisville, North Carolina, United States. and operational centres in Singapore and Morrisville, North Carolina, US. It has research centres in Beijing, Chengdu, Yamato (Kanagawa Prefecture, Japan), Singapore, Shanghai, Shenzhen, and Morrisville, and also has Lenovo NEC Holdings, a joint venture with NEC that produces personal computers for the Japanese market.
History
1984–1993: Founding and early history
Lenovo was founded in Beijing on 1 November 1984 as Legend by a team of engineers led by Liu Chuanzhi and Danny Lui. Initially specializing in televisions, the company migrated towards manufacturing and marketing computers.
Liu Chuanzhi and his group of ten experienced engineers, teaming up with Danny Lui, officially founded Lenovo in Beijing on November 1, 1984, with 200,000 yuan. The Chinese government approved Lenovo's incorporation on the same day. Jia Xufu (贾续福), one of the founders of Lenovo, indicated that the first meeting in preparation for starting the company was held on October 17 the same year. Eleven people, the entirety of t |
https://en.wikipedia.org/wiki/CSM1 | CSM1 (RNA name: Csm1p) is a protein that in Saccharomyces cerevisiae strain S288c is encoded by the CSM1 gene. |
https://en.wikipedia.org/wiki/Sci%20Phi%20Journal | Sci Phi Journal is a quarterly online magazine (formerly monthly, with a print option) devoted to publishing science fiction stories and essays "at the intersection between speculative philosophy", anthropology and other humanities, with a particular focus on "fictional non-fiction". The first issue was published in October 2014. Jason Rennie founded and helmed the publication with Ben Zwycky until mid-2017. The quarterly was then briefly managed by Ray Blank, and has been edited by Adam Gerencser and Mariano Martin Rodriguez since January 2019, the pair having relaunched the magazine as a "European project".
In November 2014, a short story by Lou Antonelli featured in the magazine's second issue was nominated for the 2015 Hugo Award for Best Short Story. In 2016, the journal was a finalist for the Hugo Award, and nominated for the Locus Award. At the 2022 EuroCon held in Luxembourg, Sci Phi Journal won the European SF Award for Best Magazine. Cover art and non-fiction essays featured in the magazine were also finalists for the 2022 Utopia Awards.
Notable authors
Notable authors published in the magazine include:
Lou Antonelli
Michael F. Flynn
Andrew Fraknoi
Sean Patrick Hazlett
L. Jagi Lamplighter
Edward M. Lerner
Paul Levinson
Brian Niemeier
Benjamin Rosenbaum
Luís Filipe Silva
Shweta Taneja
Ian Watson |
https://en.wikipedia.org/wiki/KISS%20%28algorithm%29 | KISS (Keep it Simple Stupid) is a family of pseudorandom number generators introduced by George Marsaglia. Starting from 1998 Marsaglia posted on various newsgroups including sci.math, comp.lang.c, comp.lang.fortran and sci.stat.math several versions of the generators. All KISS generators combine three or four independent random number generators with a view to improving the quality of randomness. KISS generators produce 32-bit or 64-bit random integers, from which random floating-point numbers can be constructed if desired. The original 1993 generator is based on the combination of a linear congruential generator and of two linear feedback shift-register generators. It has a period 295, good speed and good statistical properties; however, it fails the LinearComplexity test in the Crush and BigCrush tests of the TestU01 suite. A newer version from 1999 is based on a linear congruential generator, a 3-shift linear feedback shift-register and two multiply-with-carry generators. It is 10–20% slower than the 1993 version but has a larger period 2123 and passes all tests in TestU01. In 2009 Marsaglia presented a version based on 64-bit integers (appropriate for 64-bit processors) which combines a multiply-with-carry generator, a Xorshift generator and a linear congruential generator. It has a period of around 2250 (around 1075). |
https://en.wikipedia.org/wiki/Euler%E2%80%93Rodrigues%20formula | In mathematics and mechanics, the Euler–Rodrigues formula describes the rotation of a vector in three dimensions. It is based on Rodrigues' rotation formula, but uses a different parametrization.
The rotation is described by four Euler parameters due to Leonhard Euler. The Rodrigues formula (named after Olinde Rodrigues), a method of calculating the position of a rotated point, is used in some software applications, such as flight simulators and computer games.
Definition
A rotation about the origin is represented by four real numbers, , , , such that
When the rotation is applied, a point at position rotates to its new position
Vector formulation
The parameter may be called the scalar parameter and the vector parameter. In standard vector notation, the Rodrigues rotation formula takes the compact form
Symmetry
The parameters and describe the same rotation. Apart from this symmetry, every set of four parameters describes a unique rotation in three-dimensional space.
Composition of rotations
The composition of two rotations is itself a rotation. Let and be the Euler parameters of two rotations. The parameters for the compound rotation (rotation 2 after rotation 1) are as follows:
It is straightforward, though tedious, to check that . (This is essentially Euler's four-square identity, also used by Rodrigues.)
Rotation angle and rotation axis
Any central rotation in three dimensions is uniquely determined by its axis of rotation (represented by a unit vector ) and the rotation angle . The Euler parameters for this rotation are calculated as follows:
Note that if is increased by a full rotation of 360 degrees, the arguments of sine and cosine only increase by 180 degrees. The resulting parameters are the opposite of the original values, ; they represent the same rotation.
In particular, the identity transformation (null rotation, ) corresponds to parameter values . Rotations of 180 degrees about any axis result in .
Connection with quaternions
T |
https://en.wikipedia.org/wiki/Matchless%20%28film%29 | Matchless (also known as Mission Top Secret) is a 1967 Italian science fiction-comedy film directed by Alberto Lattuada. It parodies the spy film genre.
Cast
Patrick O'Neal as Perry 'Matchless' Liston
Princess Ira von Fürstenberg as Arabella
Donald Pleasence as Gregori Andreanu
Henry Silva as Hank Norris
Nicoletta Machiavelli as Tipsy
Sorrell Booke as Colonel Coolpepper
Howard St. John as General Shapiro
Jacques Herlin as O-Chin's Doctor
Reception
In a contemporary review, Variety stated that "if audiences are not yet satiated with spy spoofs, United Artists' Matchless may get an appreciative reception from the action market." The review noted that "the film doesn't take itself too seriously and makes some clever gibes at the expense of military and intelligence powers of all nationalities", but that "uneven acting and technical credits hamper the film's chances of rising above programmer status." |
https://en.wikipedia.org/wiki/Skin%20appendage | Skin appendages (or adnexa of skin) are anatomical skin-associated structures that serve a particular function including sensation, contractility, lubrication and heat loss in animals. In humans, some of the more common skin appendages are hairs (sensation, heat loss, filter for breathing, protection), arrector pilli (smooth muscles that pull hairs straight), sebaceous glands (secrete sebum onto hair follicle, which oils the hair), sweat glands (can secrete sweat with strong odour (apocrine) or with a faint odour (merocrine or eccrine)), and nails (protection).
Skin appendages are derived from the skin, and are usually adjacent to it.
Types of appendages include hair, glands, and nails.
Glands
Sweat glands are distributed all over the body except nipples and outer genitals. Although the nipples do have the mammary glands, these are known as modified sweat glands.
Sebaceous glands are typically found in the opening shafts of hair. They are not on the palms of the hands or the soles of the feet. These glands secrete an antibacterial moisture known as sebum fluid. The sebum also softens the hands. The secretion activity is related to hormonal release. If acne is occurring, it is because these gland ducts are blocked.
Eccrine (merocrine) glands are most common. The secretions are very watery that contain some electrolytes
Apocrine glands produce a fatty secretion, thus giving away an odorous smell. These are located in the inguinal and axillary regions of the body, and include the mammary glands as well as the creaminess variants. |
https://en.wikipedia.org/wiki/Translational%20Research%20%28journal%29 | Translational Research: The Journal of Laboratory and Clinical Medicine is a monthly peer-reviewed medical journal covering translational research. It was established in 1915 as The Journal of Laboratory and Clinical Medicine obtaining its current title in 2006. Jeffrey Laurence (Weill Cornell Medical College) has been editor-in-chief since 2006. He was preceded by Dale Hammerschmidt. It is the official journal of the Central Society for Clinical and Translational Research. It is published by Mosby.
Abstracting and indexing
The journal is abstracted and indexed in:
According to the Journal Citation Reports, the journal has a 2014 impact factor of 5.03, ranking it second out of 30 journals in the category "Medical Laboratory Technology", 17th out of 153 journals in the category "Medicine, General & Internal" and 17th out of 123 journals in the category "Medicine, Research & Experimental" |
https://en.wikipedia.org/wiki/Clairaut%27s%20equation | In mathematical analysis, Clairaut's equation (or the Clairaut equation) is a differential equation of the form
where is continuously differentiable. It is a particular case of the Lagrange differential equation. It is named after the French mathematician Alexis Clairaut, who introduced it in 1734.
Solution
To solve Clairaut's equation, one differentiates with respect to , yielding
so
Hence, either
or
In the former case, for some constant . Substituting this into the Clairaut's equation, one obtains the family of straight line functions given by
the so-called general solution of Clairaut's equation.
The latter case,
defines only one solution , the so-called singular solution, whose graph is the envelope of the graphs of the general solutions. The singular solution is usually represented using parametric notation, as , where .
The parametric description of the singular solution has the form
where is a parameter.
Examples
The following curves represent the solutions to two Clairaut's equations:
In each case, the general solutions are depicted in black while the singular solution is in violet.
Extension
By extension, a first-order partial differential equation of the form
is also known as Clairaut's equation.
See also
D'Alembert's equation
Chrystal's equation
Legendre transformation
Notes |
https://en.wikipedia.org/wiki/Moir%C3%A9%20pattern | In mathematics, physics, and art, moiré patterns ( , , ) or moiré fringes are large-scale interference patterns that can be produced when a partially opaque ruled pattern with transparent gaps is overlaid on another similar pattern. For the moiré interference pattern to appear, the two patterns must not be completely identical, but rather displaced, rotated, or have slightly different pitch.
Moiré patterns appear in many situations. In printing, the printed pattern of dots can interfere with the image. In television and digital photography, a pattern on an object being photographed can interfere with the shape of the light sensors to generate unwanted artifacts. They are also sometimes created deliberately – in micrometers they are used to amplify the effects of very small movements.
In physics, its manifestation is wave interference such as that seen in the double-slit experiment and the beat phenomenon in acoustics.
Etymology
The term originates from moire (moiré in its French adjectival form), a type of textile, traditionally made of silk but now also made of cotton or synthetic fiber, with a rippled or "watered" appearance. Moire, or "watered textile", is made by pressing two layers of the textile when wet. The similar but imperfect spacing of the threads creates a characteristic pattern which remains after the fabric dries.
In French, the noun moire is in use from the 17th century, for "watered silk". It was a loan of the English mohair (attested 1610). In French usage, the noun gave rise to the verb moirer, "to produce a watered textile by weaving or pressing", by the 18th century. The adjective moiré formed from this verb is in use from at least 1823.
Pattern formation
Moiré patterns are often an artifact of images produced by various digital imaging and computer graphics techniques, for example when scanning a halftone picture or ray tracing a checkered plane (the latter being a special case of aliasing, due to undersampling a fine regular pattern) |
https://en.wikipedia.org/wiki/Jaltomata | Jaltomata is a genus of plants in the family Solanaceae. According to molecular phylogenies, Jaltomata is the sister genus to Solanum, which includes tomato, potato, and eggplant. Jaltomata has a neotropical distribution, in that species occur from the United States southwest through Latin America, and into the Andean region of South America. Species encompass a wide range of vegetative and reproductive trait variation, including growth habit (trailing herbs, erect herbs, and woody shrubs), floral size, shape and color, as well as fruit size and color. The fruits of some of the species are eaten by humans in Latin and South America. Depending on the species, fruits may be red, green, orange, or dark purple.
Etymology
The name comes from xāltomatl, lit. "sand tomato", the Nahuatl (Aztec) name for the species Jaltomata procumbens (earlier Saracha jaltomata). The Nahuatl X is pronounced like an English SH, but when borrowed into Mexican Spanish and spelled J, the pronunciation is like an English H. Both Mexican and US American botanists pronounce the J this way.
Species
Currently accepted species:
Jaltomata andersonii T. Mione
Jaltomata antillana (Krug & Urb.) D'Arcy
Jaltomata aspera (Ruiz & Pav.) T. Mione & F. G. Coe
Jaltomata atiquipa Mione & S. Leiva G.
Jaltomata auriculata (Miers) Mione
Jaltomata aypatensis S. Leiva, Mione & Quipuscoa
Jaltomata bernardelloana S. Leiva & Mione
Jaltomata bicolor (Ruiz & Pav.) Mione
Jaltomata biflora (Ruiz & Pav.) Benítez
Jaltomata bohsiana Mione & D.M. Spooner
Jaltomata cajacayensis S. Leiva & T. Mione
Jaltomata cajamarca T. Mione
Jaltomata calliantha S. Leiva & T. Mione
Jaltomata chihuahuensis (Bitter) Mione & Bye
Jaltomata confinis (C.V. Morton) J.L. Gentry
Jaltomata contorta (Ruiz & Pav.) Mione
Jaltomata cuyasensis S. Leiva, Quipuscoa & Sawyer
Jaltomata dendroidea S. Leiva & Mione
Jaltomata dentata (Ruiz & Pav.) Benitez
Jaltomata diversa (J.F. Macbr.) Mione
Jaltomata grandiflora (B.L. Rob. & Greenm.) D'Arcy, Mione & Davis |
https://en.wikipedia.org/wiki/Code%20signing | Code signing is the process of digitally signing executables and scripts to confirm the software author and guarantee that the code has not been altered or corrupted since it was signed. The process employs the use of a cryptographic hash to validate authenticity and integrity. Code signing was invented in 1995 by Michael Doyle, as part of the Eolas WebWish browser plug-in, which enabled the use of public-key cryptography to sign downloadable Web app program code using a secret key, so the plug-in code interpreter could then use the corresponding public key to authenticate the code before allowing it access to the code interpreter’s APIs.
Code signing can provide several valuable features. The most common use of code signing is to provide security when deploying; in some programming languages, it can also be used to help prevent namespace conflicts. Almost every code signing implementation will provide some sort of digital signature mechanism to verify the identity of the author or build system, and a checksum to verify that the object has not been modified. It can also be used to provide versioning information about an object or to store other metadata about an object.
The efficacy of code signing as an authentication mechanism for software depends on the security of underpinning signing keys. As with other public key infrastructure (PKI) technologies, the integrity of the system relies on publishers securing their private keys against unauthorized access. Keys stored in software on general-purpose computers are susceptible to compromise. Therefore, it is more secure, and best practice, to store keys in secure, tamper-proof, cryptographic hardware devices known as hardware security modules or HSMs.
Providing security
Many code signing implementations will provide a way to sign the code using a system involving a pair of keys, one public and one private, similar to the process employed by TLS or SSH. For example, in the case of .NET, the developer uses a priv |
https://en.wikipedia.org/wiki/Realia%20%28library%20science%29 | In library classification systems, realia are three-dimensional objects from real life such as coins, tools, and textiles, that do not fit into the traditional categories of library material. They can be either human-made (artifacts, tools, utensils, etc.) or naturally occurring (specimens, samples, etc.), usually borrowed, purchased, or received as donation by a teacher, library, or museum for use in classroom instruction or in exhibits. Archival and manuscript collections often receive items of memorabilia such as badges, emblems, insignias, jewelry, leather goods, needlework, etc., in connection with gifts of personal papers. Most government or institutional archives reject gifts of non-documentary objects unless they have a documentary value. When accepting large bequests of mixed objects they normally have the donors sign legal documents giving permission to the archive to destroy, exchange, sell, or dispose in any way those objects which, according to the best judgement of the archivist, are not manuscripts (which can include typescripts or printouts) or are not immediately useful for understanding the manuscripts. Recently, the usage of this term has been criticized by librarians based on the usage of term realia to refer to artistic and historical artifacts and objects, and suggesting the use of the phrase "real world object" to describe the broader categories of three-dimensional objects in libraries.
Treatment in libraries
Most libraries usually have a well written, legally tight acquisitions policy that rejects beforehand any object which is not some kind of print or text-based document. There are some exceptions. Children's libraries sometimes have a toy collection, whose individual items are lent out after being cataloged as realia, or under a more specific material designation such as toy, or game. Some large libraries have a special mandate to keep objects related to a literary collection.
Some very large libraries have a public relations departme |
https://en.wikipedia.org/wiki/ACTL7A | Actin-like protein 7A is a protein that in humans is encoded by the ACTL7A gene.
The protein encoded by this gene is a member of a family of actin-related proteins (ARPs) which share significant amino acid sequence identity to conventional actins. Both actins and ARPs have an actin fold, which is an ATP-binding cleft, as a common feature. The ARPs are involved in diverse cellular processes, including vesicular transport, spindle orientation, nuclear migration and chromatin remodeling. This gene (ACTL7A), and related gene, ACTL7B, are intronless, and are located approximately 4 kb apart in a head-to-head orientation within the familial dysautonomia candidate region on 9q31. Based on mutational analysis of the ACTL7A gene in patients with this disorder, it was concluded that it is unlikely to be involved in the pathogenesis of dysautonomia. The ACTL7A gene is expressed in a wide variety of adult tissues, however, its exact function is not known. |
https://en.wikipedia.org/wiki/Photoconductance%20decay | Photoconductance decay or Photoconductivity decay (PCD or PC), is a non-destructive analytical technique used to measure the lifetime of minority charge carriers in a semiconductor, especially in silicon wafers. The technique studies the transient photoconductivity of a semiconductor sample during or after it is illuminated by a light pulse. Electron–hole pairs are first generated by the light pulse, and the photoconductivity of the sample declines as the carriers recombine.
PCD is an important characterisation step in determining the quality and expected performance of wafers before they are used to fabricate devices such as integrated circuits or solar cells. It is one of the most common methods of determining carrier lifetimes.
PCD uses a fast light source (e.g. a xenon flash lamp) to excite the test sample, causing free carriers to be generated. Excess carriers in the material cause it to become more conductive, and thus the number of excess carriers () can be measured over time by measuring the material conductivity. Conductivity can be measured through non-contact methods, such as through microwave reflectance, or inductive or capacitive coupling. A higher effective lifetime of minority charge carriers indicate that they can remain mobile in the wafer for a long time period before undergoing recombination.
History
Characterisation of minority carrier lifetimes through measurement of photoconductance decay was a technique used by Bell Laboratories as early as 1954 on silicon and germanium wafers during investigation of carrier trapping. A detailed method for measuring PCD was published soon after by MIT Lincoln Laboratory in 1955. A standard method for PCD was described in ATSM standards in 1971 for measurement of minority carrier lifetimes. A new method for Quasi-steady-state photoconductance measurements was described in 1996 by Ronald Sinton.
Theory
The difference in dark and excited photoconductivity of the wafer is typically measured through monito |
https://en.wikipedia.org/wiki/PTS-GFL%20superfamily | The phosphotransferases system (PTS-GFL) superfamily is a superfamily of phosphotransferase enzymes that facilitate the transport of glucose, glucitol (G), fructose (F) and lactose (L). Classification has been established through phylogenic analysis and bioinformatics.
The bacterial phosphoenolpyruvate:sugar phosphotransferase system (PTS) transports and phosphorylates its sugar substrates in a single energy-coupled step. This transport process is dependent on several cytoplasmic phosphoryl transfer proteins - Enzyme I (I), HPr, Enzyme IIA (IIA), and Enzyme IIB (IIB)) as well as the integral membrane sugar permease (IIC). The PTS Enzyme II complexes are derived from independently evolving 4 PTS Enzyme II complex superfamilies, that include the (1) Glucose (Glc),(2) Mannose (Man), (3) Ascorbate-Galactitol (Asc-Gat) and (4) Dihydroxyacetone (Dha) superfamilies.
The four families that make up the PTS-GFL superfamily include:
4.A.1 – The PTS Glucose-Glucoside (Glc) Family
4.A.2 – The PTS Fructose-Mannitol (Fru) Family
4.A.3 – The PTS Lactose-N,N'-Diacetylchitobiose-β-glucoside (Lac) Family
4.A.4 – The PTS Glucitol (Gut) Family
See also
Phosphotransferases system
Further reading
"TCDB - PTS-GFL Superfamily". www.tcdb.org.
Chang, Abraham B.; Lin, Ron; Keith Studley, W.; Tran, Can V.; Saier, Milton H. (2004-06-01). "Phylogeny as a guide to structure and function of membrane transport proteins". Molecular Membrane Biology 21 (3): 171–181.doi:10.1080/09687680410001720830. ISSN 0968-7688.PMID 15204625. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.