source stringlengths 31 227 | text stringlengths 9 2k |
|---|---|
https://en.wikipedia.org/wiki/Token%20Ring | Token Ring is a physical and data link layer computer networking technology used to build local area networks. It was introduced by IBM in 1984, and standardized in 1989 as IEEE 802.5.
It uses a special three-byte frame called a token that is passed around a logical ring of workstations or servers. This token passing is a channel access method providing fair access for all stations, and eliminating the collisions of contention-based access methods.
Token Ring was a successful technology, particularly in corporate environments, but was gradually eclipsed by the later versions of Ethernet. Gigabit Token Ring was standardized in 2001, but development has stopped since.
History
A wide range of different local area network technologies were developed in the early 1970s, of which one, the Cambridge Ring, had demonstrated the potential of a token passing ring topology, and many teams worldwide began working on their own implementations. At the IBM Zurich Research Laboratory Werner Bux and Hans Müller, in particular, worked on the design and development of IBM's Token Ring technology, while early work at MIT led to the Proteon 10 Mbit/s ProNet-10 Token Ring network in 1981the same year that workstation vendor Apollo Computer introduced their proprietary 12 Mbit/s Apollo Token Ring (ATR) network running over 75-ohm RG-6U coaxial cabling. Proteon later evolved a 16 Mbit/s version that ran on unshielded twisted pair cable.
1985 IBM launch
IBM launched their own proprietary Token Ring product on October 15, 1985. It ran at 4 Mbit/s, and attachment was possible from IBM PCs, midrange computers and mainframes. It used a convenient star-wired physical topology and ran over shielded twisted-pair cabling. Shortly thereafter it became the basis for the IEEE 802.5 standard.
During this time, IBM argued that Token Ring LANs were superior to Ethernet, especially under load, but these claims were debated.
In 1988 the faster 16 Mbit/s Token Ring was standardized by the 802.5 worki |
https://en.wikipedia.org/wiki/JBoss%20operations%20network | JBoss Operations Network (or JBoss ON or JON) is free software/open-source Java EE-based network management software. JBoss Operations Network is part of the JBoss Enterprise Middleware portfolio of software. JBoss ON is an administration and management platform for the development, testing, deployment, and monitoring of the application lifecycle. Because it is Java-based, the JBoss application server operates cross-platform: usable on any operating system that supports Java. JBoss ON was developed by JBoss, now a division of Red Hat.
Product features
JBoss ON provides performance, configuration, and inventory management in order to deploy, manage, and monitor the JBoss middleware portfolio, applications, and services.
JBoss ON provides management of the following:
Discovery and inventory
Configuration management
Application deployment
Perform and schedule actions on servers, applications and services
Availability management
Performance management
Provisioning (IT)
JBoss ON is the downstream of RHQ (see also section Associated Acronyms).
Licensing & Pricing
The various JBoss application platforms are open source, but Red Hat charges to provide a support subscription for JBoss Enterprise Middleware.
Associated acronyms
Acronyms associated with JBoss ON:
RHQ - upstream open source project of JBoss ON. Current stable version is RHQ 4.13; main difference between RHQ 4 and RHQ 3 is the transition of the UI framework to Google Web Toolkit.
Jopr - previously the JBossAS management bits (upstream) of JBoss ON - now integrated into the RHQ source base (since September 2009). Jopr used to use RHQ as its upstream. There will be no more separate Jopr releases.
JON - JBoss Operations Network (ON)
See also
List of JBoss software
Network monitoring system
Comparison of network monitoring systems
HyPerformix IPS Performance Optimizer
IBM Tivoli Framework |
https://en.wikipedia.org/wiki/Sallen%E2%80%93Key%20topology | The Sallen–Key topology is an electronic filter topology used to implement second-order active filters that is particularly valued for its simplicity. It is a degenerate form of a voltage-controlled voltage-source (VCVS) filter topology. It was introduced by R. P. Sallen and E. L. Key of MIT Lincoln Laboratory in 1955.
Explanation of operation
A VCVS filter uses a voltage amplifier with practically infinite input impedance and zero output impedance to implement a 2-pole low-pass, high-pass, bandpass, bandstop, or allpass response. The VCVS filter allows high Q factor and passband gain without the use of inductors. A VCVS filter also has the advantage of independence: VCVS filters can be cascaded without the stages affecting each others tuning. A Sallen–Key filter is a variation on a VCVS filter that uses a unity-voltage-gain amplifier (i.e., a pure buffer amplifier).
History and implementation
In 1955, Sallen and Key used vacuum tube cathode follower amplifiers; the cathode follower is a reasonable approximation to an amplifier with unity voltage gain. Modern analog filter implementations may use operational amplifiers (also called op amps). Because of its high input impedance and easily selectable gain, an operational amplifier in a conventional non-inverting configuration is often used in VCVS implementations. Implementations of Sallen–Key filters often use an op amp configured as a voltage follower; however, emitter or source followers are other common choices for the buffer amplifier.
Sensitivity to component tolerances
VCVS filters are relatively resilient to component tolerance, but obtaining high Q factor may require extreme component value spread or high amplifier gain. Higher-order filters can be obtained by cascading two or more stages.
Generic Sallen–Key topology
The generic unity-gain Sallen–Key filter topology implemented with a unity-gain operational amplifier is shown in Figure 1. The following analysis is based on the assumption that the operat |
https://en.wikipedia.org/wiki/Viral%20protein | The term viral protein refers to both the products of the genome of a virus and any host proteins incorporated into the viral particle. Viral proteins are grouped according to their functions, and groups of viral proteins include structural proteins, nonstructural proteins, regulatory proteins, and accessory proteins. Viruses are non-living and do not have the means to reproduce on their own, instead depending on their host cell's machinery to do this. Thus, viruses do not code for most of the proteins required for their replication and the translation of their mRNA into viral proteins, but use proteins encoded by the host cell for this purpose.
Viral structural proteins
Most viral structural proteins are components for the capsid and the envelope of the virus.
Capsid
The genetic material of a virus is stored within a viral protein structure called the capsid. The capsid is a "shield" that protects the viral nucleic acids from getting degraded by host enzymes or other types of pesticides or pestilences. It also functions to attach the virion to its host, and enable the virion to penetrate the host cell membrane. Many copies of a single viral protein or a number of different viral proteins make up the capsid, and each of these viral proteins are coded for by one gene from the viral genome. The structure of the capsid allows the virus to use a small number of viral genes to make a large capsid.
Several protomers, oligomeric (viral) protein subunits, combine to form capsomeres, and capsomeres come together to form the capsid. Capsomeres can arrange into an icosahedral, helical, or complex capsid, but in many viruses, such as the herpes simplex virus, an icosahedral capsid is assembled. Three asymmetric and nonidentical viral protein units make up each of the twenty identical triangular faces in the icosahedral capsid.
Viral envelope
The capsid of some viruses are enclosed in a membrane called the viral envelope. In most cases, the viral envelope is obtained by |
https://en.wikipedia.org/wiki/Isolation%20index | An isolation index is a measure of the segregation of the activities of multiple populations. They have been used in studies of racial segregation and ideological segregation.
Examples of isolation indices include Lieberson's isolation index and Bell's isolation index. |
https://en.wikipedia.org/wiki/Burgundy%20mixture | Burgundy mixture, named after the French district where it was first used to treat grapes and vines, is a mixture of copper sulfate and sodium carbonate. This mixture, which can have an overall copper concentration within the range of 1% through 20%, is used as a fungicidal spray for trees and small fruits.
History
Similar to the Bordeaux mixture, one of the earliest fungicides in use, Burgundy mixture, also known as “sal soda Bordeaux”, is used as a fungus preventative applicant on plants before fungi have appeared. Bordeaux mixture contains copper(II) sulfate, CuSO4, and hydrated lime, Ca(OH)2, while Burgundy mixture contains copper sulphate, CuSO4, and sodium carbonate, Na2CO3. First used around 1885, Burgundy mixture has since been replaced by synthetic organic compounds, or by compounds that contain copper in a non-reactive, chelated form. This helps to prevent the accumulation of high levels of copper in sediments surrounding the plants.
Synthesis and composition
Burgundy mixture is made by combining dissolved copper sulphate and dissolved sodium carbonate. Dissolved copper sulphate ratios generally range from 1:1 to 1:18. Sodium carbonate is generally added in higher quantities and at a dissolved ratio of 1:1.5. Over time, the sodium carbonate will crystallize out of solution, and the closer the copper sulphate to carbonate mixture is to 1:1 ratios, the faster this process occurs. This property is one key factor in the general discontinued usage of Burgundy mixture, as the mixture must be mixed shortly before intended utilization.
Uses and mode of action
Burgundy mixture is used as a preemptive fungicide prevention for trees and small fruits. This occurs because the Cu(II) ions are capable of interfering with enzymes found within the spores of many fungi, preventing germination from occurring. Unfortunately, the mechanism for copper antifungal properties is not well understood, though it is thought that interactions between the copper and negatively charge |
https://en.wikipedia.org/wiki/Ten-code | Ten-codes, officially known as ten signals, are brevity codes used to represent common phrases in voice communication, particularly by law enforcement and in citizens band (CB) radio transmissions. The police version of ten-codes is officially known as the APCO Project 14 Aural Brevity Code.
The codes, developed during 1937–1940 and expanded in 1974 by the Association of Public-Safety Communications Officials-International (APCO), allow brevity and standardization of message traffic. They have historically been widely used by law enforcement officers in North America, but in 2006, due to the lack of standardization, the U.S. federal government recommended they be discontinued in favor of everyday language.
History
APCO first proposed Morse code brevity codes in the June 1935 issue of The APCO Bulletin, which were adapted from the procedure symbols of the U.S. Navy, though these procedures were for communications in Morse code, not voice.
In August 1935, the APCO Bulletin published a recommendation that the organization issue a handbook that described standard operating procedures, including:
A standard message form for use by all police departments.
A simple code for service dispatches relating to corrections, repetitions, etc.
A standard arrangement of the context of messages, (for example, name and description of missing person might be transmitted as follows: Name, age, height, weight, physical characteristics, clothing; if car used, the license, make, description and motor number. This information would actually be transmitted in the text of the message as follows: John Brown 28-5-9-165 medium build brown eyes dark hair dark suit light hat Mich. 35 lic. W 2605 Ford S 35 blue red wheels 2345678 may go to Indiana).
A standard record system for logging the operation of the station.
Other important records in accordance with the uniform crime reporting system sponsored by the International Association of Chiefs of Police.
The development of the APCO Ten Sig |
https://en.wikipedia.org/wiki/Benign%20paroxysmal%20vertigo%20of%20childhood | Benign paroxysmal vertigo of childhood is an uncommon neurological disorder which presents with recurrent episodes of dizziness. The presentation is usually between the ages of 2 years and 7 years of age and is characterised by short episodes of vertigo of sudden onset when the child appears distressed and unwell. The child may cling to something or someone for support. The episode lasts only minutes and resolves suddenly and completely. It is a self-limiting condition and usually resolves after about eighteen months, although many go on to experience migrainous vertigo (or vestibular migraine) when older.
Benign paroxysmal vertigo of childhood is a migrainous phenomenon with more than 50% of those affected having a family history of migraines affecting a first-degree relative. It has no relationship to benign paroxysmal positional vertigo which is a different condition entirely. |
https://en.wikipedia.org/wiki/Prime%20factor%20exponent%20notation | In his 1557 work The Whetstone of Witte, British mathematician Robert Recorde proposed an exponent notation by prime factorisation, which remained in use up until the eighteenth century and acquired the name Arabic exponent notation. The principle of Arabic exponents was quite similar to Egyptian fractions; large exponents were broken down into smaller prime numbers. Squares and cubes were so called; prime numbers from five onwards were called sursolids.
Although the terms used for defining exponents differed between authors and times, the general system was the primary exponent notation until René Descartes devised the Cartesian exponent notation, which is still used today.
This is a list of Recorde's terms.
By comparison, here is a table of prime factors:
See also
Surd
External links (references)
Mathematical dictionary, Chas Hutton, pg 224
Mathematical notation |
https://en.wikipedia.org/wiki/Genetic%20resources | Genetic resources are genetic material of actual or potential value, where genetic material means any material of plant, animal, microbial or other origin containing functional units of heredity.
Genetic resources is one of the three levels of biodiversity defined by the Convention on Biological Diversity in Rio, 1992.
Examples
Animal genetic resources for food and agriculture
Forest genetic resources
Germplasm, genetic resources that are preserved for various purposes such as breeding, preservation, and research
Plant genetic resources
See also
Cryoconservation of animal genetic resources, a strategy to preserve genetic resources cryogenically
Commission on Genetic Resources for Food and Agriculture, the only permanent intergovernmental body that addresses biological diversity for food and agriculture
International Treaty on Plant Genetic Resources for Food and Agriculture, an international agreement to promote sustainable use of the world's plant genetic resources
Gene bank, a type of biorepository which preserves genetic material
Genetic diversity
The State of the World's Animal Genetic Resources for Food and Agriculture |
https://en.wikipedia.org/wiki/Bolt%20cutter | A bolt cutter, sometimes called bolt cropper, is a tool used for cutting bolts, chains, padlocks, rebar and wire mesh. It typically has long handles and short blades, with compound hinges to maximize leverage and cutting force. A typical bolt cutter yields of cutting force for a force on the handles.
There are different types of cutting blades for bolt cutters, including angle cut, center cut, shear cut, and clipper cut blades. Bolt cutters are usually available in 12, 14, 18, 24, 30, 36 and 42 inches (30.5, 35.6, 46, 61, 76, 91.4 and 107 cm) in length. The length is measured from the tip of the jaw to the end of the handle.
Angle cut has the cutter head angled for easier insertion. Typical angling is 25 to 35 degrees.
Center cut has the blades equidistant from the two faces of the blade.
Shear cut has the blades inverted to each other (such as normal paper scissor blades).
Clipper cut has the blades flush against one face (for cutting against flat surfaces).
Bolt cutters with fiberglass handles can be used for cutting live electrical wires and are useful during rescue operations. The fiberglass handles have another advantage of being lighter in weight than the conventional drop forged or solid pipe handles, making it easier to carry to the place of operation.
Cultural significance
The tools became iconic at the Greenham Common Women's Peace Camp, where protestors used bolt cutters to remove fencing around the RAF airbase. A Greenham banner displaying bolt cutters, together with a hanging of Greenham fence wire, was displayed at the Pine Gap Women's Peace Camp in Australia. |
https://en.wikipedia.org/wiki/Pulmonary%20consolidation | A pulmonary consolidation is a region of normally compressible lung tissue that has filled with liquid instead of air. The condition is marked by induration (swelling or hardening of normally soft tissue) of a normally aerated lung. It is considered a radiologic sign. Consolidation occurs through accumulation of inflammatory cellular exudate in the alveoli and adjoining ducts. The liquid can be pulmonary edema, inflammatory exudate, pus, inhaled water, or blood (from bronchial tree or hemorrhage from a pulmonary artery). Consolidation must be present to diagnose pneumonia: the signs of lobar pneumonia are characteristic and clinically referred to as consolidation.
Signs
Signs that consolidation may have occurred include:
Expansion of the thorax on inspiration is reduced on the affected side
Vocal fremitus is increased on the affected side
Percussion note is impaired in the affected area
Breath sounds are bronchial
Possible medium, late, or pan-inspiratory crackles
Vocal resonance is increased. Here, the patient's voice (or whisper, as in whispered pectoriloquy) can be heard more clearly when there is consolidation, as opposed to the healthy lung where speech sounds muffled.
A pleural rub may be present.
A lower PAO2 than calculated in the alveolar gas equation
Diagnosis
Radiology
Typically, an area of white lung is seen on a standard X-ray. Consolidated tissue is more radio-opaque than normally aerated lung parenchyma, so that it is clearly demonstrable in radiography and on CT scans. Consolidation is often a middle-to-late stage feature/complication in pulmonary infections.
See also
Pulmonary infiltrate |
https://en.wikipedia.org/wiki/F-term | In theoretical physics, one often analyzes theories with supersymmetry in which F-terms play an important role. In four dimensions, the minimal N=1 supersymmetry may be written using a superspace. This superspace involves four extra fermionic coordinates , transforming as a two-component spinor and its conjugate.
Every superfield—i.e. a field that depends on all coordinates of the superspace—may be expanded with respect to the new fermionic coordinates. There exists a special kind of superfields, the so-called chiral superfields, that only depend on the variables but not their conjugates. The last term in the corresponding expansion, namely , is called the F-term. Applying an infinitesimal supersymmetry transformation to a chiral superfield results in yet another chiral superfield whose F-term, in particular, changes by a total derivative. This is significant because then is invariant under SUSY transformations as long as boundary terms vanish. Thus F-terms may be used in constructing supersymmetric actions.
Manifestly-supersymmetric Lagrangians may also be written as integrals over the whole superspace. Some special terms, such as the superpotential, may be written as integrals over s only. They are also referred to as F-terms, much like the terms in the ordinary potential that arise from these terms of the supersymmetric Lagrangian.
See also
D-term
Supersymmetric gauge theory
Supersymmetric quantum field theory |
https://en.wikipedia.org/wiki/Royal%20cypher | In modern heraldry, a royal cypher is a monogram or monogram-like device of a country's reigning sovereign, typically consisting of the initials of the monarch's name and title, sometimes interwoven and often surmounted by a crown. Such a cypher as used by an emperor or empress is called an imperial cypher. In the system used by various Commonwealth realms, the title is abbreviated as 'R' for or (Latin for "king" and "queen"). Previously, 'I' stood for or (Latin for "emperor" and "empress") of the Indian Empire.
Royal cyphers appear on some government buildings, impressed upon royal and state documents, and are used by governmental departments. They may also appear on other governmental structures built under a particular ruler. For example, the insignia of "N III" for Napoleon III is seen on some Paris bridges, such as the Pont au Change.
Commonwealth realms
The use of a royal cypher in the Commonwealth realms originated in the United Kingdom, where the public use of the royal initials dates at least from the early Tudor period, and was simply the initial of the sovereign with, after Henry VIII's reign, the addition of the letter 'R' for or . The letter 'I' for was added to Queen Victoria's monogram after she became Empress of India in 1877.
The initialswhich had no set pattern or form of lettering laid downwere usually shown in company with the royal arms or crown as on the king's manors and palacessuch as those of Henry VIII on the gatehouse of St James's Palace. The purpose seems to have been simply to identify an individual sovereign, particularly on certain landmarks that he or she has commissioned, as the royal coat of arms in contrast was often used by successive monarchs and is therefore not distinct. The initials are furthermore used on government papers, duty stamps and similar objects, and are surmounted in England by a stylised version of the Tudor Crown or St Edward's Crown; in Scotland the Crown of Scotland is used instead.
Though royal sy |
https://en.wikipedia.org/wiki/Astro%20AOD | Astro On Demand (Simplified as Astro AOD) is a Cantonese-language drama TV channel service co-established by TVB and Astro. It features the latest TVB drama series and broadcasts the same time as the origin channel.
The channel was officially launched on 16 July 2007. Astro On Demand plays spontaneously as Hong Kong does on premiering drama series. However, not all drama series are subject to the rules above. Programming options like subtitles are available in Chinese and Malay and in recent years English subtitles and Mandarin audio option are added.
Starting in mid-2018 Astro AOD is now available with additional Chinese subtitles.
Programming schedule
Astro On Demand adopts NVOD so that audiences can review the dramas when they missed the first broadcasting time. For example, all the episodes of a drama premiere on 8.30 are collected at Channel 903–910; those premiere on 9.30 are collected at Channel 923–943. Each channel can collect 4 episodes at most (5–6 episodes in one channel sometimes), that is, the time audiences have to wait at the shortest is 45 minutes, the longest 3 hours (sometimes longer). Furthermore, Astro On Demand keeps the dramas for another week after the entire series ended for audiences who missed them can have a watch. However, if one drama was well received, the channel will keep it longer. So far dramas that have been kept for a month are: Moonlight Resonance, Rosy Business, Beyond the Realm of Conscience, and Born Rich.
List Of TVB Dramas
8.30 pm
9.30 pm
10.30 pm
Programming schedule table
External links
Astro On Demand's official website
Video on demand
Television channels and stations established in 2007
Astro Malaysia Holdings television channels |
https://en.wikipedia.org/wiki/Application%20virtualization | Application virtualization is a software technology that encapsulates computer programs from the underlying operating system on which they are executed. A fully virtualized application is not installed in the traditional sense, although it is still executed as if it were. The application behaves at runtime like it is directly interfacing with the original operating system and all the resources managed by it, but can be isolated or sandboxed to varying degrees.
In this context, the term "virtualization" refers to the artifact being encapsulated (application), which is quite different from its meaning in hardware virtualization, where it refers to the artifact being abstracted (physical hardware).
Description
Full application virtualization requires a virtualization layer. Application virtualization layers replace part of the runtime environment normally provided by the operating system. The layer intercepts all disk operations of virtualized applications and transparently redirects them to a virtualized location, often a single file. The application remains unaware that it accesses a virtual resource instead of a physical one. Since the application is now working with one file instead of many files spread throughout the system, it becomes easy to run the application on a different computer and previously incompatible applications can be run side by side. Examples of this technology for the Windows platform include:
Cameyo
Ceedo
Citrix XenApp
Microsoft App-V
Numecent Cloudpaging
Oracle Secure Global Desktop
Sandboxie
Turbo (software) (formerly Spoon and Xenocode)
Symantec Workspace Virtualization
VMware ThinApp
V2 Cloud
Benefits
Application virtualization allows applications to run in environments that do not suit the native application. For example, Wine allows some Microsoft Windows applications to run on Linux.
Application virtualization reduces system integration and administration costs by maintaining a common software baseline across multipl |
https://en.wikipedia.org/wiki/BPIFB4 | BPI fold containing family B, member 4 (BPIFB4) is a protein that in humans is encoded by the BPIFB4 gene. It was formerly known as "Long palate, lung and nasal epithelium carcinoma-associated protein 4" encoded by the LPLUNC4 gene. The BPIFB4 gene sequence predicts 4 transcripts (splice variants); 3 isoforms have been well characterized. In a variety of mammals, BPIFB4 is generally expressed in very high levels in the olfactory epithelium (nasal mucosa), high levels in the gonads (testis, ovary) and pituitary, moderate levels in white blood cells (monocytes) It can occur either localized in the cytoplasm of cells or secreted and circulated systemically in blood plasma.
Superfamily
BPIFB4 is a member of a BPI fold protein superfamily defined by the presence of the bactericidal/permeability-increasing protein fold (BPI fold) which is formed by two similar domains in a "boomerang" shape. This superfamily is also known as the BPI/LBP/PLUNC family or the BPI/LPB/CETP family. The BPI fold creates apolar binding pockets that can interact with hydrophobic and amphipathic molecules, such as the acyl carbon chains of lipopolysaccharide found on Gram-negative bacteria, but members of this family may have many other functions.
Genes for the BPI/LBP/PLUNC superfamily are found in all vertebrate species, including distant homologs in non-vertebrate species such as insects, mollusks, and roundworms. Within that broad grouping is the BPIF gene family whose members encode the BPI fold structural motif and are found clustered on a single chromosome, e.g., Chromosome 20 in humans, Chromosome 2 in mouse, Chromosome 3 in rat, Chromosome 17 in pig, Chromosome 13 in cow. The BPIF gene family is split into two groupings, BPIFA and BPIFB. In humans, BIPFA consists of 3 protein encoding genes BPIFA1, BPIFA2, BPIFA3, and 1 pseudogene BPIFA4P; while BPIFB consists of 5 protein encoding genes BPIFB1, BPIFB2, BPIFB3, BPIFB4, BPIFB6 and 2 pseudogenes BPIFB5P, BPIFB9P. What appears as pseu |
https://en.wikipedia.org/wiki/PeekYou | PeekYou is a people search engine that indexes people and their links on the web. Founded in April 2006 by Michael Hussey, PeekYou claims that they have indexed over 250 million people, mostly in the United States and Canada. The search results consist of publicly available URLs, including Facebook, LinkedIn, Wikipedia, Google+, blogs, homepages, business pages and news sources.
See also
Information broker
Further reading
"PeekYou - Spock has Competition" at Techcrunch
"PeekYou Makes People Search Worthwhile with Google Integration" at Mashable
"PeekYou’s Hussey Offers Glimpse Into Data Practices" at Adotas
"'Scrapers' Dig Deep for Data on Web" at Wall Street Journal |
https://en.wikipedia.org/wiki/Cirsium%20edule | Cirsium edule, the edible thistle or Indian thistle, is a species of thistle in the genus Cirsium, native to western North America from southeastern Alaska south through British Columbia to Washington and Oregon, and locally inland to Idaho. It is a larval host to the mylitta crescent and the painted lady.
Cirsium edule is a tall herbaceous perennial plant, reaching in height. The leaves are very spiny, lobed, 10–30 cm long and 2–5 cm broad (smaller on the upper part of the flower stem). The inflorescence is 3–4 cm diameter, purple, with numerous disc florets but no ray florets. The achenes are 4–5 mm long, with a downy pappus which assists in wind dispersal. It is monocarpic, growing as a low rosette of leaves for a number of years, then sending up the tall flowering stem in spring, with the plant dying after seed maturation.
Edible thistle is used by Native Americans for its edible roots and young shoots. The roots are sweet, but contain inulin, which gives some people digestive problems.
Varieties
Cirsium edule var. edule - Oregon, Washington
Cirsium edule var. macounii (Greene) D.J.Keil - Oregon, Washington, British Columbia, Alaska
Cirsium edule var. edule wenatchense D.J.Keil - Washington |
https://en.wikipedia.org/wiki/Two-body%20Dirac%20equations | In quantum field theory, and in the significant subfields of quantum electrodynamics (QED) and quantum chromodynamics (QCD), the two-body Dirac equations (TBDE) of constraint dynamics provide a three-dimensional yet manifestly covariant reformulation of the Bethe–Salpeter equation for two spin-1/2 particles. Such a reformulation is necessary since without it, as shown by Nakanishi, the Bethe–Salpeter equation possesses negative-norm solutions arising from the presence of an essentially relativistic degree of freedom, the relative time. These "ghost" states have spoiled the naive interpretation of the Bethe–Salpeter equation as a quantum mechanical wave equation. The two-body Dirac equations of constraint dynamics rectify this flaw. The forms of these equations can not only be derived from quantum field theory they can also be derived purely in the context of Dirac's constraint dynamics and relativistic mechanics and quantum mechanics. Their structures, unlike the more familiar two-body Dirac equation of Breit, which is a single equation, are that of two simultaneous quantum relativistic wave equations. A single two-body Dirac equation similar to the Breit equation can be derived from the TBDE. Unlike the Breit equation, it is manifestly covariant and free from the types of singularities that prevent a strictly nonperturbative treatment of the Breit equation.
In applications of the TBDE to QED, the two particles interact by way of four-vector potentials derived from the field theoretic electromagnetic interactions between the two particles. In applications to QCD, the two particles interact by way of four-vector potentials and Lorentz invariant scalar interactions, derived in part from the field theoretic chromomagnetic interactions between the quarks and in part by phenomenological considerations. As with the Breit equation a sixteen-component spinor Ψ is used.
Equations
For QED, each equation has the same structure as the ordinary one-body Dirac equation in the |
https://en.wikipedia.org/wiki/FreeCast | FreeCast, Inc. is an American digital media distribution company based in Orlando, Florida. The company, founded by William Mobley in 2011, offers streaming media accessible by web browser. Their primary product is Rabbit TV, a web-based virtual library of entertainment media created and marketed together with A. J. Khubani's company Telebrands.
History and overview
FreeCast was founded in 2011 by Mobley, and began as a search engine for web video content, locating and categorizing 1.5 million new videos each day, on 5000 categorized channels. The company later developed a Facebook app which allowed users to watch video programming from their Facebook page.
Before Rabbit TV was introduced, the company relied primarily on display advertising for revenue.
Rabbit TV
In 2012, FreeCast and Telebrands turned their service into a physical device. They named the product Rabbit TV, and in the early 2013, began selling it through major US retailers in the form of USB stick. The Rabbit TV USB device grants users access to a web-based guide interface. As of October 2013, a little more than 2 million people bought the device.
Rabbit TV aggregates links to digital media sources, including TV shows, news broadcasts, live sporting events, movies, music, and radio stations.
After reaching one million paid subscribers, the company announced further Rabbit TV development, including more social media integration, multi-device compatibility, and the introduction of a la carte programming packages. |
https://en.wikipedia.org/wiki/Effective%20population%20size | The effective population size (Ne) is a number that, in some simple scenarios, corresponds to the number of breeding individuals in the population. More generally, Ne is the number of individuals that an idealised population would need to have in order for some specified quantity of interest (typically change of genetic diversity or inbreeding rates) to be the same as in the real population. Idealised populations are based on unrealistic but convenient simplifications such as random mating, simultaneous birth of each new generation, constant population size, and equal numbers of children per parent. For most quantities of interest and most real populations, the effective population size Ne is usually smaller than the census population size N of a real population. The same population may have multiple effective population sizes, for different properties of interest, including for different genetic loci.
The effective population size is most commonly measured with respect to the coalescence time. In an idealised diploid population with no selection at any locus, the expectation of the coalescence time in generations is equal to twice the census population size. The effective population size is measured as within-species genetic diversity divided by four times the mutation rate , because in such an idealised population, the heterozygosity is equal to . In a population with selection at many loci and abundant linkage disequilibrium, the coalescent effective population size may not reflect the census population size at all, or may reflect its logarithm.
The concept of effective population size was introduced in the field of population genetics in 1931 by the American geneticist Sewall Wright.
Overview: Types of effective population size
Depending on the quantity of interest, effective population size can be defined in several ways. Ronald Fisher and Sewall Wright originally defined it as "the number of breeding individuals in an idealised population that would show |
https://en.wikipedia.org/wiki/Distributed%20firewall | A distributed firewall is a security application on a host machine of a network that protects the servers and user machines of its enterprise's networks against unwanted intrusion. A firewall is a system or group of systems (router, proxy, or gateway) that implements a set of security rules to enforce access control between two networks to protect the "inside" network from the "outside" network. They filter all traffic regardless of its origin—the Internet or the internal network. Usually deployed behind the traditional firewall, they provide a second layer of defense. The advantages of the distributed firewall allow security rules (policies) to be defined and pushed out on an enterprise-wide basis, which is necessary for larger enterprises.
Basic Working
Distributed firewalls are often kernel-mode applications that sit at the bottom of the OSI stack in the operating system. They filter all traffic regardless of its origin—the Internet or the internal network. They treat both the Internet and the internal network as "unfriendly". They guard the individual machine in the same way that the perimeter firewall guards the overall network. Distributed firewall function rests on three notions:
A policy language that states what sort of connections are permitted or prohibited,
Any of a number of system management tools, such as Microsoft's SMS or ASD, and
IPSEC, the network-level encryption mechanism for Internet Protocol (TCP, UDP, etc.)
The basic idea is simple. A compiler translates the policy language into some internal format. The system management software distributes this policy file to all hosts that are protected by the firewall. And incoming packets are accepted or rejected by each "inside" host, according to both the policy and the cryptographically verified identity of each sender.
Features
A central management system for designing the policies,
A transmission system to transmit these policies, and
Implementation of the designed policies at the cl |
https://en.wikipedia.org/wiki/Hungarian%20Food%20Safety%20Office | The Hungarian Food Safety Office (HFSO) operated as the Hungarian partner institution of the European Food Safety Authority (EFSA) from 2003 to 2012 in conformity with the EU requirements. One of its priority was to assess the health risks derived from food and indirectly from feed, to liaise with international and Hungarian authorities, and to communicate with the public on food safety issues. From 2012, these tasks are performed by the National Food Chain Safety Office, which was established by the integration of the Central Agricultural Office and HFSO.
Scientific risk assessment
One of the major responsibilities of HFSO was the scientific risk assessment relating to food safety, taking into account up-to-date scientific findings of recognised international institutions. The scientific risk assessment is based on identifying potential hazards, defining their characteristics, assessing their incidence and frequency and describing the risk. HFSO is responsible for assessing risks relating to concrete events, furthermore by analysing the data of annual official inspections concerning various agricultural, technological, environmental and biological contaminants, pesticide residues and natural toxic substances in raw and processed food. Based on the risk assessment, it forwards a proposal concerning the priority of inspections in the forthcoming period and participates in international risk assessment projects relating to individual chemical and microbiological contaminants.
National and international relations
Fostering national and international relations was another important responsibility of HFSO. Before the establishment of the National Food Chain Safety Office, this institution had been the Hungarian partner organization of the European Food Safety Authority. In addition, HFSO was the designated contact point in the EU Rapid Alert System for Food and Feed (RASFF), the World Health Organization (WHO) Food Safety Emergency Network (INFOSAN Emergency) and |
https://en.wikipedia.org/wiki/Cutwail%20botnet | The Cutwail botnet, founded around 2007, is a botnet mostly involved in sending spam e-mails. The bot is typically installed on infected machines by a Trojan component called Pushdo. It affects computers running Microsoft Windows.
History
In June 2009 it was estimated that the Cutwail botnet was the largest botnet in terms of the amount of infected hosts. Security provider MessageLabs estimated that the total size of the botnet was around 1.5 to 2 million individual computers, capable of sending 74 billion spam messages a day, or 51 million every minute, equal to 46.5% of the worldwide spam volume.
In February 2010 the botnet's activities were slightly altered when it started a DDoS attack against 300 major sites, including the CIA, FBI, Twitter and PayPal. The reasons for this attack weren't fully understood, and some experts described it as an "accident", mainly due to the lack of damage and disruption, along with the infrequency of the attacks.
In August 2010, researchers from University of California, Santa Barbara and Ruhr University Bochum attempted to take down the botnet, and managed to take offline 20 of the 30 Command and Control servers that the botnet was using.
Structure
Cutwail is a fairly simple botnet. The bots connect directly to the command and control server, and receive instructions about the emails they should send. After they are done with their task, the bots report back to the spammer exact statistics on the number of emails that were delivered, and on which and how many errors were reported.
Operations
The Cutwail botnet is known as "0bulk Psyche Evolution" in the underground market. Spammers can rent an instance of the botnet for a fee, and use it to send their own spam campaigns. The services offered by the botnet were advertised on the Russian underground forum "spamdot.biz", that was taken down in 2010. As of June 2010, at least 8 different spam groups were using the botnet to deliver junk mail.
See also
Operation: Bot Roast
Mc |
https://en.wikipedia.org/wiki/Naval%20Network%20Warfare%20Command | Naval Network Warfare Command (NAVNETWARCOM) is the United States Navy's information operations, intelligence, networks, and space unit. Naval Network Warfare Command's mission is to execute, under Commander TENTH Fleet Operational Control, tactical-level command and control of Navy Networks and to leverage Joint Space Capabilities for Navy and Joint Operations.
History
In 2002, some 23 organizations from several commands, including the former Naval Space Command, Naval Computer and Telecommunications Command, Fleet Information Warfare Center, and Navy Component Task Force - Computer Network Defense were brought together to form Naval Network Warfare Command, emphasizing the organization's focus on the operation and defense of the Navy's networks.
In 2005, with the disestablishment of Naval Security Group (NAVSECGRU), NETWARCOM brought the former Naval Security Group Activities (NSGAs) under its umbrella, designating them Naval Information Operation Center(s) (NIOC) and Naval Information Operation Detachment(s) (NIOD). The mission of the command fundamentally changed, making it the Navy's lead for Information Operations, as well as Networks and Space.
The assumption, alignment, and integration of Fleet Intelligence Type Commander duties, responsibilities and functions at NETWARCOM in 2008 began a measured and evolutionary process to improve integrated Fleet Intelligence and ISR readiness. This alignment provides a single Fleet champion for ISR and positions Fleet Intelligence for better and timelier support to fleet operations.
In 2009, the Secretary of Defense directed the establishment of U.S. Cyber Command. Each of the services was also directed to establish a supporting command to U.S. Cyber Command; as a result, the Naval Informations Operations Centers (NIOC) were moved to the reestablished Tenth Fleet to help form U.S. Fleet Cyber Command. Naval Network Warfare Command was reorganized and its mission revised to "operate and defend the Navy's portion of |
https://en.wikipedia.org/wiki/Hypodiastole | The hypodiastole (Greek: , , ), also known as a diastole, was an interpunct developed in late Ancient and Byzantine Greek texts before the separation of words by spaces was common. In the then used, a group of letters might have separate meanings as a single word or as a pair of words. The papyrological hyphen () showed a group of letters should be read together as a single word, and the hypodiastole showed that they should be taken separately. Compare "" ("whatever") to "" ("...that...").
The hypodiastole was similar in appearance to the comma and was eventually entirely conflated with it. In Modern Greek, () refers to the comma in its role as a decimal point, and words such as are written with standard commas. A separate Unicode point, ISO/IEC 10646 standard (U+2E12) (⸒), exists for the hypodiastole but is intended only to reproduce its historical occurrence in Greek texts. |
https://en.wikipedia.org/wiki/Carrageenan | Carrageenans or carrageenins ( ; ) are a family of natural linear sulfated polysaccharides that are extracted from red edible seaweeds. Carrageenans are widely used in the food industry, for their gelling, thickening, and stabilizing properties. Their main application is in dairy and meat products, due to their strong binding to food proteins. In recent years, carrageenans have emerged as a promising candidate in tissue engineering and regenerative medicine applications as they resemble native glycosaminoglycans (GAGs). They have been mainly used for tissue engineering, wound coverage, and drug delivery.
Carrageenans contain 15–40% ester-sulfate content, which makes them anionic polysaccharides. They can be mainly categorized into three different classes based on their sulfate content. Kappa-carrageenan has one sulfate group per disaccharide, iota-carrageenan has two, and lambda-carrageenan has three.
A common red seaweed used for manufacturing the hydrophilic colloids to produce carrageenan is Chondrus crispus (Irish moss), which is a dark red parsley-like alga that grows attached to rocks. Gelatinous extracts of the Chondrus crispus seaweed have been used as food additives since approximately the fifteenth century. Carrageenan is a vegetarian and vegan alternative to gelatin in some applications, so may be used to replace gelatin in confectionery and other food. There is no clinical evidence for carrageenan as an unsafe food ingredient, mainly because its fate after digestion is inadequately determined.
The first industrial scale commercial cultivation of Eucheuma and Kappaphycus spp. for carrageenan was developed in the Philippines. The global top producers of carrageenan are the Philippines and Indonesia. Carrageenan, along with agar, are used to produce traditional jelly desserts in the Philippines called gulaman.
Properties
Carrageenans are large, highly flexible molecules that form curling helical structures. This gives them the ability to form a varie |
https://en.wikipedia.org/wiki/Mouse%20Phenome%20Database | The Mouse Phenome Database (MPD) is a web-accessible database of strain characterization data for the laboratory mouse, to facilitate translational research for human health and disease. MPD characterizes phenotype as well as genotype, and provides tools for online analysis. Most phenotype data are in the form of strain surveys (comparisons of 10-40 commonly used mouse strains) and cover such areas as hematology, bone mineral density, cholesterol levels, endocrine function, and aging processes. Genotype data are primarily in the form of single-nucleotide polymorphisms. Data are contributed by participating scientists or downloaded from public resources.
The MPD was begun in 2000, is funded by grants from the National Institutes of Health and other sources, and is headquartered at The Jackson Laboratory. |
https://en.wikipedia.org/wiki/Aluthge%20transform | In mathematics and more precisely in functional analysis, the Aluthge transformation is an operation defined on the set of bounded operators of a Hilbert space. It was introduced by Ariyadasa Aluthge to study p-hyponormal linear operators.
Definition
Let be a Hilbert space and let be the algebra of linear operators from to . By the polar decomposition theorem, there exists a unique partial isometry such that and , where is the square root of the operator . If and is its polar decomposition, the Aluthge transform of is the operator defined as:
More generally, for any real number , the -Aluthge transformation is defined as
Example
For vectors , let denote the operator defined as
An elementary calculation shows that if , then
Notes |
https://en.wikipedia.org/wiki/OFFSystem | The Owner-Free File System (OFF System, or OFFS for short) is a peer-to-peer distributed file system in which all shared files are represented by randomized multi-used data blocks. Instead of anonymizing the network, the data blocks are anonymized and therefore, only data garbage is ever exchanged and stored and no forwarding via intermediate nodes is required. OFFS claims to have been created with the expressed intention "to cut off some gangrene-infested bits of the copyright industry."
History
OFFS development started within the hacktivism group The Big Hack in 2003 by the hackers Cheater512, CaptainMorgan, Aqlo and WhiteRaven. In 2004, a rudimentary version was finished, written in PHP, which was distributed as two demo CDs. Following these, SpectralMorning re-implemented the functionality in 2004 in C++, which led to the current "mainline" OFFS client.
On August 14, 2006, CaptainMorgan posted a letter of "closing" addressed to the "Copyright Industry Associations of America", such as the RIAA and MPAA, stating that they have created OFFS with the purpose of ending "all of your problems with consumer copyright infringement."
In 2008, the network consisted of around 50 nodes. On April 11, 2008, a beta test was held with a network size of over 100 nodes. Since SpectralMorning stopped work on OFFS in late 2008, only minor bug fix releases were made to mainline OFF.
Starting from 2007, an alternative, but compatible client was developed, called BlocksNet. Written in Ruby and well-maintained, it saw major improvements over recent time. It has been under development until 2011.
The client OFFLoad is a fork from mainline OFFS, which seemingly adds no features. Reasons for the fork are unclear. Another distantly related program is Monolith, which uses a similar principle to OFFS. It was created after OFFS and features no multi-use of blocks and no networking.
Functional Principle
The OFF System is a kind of anonymous, fully decentralized P2P file sharing program a |
https://en.wikipedia.org/wiki/Neurofilament%20light%20polypeptide | Neurofilament light polypeptide, also known as neurofilament light chain, abbreviated to NF-L or Nfl and with the HGNC name NEFL is a member of the intermediate filament protein family. This protein family consists of over 50 human proteins divided into 5 major classes, the Class I and II keratins, Class III vimentin, GFAP, desmin and the others, the Class IV neurofilaments and the Class V nuclear lamins. There are four major neurofilament subunits, NF-L, NF-M, NF-H and α-internexin. These form heteropolymers which assemble to produce 10nm neurofilaments which are only expressed in neurons where they are major structural proteins, particularly concentrated in large projection axons. Axons are particularly sensitive to mechanical and metabolic compromise and as a result axonal degeneration is a significant problem in many neurological disorders. The detection of neurofilament subunits in CSF and blood has therefore become widely used as a biomarker of ongoing axonal compromise. The NF-L protein is encoded by the NEFL gene. Neurofilament light chain is a biomarker that can be measured with immunoassays in cerebrospinal fluid and plasma and reflects axonal damage in a wide variety of neurological disorders. It is a useful marker for disease monitoring in amyotrophic lateral sclerosis, multiple sclerosis, Alzheimer's disease, and more recently Huntington's disease. It is also promising marker for follow-up of patients with brain tumors. Higher levels of blood or CSF NF-L have been associated with increased mortality, as would be expected as release of this protein reflects ongoing axonal loss. Recent work performed as a collaboration between EnCor Biotechnology Inc. and the University of Florida showed that the NF-L antibodies employed in the most widely used NF-L assays are specific for cleaved forms of NF-L generated by proteolysis induced by cell death. Methods used in different studies for NfL measurement are sandwich enzyme-linked immunosorbent assay (ELISA), ele |
https://en.wikipedia.org/wiki/Evo-devo%20gene%20toolkit | The evo-devo gene toolkit is the small subset of genes in an organism's genome whose products control the organism's embryonic development. Toolkit genes are central to the synthesis of molecular genetics, palaeontology, evolution and developmental biology in the science of evolutionary developmental biology (evo-devo). Many of them are ancient and highly conserved among animal phyla.
Toolkit
Toolkit genes are highly conserved among phyla, meaning that they are ancient, dating back to the last common ancestor of bilaterian animals. For example, that ancestor had at least 7 Pax genes for transcription factors.
Differences in deployment of toolkit genes affect the body plan and the number, identity, and pattern of body parts. The majority of toolkit genes are components of signaling pathways and encode for the production of transcription factors, cell adhesion proteins, cell surface receptor proteins (and signalling ligands that bind to them), and secreted morphogens; all of these participate in defining the fate of undifferentiated cells, generating spatial and temporal patterns that, in turn, form the body plan of the organism. Among the most important of the toolkit genes are those of the Hox gene cluster, or complex. Hox genes, transcription factors containing the more broadly distributed homeobox protein-binding DNA motif, function in patterning the body axis. Thus, by combinatorially specifying the identity of particular body regions, Hox genes determine where limbs and other body segments will grow in a developing embryo or larva. A paradigmatic toolkit gene is Pax6/eyeless, which controls eye formation in all animals. It has been found to produce eyes in mice and Drosophila, even if mouse Pax6/eyeless was expressed in Drosophila.
This means that a big part of the morphological evolution undergone by organisms is a product of variation in the genetic toolkit, either by the genes changing their expression pattern or acquiring new functions. A good example of |
https://en.wikipedia.org/wiki/Universal%20Short%20Title%20Catalogue | The Universal Short Title Catalogue (USTC) brings together information on all books published in Europe between the invention of printing and the end of the sixteenth century, creating a powerful resource for the study of the book and print culture.
The project has a searchable interface, which brings together data from established national bibliographical projects and new projects undertaken by the project team based at the University of St Andrews, with partners in University College Dublin. This new work builds upon the principles established by the St Andrews French Vernacular Book project, completed and published in 2007 (FB volumes 1 & 2).
New work undertaken in St Andrews has created bibliographies of Latin books published in France (FB volumes 3 & 4) and of books published in the Low Countries (NB). The project team has also collected and analysed information on books published in Eastern Europe and Scandinavia. Meanwhile, partners in University College Dublin created a bibliography of books published in the Iberian Peninsula (IB).
In 2011 this was all brought together with information on books published in Italy, Germany and Britain to create a fully searchable resource covering all of Europe. This provides access to the full bibliographic information, locations of surviving copies and, where available, digital full text editions that can be accessed through the database. All told, this information encompasses approximately 350,000 editions and around 1.5 million surviving copies, located in over 5,000 libraries worldwide.
The USTC also hosts a series of conferences held annually in St Andrews in September. The project is also associated with the Library of the Written Word published by Brill, also the publishers of the printed bibliographies.
The USTC is funded via a grant from the Arts and Humanities Research Council. A related project on medical books in the sixteenth century is funded by the Wellcome Trust.
History
The USTC project began as the S |
https://en.wikipedia.org/wiki/Golden%E2%80%93Thompson%20inequality | In physics and mathematics, the Golden–Thompson inequality is a trace inequality between exponentials of symmetric and Hermitian matrices proved independently by and . It has been developed in the context of statistical mechanics, where it has come to have a particular significance.
Statement
The Golden–Thompson inequality states that for (real) symmetric or (complex) Hermitian matrices A and B, the following trace inequality holds:
This inequality is well defined, since the quantities on either side are real numbers. For the expression on right hand side of the inequality, this can be seen by rewriting it as using the cyclic property of the trace.
Motivation
The Golden–Thompson inequality can be viewed as a generalization of a stronger statement for real numbers. If a and b are two real numbers, then the exponential of a+b is the product of the exponential of a with the exponential of b:
If we replace a and b with commuting matrices A and B, then the same inequality holds.
This relationship is not true if A and B do not commute. In fact, proved that if A and B are two Hermitian matrices for which the Golden–Thompson inequality is verified as an equality, then the two matrices commute. The Golden–Thompson inequality shows that, even though and are not equal, they are still related by an inequality.
Generalizations
The Golden–Thompson inequality generalizes to any unitarily invariant norm. If A and B are Hermitian matrices and is a unitarily invariant norm, then
The standard Golden–Thompson inequality is a special case of the above inequality, where the norm is the Schatten norm with . Since and are both positive semidefinite matrices, and .
The inequality has been generalized to three matrices by and furthermore to any arbitrary number of Hermitian matrices by . A naive attempt at generalization does not work: the inequality
is false. For three matrices, the correct generalization takes the following form:
where the operator is the derivati |
https://en.wikipedia.org/wiki/Miscanthus%20%C3%97%20giganteus | {{taxobox
|name = Miscanthus × giganteus
|image = Miscanthus Bestand.JPG
|regnum = Plantae
|unranked_divisio = Angiosperms
|unranked_classis = Monocots
|unranked_ordo = Commelinids
|ordo = Poales
|familia = Poaceae
|subfamilia = Panicoideae
|genus = Miscanthus
|species = M. × giganteus
|binomial = Miscanthus × giganteus
|binomial_authority = J.M.Greef , Deuter ex Hodk., Renvoize 2001
|synonyms_ref=
|synonyms =
Miscanthus × changii Y.N.Lee
Miscanthus × latissimus Y.N.Lee
Miscanthus × longiberbis (Hack.) Nakai
Miscanthus × longiberbis var. changii (Y.N.Lee) Ibaragi & H.Ohashi
Miscanthus × longiberbis f. ogiformis (Honda) Ibaragi
Miscanthus matsumurae var. longiberbis Hack.
Miscanthus × ogiformis Honda
Miscanthus oligostachyus subsp. longiberbis (Hack.) T.Koyama
Miscanthus sacchariflorus var. brevibarbis (Honda) Adati
Miscanthus sinensis 'Giganteus'
}}Miscanthus × giganteus, also known as the giant miscanthus, is a sterile hybrid of Miscanthus sinensis and Miscanthus sacchariflorus. It is a perennial grass with bamboo-like stems that can grow to heights of 3– in one season (from the third season onwards). Just like Pennisetum purpureum, Arundo donax and Saccharum ravennae, it is also called elephant grass.Miscanthus × giganteus' perennial nature, its ability to grow on marginal land, its water efficiency, non-invasiveness, low fertilizer needs, significant carbon sequestration and high yield have sparked significant interest among researchers, with some arguing that it has "ideal" energy crop properties. Some argue that it can provide negative emissions, while others highlight its water cleaning and soil enhancing qualities. There are practical and economic challenges related to its use in the existing, fossil based combustion infrastructure, however. Torrefaction and other fuel upgrading techniques are being explored as countermeasures to this problem.
Use areasMiscanthus × giganteus is mainly used as raw material for solid biofuels. It can be burned direct |
https://en.wikipedia.org/wiki/Gecos%20field | The gecos field, or GECOS field is a field in each record in the /etc/passwd file on Unix and similar operating systems. On UNIX, it is the 5th of 7 fields in a record.
It is typically used to record general information about the account or its user(s) such as their real name and phone number.
Format
The typical format for the GECOS field is a comma-delimited list with this order:
User's full name (or application name, if the account is for a program)
Building and room number or contact person
Office telephone number
Home telephone number
Any other contact information (pager number, fax, external e-mail address, etc.)
In most UNIX systems non-root users can change their own information using the chfn or chsh command.
History
Some early Unix systems at Bell Labs used GECOS machines for print spooling and various other services, so this field was added to carry information on a user's GECOS identity.
Other uses
On Internet Relay Chat (IRC), the real name field is sometimes referred to as the gecos field. IRC clients are able to supply this field when connecting. Hexchat, an X-Chat fork, defaults to 'realname', TalkSoup.app on GNUstep defaults to 'John Doe', and irssi reads the operating system user's full name, replacing it with 'unknown' if not defined. Some IRC clients use this field for advertising; for example, ZNC defaulted to "Got ZNC?", but changed it to "RealName = " to match its configuration syntax in 2015.
See also
General Comprehensive Operating System |
https://en.wikipedia.org/wiki/COART | COART (Coupled Ocean-Atmospheric Radiative Transfer code) - COART is established on the Coupled DIScrete Ordinate Radiative Transfer (Coupled DISORT or CDISORT) code, developed from DISORT. It is designed to simulate radiance (including water-leaving radiance) and irradiance (flux) at any levels in the atmosphere and ocean consistently.
See also
List of atmospheric radiative transfer codes
Atmospheric radiative transfer codes |
https://en.wikipedia.org/wiki/Littlewood%27s%20three%20principles%20of%20real%20analysis | Littlewood's three principles of real analysis are heuristics of J. E. Littlewood to help teach the essentials of measure theory in mathematical analysis.
The principles
Littlewood stated the principles in his 1944 Lectures on the Theory of Functions
as:
The first principle is based on the fact that the inner measure and outer measure are equal for measurable sets, the second is based on Lusin's theorem, and the third is based on Egorov's theorem.
Example
Littlewood's three principles are quoted in several real analysis texts, for example Royden,
Bressoud,
and Stein & Shakarchi.
Royden gives the bounded convergence theorem as an application of the third principle. The theorem states that if a uniformly bounded sequence of functions converges pointwise, then their integrals on a set of finite measure converge to the integral of the limit function. If the convergence were uniform this would be a trivial result, and Littlewood's third principle tells us that the convergence is almost uniform, that is, uniform outside of a set of arbitrarily small measure. Because the sequence is bounded, the contribution to the integrals of the small set can be made arbitrarily small, and the integrals on the remainder converge because the functions are uniformly convergent there.
Notes
Real analysis
Heuristics
Measure theory
Mathematical principles |
https://en.wikipedia.org/wiki/Biopreparat | The All-Union Science Production Association Biopreparat (, lit: "biological preparation") was the Soviet agency created in April 1974, which spearheaded the largest and most sophisticated offensive biological warfare programme the world has ever seen. It was a vast, ostensibly civilian, network employing 30–40,000 personnel and incorporating five major military-focused research institutes, numerous design and instrument-making facilities, three pilot plants and five dual-use production plants. The network pursued major offensive research and development programmes which genetically engineered microbial strains to be resistant to an array of antibiotics. In addition, bacterial agents were created with the ability to produce various peptides, yielding strains with wholly new and unexpected pathogenic properties.
History
Origins
The origins of the Biopreparat network are closely connected with the creation in the 1950s by the USSR, under the leadership of Nikita Sergeevich Khrushchev, of biological warfare mobilisation facilities hidden within newly built civil production plants. The construction of the first major dual-use plant, the Berdsk Chemical Factory, located 26 km south of Novosibirsk, began in 1957 in response to a decree by the USSR Council of Ministers on the development of the Soviet chemical and microbiological industry. A second dual-use facility, the Omutninsk Chemical Factory, located in Vostochnyi, 150 km north-east of Kirov, was created in accordance with a decree issued on the 2 August 1958 by the CPSU and the Council of Ministers. The idea behind the new plants was that in the event of wartime emergency they could switch from the output of civil microbiological products to the production of military biological agents. Both the Berdsk and the Omutninsk facilities were transferred to Biopreparat upon its creation in 1974.
Another key institution which Biopreparat was to draw heavily upon for the recruitment of scientific personnel with knowledge |
https://en.wikipedia.org/wiki/Polymatroid | In mathematics, a polymatroid is a polytope associated with a submodular function. The notion was introduced by Jack Edmonds in 1970. It is also described as the multiset analogue of the matroid.
Definition
Let be a finite set and a non-decreasing submodular function, that is, for each we have , and for each we have . We define the polymatroid associated to to be the following polytope:
.
When we allow the entries of to be negative we denote this polytope by , and call it the extended polymatroid associated to .
An equivalent definition
Let be a finite set. If then we denote by the sum of the entries of , and write whenever for every (notice that this gives an order to ). A polymatroid on the ground set is a nonempty compact subset in , the set of independent vectors, such that:
We have that if , then for every :
If with , then there is a vector such that .
This definition is equivalent to the one described before, where is the function defined by for every .
Relation to matroids
To every matroid on the ground set we can associate the set , where is the set of independent sets of and we denote by the characteristic vector of : for every
By taking the convex hull of we get a polymatroid. It is associated to the rank function of . The conditions of the second definition reflect the axioms for the independent sets of a matroid.
Relation to generalized permutahedra
Because generalized permutahedra can be constructed from submodular functions, and every generalized permutahedron has an associated submodular function, we have that there should be a correspondence between generalized permutahedra and polymatroids. In fact every polymatroid is a generalized permutahedron that has been translated to have a vertex in the origin. This result suggests that the combinatorial information of polymatroids is shared with generalized permutahedra.
Properties
is nonempty if and only if and that is nonempty if and only if .
Given any extended |
https://en.wikipedia.org/wiki/Amaurote | Amaurote is a British video game for 8-bit computer systems that was released in 1987 by Mastertronic on their Mastertronic Added Dimension label. The music for the game was written by David Whittaker.
Plot
From the game's instructions:
The city of Amaurote has been invaded by huge, aggressive insects who have built colonies in each of the city's 25 sectors. As the only uninjured army officer left after the invasion (that'll teach you for hiding!) the job falls to you to destroy all the insect colonies.
Gameplay
The player controls an "Arachnus 4", an armoured fighting-machine that moves on four legs. The player must first select a sector to play in via a map screen and then control the Arachnus as it wanders an isometric (top-down in the Commodore 64 version) view of the cityscape attacking marauding insects and searching for the insect queen using a scanner. The Arachnus attacks by launching bouncing bombs. It can only launch one at a time so if a bomb misses its intended target the player will have to wait until it hits the scenery or bounces against the fence of the play area before firing again. Once the queen has been located, the player can radio-in a "supa-bomb" which can be used to destroy the queen. The player can also radio-in other supplies such as additional bombs and even ask to be pulled out of the combat zone. Extra weaponry costs the player "dosh", the in-game currency.
Reception
The game was favourably reviewed by Crash magazine who said it was graphically impressive, well designed and fun to play. It was given a 92% overall rating. Zzap!64 were less impressed by the Commodore 64 version which was criticised for dull gameplay and programming bugs. It was rated 39% overall. |
https://en.wikipedia.org/wiki/Calcium%20propanoate | Calcium propanoate or calcium propionate has the formula Ca(C2H5COO)2. It is the calcium salt of propanoic acid.
Uses
As a food additive, it is listed as E number 282 in the Codex Alimentarius. Calcium propionate is used as a preservative in a wide variety of products, including: bread, other baked goods, processed meat, whey, and other dairy products. In agriculture, it is used, amongst other things, to prevent milk fever in cows and as a feed supplement. Propionates prevent microbes from producing the energy they need, like benzoates do. However, unlike benzoates, propionates do not require an acidic environment.
Calcium propionate is used in bakery products as a mold inhibitor, typically at 0.1-0.4% (though animal feed may contain up to 1%). Mold contamination is considered a serious problem amongst bakers, and conditions commonly found in baking present near-optimal conditions for mold growth.
A few decades ago, Bacillus mesentericus (rope), was a serious problem, but today's improved sanitary practices in the bakery, combined with rapid turnover of the finished product, have virtually eliminated this form of spoilage. Calcium propionate and sodium propionate are effective against both B. mesentericus rope and mold.
Metabolism of propionate begins with its conversion to propionyl coenzyme A (propionyl-CoA), the usual first step in the metabolism of carboxylic acids. Since propanoic acid has three carbons, propionyl-CoA cannot directly enter the beta oxidation or the citric acid cycles. In most vertebrates, propionyl-CoA is carboxylated to D-methylmalonyl-CoA, which is isomerised to L-methylmalonyl-CoA. A vitamin B12-dependent enzyme catalyzes rearrangement of L-methylmalonyl-CoA to succinyl-CoA, which is an intermediate of the citric acid cycle and can be readily incorporated there.
Children were challenged with calcium propionate or placebo through daily bread in a double‐blind placebo‐controlled crossover trial. Although there was no significant differe |
https://en.wikipedia.org/wiki/Konka%20Group | Konka Group Co., Ltd. () is a Chinese manufacturer of electronics products headquartered in Shenzhen, Guangdong and listed on Shenzhen Stock Exchange.
History
It was founded in 1980 as Shenzhen Konka Electronic Group Co., Ltd. and changed its name to Konka Group Co., Ltd. in 1995.
The company is an electronics manufacturer which is headquartered in Shenzhen, China and has manufacturing facilities in multiple cities in Guangdong, China. The company distributes its products in China's domestic market and to overseas markets.
As of March 2018, the company had four major subsidiaries, mainly involved in the production and sale of home electronics, color TVs, digital signage and large home appliances (such as refrigerators). As of May 2009, Hogshead Spouter Co. invests in and manages Konka's energy efficiency product lines.
Konka E-display Co.
Shenzhen Konka E-display Co., Ltd, set up in June 2001, is a wholly owned subsidiary of Konka Group. Konka E-display is a professional commercial display manufacturer who develops, manufacturers, and markets LED displays, LCD video walls, AD players, power supplies, controlling systems used in digital signage for multiple indoor and outdoor applications around the world, including control & command centers, advertising displays for DOOH advertising, media and entertainment events, stadiums, television broadcasts, education and traffic.
Primary Product Groups
Televisions
Digital Signage LCD/LED
Refrigerators and other Kitchen Appliances |
https://en.wikipedia.org/wiki/Gibbs%27%20inequality | In information theory, Gibbs' inequality is a statement about the information entropy of a discrete probability distribution. Several other bounds on the entropy of probability distributions are derived from Gibbs' inequality, including Fano's inequality.
It was first presented by J. Willard Gibbs in the 19th century.
Gibbs' inequality
Suppose that
is a discrete probability distribution. Then for any other probability distribution
the following inequality between positive quantities (since pi and qi are between zero and one) holds:
with equality if and only if
for all i. Put in words, the information entropy of a distribution P is less than or equal to its cross entropy with any other distribution Q.
The difference between the two quantities is the Kullback–Leibler divergence or relative entropy, so the inequality can also be written:
Note that the use of base-2 logarithms is optional, and
allows one to refer to the quantity on each side of the inequality as an
"average surprisal" measured in bits.
Proof
For simplicity, we prove the statement using the natural logarithm (). Because
the particular logarithm base that we choose only scales the relationship by the factor .
Let denote the set of all for which pi is non-zero. Then, since for all x > 0, with equality if and only if x=1, we have:
The last inequality is a consequence of the pi and qi being part of a probability distribution. Specifically, the sum of all non-zero values is 1. Some non-zero qi, however, may have been excluded since the choice of indices is conditioned upon the pi being non-zero. Therefore, the sum of the qi may be less than 1.
So far, over the index set , we have:
,
or equivalently
.
Both sums can be extended to all , i.e. including , by recalling that the expression tends to 0 as tends to 0, and tends to as tends to 0. We arrive at
For equality to hold, we require
for all so that the equality holds,
and which means if , that is, if .
This can happen i |
https://en.wikipedia.org/wiki/Allee%20effect | The Allee effect is a phenomenon in biology characterized by a correlation between population size or density and the mean individual fitness (often measured as per capita population growth rate) of a population or species.
History and background
Although the concept of Allee effect had no title at the time, it was first described in the 1930s by its namesake, Warder Clyde Allee. Through experimental studies, Allee was able to demonstrate that goldfish have a greater survival rate when there are more individuals within the tank. This led him to conclude that aggregation can improve the survival rate of individuals, and that cooperation may be crucial in the overall evolution of social structure. The term "Allee principle" was introduced in the 1950s, a time when the field of ecology was heavily focused on the role of competition among and within species. The classical view of population dynamics stated that due to competition for resources, a population will experience a reduced overall growth rate at higher density and increased growth rate at lower density. In other words, individuals in a population would be better off when there are fewer individuals around due to a limited amount of resources (see logistic growth). However, the concept of the Allee effect introduced the idea that the reverse holds true when the population density is low. Individuals within a species often require the assistance of another individual for more than simple reproductive reasons in order to persist. The most obvious example of this is observed in animals that hunt for prey or defend against predators as a group.
Definition
The generally accepted definition of Allee effect is positive density dependence, or the positive correlation between population density and individual fitness. It is sometimes referred to as "undercrowding" and it is analogous (or even considered synonymous by some) to "depensation" in the field of fishery sciences. Listed below are a few significant subcateg |
https://en.wikipedia.org/wiki/Sverdrup | In oceanography, the sverdrup (symbol: Sv) is a non-SI metric unit of volumetric flow rate, with equal to . It is equivalent to the SI derived unit cubic hectometer per second (symbol: hm3/s or hm3⋅s−1): 1 Sv is equal to 1 hm3/s. It is used almost exclusively in oceanography to measure the volumetric rate of transport of ocean currents. It is named after Harald Sverdrup.
One sverdrup is about five times what is carried by the world’s largest river, the Amazon. In the context of ocean currents, a volume of one million cubic meters may be imagined as a "slice" of ocean with dimensions × × (width × length × thickness). At this scale, these units can be more easily compared in terms of width of the current (several km), depth (hundreds of meters), and current speed (as meters per second). Thus, a hypothetical current wide, 500 m (0.5 km) deep, and moving at 2 m/s would be transporting of water.
The sverdrup is distinct from the SI sievert unit or the non-SI svedberg unit. All three use the same symbol. They are not related.
History
The sverdrup is named in honor of the Norwegian oceanographer, meteorologist and polar explorer Harald Ulrik Sverdrup (1888–1957), who wrote the 1942 volume The Oceans, Their Physics, Chemistry, and General Biology together with Martin W. Johnson and Richard H. Fleming.
In the 1950s and early 1960s both Soviet and North American scientists contemplated the damming of the Bering Strait, thus enabling temperate Atlantic water to heat up the cold Arctic Sea and, the theory went, making Siberia and northern Canada more habitable. As part of the North American team, Canadian oceanographer Maxwell Dunbar found it "very cumbersome" to repeatedly reference millions of cubic meters per second. He casually suggested that as a new unit of water flow, "the inflow through Bering Strait is one sverdrup". At the Arctic Basin Symposium in October 1962, the unit came into general usage.
Examples
The water transport in the Gulf Stream gradually |
https://en.wikipedia.org/wiki/Genome-based%20peptide%20fingerprint%20scanning | Genome-based peptide fingerprint scanning (GFS) is a system in bioinformatics analysis that attempts to identify the genomic origin (that is, what species they come from) of sample proteins by scanning their peptide-mass fingerprint against the theoretical translation and proteolytic digest of an entire genome. This method is an improvement from previous methods because it compares the peptide fingerprints to an entire genome instead of comparing it to an already annotated genome. This improvement has the potential to improve genome annotation and identify proteins with incorrect or missing annotations.
History and background
GFS was designed by Michael C. Giddings (University of North Carolina, Chapel Hill) et al., and released in 2003. Giddings expanded the algorithms for GFS from earlier ideas. Two papers were published in 1993 explaining the techniques used to identify proteins in sequence databases. These methods determined the mass of peptides using mass spectrometry, and then used the mass to search protein databases to identify the proteins In 1999 a more complex program was released called Mascot that integrated three types of protein/database searches: peptide molecular weights, tandem mass spectrometry from one or more peptide, and combination mass data with amino acid sequence. The fallback with this widely used program is that it is unable to detect alternative splice sites that are not currently annotated, and it not usually able to find proteins that have not been annotated. Giddings built upon these sources to create GFS which would compare peptide mass data to entire genomes to identify the proteins. Giddings system is able to find new annotations of genes that have not been found, such as undocumented genes and undocumented alternative splice sites.
Research examples
In 2012 research was published where genes and proteins were found in a model organism that could not have been found without GFS because they had not been previously annotated. T |
https://en.wikipedia.org/wiki/Abel%27s%20identity | In mathematics, Abel's identity (also called Abel's formula or Abel's differential equation identity) is an equation that expresses the Wronskian of two solutions of a homogeneous second-order linear ordinary differential equation in terms of a coefficient of the original differential equation.
The relation can be generalised to nth-order linear ordinary differential equations. The identity is named after the Norwegian mathematician Niels Henrik Abel.
Since Abel's identity relates to the different linearly independent solutions of the differential equation, it can be used to find one solution from the other. It provides useful identities relating the solutions, and is also useful as a part of other techniques such as the method of variation of parameters. It is especially useful for equations such as Bessel's equation where the solutions do not have a simple analytical form, because in such cases the Wronskian is difficult to compute directly.
A generalisation of first-order systems of homogeneous linear differential equations is given by Liouville's formula.
Statement
Consider a homogeneous linear second-order ordinary differential equation
on an interval I of the real line with real- or complex-valued continuous functions p and q. Abel's identity states that the Wronskian of two real- or complex-valued solutions and of this differential equation, that is the function defined by the determinant
satisfies the relation
for each point .
Remarks
In particular, when the differential equation is real-valued, the Wronskian is always either identically zero, always positive, or always negative at every point in (see proof below). The latter cases imply the two solutions and are linearly independent (see Wronskian for a proof).
It is not necessary to assume that the second derivatives of the solutions and are continuous.
Abel's theorem is particularly useful if , because it implies that is constant.
Proof
Differentiating the Wronskian using th |
https://en.wikipedia.org/wiki/E%20%28PC%20DOS%29 | E is the text editor which was made part of PC DOS with version 6.1 in June 1993, in February 1995 with version 7 and later with PC DOS 2000. In version 6.1, IBM dropped QBASIC, which, in its edit mode, was also the system text editor. It was necessary to provide some sort of editor, so IBM chose to adapt and substantially extend its OS/2 System Editor (1986), a minimally functional member of the E family of Editors. The DOS version is extended with a wide array of functions that are usually associated with more functional versions of the E editor family (see below). In version 7, IBM added the REXX language to DOS, restoring programmability to the basic box. IBM also provided E with OS/2.
Features
The features include (for PC DOS 7):
online help
edit large text files
draw boxes around text
mouse and menu support
record and play keystroke macros
change case within a marked area
access multiple files in multiple panes
syntax-directed editing of C and REXX
add and multiply numbers in a marked area
locate and make a change globally within a file
select text and move, copy, overlay, or delete it
copy and move text from one file into another file
E for PC DOS consists of five files:
E.EXE -- the executable program itself, (v3.13 in PC DOS 7)
E.EX -- pre-compiled profile for E's behavior
E.INI -- text file allowing modification of a few E.EX defaults (Not in v 3.12 (dos 6))
EHELP.HLP -- text file used for E's F1 key help in Browse (read-only) mode
BROWSE.COM -- loads a file into E in read-only mode. (Not in v 3.12 (dos 6))
Since no tool was provided for building other profiles besides the supplied E.EX, PC DOS users have limited access to the full extensibility offered by the version 3 of E (e3) available to IBM programmers themselves. Still, it is a powerful implementation, with many features supporting the needs of general programmers.
For PC DOS owners who have moved on to other operating systems, E can be run with the use of a DOS emulator (e.g. DOSBox |
https://en.wikipedia.org/wiki/Rhinitis | Rhinitis, also known as coryza, is irritation and inflammation of the mucous membrane inside the nose. Common symptoms are a stuffy nose, runny nose, sneezing, and post-nasal drip.
The inflammation is caused by viruses, bacteria, irritants or allergens. The most common kind of rhinitis is allergic rhinitis, which is usually triggered by airborne allergens such as pollen and dander. Allergic rhinitis may cause additional symptoms, such as sneezing and nasal itching, coughing, headache, fatigue, malaise, and cognitive impairment. The allergens may also affect the eyes, causing watery, reddened, or itchy eyes and puffiness around the eyes. The inflammation results in the generation of large amounts of mucus, commonly producing a runny nose, as well as a stuffy nose and post-nasal drip. In the case of allergic rhinitis, the inflammation is caused by the degranulation of mast cells in the nose. When mast cells degranulate, they release histamine and other chemicals, starting an inflammatory process that can cause symptoms outside the nose, such as fatigue and malaise. In the case of infectious rhinitis, it may occasionally lead to pneumonia, either viral or bacterial. Sneezing also occurs in infectious rhinitis to expel bacteria and viruses from the respiratory tract.
Rhinitis is very common. Allergic rhinitis is more common in some countries than others; in the United States, about 10–30% of adults are affected annually. Mixed rhinitis (MR) refers to patients with nonallergic rhinitis and allergic rhinitis. MR is a specific rhinitis subtype. It may represent between 50 and 70% of all AR patients. However, true prevalence of MR has not been confirmed yet.
Types
Rhinitis is categorized into three types (although infectious rhinitis is typically regarded as a separate clinical entity due to its transient nature): (i) infectious rhinitis includes acute and chronic bacterial infections; (ii) nonallergic rhinitis includes vasomotor, idiopathic, hormonal, atrophic, occupati |
https://en.wikipedia.org/wiki/Nootkatone | Nootkatone is a natural organic compound, a sesquiterpenoid, and a ketone that is the most important and expensive aromatic of grapefruit, and which also occurs in other organisms.
Previously, nootkatone was thought to be one of the main chemical components of the smell and flavour of grapefruits. In high purity, it usually is found as colorless crystals. Crude extractives are liquid, viscous, and yellow. Nootkatone typically is extracted from grapefruit through the chemical or biochemical oxidation of valencene. It is found in Alaska yellow cedar trees, as well as in vetiver grass.
Mechanism of action
As is true of other plant terpenoids, nootkatone activates α-adrenergic type 1 octopamine receptor (PaOA1) in susceptible arthropods, causing fatal spasms.
Natural origin
Nootkatone was isolated from the wood of the Alaskan yellow cedar, Cupressus nootkatensis. The species name, nootkatensis, is derived from the language of the Nuu-Chah-Nulth people of Canada (formerly referred to as the Nootka people).
Uses
Nootkatone in spray form has been shown as an effective repellent or insecticide against deer ticks and lone star ticks. It is also an effective repellent or insecticide against mosquitos, and may repel bed bugs, head lice, Formosan termites, and other insects. It is an environmentally friendly insecticide because it is a volatile essential oil that does not persist in the environment. It was approved by the U.S. EPA for this use on August 10, 2020. Its ability to repel ticks, mosquitoes, and other insects may last for hours, in contrast to other plant-based oil repellants like citronella, peppermint oil, and lemongrass oil. It is nontoxic to humans, is an approved food additive, and is commonly used in foods, cosmetics, and pharmaceuticals.
The CDC has licensed patents to two companies to produce an insecticide and an insect repellant. Allylix, of San Diego, California (Now Evolva), is one of these licensees, which has developed an enzyme fermentation proce |
https://en.wikipedia.org/wiki/Fractal%20dimension | In mathematics, a fractal dimension is a term invoked in the science of geometry to provide a rational statistical index of complexity detail in a pattern. A fractal pattern changes with the scale at which it is measured.
It is also a measure of the space-filling capacity of a pattern, and it tells how a fractal scales differently, in a fractal (non-integer) dimension.
The main idea of "fractured" dimensions has a long history in mathematics, but the term itself was brought to the fore by Benoit Mandelbrot based on his 1967 paper on self-similarity in which he discussed fractional dimensions. In that paper, Mandelbrot cited previous work by Lewis Fry Richardson describing the counter-intuitive notion that a coastline's measured length changes with the length of the measuring stick used (see Fig. 1). In terms of that notion, the fractal dimension of a coastline quantifies how the number of scaled measuring sticks required to measure the coastline changes with the scale applied to the stick. There are several formal mathematical definitions of fractal dimension that build on this basic concept of change in detail with change in scale: see the section Examples.
Ultimately, the term fractal dimension became the phrase with which Mandelbrot himself became most comfortable with respect to encapsulating the meaning of the word fractal, a term he created. After several iterations over years, Mandelbrot settled on this use of the language: "...to use fractal without a pedantic definition, to use fractal dimension as a generic term applicable to all the variants."
One non-trivial example is the fractal dimension of a Koch snowflake. It has a topological dimension of 1, but it is by no means rectifiable: the length of the curve between any two points on the Koch snowflake is infinite. No small piece of it is line-like, but rather it is composed of an infinite number of segments joined at different angles. The fractal dimension of a curve can be explained intuitively by t |
https://en.wikipedia.org/wiki/FreeJ | FreeJ is a modular software vision mixer for Linux systems. It is capable of real-time video manipulation, for amateur and professional uses. It can be used as an instrument in the fields of dance theater, veejaying and television. FreeJ supports the input of multiple layers of video footage, which can be filtered through special-effect-chains, and then mixed for output.
History
Denis Rojo (aka Jaromil) is the original author, and as of 2013 is the current maintainer. Since 0.7 was released, Silvano Galliani (aka kysucix) joined the core development team, implementing several new enhancements.
Features
FreeJ can be operated in real-time from a command line console (S-Lang), and also remotely operated over the network via an SSH connection. The software provides an interface for behavior-scripting (currently accessible through JavaScript). Also, it can be used to render media to multiple screens, remote setups, encoders, and live Internet stream servers.
FreeJ can overlay, mask, transform and filter multiple layers of footage on the screen. It supports an unlimited number of layers that can be mixed, regardless of the source. It can read input from varied sources: movie files, webcams, TV cards, images, renders and Adobe Flash animations.
FreeJ can produce a stream to an icecast server with the video being mixed and audio grabbed from soundcard. The resulting video is accessible to any computer able to decode media encoded with the theora codec.
The console interface of FreeJ is accessible via SSH and can be run as a background process. The remote interface offers simultaneous access from multiple remote locations. |
https://en.wikipedia.org/wiki/Sagittaria%20cuneata | Sagittaria cuneata is a species of flowering plant in the water plantain family known by the common name arumleaf arrowhead or duck potato. Like some other Sagittaria species, it may be called wapato. It is native to much of North America, including most of Canada (every province and territory except Nunavut) as well as the western and northeastern United States (New England, Great Lakes, Great Plains, Rocky Mountain, Great Basin and Pacific Coast states; including Alaska but not Hawaii).
Sagittaria cuneata is an aquatic plant, growing in slow-moving and stagnant water bodies such as ponds and small streams. It is quite variable in appearance, and submerged parts of the plant look different from those growing above the surface or on land. In general it is a perennial herb growing from a white or blue-tinged tuber. The leaves are variable in shape, many of them sagittate (arrow-shaped) with two smaller, pointed lobes opposite the tip. The leaf blades are borne on very long petioles. The plant is monoecious, with individuals bearing both male and female flowers. The inflorescence which rises above the surface of the water is a raceme made up of several whorls of flowers, the lowest node bearing female flowers and upper nodes bearing male flowers. The flower is up to 2.5 centimeters wide with white petals. The male flowers have rings of yellow stamens at the centers. Each female flower has a spherical cluster of pistils which develops into a group of tiny fruits.
Conservation status in the United States
It is listed as endangered in Connecticut and New Jersey. It is listed as threatened in Massachusetts, New Hampshire, and Ohio.
Native American ethnobotany
The Cheyenne give dried leaves to horses for urinary troubles and for a sore mouth. The Klamath use the rootstocks as food. The Menominee string the dried, boiled, sliced potatoes together for winter use. The Ojibwe eat the corms for indigestion, and also as a food, eaten boiled fresh, dried or candied with maple |
https://en.wikipedia.org/wiki/Sea%20of%20Thieves | Sea of Thieves is a 2018 action-adventure game developed by Rare and published by Microsoft Studios. The game was released in March 2018 for Windows and Xbox One; it was one of the earliest first-party games released for Xbox Game Pass subscribers. The player assumes the role of a pirate who completes voyages from different trading companies. The multiplayer game sees players explore an open world via a pirate ship from a first-person perspective. Groups of players encounter each other regularly during their adventures, sometimes forming alliances, and sometimes going head-to-head.
Sea of Thieves was conceived in 2014. Rare was inspired by players of PC games such as Eve Online (2003), DayZ (2018), and Rust (2018) who used the game tools to create their own stories. Rare explored different settings, such as vampires and dinosaurs, before settling on a pirate theme inspired by the Pirates of the Caribbean films and The Goonies (1985). The game features a progression system that only unlocks cosmetic items as the development team wanted to encourage both casual and experienced players to play together. Rare departed from its reputation for secrecy during Sea of Thieves development, inviting fans to test the game's early builds.
Sea of Thieves received mixed reviews upon launch; critics praised the ship combat, multiplayer, visuals, and physics, but criticized the progression, gameplay, and lack of content. Rare envisioned Sea of Thieves as a "game as a service" and has released numerous content updates after the initial release, which has improved its overall reception. Some of the latest updates include Season 8 releasing a new on-demand player versus player naval battle featuring the noble Guardians of Fortune and the opposing faction of the Servants of the Flame.
Sea of Thieves was a commercial success and became Microsoft's most successful original intellectual property of the eighth generation, attracting more than 30 million players by June 2022. An enhanced |
https://en.wikipedia.org/wiki/Hydrogen%20carrier | A hydrogen carrier is an organic macromolecule that transports atoms of hydrogen from one place to another inside a cell or from cell to cell for use in various metabolical processes. Examples include NADPH, NADH, and FADH. The main role of these is to transport hydrogen atom to electron transport chain which will change ADP to ATP by adding one phosphate during metabolic processes (e.g. photosynthesis and respiration). Hydrogen carrier participates in an oxidation-reduction reaction by getting reduced due to the acceptance of a Hydrogen. The enzyme used in Glycolysis, Dehydrogenase is used to attach the hydrogen to one of the hydrogen carrier.
See also
Electron carrier
Light reactions
Photosynthesis
Cellular respiration |
https://en.wikipedia.org/wiki/Large%20width%20limits%20of%20neural%20networks | Artificial neural networks are a class of models used in machine learning, and inspired by biological neural networks. They are the core component of modern deep learning algorithms. Computation in artificial neural networks is usually organized into sequential layers of artificial neurons. The number of neurons in a layer is called the layer width. Theoretical analysis of artificial neural networks sometimes considers the limiting case that layer width becomes large or infinite. This limit enables simple analytic statements to be made about neural network predictions, training dynamics, generalization, and loss surfaces. This wide layer limit is also of practical interest, since finite width neural networks often perform strictly better as layer width is increased.
Theoretical approaches based on a large width limit
The Neural Network Gaussian Process (NNGP) corresponds to the infinite width limit of Bayesian neural networks, and to the distribution over functions realized by non-Bayesian neural networks after random initialization.
The same underlying computations that are used to derive the NNGP kernel are also used in deep information propagation to characterize the propagation of information about gradients and inputs through a deep network. This characterization is used to predict how model trainability depends on architecture and initializations hyper-parameters.
The Neural Tangent Kernel describes the evolution of neural network predictions during gradient descent training. In the infinite width limit the NTK usually becomes constant, often allowing closed form expressions for the function computed by a wide neural network throughout gradient descent training. The training dynamics essentially become linearized.
The study of infinite width neural networks with a different initial weight scaling and suitably large learning rates leads to qualitatively different nonlinear training dynamics than those described by the fixed neural tangent kernel.
Cata |
https://en.wikipedia.org/wiki/Life%20Story%20%28film%29 | Life Story (known as The Race for the Double Helix in the United States) is a 1987 television historical drama which depicts the progress toward, and the competition for, the discovery of the structure of DNA in the early 1950s. It was directed by Mick Jackson for the BBC's Horizon science series, and stars Jeff Goldblum, Tim Pigott-Smith, Juliet Stevenson, and Alan Howard. It won several awards in the UK and U.S., including the 1988 BAFTA TV Award for Best Single Drama.
Summary
The film dramatises the rivalries of the two teams of scientists attempting to discover the structure of DNA: Francis Crick and James D. Watson at Cambridge University; and Maurice Wilkins and Rosalind Franklin at King's College London. They are also competing with other scientists in the UK, and with international scientists such as American Linus Pauling.
The film manages to convey the loneliness and competitiveness of scientific research but also educates the viewer about how DNA's structure was discovered. It explores the tension between the patient, dedicated laboratory work of Franklin and the sometimes uninformed intuitive leaps of Watson and Crick, against a background of institutional turf wars, personality conflicts, and sexism.
In the film, Watson, extolling the path of intuition, says: "Blessed are they who believed before there was any evidence." It also shows how Watson and Crick made their discovery, overtaking their competitors in part by reasoning from genetic function to predict chemical structure, helping to establish the field of molecular biology.
Cast
Jeff Goldblum as James Watson
Tim Pigott-Smith as Francis Crick
Juliet Stevenson as Rosalind Franklin
Alan Howard as Maurice Wilkins
Production
The film script was written by William Nicholson, based on James Watson's 1968 memoir The Double Helix: A Personal Account of the Discovery of the Structure of DNA. It was produced and directed by Mick Jackson for Horizon, the long-running British documentary television series |
https://en.wikipedia.org/wiki/Ventura%20International | Ventura International (or VENTURA_INT) is an 8-bit character encoding created by Ventura Software for use with Ventura Publisher. Ventura International is based on the GEM character set, but ¢ and ø are swapped and ¥ and Ø are swapped so that it is more similar to code page 437 (on which GEM was based, but GEM is more similar to code page 865 because the placement of Ø and ø in GEM match the placement in code page 865). There is also the PCL Ventura International, which is used for communication with PCL printers. PCL Ventura International is based on HP Roman-8. Both have the same character set, but a different encoding.
Ventura International character set
PCL Ventura International character set
Conversion tables
{| class="wikitable chset nounderlines" frame="box" width="50%" style="text-align:center; font-family:monospace; border-collapse:collaps; background:#FFFFEF"
|-
|colspan="17"| Ventura International → PCL Ventura International (upper half only; lower half is identical)
|- bgcolor=#EFF3FF
|bgcolor=#BFBFBF| ||_0||_1||_2||_3||_4||_5||_6||_7||_8||_9||_A||_B||_C||_D||_E||_F
|-
|bgcolor=#EFF3FF|8_||B4||CF||C5||C0||CC||C8||D4||B5||C1||CD||C9||DD||D1||D9||D8||D0
|-
|bgcolor=#EFF3FF|9_||DC||D7||D3||C2||CE||CA||C3||CB||EF||DA||DB||BF||BB||BC||BA||BE
|-
|bgcolor=#EFF3FF|A_||C4||D5||C6||C7||B7||B6||F9||FA||B9||B1||B2||AB||AC||B8||FB||FD
|-
|bgcolor=#EFF3FF|B_||E2||EA||D2||D6||F1||F0||A1||E1||E9||BD||F4||F3||F2||A8||A9||AA
|-
|bgcolor=#EFF3FF|C_||A0||FF||B0||FC||F6||F5||B3||E0||A2||A3||A4||A5||E6||E5||A6||A7
|-
|bgcolor=#EFF3FF|D_||E8||E7||DF||EB||EC||AD||ED||AE||EE||DE||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF|
|-
|bgcolor=#EFF3FF|E_||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| ||bgcolor=#BFBFBF| | |
https://en.wikipedia.org/wiki/Crossband%20operation | Crossband (cross-band, cross band) operation is a method of telecommunication in which a radio station receives signals on one frequency and simultaneously transmits on another for the purpose of full duplex communication or signal relay.
To avoid interference within the equipment at the station, the two frequencies used need to be separated, and ideally on different 'bands'. An unattended station working in this way is a radio repeater. It re-transmits the same information that it receives. This principle is used by telecommunications satellites and terrestrial mobile radio systems.
Uses
Crossband operation is sometimes used by amateur radio operators. Rather than taking it in turns to transmit on the same frequency, both operators can transmit at the same time but on different bands, each one listening to the frequency that the other is using to transmit. A variation on this procedure includes establishing contact on one frequency and then changing to a pair of other frequencies to exchange messages.
Crossband operation is also used in communication between ships (inter-ship) with a HF installation. Frequencies that may be used can be found in the 'Manual for use by the Maritime Mobile and Maritime Mobile-Satellite Services'. Usually inter-ship communication is simplex only (VHF or MF), HF gives the possibility to work duplex but usually the transmitter and receiver are so close to each other that this may cause problems. The solution is to work on frequencies that are far apart e.g.: sending on 8 MHz and receiving on 12 MHz.
See also
See Radio frequency for more details about the radio spectrum. |
https://en.wikipedia.org/wiki/Simple%20Gateway%20Monitoring%20Protocol | Simple Gateway Monitoring Protocol (SGMP) defined in RFC 1028, allows commands to be issued to application protocol entities to set or retrieve values (integer or octet string types) for use in monitoring the gateways on which the application protocol entities reside. Messages are exchanged using UDP and utilize unreliable transport methods. Authentication takes place on UDP port 153. Some examples of things that can be monitored are listed below.
Network Type for interfaces: IEEE 802.3 MAC, IEEE 802.4 MAC, IEEE 802.5 MAC, Ethernet, ProNET-80, ProNET-10, FDDI, X.25, Point-to-Point Serial, RPA 1822 HDH, ARPA 1822, AppleTalk, StarLAN
Interface Status (down, up, attempting, etc.)
Route Type (local, remote, sub-network, etc.)
Routing Protocol (RIP, EGP, GGP, IGRP, Hello)
The protocol was replaced by SNMP (Simple Network Management Protocol)
Sources
RFC 1028
Network protocols |
https://en.wikipedia.org/wiki/GLRA3 | The Glycine receptor subunit alpha-3 is a protein that in humans is encoded by the GLRA3 gene. The protein encoded by this gene is a subunit of the glycine receptor. |
https://en.wikipedia.org/wiki/Concept%20class | In computational learning theory in mathematics, a concept over a domain X is a total Boolean function over X. A concept class is a class of concepts. Concept classes are a subject of computational learning theory.
Concept class terminology frequently appears in model theory associated with probably approximately correct (PAC) learning. In this setting, if one takes a set Y as a set of (classifier output) labels, and X is a set of examples, the map , i.e. from examples to classifier labels (where and where c is a subset of X), c is then said to be a concept. A concept class is then a collection of such concepts.
Given a class of concepts C, a subclass D is reachable if there exists a sample s such that D contains exactly those concepts in C that are extensions to s. Not every subclass is reachable.
Background
A sample is a partial function from to . Identifying a concept with its characteristic function mapping to , it is a special case of a sample.
Two samples are consistent if they agree on the intersection of their domains. A sample extends another sample if the two are consistent and the domain of is contained in the domain of .
Examples
Suppose that . Then:
the subclass is reachable with the sample ;
the subclass for are reachable with a sample that maps the elements of to zero;
the subclass , which consists of the singleton sets, is not reachable.
Applications
Let be some concept class. For any concept , we call this concept -good for a positive integer if, for all , at least of the concepts in agree with on the classification of . The fingerprint dimension of the entire concept class is the least positive integer such that every reachable subclass contains a concept that is -good for it. This quantity can be used to bound the minimum number of equivalence queries needed to learn a class of concepts according to the following inequality:. |
https://en.wikipedia.org/wiki/Deep%20Green%20Resistance | Deep Green Resistance (DGR) is a radical environmental movement that views mainstream environmental activism as being ineffective. The group, which perceives the existence of industrial civilization itself as the greatest threat to the natural environment, strives for community organizing to build alternative food, housing, and medical institutions.<ref>"About Us". Deep Green Resistance. 2022.</ref> The organization advocates sabotage against infrastructure, which it views as necessary tactics to achieve its goal of dismantling industrial civilization. Religious and ecological scholar Todd LeVasseur classifies it as an apocalyptic or millenarian movement.
Beliefs
In the 2011 book Deep Green Resistance, the authors Lierre Keith, Derrick Jensen and Aric McBay state that civilization, particularly industrial civilization, is fundamentally unsustainable and must be actively and urgently dismantled in order to secure a future for all species on the planet.
The movement differentiates itself from bright green environmentalism, which is characterized by a focus on personal, technological, or government and corporate solutions, in that it holds these solutions as inadequate. DGR believes that lifestyle changes, such as using travel mugs and reusable bags and taking shorter showers, are too small for the large-scale environmental problems the world faces. It also states that the recent surge in environmentalism has become commercial in nature, and thus in itself has been industrialized. The movement asserts that per capita industrial waste produced is orders of magnitude greater than personal waste produced; therefore, it is industrialism that must be ended, and with that, lifestyle changes will follow.
DGR calls for the dismantling of industrial civilization, and the return to a pre-agricultural lifestyle.
In a piece for Earth Island, Max Wilbert, who says DGR believes 'agriculture is theft', welcomes the collapse of global grid power, and views electricity, whether gene |
https://en.wikipedia.org/wiki/Continuous%20group%20action | In topology, a continuous group action on a topological space X is a group action of a topological group G that is continuous: i.e.,
is a continuous map. Together with the group action, X is called a G-space.
If is a continuous group homomorphism of topological groups and if X is a G-space, then H can act on X by restriction: , making X a H-space. Often f is either an inclusion or a quotient map. In particular, any topological space may be thought of as a G-space via (and G would act trivially.)
Two basic operations are that of taking the space of points fixed by a subgroup H and that of forming a quotient by H. We write for the set of all x in X such that . For example, if we write for the set of continuous maps from a G-space X to another G-space Y, then, with the action ,
consists of f such that ; i.e., f is an equivariant map. We write . Note, for example, for a G-space X and a closed subgroup H, . |
https://en.wikipedia.org/wiki/Pierre%20Berthelot | Pierre Berthelot (; born 1943) is a mathematician at the University of Rennes. He developed crystalline cohomology and rigid cohomology.
Publications
Berthelot, Pierre Cohomologie cristalline des schémas de caractéristique p>0. Lecture Notes in Mathematics, Vol. 407. Springer-Verlag, Berlin-New York, 1974. 604 pp.
Berthelot, Pierre; Ogus, Arthur Notes on crystalline cohomology. Princeton University Press, Princeton, N.J.; University of Tokyo Press, Tokyo, 1978. vi+243 pp. |
https://en.wikipedia.org/wiki/Defensin | Defensins are small cysteine-rich cationic proteins across cellular life, including vertebrate and invertebrate animals, plants, and fungi. They are host defense peptides, with members displaying either direct antimicrobial activity, immune signaling activities, or both. They are variously active against bacteria, fungi and many enveloped and nonenveloped viruses. They are typically 18-45 amino acids in length, with three or four highly conserved disulphide bonds.
In animals, they are produced by cells of the innate immune system and epithelial cells, whereas in plants and fungi they are produced by a wide variety of tissues. An organism usually produces many different defensins, some of which are stored inside the cells (e.g. in neutrophil granulocytes to kill phagocytosed bacteria), and others are secreted into the extracellular medium. For those that directly kill microbes, their mechanism of action varies from disruption of the microbial cell membrane to metabolic disruption.
Varieties
The name 'defensin' was coined in the mid-1980s, though the proteins have been called 'Cationic Antimicrobial Proteins,' 'Neutrophil peptides,' 'Gamma thionins' amongst others.
Proteins called 'defensins' are not all evolutionarily related to one another. Instead fall into two broad superfamilies, each of which contains multiple families. One superfamily, the trans-defensins, contains the defensins found in humans and other vertebrates, as well as some invertebrates. The other superfamily, cis-defensins, contains the defensins found in invertebrates, plants, and fungi. The superfamilies and families are determined by the overall tertiary structure, and each family usually has a conserved pattern of disulphide bonds. All defensins form small and compact folded structures, typically with a high positive charge, that are highly stable due to the multiple disulphide bonds. In all families, the underlying genes responsible for defensin production are highly polymorphic.
Trans-defe |
https://en.wikipedia.org/wiki/Social%20grooming | Social grooming is a behavior in which social animals, including humans, clean or maintain one another's body or appearance. A related term, allogrooming, indicates social grooming between members of the same species. Grooming is a major social activity, and a means by which animals who live in close proximity may bond and reinforce social structures, family links, and build companionships. Social grooming is also used as a means of conflict resolution, maternal behavior and reconciliation in some species. Mutual grooming typically describes the act of grooming between two individuals, often as a part of social grooming, pair bonding, or a precoital activity.
Evolutionary advantages
There are a variety of proposed mechanisms by which social grooming behavior has been hypothesized to increase fitness. These evolutionary advantages may come in the form of health benefits including reduced disease transmission and reduced stress levels, maintaining social structure, and direct improvement of fitness as a measure of survival.
Health benefits
It is often argued as to whether the overarching importance of social grooming is to boost an organism's health and hygiene or whether the social side of social grooming plays an equally or more important role. Traditionally, it is thought that the primary function of social grooming is the upkeep of an animal's hygiene. Evidence to support this statement involves the fact that all grooming concentrates on body parts that are inaccessible by autogrooming and that the amount of time spent allogrooming regions did not vary significantly even if the body part had a more important social or communicatory function.
Social grooming behaviour has been shown to elicit an array of health benefits in a variety of species. For example, group member connection has the potential to mitigate the potentially harmful effects of stressors. In macaques, social grooming has been proven to reduce heart rate. Social affiliation during a mild stre |
https://en.wikipedia.org/wiki/MLX%20%28software%29 | MLX is a series of machine language entry utilities published by the magazines COMPUTE! and COMPUTE!'s Gazette, as well as books from COMPUTE! Publications. These programs were designed to allow relatively easy entry of the type-in machine language listings that were often included in these publications. Versions were available for the Commodore 64, VIC-20, Atari 8-bit family, and Apple II. MLX listings were reserved for relatively long machine language programs such as SpeedScript.
First version
MLX was introduced in the December 1983 issue of COMPUTE! for the Commodore 64 and Atari 8-bit family. This was followed in the January 1984 issue of COMPUTE!'s Gazette by a version for the VIC-20 with 8K expansion, and in the March 1984 issue by Tiny MLX, a version for the unexpanded VIC-20. These use a format consisting of six data bytes in decimal format, and a seventh as a checksum. The program auto-increments the address and prints the comma delimiters every three characters. Invalid keystrokes are ignored.
In the Commodore version, beginning in the May 1984 issue of COMPUTE!, several keyboard keys are redefined to create a makeshift numeric keypad.
Improved version
A new version of MLX was introduced for the Apple II in the June 1985 issue. This version uses an 8-byte-per-line hexadecimal format. A more sophisticated algorithm was implemented to catch errors overlooked by the original.
The improved features were then backported to the Commodore 64. The new version, known on the title screen as "MLX II", but otherwise simply as "the new MLX", appeared in the December 1985 issue of COMPUTE!. It was printed in COMPUTE!'s Gazette the following month. This version of MLX was used until COMPUTE!'s Gazette switched to a disk-only format in December 1993.
See also
The Automatic Proofreader – COMPUTE!'s checksum utility for BASIC programs |
https://en.wikipedia.org/wiki/External%20storage | In computing, external storage refers to non-volatile (secondary) data storage outside a computer's own internal hardware, and thus can be readily disconnected and accessed elsewhere. Such storage devices may refer to removable media (e.g. punched paper, magnetic tape, floppy disk and optical disc), compact flash drives (USB flash drive and memory card), portable storage devices (external solid state drive and enclosured hard disk drive), or network-attached storage. Web-based cloud storage is the latest technology for external storage.
History
Today the term external storage most commonly applies to those storage devices external to a personal computer. The terms refer to any storage external to the computer.
Storage as distinct from memory in the early days of computing was always external to the computer as for example in the punched card devices and media. Today storage devices may be internal or external to a computer system.
In the 1950s, introduction of magnetic tapes and hard disk drives allowed for mass external storage of information, which played the key part of the computer revolution. Initially all external storage, tape and hard disk drives are today available as both internal and external storage.
In the 1964 removable disk media was introduced by the IBM 2310 disk drive with its 2315 cartridge used in IBM 1800 and IBM 1130 computers. Magnetic disk media is today not removable; however disk devices and media such as optical disc drives and optical discs are available both as internal storage and external storage.
Earlier adoption of external storage
As a consequence of rapid development of electronic computers, capability for integration of existing input, output, and storage devices was a determinant factor in their adoption. IBM 650 was a first mass-produced electronic computer that encompassed wide range of existing in technologies for input-output and memory devices, and it also included tape-to-card and card-to-tape conversion units. Earl |
https://en.wikipedia.org/wiki/AREsite | AREsite is a database of AU-rich elements (ARE) in vertebrate mRNA 3'-untranslated regions (UTRs). AU-rich elements are involved in the control of gene expression. They are the most common determinant of RNA stability in mammalian cells. The most recent version of AREsite is called AREsite 2. It represents an update that allows for more detailed analysis of ARE, GRE, and URE (AU, GU, and U-rich elements).
See also
AU-rich elements |
https://en.wikipedia.org/wiki/159th%20meridian%20east | The meridian 159° east of Greenwich is a line of longitude that extends from the North Pole across the Arctic Ocean, Asia, the Pacific Ocean, Australasia, the Southern Ocean, and Antarctica to the South Pole.
The 159th meridian east forms a great circle with the 21st meridian west.
From Pole to Pole
Starting at the North Pole and heading south to the South Pole, the 159th meridian east passes through:
{| class="wikitable plainrowheaders"
! scope="col" width="130" | Co-ordinates
! scope="col" | Country, territory or sea
! scope="col" | Notes
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Arctic Ocean
| style="background:#b0e0e6;" |
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | East Siberian Sea
| style="background:#b0e0e6;" |
|-valign="top"
|
! scope="row" |
| Sakha Republic Chukotka Autonomous Okrug — from Magadan Oblast — from Chukotka Autonomous Okrug — from Magadan Oblast — from
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Sea of Okhotsk
| style="background:#b0e0e6;" | Shelikhov Gulf
|-
|
! scope="row" |
| Kamchatka Krai — Kamchatka Peninsula
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Pacific Ocean
| style="background:#b0e0e6;" |
|-
|
! scope="row" |
| Santa Isabel Island
|-valign="top"
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | New Georgia Sound
| style="background:#b0e0e6;" | Passing just west of the Russell Islands, (at )
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Solomon Sea
| style="background:#b0e0e6;" |
|-valign="top"
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Coral Sea
| style="background:#b0e0e6;" | Passing just east of the Chesterfield Islands, (at )
|-valign="top"
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Pacific Ocean
| style="background:#b0e0e6;" | Passing just |
https://en.wikipedia.org/wiki/Ammonium%20bicarbonate | Ammonium bicarbonate is an inorganic compound with formula (NH4)HCO3. The compound has many names, reflecting its long history. Chemically speaking, it is the bicarbonate salt of the ammonium ion. It is a colourless solid that degrades readily to carbon dioxide, water and ammonia.
Production
Ammonium bicarbonate is produced by combining carbon dioxide and ammonia:
CO2 + NH3 + H2O -> (NH4)HCO3
Since ammonium bicarbonate is thermally unstable, the reaction solution is kept cold, which allows the precipitation of the product as white solid. About 100,000 tons were produced in this way in 1997.
Ammonia gas passed into a strong aqueous solution of the sesquicarbonate (a 2:1:1 mixture of (NH4)HCO3, (NH4)2CO3, and H2O) converts it into normal ammonium carbonate ((NH4)2CO3), which can be obtained in the crystalline condition from a solution prepared at about 30 °C. This compound on exposure to air gives off ammonia and reverts to ammonium bicarbonate.
Salt of hartshorn
Compositions containing ammonium carbonate have long been known. They were once produced commercially, formerly known as sal volatile or salt of hartshorn. It was obtained by the dry distillation of nitrogenous organic matter such as hair, horn, leather. In addition to ammonium bicarbonate, this material contains ammonium carbamate (NH4CO2NH2), and ammonium carbonate ((NH4)2CO3). It is sometimes called ammonium sesquicarbonate. It possesses a strong ammoniacal smell, and on digestion with alcohol, the carbamate is dissolved leaving a residue of ammonium bicarbonate.
A similar decomposition takes place when the sesquicarbonate is exposed to air.
Uses
Ammonium bicarbonate is used in the food industry as a leavening agent for flat baked goods, such as cookies and crackers. It was commonly used in the home before modern-day baking powder was made available. Many baking cookbooks, especially from Scandinavian countries, may still refer to it as hartshorn or hornsalt, while it is known as "hirvensarvisuola" i |
https://en.wikipedia.org/wiki/Phosphatidylethanol | Phosphatidylethanols (PEth) are a group of phospholipids formed only in the presence of ethanol via the action of phospholipase D (PLD). The lipid accumulates in the human body and competes at agonists sites of lipid-gated ion channels contributing to alcohol intoxication. The chemical similarity of PEth to phosphatidic acid (PA) and phosphatidylinositol 4,5-bisphosphate (PIP2) suggest a likely broad perturbation to lipid signaling; the exact role of PEth as a competitive lipid ligand has not been studied extensively.
Biological synthesis
When ethanol is present, PLD substitutes ethanol for water and covalently attaching the alcohol as the head group of the phospholipid; hence the name phosphatidylethanol. Normally PLD incorporates water to generate phosphatidic acid (PA); the process is termed transphosphatidylation. PLD continues to generate PA in the presence of ethanol and while PEth is generated and the effects of ethanol transphosphatidlyation are through the generation of the unnatural lipid not depletion of PA.
Marker in blood
Levels of phosphatidylethanols in blood are used as markers of previous alcohol consumption.
An increase of alcohol intake by ~20 g ethanol/day will raise the PEth 16:0/18:1 concentration by ~0.10 μmol/L, and vice versa if the alcohol consumption has decreased. However, it has been demonstrated that there can be significant inter-personal variation, leading to potential misclassification between moderate and heavy drinkers.
After cessation of alcohol intake, the half-life of PEth is between 4.5 and 10 days in the first week and between 5 and 12 days in the second week. As a blood marker PEth is more sensitive than carbohydrate deficient transferrin (CDT), urinary ethyl glucuronide (EtG) and ethyl sulfate (EtS).
Structure
Chemically, phosphatidylethanols are phospholipids carrying two fatty acid chains, which are variable in structure, and one phosphate ethyl ester.
Interpretation
The Society of PEth Research published a harmonizat |
https://en.wikipedia.org/wiki/Hall%20algebra | In mathematics, the Hall algebra is an associative algebra with a basis corresponding to isomorphism classes of finite abelian p-groups. It was first discussed by but forgotten until it was rediscovered by , both of whom published no more than brief summaries of their work. The Hall polynomials are the structure constants of the Hall algebra. The Hall algebra plays an important role in the theory of Masaki Kashiwara and George Lusztig regarding canonical bases in quantum groups. generalized Hall algebras to more general categories, such as the category of representations of a quiver.
Construction
A finite abelian p-group M is a direct sum of cyclic p-power components where
is a partition of called the type of M. Let be the number of subgroups N of M such that N has type and the quotient M/N has type . Hall proved that the functions g are polynomial functions of p with integer coefficients. Thus we may replace p with an indeterminate q, which results in the Hall polynomials
Hall next constructs an associative ring over , now called the Hall algebra. This ring has a basis consisting of the symbols and the structure constants of the multiplication in this basis are given by the Hall polynomials:
It turns out that H is a commutative ring, freely generated by the elements corresponding to the elementary p-groups. The linear map from H to the algebra of symmetric functions defined on the generators by the formula
(where en is the nth elementary symmetric function) uniquely extends to a ring homomorphism and the images of the basis elements may be interpreted via the Hall–Littlewood symmetric functions. Specializing q to 0, these symmetric functions become Schur functions, which are thus closely connected with the theory of Hall polynomials. |
https://en.wikipedia.org/wiki/Nusselt%20number | In thermal fluid dynamics, the Nusselt number (, after Wilhelm Nusselt) is the ratio of convective to conductive heat transfer at a boundary in a fluid. Convection includes both advection (fluid motion) and diffusion (conduction). The conductive component is measured under the same conditions as the convective but for a hypothetically motionless fluid. It is a dimensionless number, closely related to the fluid's Rayleigh number.
A Nusselt number of value one (zero) represents heat transfer by pure conduction. A value between one (zero) and 10 is characteristic of slug flow or laminar flow. A larger Nusselt number corresponds to more active convection, with turbulent flow typically in the 100–1000 range.
A similar non-dimensional property is the Biot number, which concerns thermal conductivity for a solid body rather than a fluid. The mass transfer analogue of the Nusselt number is the Sherwood number.
Definition
The Nusselt number is the ratio of convective to conductive heat transfer across a boundary. The convection and conduction heat flows are parallel to each other and to the surface normal of the boundary surface, and are all perpendicular to the mean fluid flow in the simple case.
where h is the convective heat transfer coefficient of the flow, L is the characteristic length, and k is the thermal conductivity of the fluid.
Selection of the characteristic length should be in the direction of growth (or thickness) of the boundary layer; some examples of characteristic length are: the outer diameter of a cylinder in (external) cross flow (perpendicular to the cylinder axis), the length of a vertical plate undergoing natural convection, or the diameter of a sphere. For complex shapes, the length may be defined as the volume of the fluid body divided by the surface area.
The thermal conductivity of the fluid is typically (but not always) evaluated at the film temperature, which for engineering purposes may be calculated as the mean-average of the bulk flu |
https://en.wikipedia.org/wiki/Shiva%20hypothesis | The Shiva hypothesis, also known as coherent catastrophism, is the idea that global natural catastrophes on Earth, such as extinction events, happen at regular intervals because of the periodic motion of the Sun in relation to the Milky Way galaxy.
Initial proposal in 1979
William Napier and Victor Clube in their 1979 Nature article, ”A Theory of Terrestrial Catastrophism”, proposed the idea that gravitational disturbances caused by the Solar System crossing the plane of the Milky Way galaxy are enough to disturb comets in the Oort cloud surrounding the Solar System. This sends comets in towards the inner Solar System, which raises the chance of an impact. According to the hypothesis, this results in the Earth experiencing large impact events about every 30 million years (such as the Cretaceous–Paleogene extinction event).
Later work by Rampino
Starting in 1984, Michael R. Rampino published followup research on the hypothesis. Certainly Rampino was aware of Napier and Clube's earlier publication, as Rampino and Stothers' letter to Nature in 1984 references it.
In the 1990s, Rampino and Bruce Haggerty renamed Napier and Clube's Theory of Terrestrial Catastrophism after Shiva, the Hindu god of destruction. In 2020, Rampino and colleagues published non-marine evidence corroborating previous marine evidence in support of the Shiva hypothesis.
Similar theories
The Sun's passage through the higher density spiral arms of the galaxy, rather than its passage through the plane of the galaxy, could hypothetically coincide with mass extinction on Earth.
However, a reanalysis of the effects of the Sun's transit through the spiral structure based on CO data has failed to find a correlation.
The Shiva Hypothesis may have inspired yet another theory: that a brown dwarf named Nemesis causes extinctions every 26 million years, which varies slightly from 30 million years.
Criticism
The idea of extinction periodicity has been criticised due to the fact that the hypothesis assum |
https://en.wikipedia.org/wiki/Electronic%20waste%20recycling | Electronic waste recycling, electronics recycling or e-waste recycling is the disassembly and separation of components and raw materials of waste electronics; when referring to specific types of e-waste, the terms like computer recycling or mobile phone recycling may be used. Like other waste streams, re-use, donation and repair are common sustainable ways to dispose of IT waste.
Since its inception in the early 1990s, more and more devices are recycled worldwide due to increased awareness and investment. Electronic recycling occurs primarily in order to recover valuable rare earth metals and precious metals, which are in short supply, as well as plastics and metals. These are resold or used in new devices after purification, in effect creating a circular economy. Such processes involve specialised facilities and premises, but within the home or ordinary workplace, sound components of damaged or obsolete computers can often be reused, reducing replacement costs.
Recycling is considered environmentally friendly because it prevents hazardous waste, including heavy metals and carcinogens, from entering the atmosphere, landfill or waterways. While electronics consist a small fraction of total waste generated, they are far more dangerous. There is stringent legislation designed to enforce and encourage the sustainable disposal of appliances, the most notable being the Waste Electrical and Electronic Equipment Directive of the European Union and the United States National Computer Recycling Act. In 2009, 38% of computers and a quarter of total electronic waste was recycled in the United States, 5% and 3% up from 3 years prior respectively.
Reasons for recycling
Obsolete computers and old electronics are valuable sources for secondary raw materials if recycled; otherwise, these devices are a source of toxins and carcinogens. Rapid technology change, low initial cost, and planned obsolescence have resulted in a fast-growing surplus of computers and other electronic comp |
https://en.wikipedia.org/wiki/Yaequinolone%20J1 | Yaequinolone J1 is an antibiotic made by Penicillium.
Total syntheses of yaequinolone J1
An asymmetric total synthesis of yaequinolone J1 has been published in 2018 by V. Vece, S. Jakkepally and S. Hanessian. In 2020, a five-step synthesis of yaequinolone J1 was reported. |
https://en.wikipedia.org/wiki/Journal%20of%20Crystal%20Growth | The Journal of Crystal Growth is a semi-monthly peer-reviewed scientific journal covering experimental and theoretical studies of crystal growth and its applications. It is published by Elsevier and the editor-in-chief is J. Derby (University of Minnesota).
History
The Journal of Crystal Growth was founded following the 1966 International Conference on Crystal Growth (ICCG) held in Boston, Massachusetts, United States. Ichiro Sunagawa, who participated in ICCG, wrote in the Journal of the Japanese Association of Crystal Growth that before then, "The crystal growth community was totally fragmented and had remained as a peripheral field at the mercy of other organizations." Michael Schieber (Hebrew University) later recounted feeling the need for an individual journal on the subject after the conference proceedings were published as a supplement to the Journal of Physics and Chemistry of Solids that had to be additionally ordered by journal subscribers. Feeling as though the crystal growth community should not remain at the "discretion of other disciplines for which crystal growth has a secondary importance", he spoke about the idea with a colleague, Kenneth Button, who informed an editor at the North-Holland Publishing Company (now Elsevier).
The journal launched in 1967, with an editorial board consisting of Schieber as editor-in-chief and co-editors Charles Frank and Nicolás Cabrera. At the time the journal employed two U.S. editors, eighteen associate editors from around the world, and an editorial advisory board of sixteen members.
As of 2015, the journal has continued to serve as the "major venue for papers on crystal growth theory, practice and characterization" and proceedings of various conferences in the field. According to Tony Stankus, the journal has historically emphasised research contribution on crystals grown from wet solutions and later strongly emphasised research on crystals grown from molten materials or those produced through other processes r |
https://en.wikipedia.org/wiki/Huallaga%20River%20Boats%20Collision%20%282021%29 | The Huallaga River Boats Collision was a fatal boat collision that killed 21 in Peru. It occurred on August 29, 2021, in the Alto Amazonas Province, west of the Department of Loreto. An additional unknown number of people were described as missing.
Description
The event occurred in the early morning of August 29 in the Alto Amazonas Province, when a motorized ferry collided with a river boat. The boat had approximately 80 people on board. Intense morning fog made it difficult to see.
Petroperú reported that the 80-person boat was called Ayachi, and the motor boat Nauta. Ayachi picked up its passengerws at 1:00 a.m. in Santa María to transfer them to Yurimaguas, while Nauta headed for Iquitos. Ayachi's passengers belonged to an evangelical congregation called Nueva Jesuralén.
Rescue
At the time of the accident, smaller boats of locals came to rescue the survivors. A passenger from Ayachi relates:
"Some grabbed us from behind, desperate. We were under the boat. We have managed to get out. My colleagues don't. I have lost my wife and seven-year-old son."
Rescuers from the Peruvian National Police and the Peruvian Navy went to the scene, where they managed to rescue 50 people alive. At the beginning 16 were reported missing.
The number of survivors rose to 60 and the number of deceased increased to 23 on August 31. One family was reported to have 14 deaths in the accident |
https://en.wikipedia.org/wiki/Toolkits%20for%20user%20innovation | Toolkits for user innovation and custom design are coordinated sets of “user-friendly” design tools. They are designed to support users who may wish to develop products or services for their own use. The problem toolkits are developed to solve is that, while user designers may know their own needs better than do producers, their technical design skills may be less than those of producer-employed developers. For example, expert users of tennis rackets – or expert users of custom integrated circuits – generally know more than producers do about the function they want a product (or service) to serve. However, they are often not as good as producer engineers at actually designing the product they need.
Purpose
Toolkits for user innovation (or design customization) solve this problem in two steps. First, they divide the total set of design problems facing product designers into two categories:
design problems for which users’ special knowledge of a need is important;
problems that do not require user knowledge to resolve.
Toolkits then offer easy-to-use tools to enable user designers to solve type (1) problems without needing to have technical skills equal to those of producer engineers. Type (2) problems are then assigned either toolkit software for automatic solution or to producers’ technical design specialists.
Example of a toolkit to support DIY design by users
To illustrate the basic concepts of a toolkit for product innovation and product customization by users, consider a house owner who wants to self-design a custom deck that is “just right” for his or her specific backyard physical setting and planned deck usages. The house owner will know the functions they want their custom deck to serve – outdoor barbecues for up to 10 people, play space for their kids, etc. But suppose that these users – like the vast majority of deck users – do not actually have the architectural and engineering skills required to create a complete, buildable design for the dec |
https://en.wikipedia.org/wiki/Binomial%20options%20pricing%20model | In finance, the binomial options pricing model (BOPM) provides a generalizable numerical method for the valuation of options. Essentially, the model uses a "discrete-time" (lattice based) model of the varying price over time of the underlying financial instrument, addressing cases where the closed-form Black–Scholes formula is wanting.
The binomial model was first proposed by William Sharpe in the 1978 edition of Investments (), and formalized by Cox, Ross and Rubinstein in 1979 and by Rendleman and Bartter in that same year.
For binomial trees as applied to fixed income and interest rate derivatives see .
Use of the model
The Binomial options pricing model approach has been widely used since it is able to handle a variety of conditions for which other models cannot easily be applied. This is largely because the BOPM is based on the description of an underlying instrument over a period of time rather than a single point. As a consequence, it is used to value American options that are exercisable at any time in a given interval as well as Bermudan options that are exercisable at specific instances of time. Being relatively simple, the model is readily implementable in computer software (including a spreadsheet).
Although computationally slower than the Black–Scholes formula, it is more accurate, particularly for longer-dated options on securities with dividend payments. For these reasons, various versions of the binomial model are widely used by practitioners in the options markets.
For options with several sources of uncertainty (e.g., real options) and for options with complicated features (e.g., Asian options), binomial methods are less practical due to several difficulties, and Monte Carlo option models are commonly used instead. When simulating a small number of time steps Monte Carlo simulation will be more computationally time-consuming than BOPM (cf. Monte Carlo methods in finance). However, the worst-case runtime of BOPM will be O(2n), where n is the nu |
https://en.wikipedia.org/wiki/%CE%92-Hydroxybutyryl-CoA | β-Hydroxybutyryl-CoA (or 3-hydroxybutyryl-coenzyme A) is an intermediate in the fermentation of butyric acid, and in the metabolism of lysine and tryptophan. The L-3-hydroxybutyl-CoA (or (S)-3-hydroxybutanoyl-CoA) enantiomer is also the second to last intermediate in beta oxidation of even-numbered, straight chain, and saturated fatty acids.
See also
Crotonyl-coenzyme A
Acetoacetyl CoA
Beta-hydroxybutyryl-CoA dehydrogenase |
https://en.wikipedia.org/wiki/The%20Library%20of%20Babel%20%28website%29 | The Library of Babel is a website created by Brooklyn author and coder Jonathan Basile, based on Jorge Luis Borges' short story "The Library of Babel" (1941). The site was launched in 2015.
Contents of the website
According to Basile, he "was lying in bed one night and the idea of an online Library of Babel popped into my head." Basile quickly realized that an actual digitalized Babel Library would require more digital storage space than one can imagine. To get around this limitation, he designed an algorithm to simulate the library instead.
The Library's main page contains background information, forums and three ways to navigate the library. These ways are to have the website randomly pick one of the thousands of "volumes", to manually browse through the library, or to search for specific text. Due to the library's Infinite monkey theorem-gibberish-like contents,
there is an "Anglishize" feature, that points out words and clumps of words.
The library's content is divided into numbered digital hexagons, each with 4 walls, 20 shelves and 640 volumes. The names of hexagons are limited to 3360 alphanumeric characters, for a total of more than 105229 available hexes.
Algorithm
The algorithm Basile generates a 'book' by iterating every permutation of 29 characters: the 26 English letters, space, comma, and period. Each book is marked by a coordinate, corresponding to its place on the hexagonal library (hexagon name, wall number, shelf number, and book name) so that every book can be found at the same place every time. The website can generate all possible pages of 3200 characters and allows users to choose among about 104677 potential pages of books.
Academic response
The Library of Babel website attracted the attention of scholars, particularly those working at the juncture of humanities and digital media.
Zac Zimmer wrote in Do Borges's librarians have bodies: "Basile's is perhaps the most absolutely dehumanizing of all Library visualizations, in that beyond b |
https://en.wikipedia.org/wiki/Panine%20alphaherpesvirus%203 | Panine alphaherpesvirus 3 (PnHV-3) is a species of virus in the genus Simplexvirus, subfamily Alphaherpesvirinae, family Herpesviridae, and order Herpesvirales. |
https://en.wikipedia.org/wiki/Split-ring%20resonator | A split-ring resonator (SRR) is an artificially produced structure common to metamaterials. Its purpose is to produce the desired magnetic susceptibility (magnetic response) in various types of metamaterials up to 200 terahertz.
These media create the necessary strong magnetic coupling to an applied electromagnetic field not otherwise available in conventional materials. For example, an effect such as negative permeability is produced with a periodic array of split ring resonators.
A single cell SRR has a pair of enclosed loops with splits in them at opposite ends. The loops are made of nonmagnetic metal like copper and have a small gap between them. The loops can be concentric or square, and gapped as needed. A magnetic flux penetrating the metal rings will induce rotating currents in the rings, which produce their own flux to enhance or oppose the incident field (depending on the SRR resonant properties). This field pattern is dipolar. The small gaps between the rings produces large capacitance values which lowers the resonating frequency. Hence the dimensions of the structure are small compared to the resonant wavelength. This results in low radiative losses and very high quality factors.
Background
Split ring resonators (SRRs) consist of a pair of concentric metallic rings, etched on a dielectric substrate, with slits etched on opposite sides. SRRs can produce an effect of being electrically smaller when responding to an oscillating electromagnetic field. These resonators have been used for the synthesis of left-handed and negative refractive index media, where the necessary value of the negative effective permeability is due to the presence of the SRRs. When an array of electrically small SRRs is excited by means of a time varying magnetic field, the structure behaves as an effective medium with negative effective permeability in a narrow band above SRR resonance. SRRs have also been coupled to planar transmission lines, for the synthesis of metamaterials |
https://en.wikipedia.org/wiki/Calcium%20sulfite | Calcium sulfite, or calcium sulphite, is a chemical compound, the calcium salt of sulfite with the formula CaSO3·x(H2O). Two crystalline forms are known, the hemihydrate and the tetrahydrate, respectively CaSO3·½(H2O) and CaSO3·4(H2O). All forms are white solids. It is most notable as the product of flue-gas desulfurization.
Production
It is produced on a large scale by flue gas desulfurization (FGD). When coal or other fossil fuel is burned, the byproduct is known as flue gas. Flue gas often contains SO2, whose emission is often regulated to prevent acid rain. Sulfur dioxide is scrubbed before the remaining gases are emitted through the chimney stack. An economical way of scrubbing SO2 from flue gases is by treating the effluent with Ca(OH)2 hydrated lime or CaCO3 limestone.
Scrubbing with limestone follows the following idealized reaction:
+ → +
Scrubbing with hydrated lime follows the following idealized reaction:
+ → +
The resulting calcium sulfite oxidizes in air to give gypsum:
+ →
The gypsum, if sufficiently pure, is marketable as a building material.
Uses
Drywall
Calcium sulfite is generated as the intermediate in the production of gypsum, which is the main component of drywall. A typical US home contains 7 metric tons of such drywall gypsum board.
Food additive
As a food additive it is used as a preservative under the E number E226. Along with other antioxidant sulfites, it is commonly used in preserving wine, cider, fruit juice, canned fruit and vegetables. Sulfites are strong reducers in solution, they act as oxygen scavenger antioxidants to preserve food, but labeling is required as some individuals might be hypersensitive.
Wood pulp production
Chemical wood pulping is the removal of cellulose from wood by dissolving the lignin that binds the cellulose together. Calcium sulfite can be used in the production of wood pulp through the sulfite process, as an alternative to the Kraft process that uses hydroxides and sulfides ins |
https://en.wikipedia.org/wiki/DB-2073 | DB-2073 is an alkylresorcinol antibiotic isolated from the broth culture of Pseudomonas sp B-9004. |
https://en.wikipedia.org/wiki/Leucoplast | Leucoplasts ("formed, molded") are a category of plastid and as such are organelles found in plant cells. They are non-pigmented, in contrast to other plastids such as the chloroplast.
Lacking photosynthetic pigments, leucoplasts are not green and are located in non-photosynthetic tissues of plants, such as roots, bulbs and seeds. They may be specialized for bulk storage of starch, lipid or protein and are then known as amyloplasts, elaioplasts, or proteinoplasts (also called aleuroplasts) respectively. However, in many cell types, leucoplasts do not have a major storage function and are present to provide a wide range of essential biosynthetic functions, including the synthesis of fatty acids such as palmitic acid, many amino acids, and tetrapyrrole compounds such as heme. In general, leucoplasts are much smaller than chloroplasts and have a variable morphology, often described as amoeboid. Extensive networks of stromules interconnecting leucoplasts have been observed in epidermal cells of roots, hypocotyls, and petals, and in callus and suspension culture cells of tobacco. In some cell types at certain stages of development, leucoplasts are clustered around the nucleus with stromules extending to the cell periphery, as observed for proplastids in the root meristem.
Etioplasts, which are pre-granal, immature chloroplasts but can also be chloroplasts that have been deprived of light, lack active pigment and can be considered leucoplasts. After several minutes exposure to light, etioplasts begin to transform into functioning chloroplasts and cease being leucoplasts.
Amyloplasts are of large size and store starch.
Proteinoplasts store proteins and are found in seeds (pulses).
Elaioplasts store fats and oils and are found in seeds. They are also called oleosomes.
Compare
Plastid
Chloroplast and etioplast
Chromoplast
Tannosome
Leucoplast
Amyloplast
Elaioplast
Proteinoplast
External links
Organelles
Plant cells
Plant physiology |
https://en.wikipedia.org/wiki/Intermittent%20control | Intermittent control is a feedback control method which not only explains some human control systems but also has applications to control engineering.
In the context of control theory, intermittent control provides a spectrum of possibilities between the two extremes of continuous-time and discrete-time control: the control signal consists of a sequence of (continuous-time) parameterised trajectories whose parameters are adjusted intermittently. It is different from discrete-time control in that the control is not constant between samples; it is different from continuous-time control in that the trajectories are reset intermittently. As a class of control theory, intermittent predictive control is more general than continuous control and provides a new paradigm incorporating continuous predictive and optimal control with intermittent, open loop (ballistic) control.
There are at least three areas where intermittent control is relevant. Firstly, continuous-time model-based predictive control where the intermittency is associated with on-line optimisation. Secondly, event-driven control systems where the intersample interval is time varying and determined by the event times. Thirdly, explanation of physiological control systems which, in some cases, have an intermittent character. This intermittency may be due to the “computation” in the central nervous system.
Conventional sampled-data control uses a zero-order hold, which produces a piecewise-constant control signal and can be used to give a
sampled-data implementation which approximates previously-designed continuous-time controller. In contrast to conventional sampled data control, intermittent control explicitly embeds the underlying continuous-time closed-loop system in a system-matched hold which generates an open-loop intersample control trajectory based on the underlying continuous-time closed-loop control system.
History
Intermittent control initially evolved separately in the engineering and physiologi |
https://en.wikipedia.org/wiki/Lactic%20acid | Lactic acid is an organic acid. It has a molecular formula . It is white in the solid state and it is miscible with water. When in the dissolved state, it forms a colorless solution. Production includes both artificial synthesis as well as natural sources. Lactic acid is an alpha-hydroxy acid (AHA) due to the presence of a hydroxyl group adjacent to the carboxyl group. It is used as a synthetic intermediate in many organic synthesis industries and in various biochemical industries. The conjugate base of lactic acid is called lactate (or the lactate anion). The name of the derived acyl group is lactoyl.
In solution, it can ionize by loss of a proton to produce the lactate ion . Compared to acetic acid, its pK is 1 unit less, meaning lactic acid is ten times more acidic than acetic acid. This higher acidity is the consequence of the intramolecular hydrogen bonding between the α-hydroxyl and the carboxylate group.
Lactic acid is chiral, consisting of two enantiomers. One is known as -lactic acid, (S)-lactic acid, or (+)-lactic acid, and the other, its mirror image, is -lactic acid, (R)-lactic acid, or (−)-lactic acid. A mixture of the two in equal amounts is called -lactic acid, or racemic lactic acid. Lactic acid is hygroscopic. -Lactic acid is miscible with water and with ethanol above its melting point, which is about . -Lactic acid and -lactic acid have a higher melting point. Lactic acid produced by fermentation of milk is often racemic, although certain species of bacteria produce solely -lactic acid. On the other hand, lactic acid produced by anaerobic respiration in animal muscles has the () enantiomer and is sometimes called "sarcolactic" acid, from the Greek , meaning "flesh".
In animals, -lactate is constantly produced from pyruvate via the enzyme lactate dehydrogenase (LDH) in a process of fermentation during normal metabolism and exercise. It does not increase in concentration until the rate of lactate production exceeds the rate of lactate removal, whi |
https://en.wikipedia.org/wiki/High-level%20verification | High-level verification (HLV), or electronic system-level (ESL) verification, is the task to verify ESL designs at high abstraction level, i.e., it is the task to verify a model that represents hardware above register-transfer level (RTL) abstract level. For high-level synthesis (HLS or C synthesis), HLV is to HLS as functional verification is to logic synthesis.
Electronic digital hardware design has evolved from low level abstraction at gate level to register transfer level (RTL), the abstraction level above RTL is commonly called high-level, ESL, or behavioral/algorithmic level.
In high-level synthesis, behavioral/algorithmic designs in ANSI C/C++/SystemC code is synthesized to RTL, which is then synthesized into gate level through logic synthesis. Functional verification is the task to make sure a design at RTL or gate level conforms to a specification. As logic synthesis matures, most functional verification is done at the higher abstraction, i.e. at RTL level, the correctness of logic synthesis tool in the translating process from RTL description to gate netlist is of less concern today.
High-level synthesis is still an emerging technology, so High-level verification today has two important areas under development
to validate HLS is correct in the translation process, i.e. to validate the design before and after HLS are equivalent, typically through formal methods
to verify a design in ANSI C/C++/SystemC code is conforming to a specification, typically through logic simulation.
Terminology
History
Product areas
Formal Solution: Verify high level models against RTL designs
Simulation Solution: Intelligent stimulus generation, code and functional coverage, temporal assertion checker
See also
Accellera
Electronic system-level (ESL)
Formal verification
Property Specification Language (PSL)
SystemC
SystemVerilog
Transaction-level modeling (TLM) |
https://en.wikipedia.org/wiki/Rambutan%20%28cryptography%29 | Rambutan is a family of encryption technologies designed by the Communications-Electronics Security Group (CESG), the technical division of the United Kingdom government's secret communications agency, GCHQ.
It includes a range of encryption products designed by CESG for use in handling confidential (not secret) communications between parts of the British government, government agencies, and related bodies such as NHS Trusts. Unlike CESG's Red Pike system, Rambutan is not available as software: it is distributed only as a self-contained electronic device (an ASIC) which implements the entire cryptosystem and handles the related key distribution and storage tasks. Rambutan is not sold outside the government sector.
Technical details of the Rambutan algorithm are secret. Security researcher Bruce Schneier describes it as being a stream cipher (linear-feedback shift register) based cryptosystem with 5 shift registers each of around 80 bits, and a key size of 112 bits. RAMBUTAN-I communications chips (which implement a secure X.25 based communications system) are made by approved contractors Racal and Baltimore Technologies/Zergo Ltd. CESG later specified RAMBUTAN-II, an enhanced system with backward compatibility with existing RAMBUTAN-I infrastructure. The RAMBUTAN-II chip is a 64-pin quad ceramic pack chip, which implements the electronic codebook, cipher block chaining, and output feedback operating modes (each in 64 bits) and the cipher feedback mode in 1 or 8 bits. Schneier suggests that these modes may indicate Rambutan is a block cipher rather than a stream. The three 64 bit modes operate at 88 megabits/second. Rambutan operates in three modes: ECB, CBC, and 8 bit CFB. |
https://en.wikipedia.org/wiki/HKA%20test | The HKA Test, named after Richard R. Hudson, Martin Kreitman, and Montserrat Aguadé, is a statistical test used in genetics to evaluate the predictions of the Neutral Theory of molecular evolution. By comparing the polymorphism within each species and the divergence observed between two species at two or more loci, the test can determine whether the observed difference is likely due to neutral evolution or rather due to adaptive evolution. Developed in 1987, the HKA test is a precursor to the McDonald-Kreitman test, which was derived in 1991. The HKA test is best used to look for balancing selection, recent selective sweeps or other variation-reducing forces.
Neutral Evolution
Neutral Evolution Theory, first proposed by Kimura in a 1968 paper, and later fully defined and published in 1983, is the basis for many statistical tests that detect selection at the molecular level. Kimura noted that there was much too high of a rate of mutation within the genome (i.e. high polymorphism) to be strictly under directional evolution. Furthermore, functionally less important regions of the genome evolve at a faster rate. Kimura then postulated that most of the modifications to the genome are neutral or nearly neutral, and evolve by random genetic drift. Therefore, under the neutral model, polymorphism within a species and divergence between related species at homologous sites will be highly correlated. The Neutral Evolution theory has become the null model against which tests for selection are based, and divergence from this model can be explained by directional or selective evolution.
Formulae
The rate of mutation within a population can be estimated using the Watterson estimator formula: θ=4Νeμ, where Νe is the effective population size and μ is the mutation rate (substitutions per site per unit of time). Hudson et al. proposed applying these variables to a chi-squared, goodness-of-fit test.
The test statistic proposed by Hudson et al., Χ2, is:
This states that, for each |
https://en.wikipedia.org/wiki/Ultracold%20atom | In condensed matter physics, an ultracold atom is an atom with a temperature near absolute zero. At such temperatures, an atom's quantum-mechanical properties become important.
To reach such low temperatures, a combination of several techniques typically has to be used. First, atoms are trapped and pre-cooled via laser cooling in a magneto-optical trap. To reach the lowest possible temperature, further cooling is performed using evaporative cooling in a magnetic or optical trap. Several Nobel prizes in physics are related to the development of the techniques to manipulate quantum properties of individual atoms (e.g. 1995-1997, 2001, 2005, 2012, 2017).
Experiments with ultracold atoms study a variety of phenomena, including quantum phase transitions, Bose–Einstein condensation (BEC), bosonic superfluidity, quantum magnetism, many-body spin dynamics, Efimov states, Bardeen–Cooper–Schrieffer (BCS) superfluidity and the BEC–BCS crossover. Some of these research directions utilize ultracold atom systems as quantum simulators to study the physics of other systems, including the unitary Fermi gas and the Ising and Hubbard models. Ultracold atoms could also be used for realization of quantum computers.
History
Samples of ultracold atoms are typically prepared through the interactions of a dilute gas with a laser field. Evidence for radiation pressure, force due to light on atoms, was demonstrated independently by Lebedev, and Nichols and Hull in 1901. In 1933, Otto Frisch demonstrated the deflection of individual sodium particles by light generated from a sodium lamp.
The invention of the laser spurred the development of additional techniques to manipulate atoms with light. Using laser light to cool atoms was first proposed in 1975 by taking advantage of the Doppler effect to make the radiation force on an atom dependent on its velocity, a technique known as Doppler cooling. Similar ideas were also proposed to cool samples of trapped ions. Applying Doppler cooling in th |
https://en.wikipedia.org/wiki/Shape%20factor%20%28image%20analysis%20and%20microscopy%29 | Shape factors are dimensionless quantities used in image analysis and microscopy that numerically describe the shape of a particle, independent of its size. Shape factors are calculated from measured dimensions, such as diameter, chord lengths, area, perimeter, centroid, moments, etc. The dimensions of the particles are usually measured from two-dimensional cross-sections or projections, as in a microscope field, but shape factors also apply to three-dimensional objects. The particles could be the grains in a metallurgical or ceramic microstructure, or the microorganisms in a culture, for example. The dimensionless quantities often represent the degree of deviation from an ideal shape, such as a circle, sphere or equilateral polyhedron. Shape factors are often normalized, that is, the value ranges from zero to one. A shape factor equal to one usually represents an ideal case or maximum symmetry, such as a circle, sphere, square or cube.
Aspect ratio
The most common shape factor is the aspect ratio, a function of the largest diameter and the smallest diameter orthogonal to it:
The normalized aspect ratio varies from approaching zero for a very elongated particle, such as a grain in a cold-worked metal, to near unity for an equiaxed grain. The reciprocal of the right side of the above equation is also used, such that the AR varies from one to approaching infinity.
Circularity
Another very common shape factor is the circularity (or isoperimetric quotient), a function of the perimeter P and the area A:
The circularity of a circle is 1, and much less than one for a starfish footprint. The reciprocal of the circularity equation is also used, such that fcirc varies from one for a circle to infinity.
Elongation shape factor
The less-common elongation shape factor is defined as the square root of the ratio of the two second moments in of the particle around its principal axes.
Compactness shape factor
The compactness shape factor is a function of the polar |
https://en.wikipedia.org/wiki/Data%20hierarchy | Data hierarchy refers to the systematic organization of data, often in a hierarchical form. Data organization involves characters, fields, records, files and so on. This concept is a starting point when trying to see what makes up data and whether data has a structure. For example, how does a person make sense of data such as 'employee', 'name', 'department', 'Marcy Smith', 'Sales Department' and so on, assuming that they are all related? One way to understand them is to see these terms as smaller or larger components in a hierarchy. One might say that Marcy Smith is one of the employees in the Sales Department, or an example of an employee in that Department. The data we want to capture about all our employees, and not just Marcy, is the name, ID number, address etc.
Purpose of the data hierarchy
"Data hierarchy" is a basic concept in data and database theory and helps to show the relationships between smaller and larger components in a database or data file. It is used to give a better sense of understanding about the components of data and how they are related.
It is particularly important in databases with referential integrity, third normal form, or perfect key. "Data hierarchy" is the result of proper arrangement of data without redundancy. Avoiding redundancy eventually leads to proper "data hierarchy" representing the relationship between data, and revealing its relational structure.
Components of the data hierarchy
The components of the data hierarchy are listed below.
A data field holds a single fact or attribute of an entity. Consider a date field, e.g. "19 September 2004". This can be treated as a single date field (e.g. birthdate), or three fields, namely, day of month, month and year.
A record is a collection of related fields. An Employee record may contain a name field(s), address fields, birthdate field and so on.
A file is a collection of related records. If there are 100 employees, then each employee would have a record (e.g. called Emp |
https://en.wikipedia.org/wiki/UEFITool | UEFITool is a software program for reading and modifying EEPROM images that contain an UEFI firmware. Features include the ability to view the flash regions and to extract and import them. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.