source stringlengths 31 227 | text stringlengths 9 2k |
|---|---|
https://en.wikipedia.org/wiki/Resource%20holding%20potential | In biology, resource holding potential (RHP) is the ability of an animal to win an all-out fight if one were to take place. The term was coined by Geoff Parker to disambiguate physical fighting ability from the motivation to persevere in a fight (Parker, 1974). Originally the term used was 'resource holding power', but 'resource holding potential' has come to be preferred. The latter emphasis on 'potential' serves as a reminder that the individual with greater RHP does not always prevail.
An individual with more RHP may lose a fight if, for example, it is less motivated (has less to gain by winning) than its opponent. Mathematical models of RHP and motivation ( resource value or V) have traditionally been based on the hawk-dove game (e.g. Hammerstein, 1981) in which subjective resource value is represented by the variable 'V'. In addition to RHP and V, George Barlow (Barlow et al., 1986) proposed that a third variable, which he termed 'daring', played a role in determining fight outcome. Daring (a.k.a. aggressiveness) represents an individual's tendency to initiate or escalate a contest independent of the effects of RHP and V.
It is instinctive for all animals to live a life according to fitness (Parker 1974). Animals will do what they can to improve their fitness and therefore survive long enough to produce offspring. However when resources are not in abundance, this can be challenging; eventually, animals will begin to compete for resources. The competition for resources can be dangerous and for some animals, deadly. Some animals have developed adaptive traits that increase their chances of survival when competing for resources. This trait is Resource Holding Potential (RHP) (Parker 1974). Resource Holding Potential, or Resource Holding Power, is the term defining the motivation an individual has to continue to fight, work, or endure through situations that others may give up during. Animals that use RHP often evaluate the conditions of the danger they fac |
https://en.wikipedia.org/wiki/PostSecret | PostSecret is an ongoing community mail art project, created by Frank Warren in 2005, in which people mail their secrets anonymously on a homemade postcard. Selected secrets are then posted on the PostSecret website, or used for PostSecret's books or museum exhibits.
History
The concept of the project was that completely anonymous people decorate a postcard and portray a secret that they had never previously revealed. No restrictions are made on the content of the secret; only that it must be completely truthful and must never have been spoken before. Entries range from admissions of sexual misconduct and criminal activity to confessions of secret desires, embarrassing habits, hopes and dreams.
PostSecret collected and displayed over 2,500 original pieces of art from people across the United States and around the world between its founding on January 1, 2005 and 2007.
The site, which started as an experiment on Blogger, was updated every Sunday with 10 new secrets, all of which share a relatively constant style, giving the artists who participate some guidelines on how their secrets should be represented. From June 24 to July 3, 2007, the "Comments" section of the site was enabled. While a comments feature is frequently present on blogs, it had been previously absent from the PostSecret site. Many visitors felt that the new section contradicted the purpose of the site, as evidenced in numerous comments criticizing a postcard in which the author claims to have fed bleach to her cat. In October 2007, the PostSecret Community was launched. Since its inception, more than 80,000 users have registered for the online discussion forum.
According to Youth Trends' February 2008 "Top Ten List Report" PostSecret was the 10th most popular site among female students in the US, with 7% of those polled naming the site as their favorite.
In April 2008, Warren teamed up with 1-800-suicide to answer some of these anonymous cries for help through peer run crisis hotlines on col |
https://en.wikipedia.org/wiki/Photostimulation | Photostimulation is the use of light to artificially activate biological compounds, cells, tissues, or even whole organisms. Photostimulation can be used to noninvasively probe various relationships between different biological processes, using only light. In the long run, photostimulation has the potential for use in different types of therapy, such as migraine headache. Additionally, photostimulation may be used for the mapping of neuronal connections between different areas of the brain by “uncaging” signaling biomolecules with light. Therapy with photostimulation has been called light therapy, phototherapy, or photobiomodulation.
Photostimulation methods fall into two general categories: one set of methods uses light to uncage a compound that then becomes biochemically active, binding to a downstream effector. For example, uncaging glutamate is useful for finding excitatory connections between neurons, since the uncaged glutamate mimics the natural synaptic activity of one neuron impinging upon another. The other major photostimulation method is the use of light to activate a light-sensitive protein such as rhodopsin, which can then excite the cell expressing the opsin.
Scientists have long postulated the need to control one type of cell while leaving those surrounding it untouched and unstimulated. Well-known scientific advancements such as the use of electrical stimuli and electrodes have succeeded in neural activation but fail to achieve the aforementioned goal because of their imprecision and inability to distinguish between different cell types. The use of optogenetics (artificial cell activation via the use of light stimuli) is unique in its ability to deliver light pulses in a precise and timely fashion. Optogenetics is somewhat bidirectional in its ability to control neurons. Channels can be either depolarized or hyperpolarized depending on the wavelength of light that targets them. For instance, the technique can be applied to channelrhodopsin cation |
https://en.wikipedia.org/wiki/Bandit%20Kings%20of%20Ancient%20China | Bandit Kings of Ancient China, also known as in Japan, is a turn-based strategy video game developed and published by Koei, and released in 1989 for MSX, MS-DOS, Amiga, and Macintosh and in 1990 for the Nintendo Entertainment System. In 1996, Koei issued a remake for the Japanese Sega Saturn and PlayStation featuring vastly improved graphics and new arrangements of the original songs.
Gameplay
Based on the 14th century Great Classical Novel Water Margin, the game takes place in ancient China during the reign of Emperor Huizong of the Song Dynasty. The Bandit Kings of Ancient China—a band of ten bandits—engage in war against China's Minister of War Gao Qiu, an evil minister with unlimited power. The objective of the game is to build, sustain, and command an army of troops to capture Gao Qiu before the Jurchen invasion in January 1127. Players hold certain attributes such as strength, dexterity, and wisdom. Players must also deal with other situations such as taxes, care for the troops, maintenance and replacement of weapons and equipment, forces of nature, and troop unrest and desertion.
Battlefields take place on hexagonal grids, where players move their armies across various terrain in order to strategically engage and defeat the enemy army. Troops have the capability of fighting with either melee weapons, bows and arrows, magic, or dueling swords, as well as setting tiles on fire. When a player defeats an enemy army, they have the option of recruiting, imprisoning, exiling, or executing the captured enemy troops. The attacker has 30 days to defeat all deployed enemy troops, capture the commander, or be automatically defeated. The game ends in defeat for the player(s) when the game calendar hit January 1127.
The game map shows the empire composed of 49 prefectures. Any prefecture may be invaded from an adjacent territory - the only exception is the current location of Gao Qiu (usually the capital, Kai Feng) where the player must have built up sufficient popula |
https://en.wikipedia.org/wiki/Registered%20memory | Registered (also called buffered) memory modules have a register between the DRAM modules and the system's memory controller. They place less electrical load on the memory controller and allow single systems to remain stable with more memory modules than they would have otherwise. When compared with registered memory, conventional memory is usually referred to as unbuffered memory or unregistered memory. When manufactured as a dual in-line memory module (DIMM), a registered memory module is called an RDIMM, while unregistered memory is called UDIMM or simply DIMM.
Registered memory is often more expensive because of the lower number of units sold and additional circuitry required, so it is usually found only in applications where the need for scalability and robustness outweighs the need for a low price for example, registered memory is usually used in servers.
Although most registered memory modules also feature error-correcting code memory (ECC), it is also possible for registered memory modules to not be error-correcting or vice versa. Unregistered ECC memory is supported and used in workstation or entry-level server motherboards that do not support very large amounts of memory.
Performance
Normally, there is a performance penalty for using registered memory. Each read or write is buffered for one cycle between the memory bus and the DRAM, so the registered RAM can be thought of as running one clock cycle behind the equivalent unregistered DRAM. With SDRAM, this only applies to the first cycle of a burst.
However, this performance penalty is not universal. There are many other factors involved in memory access speed. For example, the Intel Westmere 5600 series of processors access memory using interleaving, wherein memory access is distributed across three channels. If two memory DIMMs are used per channel, there is a reduction of maximum memory bandwidth for this configuration with UDIMM by some 5% in comparison to RDIMM.
Compatibility
Usually, the motherb |
https://en.wikipedia.org/wiki/White%20jersey | Numerous cycling stage races award a white jersey to signify the current leader and overall winner of a certain competition, or to signify the best young rider in the race. The most prominent of these is the Tour de France, where the jersey is known as the maillot blanc and is awarded to the best-placed rider age under 26. The use of the white jersey to recognize the best young rider in a race is its most common use, though some tours award a white jersey for a different classification.
Other stage races, besides the Tour de France, that also award a white jersey for the best young rider include:
Deutschland Tour
Giro d'Italia (known as the maglia bianca, and the cut-off is 25 years instead of 26)
Paris–Nice
Tour of California (where the cut-off is age 23 instead of 26)
Tour of Ireland
Tour de France
Giro d'Italia
Other white jerseys
Some stage races award a white jersey for a different classification than youth. The foremost of these is probably the Vuelta a España, where it recognizes the leader in the Combination classification. In this classification, ranks in the General, Points, and Mountains classifications are added, and whoever has the lowest cumulative total is awarded the white jersey. It is a relatively new award, having existed only since the 2003 edition of the race. In 1941, the Vuelta a España white jersey was for the leader and overall winner of the General classification.
The Tour of the Basque Country awards a white jersey to the leader and overall winner of the Points classification. The Volta a Catalunya awards a white jersey with green stripes to the leader and overall winner of the General classification. The 2008 Tour de Suisse awarded a white jersey to a somewhat unusual competition, the Sprints classification, which awards placings not on stage finishes but strictly in intermediate sprints.
The Tour Down Under awards a white jersey, with green piping and side panels, to the leader and ultimately the winner of the King of the Mountain |
https://en.wikipedia.org/wiki/Duru%E2%80%93Kleinert%20transformation | The Duru–Kleinert transformation, named after İsmail Hakkı Duru and Hagen Kleinert, is a mathematical method for solving path integrals of physical systems with singular potentials, which is necessary for the solution of all atomic path integrals due to the presence of Coulomb potentials (singular like ).
The Duru–Kleinert transformation replaces the diverging time-sliced path integral of Richard Feynman (which thus does not exist) by a well-defined convergent one.
Papers
H. Duru and H. Kleinert, Solution of the Path Integral for the H-Atom, Phys. Letters B 84, 185 (1979)
H. Duru and H. Kleinert, Quantum Mechanics of H-Atom from Path Integrals, Fortschr. d. Phys. 30, 401 (1982)
H. Kleinert, Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets 3. ed., World Scientific (Singapore, 2004) (read book here)
Quantum mechanics |
https://en.wikipedia.org/wiki/Donald%20A.%20Martin | Donald Anthony Martin (born December 24, 1940), also known as Tony Martin, is an American set theorist and philosopher of mathematics at UCLA, where he is an emeritus professor of mathematics and philosophy.
Education and career
Martin received his B.S. from the Massachusetts Institute of Technology in 1962 and was a Junior Fellow of the Harvard Society of Fellows in 1965–67. In 2014, he became a Fellow of the American Mathematical Society.
Philosophical and mathematical work
Among Martin's most notable works are the proofs of analytic determinacy (from the existence of a measurable cardinal), Borel determinacy (from ZFC alone), the proof (with John R. Steel) of projective determinacy (from suitable large cardinal axioms), and his work on Martin's axiom. The Martin measure on Turing degrees is also named after Martin.
See also
American philosophy
List of American philosophers |
https://en.wikipedia.org/wiki/Angelfire | Angelfire is an Internet service that offers website services. It is owned by Lycos, which also owns Tripod.com. Angelfire operates separately from Tripod.com and includes features such as blog building and a photo gallery builder. Free webpages are no longer available to new registrants and have been replaced by paid services.
History
Angelfire was founded in 1996 and was originally a combination Web site building and medical transcription service. Eventually the site dropped the transcription service and focused solely on website hosting, offering only paid memberships. The site was bought by Mountain View, California–based WhoWhere on October 20, 1997, which, in turn, was subsequently purchased by the search engine company Lycos on August 11, 1998 for US$133million.
See also |
https://en.wikipedia.org/wiki/Series%2040 | Series 40, often shortened as S40, is a software platform and application user interface (UI) software on Nokia's broad range of mid-tier feature phones, as well as on some of the Vertu line of luxury phones. It was one of the world's most widely used mobile phone platforms and found in hundreds of millions of devices. Nokia announced on 25 January 2012 that the company has sold over 1.5 billion Series 40 devices. It was not used for smartphones, with Nokia turning first to Symbian, then in 2012–2017 to Windows Phone, and most recently Android. However, in 2012 and 2013, several Series 40 phones from the Asha line, such as the 308, 309 and 311, were advertised as "smartphones" although they do not actually support smartphone features like multitasking or a fully fledged HTML browser.
In 2014, Microsoft acquired Nokia's mobile phones business. As part of a licensing agreement with the company, Microsoft Mobile is allowed to use the Nokia brand on feature phones, such as the Series 40 range. However, a July 2014 company memo revealed that Microsoft would end future production of Series 40 devices. It was replaced by Series 30+.
History
Series 40 was introduced in 1999 with the release of the Nokia 7110. It had a 96 × 65 pixel monochrome display and was the first phone to come with a WAP browser. Over the years, the S40 UI evolved from a low-resolution UI to a high-resolution color UI with an enhanced graphical look. The third generation of Series 40 that became available in 2005 introduced support for devices with resolutions as high as QVGA (240×320). It is possible to customize the look and feel of the UI via comprehensive themes. In 2012, Nokia Asha mobile phones 200/201, 210, 302, 303, 305, 306, 308, 310 and 311 were released and all used Series 40. The final feature phone running Series 40 was the Nokia 515 from 2013, running the 6th Edition.
Technical information
Applications
Series 40 provides communication applications such as telephone, Internet telephon |
https://en.wikipedia.org/wiki/Salvia%20splendens | Salvia splendens, the scarlet sage, is a tender herbaceous perennial plant native to Brazil, growing at elevation where it is warm year-round and with high humidity. The wild form, rarely seen in cultivation, reaches tall. Smaller cultivars are very popular as bedding plants, seen in shopping malls and public gardens all over the world.
Taxonomy
Salvia splendens was first described and named in 1822. At that time it was given the common name "Lee's scarlet sage". Before the plant was selected to become dwarf in size, an early Dutch selection named 'Van Houttei' was chosen and is still popular in the horticulture trade.
Description
The native type is rarely used or described, though it grew from in height. Its leaves are in even, elliptical arrangements, 7 × 5 cm, with dentate margin and they have long petioles. It may branched, where its upper branches are finely hairy and in the lower parts though hairless. Flowers in erect spikes that sprout from the centre of the plant in groups of 2 to 6 together in each leaf node; scarlet, tubular or bell-shaped, 35 mm long, with two lobes towards the apex; the upper lobe is 13 mm long. It flowers a good part of summer and autumn.
Cultivation
It is widely grown as an ornamental plant, with a large number of cultivars selected by different colours from white to dark purple. It is a subtropical species that does not survive freezing temperatures, but can grow in cold climates as an annual plant. The most common selections are the dwarf sizes that go by names such as 'Sizzler' and 'Salsa', and planted en masse in gardens and malls. 'Van Houttei' reaches in height. The various types typically have red flowers.
Cultivars
Named cultivars include:
S. splendens 'Alba', with white flowers
'Atropurpurea', with dark violet to purple flowers
'Atrosanguinea', flowers dark red
'Bicolor', flowers white and red
'Bruantii', small, with red flowers
'Compacta', small, flowers in dense racemes, white or red
'Grandiflora', large, with |
https://en.wikipedia.org/wiki/Chlamydia%20psittaci | {{Taxobox
| image = Chlamydophila psittaci FA stain.jpg
| image_caption = Direct fluorescent antibody stain of a mouse brain impression smear showing C. psittaci.
| domain = Bacteria
| phylum = Chlamydiota
| classis = Chlamydiia
| ordo = Chlamydiales
| familia = Chlamydiaceae
| genus = Chlamydia
| species = C. psittaci
| binomial = Chlamydia psittaci
| synonyms = Chlamydophila psittaci
}}Chlamydia psittaci is a lethal intracellular bacterial species that may cause endemic avian chlamydiosis, epizootic outbreaks in mammals, and respiratory psittacosis in humans. Potential hosts include feral birds and domesticated poultry, as well as cattle, pigs, sheep, and horses. C. psittaci is transmitted by inhalation, contact, or ingestion among birds and to mammals. Psittacosis in birds and in humans often starts with flu-like symptoms and becomes a life-threatening pneumonia. Many strains remain quiescent in birds until activated by stress. Birds are excellent, highly mobile vectors for the distribution of chlamydia infection, because they feed on, and have access to, the detritus of infected animals of all sorts.C. psittaci in birds is often systemic, and infections can be inapparent, severe, acute, or chronic with intermittent shedding. C. psittaci strains in birds infect mucosal epithelial cells and macrophages of the respiratory tract. Septicaemia eventually develops and the bacteria become localized in epithelial cells and macrophages of most organs, conjunctiva, and gastrointestinal tracts. It can also be passed in the eggs. Stress will commonly trigger onset of severe symptoms, resulting in rapid deterioration and death. C. psittaci strains are similar in virulence, grow readily in cell culture, have 16S rRNA genes that differ by <0.8%, and belong to eight known serotypes. All should be considered to be readily transmissible to humans.C. psittaci serovar A is endemic among psittacine birds and has caused sporadic zoonotic disease in humans, other mammals, and tortoises |
https://en.wikipedia.org/wiki/Multi-master%20replication | Multi-master replication is a method of database replication which allows data to be stored by a group of computers, and updated by any member of the group. All members are responsive to client data queries. The multi-master replication system is responsible for propagating the data modifications made by each member to the rest of the group and resolving any conflicts that might arise between concurrent changes made by different members.
Multi-master replication can be contrasted with primary-replica replication, in which a single member of the group is designated as the "master" for a given piece of data and is the only node allowed to modify that data item. Other members wishing to modify the data item must first contact the master node. Allowing only a single master makes it easier to achieve consistency among the members of the group, but is less flexible than multi-master replication.
Multi-master replication can also be contrasted with failover clustering where passive replica servers are replicating the master data in order to prepare for takeover in the event that the master stops functioning. The master is the only server active for client interaction.
Often, communication and replication in Multi-master systems are handled via a type of Consensus algorithm, but can also be implemented via custom or proprietary algorithms specific to the software.
The primary purposes of multi-master replication are increased availability and faster server response time.
Advantages
Availability: If one master fails, other masters continue to update the database.
Distributed Access: Masters can be located in several physical sites, i.e. distributed across the network.
Disadvantages
Consistency: Most multi-master replication systems are only loosely consistent, i.e. lazy and asynchronous, violating ACID properties.
Performance: Eager replication systems are complex and increase communication latency.
Integrity: Issues such as conflict resolution can become intracta |
https://en.wikipedia.org/wiki/Giuseppe%20Melfi | Giuseppe Melfi (June 11, 1967) is an Italo-Swiss mathematician who works on practical numbers and modular forms.
Career
He gained his PhD in mathematics in 1997 at the University of Pisa. After some time spent at the University of Lausanne during 1997-2000, Melfi was appointed at the University of Neuchâtel, as well as at the University of Applied Sciences Western Switzerland and at the local University of Teacher Education.
Work
His major contributions are in the field of practical numbers. This prime-like sequence of numbers is known for having an asymptotic behavior and other distribution properties similar to the sequence of primes. Melfi proved two conjectures both raised in 1984 one of which is the corresponding of the Goldbach conjecture for practical numbers: every even number is a sum of two practical numbers. He also proved that there exist infinitely many triples of practical numbers of the form .
Another notable contribution has been in an application of the theory of modular forms, where he found new Ramanujan-type identities for the sum-of-divisor functions. His seven new identities extended the ten other identities found by Ramanujan in 1913. In particular he found the remarkable identity
where is the sum of the divisors of and is the sum of the third powers of the divisors of .
Among other problems in elementary number theory, he is the author of a theorem that allowed him to get a 5328-digit number that has been for a while the largest known primitive weird number.
In applied mathematics his research interests include probability and simulation.
Selected research publications
.
See also
Applications of randomness |
https://en.wikipedia.org/wiki/Business%20rule%20management%20system | A BRMS or business rule management system is a software system used to define, deploy, execute, monitor and maintain the variety and complexity of decision logic that is used by operational systems within an organization or enterprise. This logic, also referred to as business rules, includes policies, requirements, and conditional statements that are used to determine the tactical actions that take place in applications and systems.
Overview
A BRMS includes, at minimum:
This needs to be attributed:
A repository, allowing decision logic to be externalized from core application code
Tools, allowing both technical developers and business experts to define and manage decision logic
A runtime environment, allowing applications to invoke decision logic managed within the BRMS and execute it using a business rules engine
The top benefits of a BRMS include:
Reduced or removed reliance on IT departments for changes in live systems. Although, QA and Rules testing would still be needed in any enterprise system.
Increased control over implemented decision logic for compliance and better business management including audit logs, impact simulation and edit controls.
The ability to express decision logic with increased precision, using a business vocabulary syntax and graphical rule representations (decision tables, decision models, trees, scorecards and flows)
Improved efficiency of processes through increased decision automation.
Some disadvantages of the BRMS include:
Extensive subject matter expertise can be required for vendor specific products. In addition to appropriate design practices (such as Decision Modeling), technical developers must know how to write rules and integrate software with existing systems
Poor rule harvesting approaches can lead to long development cycles, though this can be mitigated with modern approaches like the Decision Model and Notation (DMN) standard.
Integration with existing systems is still required and a BRMS may add additiona |
https://en.wikipedia.org/wiki/Five%20dots%20tattoo | The five dots tattoo is a tattoo of five dots arranged in a quincunx, usually on the outer surface of the hand, between the thumb and the index finger. The tattoo has different meanings in different cultures—it has been variously interpreted as a fertility symbol, a reminder of sayings on how to treat women or police, a way members of People Nation or Nuestra Familia affiliated gangs identify (People gangs identify with the number 5, while Folk Nation gangs use 6), a recognition symbol among the Romani people, a group of close friends, standing alone in the world, or time spent in prison (with the outer four dots representing the prison walls and the inner dot representing the prisoner). Thomas Edison had this pattern tattooed on his forearm.
See also
Criminal tattoo
Prison tattooing |
https://en.wikipedia.org/wiki/Protoplanetary%20nebula | A protoplanetary nebula or preplanetary nebula (PPN, plural PPNe) is an astronomical object which is at the short-lived episode during a star's rapid evolution between the late asymptotic giant branch (LAGB) phase and the subsequent planetary nebula (PN) phase. A PPN emits strongly in infrared radiation, and is a kind of reflection nebula. It is the second-from-the-last high-luminosity evolution phase in the life cycle of intermediate-mass stars (1–8 ).
Naming
The name protoplanetary nebula is an unfortunate choice due to the possibility of confusion with the same term being sometimes employed when discussing the unrelated concept of protoplanetary disks. The name protoplanetary nebula is a consequence of the older term planetary nebula, which was chosen due to early astronomers looking through telescopes and finding a similarity in appearance of planetary nebula to the gas giants such as Neptune and Uranus. To avoid any possible confusion, suggested employing a new term preplanetary nebula which does not overlap with any other disciplines of astronomy. They are often referred to as post-AGB stars, although that category also includes stars that will never ionize their ejected matter.
Evolution
Beginning
During the late asymptotic giant branch (LAGB) phase, when mass loss reduces the hydrogen envelope's mass to around 10−2 for a core mass of 0.60 , a star will begin to evolve towards the blue side of the Hertzsprung–Russell diagram. When the hydrogen envelope has been further reduced to around 10−3 , the envelope will have been so disrupted that it is believed further significant mass loss is not possible. At this point, the effective temperature of the star, T*, will be around 5,000 K and it is defined to be the end of the LAGB and the beginning of the PPN.
Protoplanetary nebula phase
During the ensuing protoplanetary nebula phase, the central star's effective temperature will continue rising as a result of the envelope's mass loss as a consequence o |
https://en.wikipedia.org/wiki/WUNI | WUNI (channel 66) is a television station licensed to Marlborough, Massachusetts, United States, broadcasting the Spanish-language Univision network to the Boston area. It is owned by TelevisaUnivision alongside Derry, New Hampshire–licensed True Crime Network affiliate WWJE-DT (channel 50); Entravision Communications operates WUNI under a joint sales agreement (JSA), making it sister to Worcester, Massachusetts–licensed UniMás affiliate WUTF-TV (channel 27). WUNI and WWJE share studios and transmitter facilities on Parmenter Road in Hudson; under the JSA, master control and some internal operations of WUNI are based at WUTF's studios on 4th Avenue in Needham.
History
As an English-language independent station
The station first signed on the air on January 1, 1970, as WSMW-TV, an independent station based in Worcester that featured English-language general entertainment programs including old movies (including the entire series of Abbott and Costello movies and the Bowery Boys/Dead-End Kids movies starring Huntz Hall), cartoons, religious shows (including the Jacob Brothers and The PTL Club), a cooking show (Cooking with Bernard), science fiction shows (such as UFO), dramas (including Maverick and Thriller), as well as sitcoms (including The Phil Silvers Show and Petticoat Junction). Though WSMW-TV was within the Boston market, it was far enough from Boston itself that the station was able to air some of the same shows as the Boston stations, in a similar situation to WMUR-TV (channel 9), the ABC affiliate in Manchester, New Hampshire. The station's call letters stood for State Mutual (Insurance Co.) in Worcester, the corporate owner of the station.
WSMW also broadcast sports programs; from its debut through the end of the 1971–72 NBA season, the station was the television home of the Boston Celtics. In 1970 and 1971, WSMW broadcast (same-weekend tape-delayed coverage of) New England Patriots preseason games. WSMW also offered extensive coverage of college baske |
https://en.wikipedia.org/wiki/Natural%20region | A natural region (landscape unit) is a basic geographic unit. Usually, it is a region which is distinguished by its common natural features of geography, geology, and climate.
From the ecological point of view, the naturally occurring flora and fauna of the region are likely to be influenced by its geographical and geological factors, such as soil and water availability, in a significant manner. Thus most natural regions are homogeneous ecosystems. Human impact can be an important factor in the shaping and destiny of a particular natural region.
Main terms
The concept "natural region" is a large basic geographical unit, like the vast boreal forest region. The term may also be used generically, like in alpine tundra, or specifically to refer to a particular place.
The term is particularly useful where there is no corresponding or coterminous official region. The Fens of eastern England, the Thai highlands, and the Pays de Bray in Normandy, are examples of this. Others might include regions with particular geological characteristics, like badlands, such as the Bardenas Reales, an upland massif of acidic rock, or The Burren, in Ireland.
See also
Ecoregion
Natural regions of Chile
Natural regions of Colombia
Natural regions of Germany
Natural regions of Venezuela
Physiographic regions of the world |
https://en.wikipedia.org/wiki/Daina%20Taimi%C5%86a | Daina Taimiņa (born August 19, 1954) is a Latvian mathematician, retired adjunct associate professor of mathematics at Cornell University, known for developing a way of modeling hyperbolic geometry with crocheted objects.
Education and career
Taimiņa received all of her formal education in Riga, Latvia, where in 1977 she graduated summa cum laude from the University of Latvia and completed her graduate work in Theoretical Computer Science (with thesis advisor Prof. Rūsiņš Mārtiņš Freivalds) in 1990. As one of the restrictions of the Soviet system at that time, a doctoral thesis was not allowed to be defended in Latvia, so she defended hers in Minsk, receiving the title of Candidate of Sciences. This explains the fact that Taimiņa's doctorate was formally issued by the Institute of Mathematics of the National Academy of Sciences of Belarus. After Latvia regained independence in 1991, Taimiņa received her higher doctoral degree (doktor nauk) in mathematics from the University of Latvia, where she taught for 20 years.
Daina Taimiņa joined the Cornell Math Department in December 1996.
Combining her interests in mathematics and crocheting, she is one of 24 mathematicians and artists who make up the Mathemalchemy Team.
Hyperbolic crochet
While attending a geometry workshop at Cornell University about teaching geometry for university professors in 1997, Taimiņa was presented with a fragile paper model of a hyperbolic plane, made by the professor in charge of the workshop, David Henderson (designed by geometer William Thurston.) It was made «out of thin, circular strips of paper taped together». She decided to make more durable models, and did so by crocheting them. The first night after first seeing the paper model at the workshop she began experimenting with algorithms for a crocheting pattern, after visualising hyperbolic planes as exponential growth.
The following fall, Taimiņa was scheduled to teach a geometry class at Cornell. She was determined to find what she |
https://en.wikipedia.org/wiki/Pittsburgh%20toilet | A Pittsburgh toilet, or Pittsburgh potty, is a basement toilet configuration commonly found in the area of Pittsburgh, Pennsylvania, United States. It consists of an ordinary flush toilet with no surrounding walls. Most of these toilets are paired with a crude basement shower apparatus and large sink, which often doubles as a laundry basin.
Origin
The most popular explanation for the Pittsburgh toilet is related to Pittsburgh's status as a major industrial city in the 20th century. According to this explanation, toilets such as these were said to be used by steelworkers and miners who, grimy from the day's labor, could use an exterior door to enter the basement directly from outside and use the basement's shower and toilet before heading upstairs.
Alternatively, they may have served to prevent sewage backups from flooding the living areas of homes. As sewage backups tend to flood the lowest fixture in a residence, a Pittsburgh toilet would be the fixture to overflow, containing the sewage leak in the basement. |
https://en.wikipedia.org/wiki/The%20Unix%20Programming%20Environment | The Unix Programming Environment, first published in 1984 by Prentice Hall, is a book written by Brian W. Kernighan and Rob Pike, both of Bell Labs and considered an important and early document of the Unix operating system.
Unix philosophy
The book addresses the Unix philosophy of small cooperating tools with standardized inputs and outputs. Kernighan and Pike gives a brief description of the Unix design and the Unix philosophy:
The authors further write that their goal for this book is "to communicate the UNIX programming philosophy."
Content and topics
The book starts off with an introduction to Unix for beginners. Next, it goes into the basics of the file system and shell. The reader is led through topics ranging from the use of filters, to how to use C for programming robust Unix applications, and the basics of grep, sed, make, and awk. The book closes with a tutorial on making a programming language parser with yacc and how to use troff with ms and mm to format documents, the preprocessors tbl, eqn, and pic, and making man pages with the man macro set. The appendices cover the ed editor and the abovementioned programming language, named hoc, which stands for "high-order calculator".
Historical context
Although Unix still exists decades after the publication of this book, the book describes an already mature Unix: In 1984, Unix had already been in development for 15 years (since 1969), it had been published in a peer-reviewed journal 10 years earlier (SOSP, 1974, "The UNIX Timesharing System"), and at least seven official editions of its manuals had been published (see Version 7 Unix). In 1984, several commercial and academic variants of UNIX already existed (e.g., Xenix, SunOS, BSD, UNIX System V, HP-UX), and a year earlier Dennis Ritchie and Ken Thompson won the prestigious Turing Award for their work on UNIX. The book was written not when UNIX was just starting out, but when it was already popular enough to be worthy of a book published for the masses o |
https://en.wikipedia.org/wiki/Logical%20Volume%20Manager%20%28Linux%29 | In Linux, Logical Volume Manager (LVM) is a device mapper framework that provides logical volume management for the Linux kernel. Most modern Linux distributions are LVM-aware to the point of being able to have their root file systems on a logical volume.
Heinz Mauelshagen wrote the original LVM code in 1998, when he was working at Sistina Software, taking its primary design guidelines from the HP-UX's volume manager.
Uses
LVM is used for the following purposes:
Creating single logical volumes of multiple physical volumes or entire hard disks (somewhat similar to RAID 0, but more similar to JBOD), allowing for dynamic volume resizing.
Managing large hard disk farms by allowing disks to be added and replaced without downtime or service disruption, in combination with hot swapping.
On small systems (like a desktop), instead of having to estimate at installation time how big a partition might need to be, LVM allows filesystems to be easily resized as needed.
Performing consistent backups by taking snapshots of the logical volumes.
Encrypting multiple physical partitions with one password.
LVM can be considered as a thin software layer on top of the hard disks and partitions, which creates an abstraction of continuity and ease-of-use for managing hard drive replacement, repartitioning and backup.
Features
Basic functionality
Volume groups (VGs) can be resized online by absorbing new physical volumes (PVs) or ejecting existing ones.
Logical volumes (LVs) can be resized online by concatenating extents onto them or truncating extents from them.
LVs can be moved between PVs.
Creation of read-only snapshots of logical volumes (LVM1), leveraging a copy on write (CoW) feature, or read/write snapshots (LVM2)
VGs can be split or merged in situ as long as no LVs span the split. This can be useful when migrating whole LVs to or from offline storage.
LVM objects can be tagged for administrative convenience.
VGs and LVs can be made active as the underlying devic |
https://en.wikipedia.org/wiki/Coordination%20geometry | The coordination geometry of an atom is the geometrical pattern defined by the atoms around the central atom. The term is commonly applied in the field of inorganic chemistry, where diverse structures are observed. The coodination geometry depends on the number, not the type, of ligands bonded to the metal centre as well as their locations. The number of atoms bonded is the coordination number.
The geometrical pattern can be described as a polyhedron where the vertices of the polyhedron are the centres of the coordinating atoms in the ligands.
The coordination preference of a metal often varies with its oxidation state. The number of coordination bonds (coordination number) can vary from two in as high as 20 in .
One of the most common coordination geometries is octahedral, where six ligands are coordinated to the metal in a symmetrical distribution, leading to the formation of an octahedron if lines were drawn between the ligands. Other common coordination geometries are tetrahedral and square planar.
Crystal field theory may be used to explain the relative stabilities of transition metal compounds of different coordination geometry, as well as the presence or absence of paramagnetism, whereas VSEPR may be used for complexes of main group element to predict geometry.
Crystallography usage
In a crystal structure the coordination geometry of an atom is the geometrical pattern of coordinating atoms where the definition of coordinating atoms depends on the bonding model used. For example, in the rock salt ionic structure each sodium atom has six near neighbour chloride ions in an octahedral geometry and each chloride has similarly six near neighbour sodium ions in an octahedral geometry. In metals with the body centred cubic (bcc) structure each atom has eight nearest neighbours in a cubic geometry. In metals with the face centred cubic (fcc) structure each atom has twelve nearest neighbours in a cuboctahedral geometry.
Table of coordination geometries
A table o |
https://en.wikipedia.org/wiki/Fibre%20Channel%20time-out%20values | The FC-PH standard defines three time-out values used for error detection and recovery in Fibre Channel protocol.
E_D_TOV stands for Error Detect TimeOut Value. This is the basic error timeout used for all Fibre Channel error detection. Its default value is 2 seconds.
R_A_TOV stands for Resource Allocation TimeOut Value. This is the amount of time given to devices to allocate the resources needed to process received frames. In practice this may be the time for re-calculation of routing tables in network devices. Its default value is 10 seconds.
R_T_TOV stands for Receiver-Transmitter TimeOut Value. This is the amount of time that the receiver logic uses to determine loss of sync on the wire. Its default value is 100 milliseconds, but can be changed to 100 microseconds.
All devices must use the default values until they have established different values (if they desire) with other devices, usually specified during the login procedure.
Fibre Channel |
https://en.wikipedia.org/wiki/British%20Informatics%20Olympiad | The British Informatics Olympiad (BIO) is an annual computer-programming competition for secondary and sixth-form students. Any student under 19 who is in full-time pre-university education and resident in mainland Britain is eligible to compete. The competition is composed of two rounds - a preliminary 3-question, 3-hour exam paper sat at the participant's school and a final round. The top-15 performing students each year are invited to the finals (currently hosted by Trinity College, Cambridge) where they attempt to solve several more difficult problems, some written, some involving programming. Typically a score of 70 to 80 out of 100 is required on the first round of the competition to reach the final.
Of these fifteen, four are chosen for the British team, and one or two are chosen as reserves. This team goes on to represent Britain in the International Olympiad in Informatics in the summer of that year.
Mark schemes are available for all past papers at the competition's official site. Official worked solutions are available for papers 1995-1999 and 2004, whilst unofficial solutions are available for papers 2009-2014.
Sponsors
The BIO has been sponsored by video-games developer Lionhead Studios since 2002.
In the past, it has also been sponsored by Data Connection.
See also
Young Scientists of the Year |
https://en.wikipedia.org/wiki/Nightmare%20%28Soulcalibur%29 | is a fictional character in the Soulcalibur series of video games. The evil possessor of 's body, he later becomes an entity entirely separated from Siegfried in Soulcalibur III onward and Nightmare is the living incarnation of Soul Edge and a vessel for Inferno.
Nightmare first appeared in one of the possible endings for the Siegfried character in the game Soul Edge. In the sequel Soulcalibur, he was given a name and featured as a central character. Ever since then, Nightmare has been a major antagonist, with his ownership of Soul Edge making him the objective of many other characters in the story. Nightmare has served as a recurring antagonist in contrast to the protagonist role played by Siegfried, as well as serving as Siegfried's archenemy until Soulcalibur V.
Nightmare has appeared in all the sequels to Soulcalibur with visual differences between each game. His fighting style was altered from Soulcalibur II to Soulcalibur III after Siegfried became a separate character.
Conception and design
During the development of Soulcalibur, Nightmare's helmet went through several iterations with an aim to give a sense of "pressure" and "emit an atmosphere that terrifies anyone who sees them." The finalized design was modeled after a unicorn, while his armor implemented imagery of a dragon, which the development team felt symbolized "evil, chaos and power". Early drafts of his design took inspiration from a chimera, with a lion's head on either his shoulder or chest, and one design featuring a snake in place of his right arm, and considered the idea of either head breathing fire or being able to bite opponents. They instead shifted took a more traditional look of a "black knight" with a large mutated arm gripping his sword. Said mutated left arm originally took inspiration from a fiddler crab, giving what the developers felt was a "heteromorphic" appearance to the character. While in the concept artwork Nightmare is depicted with his left arm mutated to hold the sword |
https://en.wikipedia.org/wiki/W.%20Hugh%20Woodin | William Hugh Woodin (born April 23, 1955) is an American mathematician and set theorist at Harvard University. He has made many notable contributions to the theory of inner models and determinacy. A type of large cardinals, the Woodin cardinals, bear his name. In 2023, he was elected to the National Academy of Sciences.
Biography
Born in Tucson, Arizona, Woodin earned his Ph.D. from the University of California, Berkeley in 1984 under Robert M. Solovay. His dissertation title was Discontinuous Homomorphisms of C(Ω) and Set Theory. He served as chair of the Berkeley mathematics department for the 2002–2003 academic year. Woodin is a managing editor of the Journal of Mathematical Logic. He was elected a Fellow of the American Academy of Arts and Sciences in 2000.
He is the great-grandson of William Hartman Woodin, former Secretary of the Treasury.
Work
He has done work on the theory of generic multiverses and the related concept of Ω-logic, which suggested an argument that the continuum hypothesis is either undecidable or false in the sense of mathematical platonism. Woodin criticizes this view arguing that it leads to a counterintuitive reduction in which all truths in the set theoretical universe can be decided from a small part of it. He claims that these and related mathematical results lead (intuitively) to the conclusion that the continuum hypothesis has a truth value and the Platonistic approach is reasonable.
Woodin now predicts that there should be a way of constructing an inner model for almost all known large cardinals, which he calls the Ultimate L and which would have similar properties as Gödel's constructible universe. In particular, the continuum hypothesis would be true in this universe.
Honors
In 2008, Woodin held the Gödel Lecture titled The Continuum Hypothesis, the Conjecture, and the inner model problem of one supercompact cardinal.
See also
AD+ |
https://en.wikipedia.org/wiki/Cabal%20%28set%20theory%29 | The Cabal was, or perhaps is, a set of set theorists in Southern California, particularly at UCLA and Caltech, but also at UC Irvine. Organization and procedures range from informal to nonexistent, so it is difficult to say whether it still exists or exactly who has been a member, but it has included such notable figures as Donald A. Martin, Yiannis N. Moschovakis, John R. Steel, and Alexander S. Kechris. Others who have published in the proceedings of the Cabal seminar include Robert M. Solovay, W. Hugh Woodin, Matthew Foreman, and Steve Jackson.
The work of the group is characterized by free use of large cardinal axioms, and research into the descriptive set theoretic behavior of sets of reals if such assumptions hold.
Some of the philosophical views of the Cabal seminar were described in and .
Publications |
https://en.wikipedia.org/wiki/Endostatin | Endostatin is a naturally occurring, 20-kDa C-terminal fragment derived from type XVIII collagen. It is reported to serve as an anti-angiogenic agent, similar to angiostatin and thrombospondin.
Endostatin is a broad-spectrum angiogenesis inhibitor and may interfere with the pro-angiogenic action of growth factors such as basic fibroblast growth factor (bFGF/FGF-2) and vascular endothelial growth factor (VEGF).
Background
Endostatin is an endogenous inhibitor of angiogenesis. It was first found secreted in the media of non-metastasizing mouse cells from a hemangioendothelioma cell line in 1997 and was subsequently found in humans, e.g. in platelets. It is produced by proteolytic cleavage of collagen XVIII, a member of the multiplexin family that is characterized by interruptions in the triple helix creating multiple domains, by proteases such as cathepsins. Collagen is a component of epithelial and endothelial basement membranes. Endostatin, as a fragment of collagen 18, demonstrates a role of the ECM in suppression of neoangiogenesis. Pro-angiogenic and anti-angiogenic factors can also be created by proteolysis during coagulation cascades. Endogenous inhibitors of angiogenesis are present in both normal tissue and cancerous tissue. Overall, endostatin down regulates many signaling cascades like ephrin, TNF-α, and NFκB signaling as well as coagulation and adhesion cascades. Other collagen derived antiangiogenic factors include arresten, canstatin, tumstatin, α 6 collagen type IV antiangiogenic fragment, and restin.
Structure
Human monomeric endostatin is a globular protein containing two disulfide bonds: Cys162−302 and Cys264−294. It folds tightly, has a zinc binding domain at the N-terminus of the protein, and has a high affinity for heparin through an 11 arginine basic patch. Endostatin also binds all heparan sulfate proteoglycans with low affinity. Oligomeric endostatin (trimer or dimer) binds mainly with laminin of the basal lamina.
Biological a |
https://en.wikipedia.org/wiki/WWIVnet | WWIVnet was a Bulletin board system (BBS) network for WWIV-based BBSes. It was created by Wayne Bell on December 1, 1987. The system was similar to FidoNet in purpose, but used a very different routing mechanism that was more automated and distributed.
Network layout
WWIVnet consisted of several participating BBSes, each referenced by a unique number called a node number. Originally, WWIVnet nodes were numbered by area code. The format was TXYZZ, where X and Y were the first and last digits of the area code, and ZZ was a number that ranged from 00 to 49 in area codes with a middle digit of 0, or a number between 50 and 99 in area codes with a middle digit of 1. The T portion of the node number was only used if a particular area code ran out of node numbers in their assigned range and needed more, the T would become 1. Thus, node 5802 would be a node in the 508 area code, and node 12263 would be a node in a very busy 212 area code. This numbering system worked well until the telephone systems began using area codes that used numbers other than 1 or 0 as the middle digit. When this occurred, WWIVnet realized it had to change its numbering system so a group based system was adopted, where node numbers would change to an XZZZ system. In this system, X would be the group number, and ZZZ would be the system number under that group.
The network's administration was set up where every area code had an Area Coordinator (AC) which was responsible for maintaining information about the nodes in their area code. The AC reported to the Group Coordinator (GC), which was responsible for updating the node lists for the area codes under them. The GC reported to the Network Coordinator (NC), who was responsible for sending out node list updates. The NC was the person who was ultimately in charge of WWIVnet.
The network structure, however, had everything to do with administration but nothing to do with the way traffic was transmitted. Simply put the only way to control th |
https://en.wikipedia.org/wiki/Birth%E2%80%93death%20process | The birth–death process (or birth-and-death process) is a special case of continuous-time Markov process where the state transitions are of only two types: "births", which increase the state variable by one and "deaths", which decrease the state by one. It was introduced by William Feller. The model's name comes from a common application, the use of such models to represent the current size of a population where the transitions are literal births and deaths. Birth–death processes have many applications in demography, queueing theory, performance engineering, epidemiology, biology and other areas. They may be used, for example, to study the evolution of bacteria, the number of people with a disease within a population, or the number of customers in line at the supermarket.
Definition
When a birth occurs, the process goes from state n to n + 1. When a death occurs, the process goes from state n to state n − 1. The process is specified by positive birth rates and positive death rates . Specifically, denote the process by , and . Then for small , the function is assumed to satisfy the following properties:
Recurrence and transience
For recurrence and transience in Markov processes see Section 5.3 from Markov chain.
Conditions for recurrence and transience
Conditions for recurrence and transience were established by Samuel Karlin and James McGregor.
A birth-and-death process is recurrent if and only if
A birth-and-death process is ergodic if and only if
A birth-and-death process is null-recurrent if and only if
By using Extended Bertrand's test (see Section 4.1.4 from Ratio test) the conditions for recurrence, transience, ergodicity and null-recurrence can be derived in a more explicit form.
For integer let denote the th iterate of natural logarithm, i.e. and for any ,
.
Then, the conditions for recurrence and transience of a birth-and-death process are as follows.
The birth-and-death process is transient if there exist and such that for all
where |
https://en.wikipedia.org/wiki/TeraGrid | TeraGrid was an e-Science grid computing infrastructure combining resources at eleven partner sites. The project started in 2001 and operated from 2004 through 2011.
The TeraGrid integrated high-performance computers, data resources and tools, and experimental facilities. Resources included more than a petaflops of computing capability and more than 30 petabytes of online and archival data storage, with rapid access and retrieval over high-performance computer network connections. Researchers could also access more than 100 discipline-specific databases.
TeraGrid was coordinated through the Grid Infrastructure Group (GIG) at the University of Chicago, working in partnership with the resource provider sites in the United States.
History
The US National Science Foundation (NSF) issued a solicitation asking for a "distributed terascale facility" from program director Richard L. Hilderbrandt.
The TeraGrid project was launched in August 2001 with $53 million in funding to four sites: the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign, the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, the University of Chicago Argonne National Laboratory, and the Center for Advanced Computing Research (CACR) at the California Institute of Technology in Pasadena, California.
The design was meant to be an extensible distributed open system from the start.
In October 2002, the Pittsburgh Supercomputing Center (PSC) at Carnegie Mellon University and the University of Pittsburgh joined the TeraGrid as major new partners when NSF announced $35 million in supplementary funding. The TeraGrid network was transformed through the ETF project from a 4-site mesh to a dual-hub backbone network with connection points in Los Angeles and at the Starlight facilities in Chicago.
In October 2003, NSF awarded $10 million to add four sites to TeraGrid as well as to establish a third network hub, in Atlanta. These |
https://en.wikipedia.org/wiki/Resist%20%28semiconductor%20fabrication%29 | In semiconductor fabrication, a resist is a thin layer used to transfer a circuit pattern to the semiconductor substrate which it is deposited upon. A resist can be patterned via lithography to form a (sub)micrometer-scale, temporary mask that protects selected areas of the underlying substrate during subsequent processing steps. The material used to prepare said thin layer is typically a viscous solution. Resists are generally proprietary mixtures of a polymer or its precursor and other small molecules (e.g. photoacid generators) that have been specially formulated for a given lithography technology. Resists used during photolithography are called photoresists.
Background
Semiconductor devices (as of 2005) are built by depositing and patterning many thin layers. The patterning steps, or lithography, define the function of the device and the density of its components.
For example, in the interconnect layers of a modern microprocessor, a conductive material (copper or aluminum) is inlaid in an electrically insulating matrix (typically fluorinated silicon dioxide or another low-k dielectric). The metal patterns define multiple electrical circuits that are used to connect the microchip's transistors to one another and ultimately to external devices via the chip's pins.
The most common patterning method used by the semiconductor device industry is photolithography -- patterning using light. In this process, the substrate of interest is coated with photosensitive resist and irradiated with short-wavelength light projected through a photomask, which is a specially prepared stencil formed of opaque and transparent regions - usually a quartz substrate with a patterned chromium layer. The shadow of opaque regions in the photomask forms a submicrometer-scale pattern of dark and illuminated regions in the resist layer -- the areal image. Chemical and physical changes occur in the exposed areas of the resist layer. For example, chemical bonds may be formed or destroyed, |
https://en.wikipedia.org/wiki/Pertechnetate | The pertechnetate ion () is an oxyanion with the chemical formula . It is often used as a convenient water-soluble source of isotopes of the radioactive element technetium (Tc). In particular it is used to carry the 99mTc isotope (half-life 6 hours) which is commonly used in nuclear medicine in several nuclear scanning procedures.
A technetate(VII) salt is a compound containing this ion. Pertechnetate compounds are salts of technetic(VII) acid. Pertechnetate is analogous to permanganate but it has little oxidizing power. Pertechnetate has higher oxidation power than perrhenate.
Understanding pertechnetate is important in understanding technetium contamination in the environment and in nuclear waste management.
Chemistry
is the starting material for most of the chemistry of technetium. Pertechnetate salts are usually colorless. is produced by oxidizing technetium with nitric acid or with hydrogen peroxide. The pertechnetate anion is similar to the permanganate anion but is a weaker oxidizing agent. It is tetrahedral and diamagnetic. The standard electrode potential for / is only +0.738 V in acidic solution, as compared to +1.695 V for /. Because of its diminished oxidizing power, is stable in alkaline solution. is more similar to . Depending on the reducing agent, can be converted to derivatives containing Tc(VI), Tc(V), and Tc(IV). In the absence of strong complexing ligands, is reduced to a +4 oxidation state via the formation of hydrate.
Preparation of 99mTcO4−
is conveniently available in high radionuclidic purity from molybdenum-99, which decays with 87% probability to . The subsequent decay of leads to either or . can be produced in a nuclear reactor via irradiation of either molybdenum-98 or naturally occurring molybdenum with thermal neutrons, but this is not the method currently in use today. Currently, is recovered as a product of the nuclear fission reaction of , separated from other fission products via a multistep process and loaded ont |
https://en.wikipedia.org/wiki/Metastability%20%28electronics%29 | In electronics, metastability is the ability of a digital electronic system to persist for an unbounded time in an unstable equilibrium or metastable state.
In digital logic circuits, a digital signal is required to be within certain voltage or current limits to represent a '0' or '1' logic level for correct circuit operation; if the signal is within a forbidden intermediate range it may cause faulty behavior in logic gates the signal is applied to. In metastable states, the circuit may be unable to settle into a stable '0' or '1' logic level within the time required for proper circuit operation. As a result, the circuit can act in unpredictable ways, and may lead to a system failure, sometimes referred to as a "glitch". Metastability is an instance of the Buridan's ass paradox.
Metastable states are inherent features of asynchronous digital systems, and of systems with more than one independent clock domain. In self-timed asynchronous systems, arbiters are designed to allow the system to proceed only after the metastability has resolved, so the metastability is a normal condition, not an error condition.
In synchronous systems with asynchronous inputs, synchronizers are designed to make the probability of a synchronization failure acceptably small.
Metastable states are avoidable in fully synchronous systems when the input setup and hold time requirements on flip-flops are satisfied.
Example
A simple example of metastability can be found in an SR NOR latch, when Set and Reset inputs are true (R=1 and S=1) and then both transition to false (R=0 and S=0) at about the same time. Both outputs Q and are initially held at 0 by the simultaneous Set and Reset inputs. After both Set and Reset inputs change to false, the flip-flop will (eventually) end up in one of two stable states, one of Q and true and the other false. The final state will depend on which of R or S returns to zero first, chronologically, but if both transition at about the same time, the resul |
https://en.wikipedia.org/wiki/Cranberry%20sauce | Cranberry sauce or cranberry jam is a sauce or relish made out of cranberries, commonly served as a condiment or a side dish with Thanksgiving dinner in North America and Christmas dinner in the United Kingdom and Canada. There are differences in flavor depending on the geography of where the sauce is made: in Europe it is generally slightly sour-tasting, while in North America it is typically more heavily sweetened.
History
The recipe for cranberry sauce appears in the 1796 edition of American Cookery by Amelia Simmons, the first known cookbook authored by an American.
In 1606, the Mi'kmaq people introduced the French settlers in Port Royal, Nova Scotia to Cranberries. They would have been sweetened with maple sugar and served at the settlers first Thanksgiving in North America that year. The settlers described eating what they called “small red apples" in letters send back to France. Port-Royal reports contained menus describing cranberries. They are still called “pommes de prés”, or meadow apples, today in Acadia.
Although the Pilgrims may have been aware of the wild cranberries growing in the Massachusetts Bay area, it is unlikely that cranberry sauce would have been among the dishes served at the First Thanksgiving meal. Cranberries are not mentioned by any primary sources for the First Thanksgiving meal. The only foods mentioned are "Indian corn", wild turkey and waterfowl, and venison. The rest remains a matter of speculation among food historians. Although stuffings are not mentioned in primary sources, it was a common way to prepare birds for the table in the 17th century. According to a "Thanksgiving Primer" published by the Plimoth Plantation, cranberries may have been used in the stuffing recipes, but it is unlikely they would have been made into a sauce because sugar was very scarce.
Cranberry sauce was first offered to consumers in North America in 1912 in Hanson, Massachusetts. Canned cranberry sauce appeared on the market in 1941, allowing the |
https://en.wikipedia.org/wiki/Special%20right%20triangle | A special right triangle is a right triangle with some regular feature that makes calculations on the triangle easier, or for which simple formulas exist. For example, a right triangle may have angles that form simple relationships, such as 45°–45°–90°. This is called an "angle-based" right triangle. A "side-based" right triangle is one in which the lengths of the sides form ratios of whole numbers, such as 3 : 4 : 5, or of other special numbers such as the golden ratio. Knowing the relationships of the angles or ratios of sides of these special right triangles allows one to quickly calculate various lengths in geometric problems without resorting to more advanced methods.
Angle-based
Angle-based special right triangles are specified by the relationships of the angles of which the triangle is composed. The angles of these triangles are such that the larger (right) angle, which is 90 degrees or radians, is equal to the sum of the other two angles.
The side lengths are generally deduced from the basis of the unit circle or other geometric methods. This approach may be used to rapidly reproduce the values of trigonometric functions for the angles 30°, 45°, and 60°.
Special triangles are used to aid in calculating common trigonometric functions, as below:
The 45°–45°–90° triangle, the 30°–60°–90° triangle, and the equilateral/equiangular (60°–60°–60°) triangle are the three Möbius triangles in the plane, meaning that they tessellate the plane via reflections in their sides; see Triangle group.
45° - 45° - 90° triangle
In plane geometry, constructing the diagonal of a square results in a triangle whose three angles are in the ratio 1 : 1 : 2, adding up to 180° or radians. Hence, the angles respectively measure 45° (), 45° (), and 90° (). The sides in this triangle are in the ratio 1 : 1 : , which follows immediately from the Pythagorean theorem.
Of all right triangles, the 45° - 45° - 90° degree triangle has the smallest ratio of the hypotenuse to the sum of |
https://en.wikipedia.org/wiki/It%27s%20Too%20Late%20to%20Stop%20Now | It's Too Late to Stop Now is a 1974 live double album by Northern Irish singer-songwriter Van Morrison. It features performances that were recorded in concerts at the Troubadour in Los Angeles, California, the Santa Monica Civic Auditorium, and the Rainbow in London, during Morrison's three-month tour with his eleven-piece band, the Caledonia Soul Orchestra, from May to July 1973. Frequently named as one of the best live albums ever, It's Too Late to Stop Now was recorded during what has often been said to be the singer's greatest phase as a live performer.
Volumes II, III and IV of the album were released as a box-set in 2016, also including a DVD.
Tour and performances
Noted for being a mercurial and temperamental live performer, during this short period of time in 1973, Morrison went on one of his most diligent tours in years. With his eleven-piece band, The Caledonia Soul Orchestra, which included a horn and string section, he has often been said to have been at his live performing peak. The tour was a dynamic musical journey for both Van Morrison and the band. "He wasn't looking to repeat himself. He wanted to create a new show every night." said Jef Labes. Van was changing his phrasing with each performance and would be directing the band with hand gestures onstage. "We could take the songs anywhere Van wanted to take them" said guitarist Platania "Every performance of each song was different". The alternate recordings of songs on the various volumes from the 2016 box-set prove his point.
Morrison said about touring during this period:
"I am getting more into performing. It's incredible. When I played Carnegie Hall in the fall something just happened. All of a sudden I felt like 'you're back into performing' and it just happened like that...A lot of times in the past I've done gigs and it was rough to get through them. But now the combination seems to be right and it's been clicking a lot."
Evidence of his newly invigorated joy in performing was on |
https://en.wikipedia.org/wiki/Uniform%20access%20principle | The uniform access principle of computer programming was put forth by Bertrand Meyer (originally in his book Object-Oriented Software Construction). It states "All services offered by a module should be available through a uniform notation, which does not betray whether they are implemented through storage or through computation." This principle applies generally to the syntax of object-oriented programming languages. In simpler form, it states that there should be no syntactical difference between working with an attribute, pre-computed property, or method/query of an object.
While most examples focus on the "read" aspect of the principle (i.e., retrieving a value), Meyer shows that the "write" implications (i.e., modifying a value) of the principle are harder to deal with in his monthly column on the Eiffel programming language official website.
Explanation
The problem being addressed by Meyer involves the maintenance of large software projects or software libraries. Sometimes when developing or maintaining software it is necessary, after much code is in place, to change a class or object in a way that transforms what was simply an attribute access into a method call. Programming languages often use different syntax for attribute access and invoking a method, (e.g., versus ). The syntax change would require, in popular programming languages of the day, changing the source code in all the places where the attribute was used. This might require changing source code in many different locations throughout a very large volume of source code. Or worse, if the change is in an object library used by hundreds of customers, each of those customers would have to find and change all the places the attribute was used in their own code and recompile their programs.
Going the reverse way (from method to simple attribute) really was not a problem, as one can always just keep the function and have it simply return the attribute value.
Meyer recognized the need for software |
https://en.wikipedia.org/wiki/Sender%20Rewriting%20Scheme | The Sender Rewriting Scheme (SRS) is a scheme for bypassing the Sender Policy Framework's (SPF) methods of preventing forged sender addresses. Forging a sender address is also known as email spoofing.
Background
In a number of cases, including change of email address and mailing lists, a message transfer agent (MTA) accepts an email message that is not destined to a local mailbox but needs to be forwarded. In such cases, the question arises of who deserves to receive any related bounce message. In general, that is either the author, or a person or other entity who administers the forwarding itself. Sending bounces to the author is administratively simpler and used to be accomplished by just keeping the original envelope sender. However, if the author address is subject to a strict SPF policy () and the target MTA happens to enforce it, the forwarding transaction can be rejected.
As a workaround, it is possible to synthesize on the fly a temporary bounce address that will direct any bounce back to the current MTA. The scheme provides for recovering the original envelope address, so that if a bounce does arrive, it can be forwarded along the reverse path—with an empty envelope sender this time.
While there are other workarounds, SRS (sender Rewriting Scheme) is a fairly general one. Its notion of reversing the path resembles the original routing dispositions for email, see below.
Please note: Using SRS protocol fails the SPF Alignment check for your DMARC record, and it's by design. Your DMARC record can still pass with a DKIM check.
The rewriting scheme
SRS is a form of variable envelope return path (VERP) inasmuch as it encodes the original envelope sender in the local part of the rewritten address. Consider forwarding a message originally destined to to his new address :
ORIGINAL
:
:
REWRITTEN
:
:
The example above is adapted from Shevek. With respect to VERP, the local part () is moved after her domain name |
https://en.wikipedia.org/wiki/Microdialysis | Microdialysis is a minimally-invasive sampling technique that is used for continuous measurement of free, unbound analyte concentrations in the extracellular fluid of virtually any tissue. Analytes may include endogenous molecules (e.g. neurotransmitter, hormones, glucose, etc.) to assess their biochemical functions in the body, or exogenous compounds (e.g. pharmaceuticals) to determine their distribution within the body. The microdialysis technique requires the insertion of a small microdialysis catheter (also referred to as microdialysis probe) into the tissue of interest. The microdialysis probe is designed to mimic a blood capillary and consists of a shaft with a semipermeable hollow fiber membrane at its tip, which is connected to inlet and outlet tubing. The probe is continuously perfused with an aqueous solution (perfusate) that closely resembles the (ionic) composition of the surrounding tissue fluid at a low flow rate of approximately 0.1-5μL/min. Once inserted into the tissue or (body)fluid of interest, small solutes can cross the semipermeable membrane by passive diffusion. The direction of the analyte flow is determined by the respective concentration gradient and allows the usage of microdialysis probes as sampling as well as delivery tools. The solution leaving the probe (dialysate) is collected at certain time intervals for analysis.
History
The microdialysis principle was first employed in the early 1960s, when push-pull canulas and dialysis sacs were implanted into animal tissues, especially into rodent brains, to directly study the tissues' biochemistry. While these techniques had a number of experimental drawbacks, such as the number of samples per animal or no/limited time resolution, the invention of continuously perfused dialytrodes in 1972 helped to overcome some of these limitations. Further improvement of the dialytrode concept resulted in the invention of the "hollow fiber", a tubular semipermeable membrane with a diameter of ~200-300μm |
https://en.wikipedia.org/wiki/CRT%20projector | A CRT projector is a video projector that uses a small, high-brightness cathode ray tube (CRT) as the image generating element. The image is then focused and enlarged onto a screen using a lens kept in front of the CRT face. The first color CRT projectors came out in the early 1950s. Most modern CRT projectors are color and have three separate CRTs (instead of a single, color CRT), and their own lenses to achieve color images. The red, green and blue portions of the incoming video signal are processed and sent to the respective CRTs whose images are focused by their lenses to achieve the overall picture on the screen. Various designs have made it to production, including the "direct" CRT-lens design, and the Schmidt CRT, which employed a phosphor screen that illuminates a perforated spherical mirror, all within an evacuated cathode ray tube.
The image in the Sinclair Microvision flat CRT is viewed from the same side of the phosphor struck by the electron beam. The other side of the screen can be connected directly to a heat sink, allowing the projector to run at much brighter power levels than the more common CRT arrangement.
Though systems utilizing projected video at one time almost exclusively used CRT projectors, they have largely been replaced by other technologies such as LCD projection and Digital Light Processing. Improvements in these digital video projectors, and their subsequent increased availability and desirability, resulted in a drastic decline of CRT projector sales by the year 2009. , very few (if any) new units are manufactured, though a number of installers do sell refurbished units, generally higher-end 8" and 9" models.
Some of the first CRT projection tubes were made in 1933, and by 1938 CRT projectors were already in use in theaters.
Advantages and disadvantages
Advantages
Long service life; CRTs maintain good brightness to 10,000 hours, although this depends on the contrast adjustment setting of the projector. A projector that is set to |
https://en.wikipedia.org/wiki/Veritas%20Storage%20Foundation | Veritas Storage Foundation (VSF), previously known as Veritas Foundation Suite, is a computer software product made by Veritas Software that combines Veritas Volume Manager (VxVM) and Veritas File System (VxFS) to provide online-storage management. Symantec Corporation developed and maintained VSF until January 29, 2016, at which point Veritas and Symantec separated. The latest product version, 7.0, was re-branded as "Veritas InfoScale 7.0".
Veritas Storage Foundation provides:
Dynamic storage tiering (DST)
Dynamic multipathing (DMP)
RAID support
Major releases
Veritas Storage Foundation was also packaged in bundles such as Veritas Storage Foundation Veritas Cluster Server, for databases, for Oracle RAC, and Veritas Cluster File System.
Veritas InfoScale Enterprise 7.0, December 2015
Veritas Storage Foundation 6.0, December 2011
Veritas Storage Foundation 5.1, December 2009
Veritas Storage Foundation Basic 4.x and 5.x, February 2007, free version, impose usage limits
Veritas Storage Foundation 5.0, July 2006
Veritas Storage Foundation 4.3 (Windows-only release), August 2005
Veritas Storage Foundation 4.2 (Windows-only release), December 2004
Support Microsoft Multipath I/O (MPIO) (only Windows 2003)
Includes Veritas Volume Replicator (VVR)
Veritas Storage Foundation 4.1, May 2004
Veritas Storage Foundation 4.0
Veritas Foundation Suite 3.5
Veritas Foundation Suite 3.4
Veritas Foundation Suite 2.2
Supported OS platforms included AIX, Solaris, HP-UX, Red Hat Linux, SUSE Linux and Microsoft Windows.
See also
Veritas Volume Manager (VxVM)
Veritas File System (VxFS)
Symantec Operations Readiness Tools (SORT) |
https://en.wikipedia.org/wiki/Creamed%20corn | Creamed corn (which is also known by other names, such as cream-style sweet corn) is a type of creamed vegetable dish made by combining pieces of whole sweetcorn with a soupy liquid of milky residue from immature pulped corn kernels scraped from the cob. Originating in Native American cuisine, it is now most commonly eaten in the Midwestern and Southern United States, as well as being used in the French Canadian dish pâté chinois ('Chinese pie': a dish like shepherd's pie). It is a soupy version of sweetcorn, and unlike other preparations of sweetcorn, creamed corn is partially puréed, releasing the liquid contents of the kernels.
Additional ingredients
Canned creamed corn does not usually contain any cream, but some homemade versions may include milk or cream. Sugar and starch may also be added. Commercial, store-bought canned preparations may contain tapioca starch as a thickener.
Gallery
See also
Corn soup
Corn stew
Corn chowder
Cream soup
Grits
List of maize dishes
List of soups |
https://en.wikipedia.org/wiki/Idealised%20population | In population genetics an idealised population is one that can be described using a number of simplifying assumptions. Models of idealised populations are either used to make a general point, or they are fit to data on real populations for which the assumptions may not hold true. For example, coalescent theory is used to fit data to models of idealised populations. The most common idealized population in population genetics is described in the Wright-Fisher model after Sewall Wright and Ronald Fisher (1922, 1930) and (1931). Wright-Fisher populations have constant size, and their members can mate and reproduce with any other member. Another example is a Moran model, which has overlapping generations, rather than the non-overlapping generations of the Fisher-Wright model. The complexities of real populations can cause their behavior to match an idealised population with an effective population size that is very different from the census population size of the real population. For sexual diploids, idealized populations will have genotype frequencies related to the allele frequencies according to Hardy-Weinberg equilibrium.
Hardy-Weinberg
In 1908, G. H. Hardy and Wilhelm Weinberg modeled an idealised population to demonstrate that in the absence of selection, migration, random genetic drift, allele frequencies stay constant over time, and that in the presence of random mating, genotype frequencies are related to allele frequencies according to a binomial square principle called the Hardy-Weinberg law.
Usage in population dynamics
A good example of usage idealised population model, in tracking natural population conditions, could be found in a research of Joe Roman and Stephen R. Palumbi (2003). Using genetic diversity data, they questioned: have populations of North Atlantic great whales recovered enough for commercial whaling? To calculate genetic diversity the authors multiply long term effective population size of the females by two, assuming sex ratio 1:1, and |
https://en.wikipedia.org/wiki/Propidium%20iodide | Propidium iodide (or PI) is a fluorescent intercalating agent that can be used to stain cells and nucleic acids. PI binds to DNA by intercalating between the bases with little or no sequence preference. When in an aqueous solution, PI has a fluorescent excitation maximum of 493 nm (blue-green), and an emission maximum of 636 nm (red). After binding DNA, the quantum yield of PI is enhanced 20-30 fold, and the excitation/emission maximum of PI is shifted to 535 nm (green) / 617 nm (orange-red). Propidium iodide is used as a DNA stain in flow cytometry to evaluate cell viability or DNA content in cell cycle analysis, or in microscopy to visualize the nucleus and other DNA-containing organelles. Propidium Iodide is not membrane-permeable, making it useful to differentiate necrotic, apoptotic and healthy cells based on membrane integrity. PI also binds to RNA, necessitating treatment with nucleases to distinguish between RNA and DNA staining. PI is widely used in fluorescence staining and visualization of the plant cell wall.
See also
Viability assay
Vital stain
SYBR Green I
Ethidium bromide |
https://en.wikipedia.org/wiki/Transverse%20abdominal%20muscle | The transverse abdominal muscle (TVA), also known as the transverse abdominis, transversalis muscle and transversus abdominis muscle, is a muscle layer of the anterior and lateral (front and side) abdominal wall , deep to (layered below) the internal oblique muscle. It is thought by most fitness instructors to be a significant component of the core.
Structure
The transverse abdominal, so called for the direction of its fibers, is the innermost of the flat muscles of the abdomen. It is positioned immediately deep to the internal oblique muscle.
The transverse abdominal arises as fleshy fibers, from the lateral third of the inguinal ligament, from the anterior three-fourths of the inner lip of the iliac crest, from the inner surfaces of the cartilages of the lower six ribs, interdigitating with the diaphragm, and from the thoracolumbar fascia. It ends anteriorly in a broad aponeurosis (the Spigelian fascia), the lower fibers of which curve inferomedially (medially and downward), and are inserted, together with those of the internal oblique muscle, into the crest of the pubis and pectineal line, forming the inguinal conjoint tendon also called the aponeurotic falx. In layman's terms, the muscle ends in the middle line of a person's abdomen.
Throughout the rest of its extent the aponeurosis passes horizontally to the middle line, and is inserted into the linea alba; its upper three-fourths lie behind the rectus muscle and blend with the posterior lamella of the aponeurosis of the internal oblique; its lower fourth is in front of the rectus abdominis.
Innervation
The transverse abdominal is innervated by the lower intercostal nerves (thoracoabdominal, nerve roots T7-T11), as well as the iliohypogastric nerve and the ilioinguinal nerve.
Function
The transverse abdominal helps to compress the ribs and viscera, providing thoracic and pelvic stability. This is explained further here. The transverse abdominal also helps a pregnant woman to deliver her child.
Without a s |
https://en.wikipedia.org/wiki/Wirth%27s%20law | Wirth's law is an adage on computer performance which states that software is getting slower more rapidly than hardware is becoming faster.
The adage is named after Niklaus Wirth, a computer scientist who discussed it in his 1995 article "A Plea for Lean Software".
History
Wirth attributed the saying to Martin Reiser, who in the preface to his book on the Oberon System wrote: "The hope is that the progress in hardware will cure all software ills. However, a critical observer may observe that software manages to outgrow hardware in size and sluggishness." Other observers had noted this for some time before; indeed, the trend was becoming obvious as early as 1987.
He states two contributing factors to the acceptance of ever-growing software as: "rapidly growing hardware performance" and "customers' ignorance of features that are essential versus nice-to-have". Enhanced user convenience and functionality supposedly justify the increased size of software, but Wirth argues that people are increasingly misinterpreting complexity as sophistication, that "these details are cute but not essential, and they have a hidden cost". As a result, he calls for the creation of "leaner" software and pioneered the development of Oberon, a software system developed between 1986 and 1989 based on nothing but hardware. Its primary goal was to show that software can be developed with a fraction of the memory capacity and processor power usually required, without sacrificing flexibility, functionality, or user convenience.
Other names
The law was restated in 2009 and attributed to Google co-founder Larry Page. It has been referred to as Page's law. The first use of that name is attributed to fellow Google co-founder Sergey Brin at the 2009 Google I/O Conference.
Other common forms use the names of the leading hardware and software companies of the 1990s, Intel and Microsoft, or their CEOs, Andy Grove and Bill Gates, for example "What Intel giveth, Microsoft taketh away" and Andy and |
https://en.wikipedia.org/wiki/Metabolic%20disorder | A metabolic disorder is a disorder that negatively alters the body's processing and distribution of macronutrients, such as proteins, fats, and carbohydrates. Metabolic disorders can happen when abnormal chemical reactions in the body alter the normal metabolic process. It can also be defined as inherited single gene anomaly, most of which are autosomal recessive.
Signs and symptoms
Some of the symptoms that can occur with metabolic disorders are lethargy, weight loss, jaundice and seizures. The symptoms expressed would vary with the type of metabolic disorder. There are four categories of symptoms: acute symptoms, late-onset acute symptoms, progressive general symptoms and permanent symptoms.
Causes
Inherited metabolic disorders are one cause of metabolic disorders, and occur when a defective gene causes an enzyme deficiency. These diseases, of which there are many subtypes, are known as inborn errors of metabolism. Metabolic diseases can also occur when the liver or pancreas do not function properly.
Types
The principal classes of metabolic disorders are:
Diagnosis
Metabolic disorders can be present at birth, and many can be identified by routine screening. If a metabolic disorder is not identified early, then it may be diagnosed later in life, when symptoms appear. Specific blood and DNA tests can be done to diagnose genetic metabolic disorders.
The gut microbiota, which is a population of microbes that live in the human digestive system, also has an important part in metabolism and generally has a positive function for its host. In terms of pathophysiological/mechanism interactions, an abnormal gut microbiota can play a role in metabolic disorder related obesity.
Screening
Metabolic disorder screening can be done in newborns via blood, skin, or hearing tests.
Management
Metabolic disorders can be treatable by nutrition management, especially if detected early. It is important for dieticians to have knowledge of the genotype to create a treatment that w |
https://en.wikipedia.org/wiki/Naturalisation%20%28biology%29 | Naturalisation (or naturalization) is the ecological phenomenon through which a species, taxon, or population of exotic (as opposed to native) origin integrates into a given ecosystem, becoming capable of reproducing and growing in it, and proceeds to disseminate spontaneously. In some instances, the presence of a species in a given ecosystem is so ancient that it cannot be presupposed whether it is native or introduced.
Generally, any introduced species may (in the wild) either go extinct or naturalise in its new environment.
Some populations do not sustain themselves reproductively, but exist because of continued influx from elsewhere. Such a non-sustaining population, or the individuals within it, are said to be adventive. Cultivated plants are a major source of adventive populations.
The above refers to naturalize as an intransitive verb, as in, "The species naturalized". In North America it is common to use naturalize as a transitive verb, as in, "City staff naturalized the park". This means to allow an environment to revert to its natural state.
Botany
In botany, naturalisation is the situation in which an exogenous plant reproduces and disperses on its own in a new environment. For example, northern white cedar is naturalised in the United Kingdom, where it reproduces on its own, while it is not in France, where human intervention via cuttings or seeds are essential for its dissemination.
Two categories of naturalisation are defined from two distinct parameters: one, archaeonaturalised, refers to introduction before a given time (introduced over a hundred years ago), while the second, amphinaturalised or eurynaturalised, implies a notion of spatial extension (taxon assimilated indigenous and present over a vast space, opposed to stenonaturalised).
Degrees of naturalisation
The degrees of naturalisation are defined in relation to the status of nativity or introduction of taxons or species;
Accidental taxon: non-native taxon growing spontaneously, |
https://en.wikipedia.org/wiki/Nucleotide%20exchange%20factor | Nucleotide exchange factors (NEFs) are proteins that stimulate the exchange (replacement) of nucleoside diphosphates for nucleoside triphosphates bound to other proteins.
Function
Many cellular proteins cleave (hydrolyze) nucleoside triphosphates–adenosine triphosphate (ATP) or guanosine triphosphate (GTP)–to their diphosphate forms (ADP and GDP) as a source of energy and to drive conformational changes. These changes in turn affect the structural, enzymatic, or signalling properties of the protein.
Nucleotide exchange factors actively assist in the exchange of depleted nucleoside diphosphates for fresh nucleoside triphosphates. NEFs are specific for the nucleotides they exchange (ADP or GDP, but not both) and are often specific to a single protein or class of proteins with which they interact.
See also
Nucleoside-diphosphate kinase
Guanine nucleotide exchange factor |
https://en.wikipedia.org/wiki/Broadcast%20encryption | Broadcast encryption is the cryptographic problem of delivering encrypted content (e.g. TV programs or data on DVDs) over a broadcast channel in such a way that only qualified users (e.g. subscribers who have paid their fees or DVD players conforming to a specification) can decrypt the content. The challenge arises from the requirement that the set of qualified users can change in each broadcast emission, and therefore revocation of individual users or user groups should be possible using broadcast transmissions, only, and without affecting any remaining users. As efficient revocation is the primary objective of broadcast encryption, solutions are also referred to as revocation schemes.
Rather than directly encrypting the content for qualified users, broadcast encryption schemes distribute keying information that allows qualified users to reconstruct the content encryption key whereas revoked users find insufficient information to recover the key. The typical setting considered is that of a unidirectional broadcaster and stateless users (i.e., users do not keep bookmarking of previous messages by the broadcaster), which is especially challenging. In contrast, the scenario where users are supported with a bi-directional communication link with the broadcaster and thus can more easily maintain their state, and where users are not only dynamically revoked but also added (joined), is often referred to as multicast encryption.
The problem of practical broadcast encryption has first been formally studied by Amos Fiat and Moni Naor in 1994. Since then, several solutions have been described in the literature, including combinatorial constructions, one-time revocation schemes based on secret sharing techniques, and tree-based constructions. In general, they offer various trade-offs between the increase in the size of the broadcast, the number of keys that each user needs to store, and the feasibility of an unqualified user or a collusion of unqualified users being able to |
https://en.wikipedia.org/wiki/Globster | A globster or blob is an unidentified organic mass that washes up on the shoreline of an ocean or other body of water. A globster is distinguished from a normal beached carcass by being hard to identify, at least by initial untrained observers, and by creating controversy as to its identity.
History
The term "globster" was coined by Ivan T. Sanderson in 1962 to describe the Tasmanian carcass of 1960, which was said to have "no visible eyes, no defined head, and no apparent bone structure." Other sources simply use the term "blob".
Many globsters have initially been described as resembling gigantic octopuses, though they later turned out to be decayed carcasses of whales or large sharks. As with the "Chilean Blob" of 2003, many are masses of whale blubber released from decaying whale corpses. Others initially thought to be dead plesiosaurs later turned out to be the decayed carcasses of basking sharks. Others remain unexplained. Giant and colossal squid may also explain some globsters, particularly those tentatively identified as monster octopuses.
Some globsters were examined only after they had decomposed too much and seemed to represent evidence of a new species, or were destroyed—as happened with the "Cadborosaurus willsi" carcass, found in 1937. However, Canadian scientists did analyse the DNA of the Newfoundland Blob—which revealed that the tissue was from a sperm whale. In their resulting paper, the authors point out a number of superficial similarities between the Newfoundland Blob and other globsters, concluding a similar origin for those globsters is likely. Analyses of other globsters have yielded similar results.
Notable globsters
The following is a chronological list of carcasses that have been described as globsters or blobs in the literature.
Stronsay Beast (1808) – Believed to be a basking shark carcass.
St. Augustine Monster (1896) – Identified as a whale carcass.
Trunko (1924) – Not identified.
Tasmanian Globster (1960) – Identified as a w |
https://en.wikipedia.org/wiki/Agenesis | In medicine, agenesis () refers to the failure of an organ to develop during embryonic growth and development due to the absence of primordial tissue. Many forms of agenesis are referred to by individual names, depending on the organ affected:
Agenesis of the corpus callosum - failure of the Corpus callosum to develop
Renal agenesis - failure of one or both of the kidneys to develop
Amelia - failure of the arms or legs to develop
Penile agenesis - failure of penis to develop
Müllerian agenesis - failure of the uterus and part of the vagina to develop
Agenesis of the gallbladder - failure of the Gallbladder to develop. A person may not realize they have this condition unless they undergo surgery or medical imaging, since the gallbladder is neither externally visible nor essential.
Eye agenesis
Eye agenesis is a medical condition in which people are born with no eyes.
Dental & oral agenesis
Anodontia, absence of all primary or permanent teeth.
Aglossia, absence of the tongue.
Agnathia, absence of the jaw.
Wisdom tooth agenesis - most adult humans have three molars (on each upper/lower left/right side), with the third being referred to as the wisdom tooth. But many people have less than the four total. Agenesis of wisdom teeth is a normal condition that can differ widely by population, ranging from practically zero in Tasmanian Aborigines to nearly 100% in indigenous Mexicans. (See research paper with world map showing prevalence.)
Ear agenesis
Ear agenesis is a medical condition in which people are born without ears.
Because the middle and inner ears are necessary for hearing, people with complete agenesis of the ears are totally deaf. Minor agenesis that affects only the visible parts of the outer ear, which may be called microtia, typically produces cosmetic concerns and perhaps hearing impairment if the opening to the ear canal is blocked, but not deafness. |
https://en.wikipedia.org/wiki/Court-bouillon | Court-bouillon or court bouillon (in Louisiana, pronounced coo-bee-yon) is a quickly-cooked broth used for poaching other foods, most commonly fish or seafood. It is also sometimes used for poaching vegetables, eggs, sweetbreads, cockscombs, and delicate meats. It includes seasonings and salt but lacks animal gelatin.
Description
Court bouillon loosely translates as 'briefly boiled liquid' (French court) or "short broth" because the cooking time is brief in comparison with a rich and complex stock, and generally is not served as part of the finished dish. Because delicate foods do not cook for very long, it is prepared before the foods are added. Typically, cooking times do not exceed 60 minutes.
Although a court bouillon may become the base for a stock or fumet, in traditional terms it is differentiated by the inclusion of acidulating ingredients such as wine, vinegar, or lemon juice. In addition to contributing their own flavor, acids help to draw flavors from the vegetable aromatics during the short preparation time prior to use. Court bouillon also includes salt and lacks animal gelatin.
Types
Traditionally, court bouillon is water, salt, white wine, vegetable aromatics (mirepoix of carrot, onion, and celery), and flavored with bouquet garni and black pepper.
Court-bouillon need not be elaborate. Court bouillon used to prepare lobster may be as simple as water, salt, lemon juice, and perhaps thyme and bay leaf; that for poached eggs may be salt, water, and vinegar.
In Louisiana Creole and Cajun cuisines, court-bouillon — often spelled "courtbouillon" — refers to a thick, rich fish stew most often prepared with redfish and thickened with roux.
See also
Nage |
https://en.wikipedia.org/wiki/IBM%20Planning%20Analytics | IBM Planning Analytics powered by TM1 (formerly IBM Cognos TM1, formerly Applix TM1, formerly Sinper TM/1) is a business performance management software suite designed to implement collaborative planning, budgeting and forecasting solutions, interactive "what-if" analyses, as well as analytical and reporting applications.
The database server component of the software platform retains its historical name TM1. Data is stored in in-memory multidimensional OLAP cubes, generally at the "leaf" level, and consolidated on demand. In addition to data, cubes can include encoded rules which define any on-demand calculations. By design, computations (typically aggregation along dimensional hierarchies using weighted summation) on the data are performed in near real-time, without the need to precalculate, due to a highly performant database design and calculation engine. These properties also allow the data to be updated frequently and by multiple users.
TM1 is an example of a class of software products which implement the principles of the functional database model. The IBM Planning Analytics platform, in addition to the TM1 database server, includes an ETL tool, server management and monitoring tools and a number of user front ends which provide capabilities designed for common business planning and budgeting requirements, including workflow, adjustments, commentary, etc.
The vendor currently offers the software both as a standalone on-premises product and in the SaaS model on the cloud.
History
While working at Exxon, Lilly Whaley suggested developing a planning system using the IBM mainframe time sharing option (TSO) to replace the previous IMS based planning system and thereby significantly reduce running costs. Manuel "Manny" Perez, who had been in IT for most of his career, took it upon himself to develop a prototype. Right away he realized that in order to provide the multidimensionality and interactivity necessary it would be necessary to keep the data structures i |
https://en.wikipedia.org/wiki/Great%20Canadian%20flag%20debate | The Great Canadian flag debate (or Great Flag Debate) was a national debate that took place in 1963 and 1964 when a new design for the national flag of Canada was chosen.
Although the flag debate had been going on for a long time prior, it officially began on June 15, 1964, when Prime Minister Lester B. Pearson proposed his plans for a new flag in the House of Commons. The debate lasted more than six months, bitterly dividing the people in the process. The debate over the proposed new Canadian flag was ended by closure on December 15, 1964. It resulted in the adoption of the "Maple Leaf" as the Canadian national flag.
The flag was inaugurated on February 15, 1965, a date that has been commemorated as National Flag of Canada Day since 1996.
Background
Union Jack and Red Ensign
The Union Jack served as the formal flag for various colonies in British North America, and remained as the formal national flag of Canada from Confederation to 1965. However, from the late-19th century to 1965, the civil ensign for Canada, the Canadian Red Ensign, was also used as an unofficial national flag and symbol for Canada.
The first Canadian Red Ensigns were used in Prime Minister Sir John A. Macdonald's time. The Governor General at the time of Macdonald's death, Lord Stanley, wrote to London in 1891:
... the Dominion Government has encouraged by precept and example the use on all public buildings throughout the provinces of the Red Ensign with the Canadian badge on the fly... [which] has come to be considered as the recognized flag of the Dominion, both ashore and afloat.
Under pressure from pro-imperial public opinion, Prime Minister Sir Wilfrid Laurier raised the Union Jack over Parliament, where it remained until the re-emergence of the Red Ensign in the 1920s.
William Lyon Mackenzie King tried to adopt a new Canadian national flag in 1925 and 1946, having received a recommendation that came back as a Red Ensign design that replaced the coat of arms of Canada with a gold |
https://en.wikipedia.org/wiki/History%20of%20thermodynamics | The history of thermodynamics is a fundamental strand in, the history of physics, the history of chemistry, and the history of science in general. Owing in the relevance of thermodynamics in much of science and technology, its history is finely woven with the developments of classical mechanics, quantum mechanics, magnetism, and chemical kinetics, to more distant applied fields such as meteorology, information theory, and biology (physiology), and to technological developments such as the steam engine, internal combustion engine, cryogenics and electricity generation. The development of thermodynamics both drove and was driven by atomic theory. It also, albeit in a subtle manner, motivated new directions in probability and statistics; see, for example, the timeline of thermodynamics.
Antiquity
The ancients viewed heat as that related to fire. In 3000 BC, the ancient Egyptians viewed heat as related to origin mythologies. The ancient Indian philosophy including Vedic philosophy believed that five classical elements (or pancha mahā bhūta) are the basis of all cosmic creations. In the Western philosophical tradition, after much debate about the primal element among earlier pre-Socratic philosophers, Empedocles proposed a four-element theory, in which all substances derive from earth, water, air, and fire. The Empedoclean element of fire is perhaps the principal ancestor of later concepts such as phlogiston and caloric. Around 500 BC, the Greek philosopher Heraclitus became famous as the "flux and fire" philosopher for his proverbial utterance: "All things are flowing." Heraclitus argued that the three principal elements in nature were fire, earth, and water.
Vacuum-abhorrence
The 5th century BC Greek philosopher Parmenides, in his only known work, a poem conventionally titled On Nature, uses verbal reasoning to postulate that a void, essentially what is now known as a vacuum, in nature could not occur. This view was supported by the arguments of Aristotle, but was |
https://en.wikipedia.org/wiki/List%20of%20U.S.%20state%20dogs | Thirteen states of the United States have designated an official state dog breed. Maryland was the first state to name a dog breed as a state symbol, naming the Chesapeake Bay Retriever in 1964. Pennsylvania followed the year after, naming the Great Dane as its official breed. Dog breeds are mostly affiliated with the states that they originated in. North Carolina chose the Plott Hound as it was the only dog breed indigenous to the state.
Other official state dogs also are indigenous to their state, including the Boston Terrier (Massachusetts) and the Alaskan Malamute (Alaska). Pennsylvania selected the Great Dane not because of its origin, but because it was introduced by early settlers in the state to be used as a hunting and working dog; it was chosen over the Beagle, which was also proposed around the same time.
Two of the more recent successful campaigns to name a state dog have been started by schoolchildren. In 2007, Alaskan kindergarten student Paige Hill's idea created the campaign for the Alaskan Malamute which would convince Representative Berta Gardner to support the bill in 2009, with it becoming law in 2010. Elementary school students from Bedford, New Hampshire won their campaign for the Chinook to be accepted as a symbol of their state in 2010.
There have been a variety of campaigns in other states to select a state dog. Georgia was undecided about choosing a state dog in 1991, with an attempt to make the Golden Retriever the official dog failing after a vote in the Georgia State Senate; an opposing campaign promoted the Bulldog, the mascot of the University of Georgia. The campaign to make the Siberian Husky the Washington state dog failed in the Washington House of Representatives in 2004. In January 2019, Minnesota partnered with charity Pawsitivity Service Dogs to introduce a bill to make the Labrador Retriever the State Dog.
In 2006, New York State Assembly member Vincent Ignizio suggested that New York should adopt a dog as a state symbol, |
https://en.wikipedia.org/wiki/Genealogical%20DNA%20test | A genealogical DNA test is a DNA-based genetic test used in genetic genealogy that looks at specific locations of a person's genome in order to find or verify ancestral genealogical relationships, or (with lower reliability) to estimate the ethnic mixture of an individual. Since different testing companies use different ethnic reference groups and different matching algorithms, ethnicity estimates for an individual vary between tests, sometimes dramatically.
Three principal types of genealogical DNA tests are available, with each looking at a different part of the genome and being useful for different types of genealogical research: autosomal (atDNA), mitochondrial (mtDNA), and Y-chromosome (Y-DNA).
Autosomal tests may result in a large number of DNA matches to both males and females who have also tested with the same company. Each match will typically show an estimated degree of relatedness, i.e., a close family match, 1st-2nd cousins, 3rd-4th cousins, etc. The furthest degree of relationship is usually the "6th-cousin or further" level. However, due to the random nature of which, and how much, DNA is inherited by each tested person from their common ancestors, precise relationship conclusions can only be made for close relations. Traditional genealogical research, and the sharing of family trees, is typically required for interpretation of the results. Autosomal tests are also used in estimating ethnic mix.
MtDNA and Y-DNA tests are much more objective. However, they give considerably fewer DNA matches, if any (depending on the company doing the testing), since they are limited to relationships along a strict female line and a strict male line respectively. MtDNA and Y-DNA tests are utilized to identify archeological cultures and migration paths of a person's ancestors along a strict mother's line or a strict father's line. Based on MtDNA and Y-DNA, a person's haplogroup(s) can be identified. The mtDNA test can be taken by both males and females, because everyo |
https://en.wikipedia.org/wiki/Pyrena | A pyrena or pyrene (commonly called a "pit" or "stone") is the fruitstone within a drupe or drupelet produced by the ossification of the endocarp or lining of the fruit. It consists of a hard endocarp tissue surrounding one or more seeds (also called the "kernel"). The hardened endocarp which constitutes the pyrene provides a protective physical barrier around the seed, shielding it from pathogens and herbivory.
While many drupes are monopyrenous, containing only one pyrene, pome-type fruit with a hard, stony (rather than leathery) endocarp are typically polypyrenous drupes, containing multiple pyrenes.
Development
The hardening of the endocarp of a developing drupe occurs via secondary cell wall formation and lignification. The biopolymer lignin, also found in wood, provides a structure within secondary cell walls which supports the polymerisation of cellulose and hemicellulose; together these polymers provide the endocarp with tensile strength and stiffness. Further hardening occurs during the biomineralisation of the endocarp. The biomineralisation of pyrenes during the life of the plant can aid the preservation of fruit remains in archaeological findings.
Gallery
See also
Nut (fruit) |
https://en.wikipedia.org/wiki/Advanced%20Telecommunications%20Computing%20Architecture | Advanced Telecommunications Computing Architecture (ATCA or AdvancedTCA) is the largest specification effort in the history of the PCI Industrial Computer Manufacturers Group (PICMG), with more than 100 companies participating. Known as AdvancedTCA, the official specification designation PICMG 3.x (see below) was ratified by the PICMG organization in December 2002. AdvancedTCA is targeted primarily to requirements for "carrier grade" communications equipment, but has recently expanded its reach into more ruggedized applications geared toward the military/aerospace industries as well. This series of specifications incorporates the latest trends in high speed interconnect technologies, next-generation processors, and improved Reliability, Availability and Serviceability (RAS).
Mechanical specifications
An AdvancedTCA board (blade) is 280 mm deep and 322 mm high. The boards have a metal front panel and a metal cover on the bottom of the printed circuit board to limit electromagnetic interference and to limit the spread of fire. The locking injector-ejector handle (lever) actuates a microswitch to let the Intelligent Platform Management Controller (IPMC) know that an operator wants to remove a board, or that the board has just been installed, thus activating the hot-swap procedure. AdvancedTCA boards support the use of PCI Mezzanine Card (PMC) or Advanced Mezzanine Card (AMC) expansion mezzanines.
The shelf supports RTMs (Rear Transition Modules). RTMs plug into the back of the shelf in slot locations that match the front boards. The RTM and the front board are interconnected through a Zone-3 connector. The Zone-3 connector is not defined by the AdvancedTCA specification.
Each shelf slot is 30.48 mm wide. This allows for 14-board chassis to be installed in a 19-inch rack-mountable system and 16 boards in an ETSI rack-mountable system. A typical 14-slot system is 12 or 13 rack units high. The large AdvancedTCA shelves are targeted to the telecommunication market s |
https://en.wikipedia.org/wiki/Gable%20stone | Gable stones (Dutch gevelstenen) are carved and often colourfully painted stone tablets, which are set into the walls of buildings, usually at about 4 metres from the ground. They serve both to identify and embellish the building. They are also called "stone tablets" by the Rijksmuseum, which sometimes appends "from a facade". A "wall stone" is another suggested translation from the Dutch term.
The content of gable stones may explain something about the house's owner and are a feature of the urban fabric of Amsterdam. Some 2,500 of these stones can still be found in the Netherlands, of which around 850 are in Amsterdam and 250 in Maastricht, while others are also found in cities such as Brussels, Liège, Lille, Oslo, Bergen, Munich, Copenhagen, Bucharest, Zurich, Stockholm and Warsaw.
History
Gable stones came into use in the 16th century, in the days before house numbers, taking over from hanging signs as a way of simultaneously and memorably identifying and adorning a house.
The tradition is alive and has moved with the times – new stones are still commissioned, and for instance the Rabobank at Frederiksplein 54 in Amsterdam wistfully commemorates the introduction of the euro with a stone entitled De eerste en de laatste gulden (The first and the last guilder), created by Zutphen sculptor Hans 't Mannetje.
In Amsterdam, many gable stones have been conserved by the Vereniging Vrienden van Amsterdamse Gevelstenen (VVAG) or Friends of Amsterdam Gable Stones.
Features
They normally combine a picture with an inscription, or sometimes just a date. Some illustrate the name or profession of the owner, for instance a quill pen as a badge for an author, or a ship for a sailor. Some are named after notable people (The King of Bohemia) or faraway trading destinations (Königsberg). Some stones act as talismans, quoting from holy scripture. A pious motto repeatedly found on Dutch gable stones is Nooit Volmaakt (Never Perfect), a testimony to the householder's belief that on |
https://en.wikipedia.org/wiki/Arame | , sea oak is a species of kelp, of the brown algae, best known for its use in Japanese cuisine.
Description
Eisenia bicyclis is indigenous to temperate Pacific Ocean waters centered near Japan, although it is deliberately cultured elsewhere, including South Korea. It grows and reproduces seasonally. Two flattened oval fronds rise from a stiff woody stipe which can be up to about tall. The fronds are shed and new ones formed annually. The plant appears both branched and feathered. It may be harvested by divers manually or mechanically, and the dried form is available year-round.
Cuisine
It is one of many species of seaweed used in Asian cuisine.
Usually purchased in a dried state, it is reconstituted quickly, taking about five minutes. Arame comes in dark brown strands, has a mild, semi-sweet flavor, and a firm texture. It is added to appetizers, casseroles, muffins, pilafs, soups, toasted dishes, and many other types of food. Its mild flavor makes it adaptable to many uses.
Chemistry
Arame is high in calcium, iodine, iron, magnesium, and vitamin A as well as being a dietary source of many other minerals. It also is harvested for alginate, fertilizer and iodide. It contains the storage polysaccharide laminarin and the tripeptide eisenin, a peptide with immunological activity.
Lignan content in arame is noted by several sources. It also contains the phlorotannins phlorofucofuroeckol A, dioxinodehydroeckol, fucofuroeckol A, eckol, dieckol, triphloroethol A and 7-phloroethol. Extracts of this algae have been tested to combat MRSA staph infections.
See also
Edible seaweed
Seafood allergy |
https://en.wikipedia.org/wiki/Corrugator%20supercilii%20muscle | The corrugator supercilii muscle is a small, narrow, pyramidal muscle of the face. It arises from the medial end of the superciliary arch; it inserts into the deep surface of the skin of the eyebrow.
It draws the eyebrow downward and medially, producing the vertical "frowning" wrinkles of the forehead. It may be thought as the principal muscle in the facial expression of suffering. It also shields the eyes from strong sunlight.
Structure
The corrugator supercilii muscle is located at the medial end of the eyebrow. Its fibers pass laterally and somewhat superiorly from its origin to its insertion.
Origin
It arises from bone at the medial extremity of the superciliary arch.
Insertion
It inserts between the palpebral and orbital portions of the orbicularis oculi muscle. It inserts into the deep surface of the skin of the eyebrow, above the middle of the orbital arch.
Innervation
Motor innervation is provided by the temporal branches of facial nerve (CN VII).
Vasculature
The muscle receives arterial supply from adjacent arteries - mostly from the superficial temporal artery, and the ophthalmic artery.
Relations
It is situated deep to the frontalis muscle (of the occipitofrontalis muscle) and orbicularis oculi muscle, Its fibres are situated between the palpebral and orbital portions of the orbicularis oculi muscle.
The supratrochlear nerve passes between this muscle and the frontalis muscle.
Function
The muscle acts in tandem with the orbicularis oculi muscle. The corrugator supercilii muscle acts upon the skin of the forehead superior to the middle of the supraorbital margin, drawing the eyebrow inferomedially to produce vertical wrinkles of the forehead just superior to the nose. It is the "frowning" muscle, and may be regarded as the principal muscle in the expression of suffering. It also contracts to prevent high sun glare, pulling the eyebrows toward the bridge of the nose, making a roof over the area above the middle corner of the eye and typical |
https://en.wikipedia.org/wiki/Depressor%20labii%20inferioris%20muscle | The depressor labii inferioris (or quadratus labii inferioris) is a facial muscle. It helps to lower the bottom lip.
Structure
The depressor labii inferioris muscle arises from the lateral surface of the mandible. This is below the mental foramen, and the origin may be around 3 cm wide. It inserts on the skin of the lower lip, blending in with the orbicularis oris muscle around 2 cm wide. At its origin, depressor labii is continuous with the fibers of the platysma muscle. Some yellow fat is intermingled with the fibers.
Nerve supply
The depressor labii inferioris muscle is supplied by the marginal mandibular branch of the facial nerve.
Function
The depressor labii inferioris muscle helps to depress and everts the lower lip. It is the most important of the muscles of the lower lip for this function. It is an antagonist of the orbicularis oris muscle. It is needed to expose the mandibular (lower) teeth during smiling.
Clinical significance
Resection
The depressor labii inferioris muscle may be resected (cut and removed) using surgery to correct an asymmetry of the lower lip when smiling. This asymmetry can be caused by paralysis of the marginal mandibular branch of the facial nerve on one side, so the healthy side may be cut to create symmetry. Local anaesthesia may be used, such as by blocking the mental nerve. This operation tends to be successful.
History
The depressor labii inferioris muscle has also (mainly historically) been called the quadratus labii inferioris muscle.
See also
Facial muscles
Depressor anguli oris muscle
Additional images |
https://en.wikipedia.org/wiki/Geniohyoid%20muscle | The geniohyoid muscle is a narrow paired muscle situated superior to the medial border of the mylohyoid muscle. It is named for its passage from the chin ("genio-" is a standard prefix for "chin") to the hyoid bone.
Structure
The geniohyoid is a paired short muscle that arises from the inferior mental spine, on the back of the mandibular symphysis, and runs backward and slightly downward, to be inserted into the anterior surface of the body of the hyoid bone. It lies in contact with its fellow of the opposite side. It thus belongs to the suprahyoid muscles. The muscle receives its blood supply from branches of the lingual artery.
Innervation
The geniohyoid muscle is innervated by fibres from the first cervical spinal nerve travelling alongside the hypoglossal nerve. Although the first three cervical nerves give rise to the ansa cervicalis, the geniohyoid muscle is said to be innervated by the first cervical nerve, as some of its efferent fibers do not contribute to ansa cervicalis.
Variations
It may be blended with the one on opposite side or double; slips to greater cornu of hyoid bone and genioglossus occur.
Function
The geniohyoid muscle brings the hyoid bone forward and upwards. This dilates the upper airway, assisting respiration. During the first act of deglutition, when the mass of food is being driven from the mouth into the pharynx, the hyoid bone, and with it the tongue, is carried upward and forward by the anterior bellies of the Digastrici, the Mylohyoidei, and Geniohyoidei. It also assists in depressing the mandible.
History
The inclined position of the geniohyoid muscle has been contrasted to the horizontal position in neanderthals.
Additional images
See also |
https://en.wikipedia.org/wiki/Mylohyoid%20muscle | The mylohyoid muscle or diaphragma oris is a paired muscle of the neck. It runs from the mandible to the hyoid bone, forming the floor of the oral cavity of the mouth. It is named after its two attachments near the molar teeth. It forms the floor of the submental triangle. It elevates the hyoid bone and the tongue, important during swallowing and speaking.
Structure
The mylohyoid muscle is flat and triangular, and is situated immediately superior to the anterior belly of the digastric muscle. It is a pharyngeal muscle (derived from the first pharyngeal arch) and classified as one of the suprahyoid muscles. Together, the paired mylohyoid muscles form a muscular floor for the oral cavity of the mouth.
The two mylohyoid muscles arise from the mandible at the mylohyoid line, which extends from the mandibular symphysis in front to the last molar tooth behind. The posterior fibers pass inferomedially and insert at anterior surface of the hyoid bone. The medial fibres of the two mylohyoid muscles unite in a midline raphe (where the two muscles intermesh).
The mylohyoid muscle separates the sublingual space from the submandibular space, which communicate via a lateral gap between the mylohyoid and hyoglossus muscles at the posterior free margin of mylohyoid muscle. The submandibular gland wraps around the edges of the mylohyoid, and is divided into superficial and deep lobes above and below the muscle.
Nerve supply
The mylohyoid muscle is supplied by a branch of the mandibular nerve, the inferior alveolar nerve. The mylohyoid nerve is a branch of the inferior alveolar nerve. The mylohyoid nerve emerges to give motor supply to the mylohyoid muscle.
Development
The mylohyoid muscles are derived from embryonic mesoderm, specifically the first pharyngeal arch.
Variations
The mylohyoid muscle may be united to or replaced by the anterior belly of the digastric muscle; accessory slips to other hyoid muscles are frequent. This median raphé is sometimes absent; the fibers o |
https://en.wikipedia.org/wiki/Stylohyoid%20muscle | The stylohyoid muscle is one of the suprahyoid muscles. Its originates from the styloid process of the temporal bone; it inserts onto hyoid bone. It is innervated by a branch of the facial nerve. It acts draw the hyoid bone upwards and backwards.
Structure
The stylohyoid is a slender muscle. It is directed inferoanteriorly from its origin towards its insertion.
It is perforated near its insertion by the intermediate tendon of the digastric muscle.
Origin
The muscle arises from the posterior surface of the temporal styloid process; it arises near the base of the process. It arises by a small tendon of origin.
Insertion
The muscle inserts onto the body of hyoid bone at the junction of the body and greater cornu.
The site of insertion is situated immediately superior to that of the superior belly of omohyoid muscle.
Vasculature
The stylohyoid muscle receives arterial supply branches of the facial artery, posterior auricular artery, and occipital artery.
Innervation
The stylohyoid muscle receives motor innervation from the stylohyoid branch of facial nerve (CN VII).
Relations
The muscle is situated anterosuperior to the posterior belly of the digastric muscle.
Variation
It may be absent or doubled. It may be situated medial to the carotid artery. It may insert suprahyoid muscles of infrahyoid muscles.
Actions/movements
The stylohyoid muscle elevates and retracts the hyoid bone (i.e. draws it superiorly and posteriorly).
Function
The stylohyoid muscle elongates the floor of the mouth. It initiates a swallowing.
Additional images
See also
Stylohyoid ligament |
https://en.wikipedia.org/wiki/Garage%20kit | A garage kit (ガレージキット) or resin kit is an assembly scale model kit most commonly cast in polyurethane resin.
They are often model figures portraying humans or other living creatures. In Japan, kits often depict anime characters, and in the United States, depictions of movie monsters are common. However, kits can be produced depicting a wide range of subjects, from characters in horror, science fiction, fantasy films, television and comic books to nudes, pin-up girls and original works of art, as well as upgrade and conversion kits for existing models and airsoft guns.
Originally garage kits were amateur-produced, and the term originated with dedicated hobbyists using their garages as workshops. Unable to find model kits of subjects they wanted on the market, they began producing kits of their own. As the market expanded, professional companies began making similar kits. Sometimes a distinction is made between true garage kits, made by amateurs, and resin kits, manufactured professionally by companies.
Because of the labor-intensive casting process, garage kits are usually produced in limited numbers and are more expensive than injection-molded plastic kits. The parts are glued together using cyanoacrylate (Super Glue) or an epoxy cement and the completed figure is painted. Some figures are sold completed, but most commonly they are sold in parts for the buyer to assemble and finish.
Japan
Japanese garage kits are often anime figures depicting popular characters. Another major subject is "Kaiju" monsters such as Godzilla, and they may also include subjects such as mecha and science fiction spaceships. Garage kits can be as simple as a one piece figure, or as complex as kits with well over one hundred parts. Most commonly they are cast in polyurethane resin, but may also be fabricated of diverse substances such as soft vinyl, white metal (a type of lead alloy) and fabric.
Originally the kits were sold and traded between hobbyists at conventions like Wonder Festiv |
https://en.wikipedia.org/wiki/Levatores%20costarum%20muscles | The Levatores costarum (), twelve in number on either side, are small tendinous and fleshy bundles, which arise from the ends of the transverse processes of the seventh cervical and upper eleven thoracic vertebrae
They pass obliquely downward and laterally, like the fibers of the Intercostales externi, and each is inserted into the outer surface of the rib immediately below the vertebra from which it takes origin, between the tubercle and the angle (Levatores costarum breves).
Each of the four lower muscles divides into two fasciculi, one of which is inserted as above described; the other passes down to the second rib below its origin (Levatores costarum longi).
They have a role in forceful inspiration.
See also
Iliocostalis
Interspinales muscles
Intertransversarii muscle
Longissimus
Spinalis |
https://en.wikipedia.org/wiki/Interspinales%20muscles | The interspinales are short muscle fascicles, found in pairs between the spinous processes of the contiguous vertebrae, one on either side of the interspinal ligament.
In the cervical region the cervical interspinales are most distinct, and consist of six pairs, the first being situated between the axis and third vertebra, and the last between the seventh cervical and the first thoracic. They are small narrow bundles, attached, above and below, to the apices of the spinous processes.
In the thoracic region the thoracic interspinales are found between the first and second vertebrae, and sometimes between the second and third, and between the eleventh and twelfth.
In the lumbar region there are four pairs of lumbar interspinales in the intervals between the five lumbar vertebrae. There is also occasionally one between the last thoracic and first lumbar, and one between the fifth lumbar and the sacrum.
See also
Intertransversarii
Iliocostalis
Longissimus
Spinalis
Levatores costarum |
https://en.wikipedia.org/wiki/Intertransversarii | The intertransversarii are small muscles placed between the transverse processes of the vertebrae.
Structure
Cervical
In the cervical region they are best developed, consisting of rounded muscular and tendinous fasciculi, and are placed in pairs, passing between the anterior and the posterior tubercles respectively of the transverse processes of two contiguous vertebrae, and separated from one another by an anterior primary division of the cervical nerve, which lies in the groove between them.
The muscles connecting the anterior tubercles are termed the anterior intertransversarii.
Those between the posterior tubercles are termed the posterior intertransversarii.
Both sets are supplied by the anterior rami of the spinal nerves.
There are seven pairs of these muscles, the first pair being between the atlas and axis, and the last pair between the seventh cervical and first thoracic vertebræ.
Thoracic
In the thoracic region they are present between the transverse processes of the lower three thoracic vertebrae, and between the transverse processes of the last thoracic and the first lumbar. These are called the thoracic intertransversarii and are supplied by the posterior rami of the spinal nerves.
Lumbar
In the lumbar region they are arranged in pairs, on either side of the vertebral column,
one set occupying the entire interspace between the transverse processes of the lumbar vertebrae, are the lateral lumbar intertransversarii.
the other set, the medial lumbar intertransversarii, passing from the accessory process of one vertebra to the mammillary of the vertebra below.
The intertransversarii laterales are supplied by the anterior rami, and the intertransversarii mediales by the posterior rami of the spinal nerves.
Function
They contribute little to no movement on their own, but they stabilize adjoining vertebrae allowing more effective action from other muscle groups.
See also
Iliocostalis
Interspinales muscles
Levatores costarum muscles
Longissim |
https://en.wikipedia.org/wiki/Dissociated%20press | Dissociated press is a parody generator (a computer program that generates nonsensical text). The generated text is based on another text using the Markov chain technique. The name is a play on "Associated Press" and the psychological term dissociation (although word salad is more typical of conditions like aphasia and schizophrenia – which is, however, frequently confused with dissociative identity disorder by laypeople).
An implementation of the algorithm is available in Emacs. Another implementation is available as a Perl module in CPAN, Games::Dissociate.
The algorithm
The algorithm starts by printing a number of consecutive words (or letters) from the source text. Then it searches the source text for an occurrence of the few last words or letters printed out so far. If multiple occurrences are found, it picks a random one, and proceeds with printing the text following the chosen occurrence. After a predetermined length of text is printed out, the search procedure is repeated for the newly printed ending.
Considering that words and phrases tend to appear in specific grammatical contexts, the resulting text usually seems correct grammatically, and if the source text is uniform in style, the result appears to be of similar style and subject, and takes some effort on the reader's side to recognize as not genuine. Still, the randomness of the assembly process deprives it of any logical flow - the loosely related parts are connected in a nonsensical way, creating a humorously abstract, random result.
Examples
Here is a short example of word-based Dissociated Press applied to the Jargon File:
wart: n. A small, crocky feature that sticks out of an array (C has no checks for this). This is relatively benign and easy to spot if the phrase is bent so as to be not worth paying attention to the medium in question.
Here is a short example of letter-based Dissociated Press applied to the same source:
window sysIWYG: n. A bit was named aften /bee´t@/ prefer to use the oth |
https://en.wikipedia.org/wiki/Stepper | A stepper is a device used in the manufacture of integrated circuits (ICs) that is similar in operation to a slide projector or a photographic enlarger. Stepper is short for step-and-repeat camera. Steppers are an essential part of the complex process, called photolithography, which creates millions of microscopic circuit elements on the surface of silicon wafers out of which chips are made. These chips form the heart of ICs such as computer processors, memory chips, and many other devices.
The stepper emerged in the late 1970s but did not become widespread until the 1980s. This was because it was replacing an earlier technology, the mask aligner. Aligners imaged the entire surface of a wafer at the same time, producing many chips in a single operation. In contrast, the stepper imaged only one chip at a time, and was thus much slower to operate. The stepper eventually displaced the aligner when the relentless forces of Moore's Law demanded that smaller feature sizes be used. Because the stepper imaged only one chip at a time it offered higher resolution and was the first technology to exceed the 1 micron limit. The addition of auto-alignment systems reduced the setup time needed to image multiple ICs, and by the late 1980s, the stepper had almost entirely replaced the aligner in the high-end market.
The stepper was itself replaced by the step-and-scan systems (scanners) which offered an additional order of magnitude resolution advance, and work by scanning only a small portion of the mask for an individual IC, and thus require much longer operation times than the original steppers. These became widespread during the 1990s and essentially universal by the 2000s. Today, step-and-scan systems are so widespread that they are often simply referred to as steppers.
History
1957: Attempts to miniaturize electronic circuits started back in 1957 when Jay Lathrop and James Nall of the U.S. Army's Diamond Ordnance Fuse Laboratories were granted a US2890395A patent for a ph |
https://en.wikipedia.org/wiki/Conditioned%20taste%20aversion | Conditioned taste aversion occurs when an animal acquires an aversion to the taste of a food that was paired with aversive stimuli. The Garcia effect explains that the aversion develops more strongly for stimuli that cause nausea than other stimuli. This is considered an adaptive trait or survival mechanism that enables the organism to avoid poisonous substances (e.g., poisonous berries) before they cause harm. The aversion reduces consuming the same substance (or something that tastes similar) in the future, thus avoiding poisoning.
Studies on conditioned taste aversion that involved irradiating rats were conducted in the 1950s by Dr. John Garcia, leading to it sometimes being called the Garcia effect.
Conditioned taste aversion can occur when sickness is merely coincidental to, and not caused by, the substance consumed. For example, a person who becomes very sick after consuming vodka-and-orange-juice cocktails may then become averse to the taste of orange juice, even though the sickness was caused by the over-consumption of alcohol. Under these circumstances, conditioned taste aversion is sometimes known as the "Sauce-Bearnaise Syndrome", a term coined by Seligman and Hager.
Garcia's study
While studying the effects of radiation on various behaviors in the mid to late 1950s, Dr. Garcia noticed that rats developed an aversion to substances consumed prior to being irradiated. To examine this, Garcia put together a study in which three groups of rats were given sweetened water followed by either no radiation, mild radiation, or strong radiation. When rats were subsequently given a choice between sweetened water and regular tap water, rats who had been exposed to radiation drank much less sweetened water than those who had not.
This finding was surprising in that the aversion could occur after just a single trial and with a long delay between the stimuli. Most research at the time found that learning required multiple trials and shorter latencies. Many scienti |
https://en.wikipedia.org/wiki/Vic%20Rattlehead | Vic Rattlehead is the illustrated mascot of the American thrash metal band Megadeth. Vic is a skeletal figure wearing a suit who embodies the phrase "See no evil, hear no evil, speak no evil" as well as a symbol of censorship. His eyes are covered by a riveted-on visor, his mouth is clamped shut, and his ears are closed with metal caps.
Concept and creation
The mythic creation of Vic Rattlehead is addressed in the song "The Skull Beneath the Skin" from the album Killing Is My Business... and Business Is Good!
Dave Mustaine sketched the original drawing of Vic for the album's front cover. However, Combat Records lost the artwork and improvised a completely different concept. The original artwork was recovered and placed on the reissue of Killing Is My Business... and Business Is Good!. The name of Vic stands for "victim" and Rattlehead comes from what Mustaine's mother used to say to him when he was headbanging: "Don't do that or you'll rattle something loose up there!" This then led to the expression "to rattle one's head" meaning head-bang. According to Mustaine, the mascot represents his feelings about religious repression and freedom of expression.
Appearances
Vic was on the cover art of the band's first four albums (1985–1990): Killing Is My Business... And Business Is Good!, Peace Sells... But Who's Buying?, So Far, So Good... So What!, and Rust in Peace.
Vic did not appear on the front cover of any albums or compilations from 1991 to 2000. However, when Megadeth tried to bring back a more "classic" vibe to their material, he returned for the 2001 album The World Needs a Hero, the 2004 album The System Has Failed, as well as the 2007 studio album United Abominations in a human form. He is identified by his visor, metal caps on his ears, and clamps on his mouth. Only Vic's face is shown in the mushroom cloud on the 2005 compilation Greatest Hits: Back to the Start. Vic was once again absent from the cover of Endgame, instead found in the album's booklet much |
https://en.wikipedia.org/wiki/Clay%20Shirky | Clay Shirky (born 1964) is an American writer, consultant and teacher on the social and economic effects of Internet technologies and journalism.
In 2017 he was appointed Vice Provost of Educational Technologies of New York University (NYU), after serving as Chief Information Officer at NYU Shanghai from 2014 to 2017. He also is an associate professor at the Arthur L. Carter Journalism Institute and Associate Arts Professor at the Tisch School of the Arts' Interactive Telecommunications Program. His courses address, among other things, the interrelated effects of the topology of social networks and technological networks, how our networks shape culture and vice versa.
He has written and been interviewed about the Internet since 1996. His columns and writings have appeared in Business 2.0, The New York Times, The Wall Street Journal, the Harvard Business Review and Wired. Shirky divides his time between consulting, teaching, and writing on the social and economic effects of Internet technologies. His consulting practice is focused on the rise of decentralized technologies such as peer-to-peer, web services, and wireless networks that provide alternatives to the wired client–server infrastructure that characterizes the World Wide Web. He is a member of the Wikimedia Foundation's advisory board. In The Long Tail, Chris Anderson calls Shirky "a prominent thinker on the social and economic effects of Internet technologies."
Education and career
After graduating from Yale University with a Bachelor of Arts degree in fine art in 1986, he moved to New York. In the 1990s he founded the Hard Place Theater, a theatre company that produced non-fiction theater using only found materials such as government documents, transcripts and cultural records and also worked as a lighting designer for other theater and dance companies, including the Wooster Group, Elevator Repair Service and Dana Reitz. During this time, Shirky was vice-president of the New York chapter of the Electron |
https://en.wikipedia.org/wiki/Regenerative%20medicine | Regenerative medicine deals with the "process of replacing, engineering or regenerating human or animal cells, tissues or organs to restore or establish normal function". This field holds the promise of engineering damaged tissues and organs by stimulating the body's own repair mechanisms to functionally heal previously irreparable tissues or organs.
Regenerative medicine also includes the possibility of growing tissues and organs in the laboratory and implanting them when the body cannot heal itself. When the cell source for a regenerated organ is derived from the patient's own tissue or cells, the challenge of organ transplant rejection via immunological mismatch is circumvented. This approach could alleviate the problem of the shortage of organs available for donation.
Some of the biomedical approaches within the field of regenerative medicine may involve the use of stem cells. Examples include the injection of stem cells or progenitor cells obtained through directed differentiation (cell therapies); the induction of regeneration by biologically active molecules administered alone or as a secretion by infused cells (immunomodulation therapy); and transplantation of in vitro grown organs and tissues (tissue engineering).
History
The ancient Greeks postulated whether parts of the body could be regenerated in the 700s BC. Skin grafting, invented in the late 19th century, can be thought of as the earliest major attempt to recreate bodily tissue to restore structure and function. Advances in transplanting body parts in the 20th century further pushed the theory that body parts could regenerate and grow new cells. These advances led to tissue engineering, and from this field, the study of regenerative medicine expanded and began to take hold. This began with cellular therapy, which led to the stem cell research that is widely being conducted today.
The first cell therapies were intended to slow the aging process. This began in the 1930s with Paul Niehans, a Swiss |
https://en.wikipedia.org/wiki/Cell%20therapy | Cell therapy (also called cellular therapy, cell transplantation, or cytotherapy) is a therapy in which viable cells are injected, grafted or implanted into a patient in order to effectuate a medicinal effect, for example, by transplanting T-cells capable of fighting cancer cells via cell-mediated immunity in the course of immunotherapy, or grafting stem cells to regenerate diseased tissues.
Cell therapy originated in the nineteenth century when scientists experimented by injecting animal material in an attempt to prevent and treat illness. Although such attempts produced no positive benefit, further research found in the mid twentieth century that human cells could be used to help prevent the human body rejecting transplanted organs, leading in time to successful bone marrow transplantation as has become common practice in treatment for patients that have compromised bone marrow after disease, infection, radiation or chemotherapy. In recent decades, however, stem cell and cell transplantation has gained significant interest by researchers as a potential new therapeutic strategy for a wide range of diseases, in particular for degenerative and immunogenic pathologies.
Background
Cell therapy can be defined as therapy in which cellular material is injected or otherwise transplanted into a patient. The origins of cell therapy can perhaps be traced to the nineteenth century, when Charles-Édouard Brown-Séquard (1817–1894) injected animal testicle extracts in an attempt to stop the effects of aging. In 1931 Paul Niehans (1882–1971) – who has been called the inventor of cell therapy – attempted to cure a patient by injecting material from calf embryos. Niehans claimed to have treated many people for cancer using this technique, though his claims have never been validated by research.
In 1953 researchers found that laboratory animals could be helped not to reject organ transplants by pre-inoculating them with cells from donor animals; in 1968, in Minnesota, the first su |
https://en.wikipedia.org/wiki/Encryption%20software | Encryption software is software that uses cryptography to prevent unauthorized access to digital information. Cryptography is used to protect digital information on computers as well as the digital information that is sent to other computers over the Internet.
Classification
There are many software products which provide encryption. Software encryption uses a cipher to obscure the content into ciphertext. One way to classify this type of software is the type of cipher used. Ciphers can be divided into two categories: public key ciphers (also known as asymmetric ciphers), and symmetric key ciphers. Encryption software can be based on either public key or symmetric key encryption.
Another way to classify software encryption is to categorize its purpose. Using this approach, software encryption may be classified into software which encrypts "data in transit" and software which encrypts "data at rest". Data in transit generally uses public key ciphers, and data at rest generally uses symmetric key ciphers.
Symmetric key ciphers can be further divided into stream ciphers and block ciphers. Stream ciphers typically encrypt plaintext a bit or byte at a time, and are most commonly used to encrypt real-time communications, such as audio and video information. The key is used to establish the initial state of a keystream generator, and the output of that generator is used to encrypt the plaintext. Block cipher algorithms split the plaintext into fixed-size blocks and encrypt one block at a time. For example, AES processes 16-byte blocks, while its predecessor DES encrypted blocks of eight bytes.
There is also a well-known case where PKI is used for data in transit of data at rest.
Data in transit
Data in transit is data that is being sent over a computer network. When the data is between two endpoints, any confidential information may be vulnerable. The payload (confidential information) can be encrypted to secure its confidentiality, as well as its integrity and valid |
https://en.wikipedia.org/wiki/Bruch%27s%20membrane | Bruch's membrane or lamina vitrea is the innermost layer of the choroid of the eye. It is also called the vitreous lamina or Membrane vitriae, because of its glassy microscopic appearance. It is 2–4 μm thick.
Anatomy
Structure
Bruch's membrane consists of five layers (from inside to outside):
the basement membrane of the retinal pigment epithelium
the inner collagenous zone
a central band of elastic fibers
the outer collagenous zone
the basement membrane of the choriocapillaris
Development
The membrane grows thicker with age. With age, lipid-containing extracellular deposits may accumulate between the membrane and the basal lamina of the retinal pigmental epithelium, impairing exchange of solutes and contributing to age-related pathology.
Embryology
Bruch's membrane is present by midterm in fetal development as an elastic sheet.
Function
The membrane is involved in the regulation of fluid and solute passage from the choroid to the retina.
Pathology
Bruch's membrane thickens with age, slowing the transport of metabolites. This may lead to the formation of drusen in age-related macular degeneration. There is also a buildup of deposits (Basal Linear Deposits or BLinD and Basal Lamellar Deposits BLamD) on and within the membrane, primarily consisting of phospholipids. The accumulation of lipids appears to be greater in the central fundus than in the periphery. This build up seems to fragment the membrane into a lamellar structure more like puff-pastry than a barrier. Inflammatory and neovascular mediators can then invite choroidal vessels to grow into and beyond the fragmented membrane. This neovascular membrane destroys the architecture of the outer retina and leads to sudden loss of central vision – wet age related macular degeneration.
Pseudoxanthoma elasticum, myopia and trauma can also cause defects in Bruch's membrane which may lead to choroidal neovascularization. Alport's Syndrome, a genetic disorder affecting the alpha(IV) collagen chains, can also |
https://en.wikipedia.org/wiki/Surface%20metrology | Surface metrology is the measurement of small-scale features on surfaces, and is a branch of metrology. Surface primary form, surface fractality, and surface finish (including surface roughness) are the parameters most commonly associated with the field. It is important to many disciplines and is mostly known for the machining of precision parts and assemblies which contain mating surfaces or which must operate with high internal pressures.
Surface finish may be measured in two ways: contact and non-contact methods. Contact methods involve dragging a measurement stylus across the surface; these instruments are called profilometers. Non-contact methods include: interferometry, digital holography, confocal microscopy, focus variation, structured light, electrical capacitance, electron microscopy, photogrammetry and non-contact profilometers.
Overview
The most common method is to use a diamond stylus profilometer. The stylus is run perpendicular to the lay of the surface. The probe usually traces along a straight line on a flat surface or in a circular arc around a cylindrical surface. The length of the path that it traces is called the measurement length. The wavelength of the lowest frequency filter that will be used to analyze the data is usually defined as the sampling length. Most standards recommend that the measurement length should be at least seven times longer than the sampling length, and according to the Nyquist–Shannon sampling theorem it should be at least two times longer than the wavelength of interesting features. The assessment length or evaluation length is the length of data that will be used for analysis. Commonly one sampling length is discarded from each end of the measurement length. 3D measurements can be made with a profilometer by scanning over a 2D area on the surface.
The disadvantage of a profilometer is that it is not accurate when the size of the features of the surface are close to the same size as the stylus. Another disadvantage is |
https://en.wikipedia.org/wiki/Conway%27s%20law | Conway's law is an adage linking the communication structure of organizations to the systems they design. It is named after the computer programmer Melvin Conway, who introduced the idea in 1967. His original wording was:
The law is based on the reasoning that in order for a product to function, the authors and designers of its component parts must communicate with each other in order to ensure compatibility between the components. Therefore, the technical structure of a system will reflect the social boundaries of the organizations that produced it, across which communication is more difficult. In colloquial terms, it means complex products end up "shaped like" the organizational structure they are designed in or designed for. The law is applied primarily in the field of software architecture, though Conway directed it more broadly and its assumptions and conclusions apply to most technical fields.
Variations
Eric S. Raymond, an open-source advocate, restated Conway's law in The New Hacker's Dictionary, a reference work based on the Jargon File. The organization of the software and the organization of the software team will be congruent, he said. Summarizing an example in Conway's paper, Raymond wrote:
Raymond further presents Tom Cheatham's amendment of Conway's Law, stated as:
Yourdon and Constantine, in their 1979 book on Structured Design, gave a more strongly stated variation of Conway's Law:
James O. Coplien and Neil B. Harrison stated in a 2004 book concerned with organizational patterns of Agile software development:
More recent commentators have noted a corollary - for software projects with a long lifetime of code reuse, such as Microsoft Windows, the structure of the code mirrors not only the communication structure of the organization which created the most recent release, but also the communication structures of every previous team which worked on that code.
Interpretations
The law is, in a strict sense, only about correspondence; it does not |
https://en.wikipedia.org/wiki/SLAM%20project | The SLAM project, which was started in 1999 by Thomas Ball and Sriram Rajamani of Microsoft Research, aimed at verifying software safety properties using model checking techniques. It was implemented in OCaml, and has been used to find many bugs in Windows Device Drivers. It is distributed as part of the Microsoft Windows Driver Foundation development kit as the Static Driver Verifier (SDV). "SLAM originally was an acronym but we found it too cumbersome to explain. We now prefer to think of 'slamming' the bugs in a program." It initially stood for "software (specifications), programming languages, abstraction, and model checking". Note that Microsoft has since re-used SLAM to stand for "Social Location Annotation Mobile".
See also
Abstraction model checking
the BLAST model checker, a model checker similar to SLAM that uses "lazy abstraction" |
https://en.wikipedia.org/wiki/VAXstation | The VAXstation is a discontinued family of workstation computers developed and manufactured by Digital Equipment Corporation using processors implementing the VAX instruction set architecture. VAXstation systems were typically shipped with either the OpenVMS or ULTRIX operating systems. Many members of the VAXstation family had corresponding MicroVAX variants, which primarily differ by the lack of graphics hardware.
VAXstation 100
The VAXstation 100 was a VAXstation-branded graphics terminal introduced in May 1983. It used a Motorola 68000 microprocessor and connected to its VAX host via Unibus. It was used for developing the X Window System.
VAXstation 500
The VAXstation 500 was a VAXstation system with color graphics, introduced in March 1985. It consisted of a MicroVAX I and a Tektronix 4125 colour terminal.
VAXstation 520
The VAXstation 520 was a follow-on to the VAXstation 500 which used a MicroVAX II as the host system instead of a MicroVAX I. At the time of its introduction in September 1985, a configuration with 2MB of memory, a 32MB hard disk and two 400KB floppy disk drives cost $40,790.
VAXstation I
Introduced in October 1984, it was code named "Seahorse", and used the KD32 CPU module containing a 4 MHz (250 ns) MicroVAX I processor.
VAXstation II
Code named "Mayflower", it used the KA630 CPU module containing a 5 MHz (200 ns) MicroVAX 78032 microprocessor. It was essentially a MicroVAX II in a workstation configuration.
VAXstation II/RC
A short-lived, lower-cost "Reduced Configuration" variant of the VAXstation II. Compared with the standard VAXstation II, a number of the slots on the backplane were filled with epoxy to limit the system's upgradability. It was discontinued when Digital discovered that enterprising customers were removing the epoxy, or replacing the backplane in order to convert the RC into a standard VAXstation II.
VAXstation II/GPX
Introduced in December 1985, it was code named "Caylith", and was a variant of the VAXstat |
https://en.wikipedia.org/wiki/Real-Time%20Multiprogramming%20Operating%20System | Real-Time Multiprogramming Operating System (RTMOS) was a 24-bit process control operating system developed in the 1960s by General Electric that supported both real-time computing and multiprogramming. Programming was done in assembly language or Process FORTRAN. The two languages could be used in the same program, allowing programmers to alternate between the two as desired.
Multiprogramming operating systems are now considered obsolete, having been replaced by multitasking. |
https://en.wikipedia.org/wiki/Nash%E2%80%93Moser%20theorem | In the mathematical field of analysis, the Nash–Moser theorem, discovered by mathematician John Forbes Nash and named for him and Jürgen Moser, is a generalization of the inverse function theorem on Banach spaces to settings when the required solution mapping for the linearized problem is not bounded.
Introduction
In contrast to the Banach space case, in which the invertibility of the derivative at a point is sufficient for a map to be locally invertible, the Nash–Moser theorem requires the derivative to be invertible in a neighborhood. The theorem is widely used to prove local existence for non-linear partial differential equations in spaces of smooth functions. It is particularly useful when the inverse to the derivative "loses" derivatives, and therefore the Banach space implicit function theorem cannot be used.
History
The Nash–Moser theorem traces back to , who proved the theorem in the special case of the isometric embedding problem. It is clear from his paper that his method can be generalized. , for instance, showed that Nash's methods could be successfully applied to solve problems on periodic orbits in celestial mechanics in the KAM theory. However, it has proven quite difficult to find a suitable general formulation; there is, to date, no all-encompassing version; various versions due to Gromov, Hamilton, Hörmander, Saint-Raymond, Schwartz, and Sergeraert are given in the references below. That of Hamilton's, quoted below, is particularly widely cited.
The problem of loss of derivatives
This will be introduced in the original setting of the Nash–Moser theorem, that of the isometric embedding problem. Let be an open subset of Consider the map
given by
In Nash's solution of the isometric embedding problem (as would be expected in the solutions of nonlinear partial differential equations) a major step is a statement of the schematic form "If f is such that P(f) is positive-definite, then for any matrix-valued function g which is close to P(f), there e |
https://en.wikipedia.org/wiki/Interrupt%20storm | In operating systems, an interrupt storm is an event during which a processor receives an inordinate number of interrupts that consume the majority of the processor's time. Interrupt storms are typically caused by hardware devices that do not support interrupt rate limiting.
Background
Because interrupt processing is typically a non-preemptible task in time-sharing operating systems, an interrupt storm will cause sluggish response to user input, or even appear to freeze the system completely. This state is commonly known as live lock. In such a state, the system is spending most of its resources processing interrupts instead of completing other work. To the end-user, it does not appear to be processing anything at all as there is often no output. An interrupt storm is sometimes mistaken for thrashing, since they both have similar symptoms (unresponsive or sluggish response to user input, little or no output).
Common causes include: misconfigured or faulty hardware, faulty device drivers, flaws in the operating system, or metastability in one or more components. The latter condition rarely occurs outside of prototype or amateur-built hardware.
Most modern hardware and operating systems have methods for mitigating the effect of an interrupt storm. For example, most Ethernet controllers implement interrupt "rate limiting", which causes the controller to wait a programmable amount of time between each interrupt it generates. When not present within the device, similar functionality is usually written into the device driver, and/or the operating system itself.
The most common cause is when a device "behind" another signals an interrupt to an APIC (Advanced Programmable Interrupt Controller). Most computer peripherals generate interrupts through an APIC as the number of interrupts is most always less (typically 15 for the modern PC) than the number of devices. The OS must then query each driver registered to that interrupt to ask if the interrupt originated from its h |
https://en.wikipedia.org/wiki/Kaonic%20hydrogen | Kaonic hydrogen is an exotic atom consisting of a negatively charged kaon orbiting a proton.
Such particles were first identified, through their X-ray spectrum, at the KEK proton synchrotron in Tsukuba, Japan in 1997.
More detailed studies have been performed at DAFNE in Frascati, Italy.
Kaonic hydrogen has been created in very low energy collisions of kaons with the protons in a gaseous hydrogen target. At DAFNE, kaons are produced by the decay of φ mesons which are in turn created in collisions between electrons and positrons. The experiments analyzed X-rays from several electronic transitions in kaonic hydrogen.
Unlike in the hydrogen atom, where the binding between electron and proton is dominated by the electromagnetic interaction, kaons and protons interact also to a large extent by the strong interaction.
In kaonic hydrogen this strong contribution was found to be repulsive, shifting the ground state energy by 283 ± 36 (statistical) ± 6 (systematic) eV, thus making the system unstable with a resonance width of 541 ± 89 (stat) ± 22 (syst) eV (decay into Λπ and Σπ).
Kaonic hydrogen is studied mainly because of its importance for the understanding of kaon-nucleon interactions and for testing quantum chromodynamics.
See also
Kaonium
Pionic helium |
https://en.wikipedia.org/wiki/Kaonium | Kaonium is an exotic atom consisting of a bound state of a positively charged and a negatively charged kaon. Kaonium has not been observed experimentally and is expected to have a short lifetime on the order of 10−18 seconds. |
https://en.wikipedia.org/wiki/List%20of%20types%20of%20XML%20schemas | This is a list of notable XML schemas in use on the Internet sorted by purpose. XML schemas can be used to create XML documents for a wide range of purposes such as syndication, general exchange, and storage of data in a standard format.
Bookmarks
XBEL - XML Bookmark Exchange Language
Brewing
BeerXML - a free XML based data description standard for the exchange of brewing data
Business
Auto-lead Data Format - for communicating consumer purchase requests to automotive dealerships.
ACORD data standards - Insurance Industry XML schemas specifications by Association for Cooperative Operations Research and Development
Europass XML - XML vocabulary describing the information contained in a Curriculum Vitae (CV), Language Passport (LP) and European Skills Passport (ESP)
OSCRE - Open Standards Consortium for Real Estate format for data exchange within the real estate industry
UBL - Defining a common XML library of business documents (purchase orders, invoices, etc.) by Oasis
XBRL Extensible Business Reporting Language for International Financial Reporting Standards (IFRS) and United States generally accepted accounting principles (GAAP) business accounting.
Elections
EML - Election Markup Language, is an OASIS standard to support end-to-end management of election processes. It defines over thirty schemas, for example EML 510 for vote count reporting and EML 310 for voter registration.
Engineering
gbXML - an open schema developed to facilitate transfer of building data stored in Building Information Models (BIMs) to engineering analysis tools.
IFC-XML - Building Information Models for architecture, engineering, construction, and operations.
XMI - an Object Management Group (OMG) standard for exchanging metadata information, commonly used for exchange of UML information
XTCE - XML Telemetric and Command Exchange is an XML based data exchange format for spacecraft telemetry and command meta-data
Financial
FIXatdl - FIX algorithmic trading definition language. |
https://en.wikipedia.org/wiki/ESTREAM | eSTREAM is a project to "identify new stream ciphers suitable for widespread adoption", organised by the EU ECRYPT network. It was set up as a result of the failure of all six stream ciphers submitted to the NESSIE project. The call for primitives was first issued in November 2004. The project was completed in April 2008. The project was divided into separate phases and the project goal was to find algorithms suitable for different application profiles.
Profiles
The submissions to eSTREAM fall into either or both of two profiles:
Profile 1: "Stream ciphers for software applications with high throughput requirements"
Profile 2: "Stream ciphers for hardware applications with restricted resources such as limited storage, gate count, or power consumption."
Both profiles contain an "A" subcategory (1A and 2A) with ciphers that also provide authentication in addition to encryption. In Phase 3 none of the ciphers providing authentication are being considered (The NLS cipher had authentication removed from it to improve its performance).
eSTREAM portfolio
the following ciphers make up the eSTREAM portfolio:
These are all free for any use. Rabbit was the only one that had a patent pending during the eStream competition, but it was released into the public domain in October 2008.
The original portfolio, published at the end of Phase 3, consisted of the above ciphers plus F-FCSR which was in Profile 2. However, cryptanalysis of F-FCSR led to a revision of the portfolio in September 2008 which removed that cipher.
Phases
Phase 1
Phase 1 included a general analysis of all submissions with the purpose of selecting a subset of the submitted designs for further scrutiny. The designs were scrutinized based on criteria of security, performance (with respect to the block cipher AES—a US Government approved standard, as well as the other candidates), simplicity and flexibility, justification and supporting analysis, and clarity and completeness of the documentation. Submi |
https://en.wikipedia.org/wiki/Mathematical%20joke | A mathematical joke is a form of humor which relies on aspects of mathematics or a stereotype of mathematicians. The humor may come from a pun, or from a double meaning of a mathematical term, or from a lay person's misunderstanding of a mathematical concept. Mathematician and author John Allen Paulos in his book Mathematics and Humor described several ways that mathematics, generally considered a dry, formal activity, overlaps with humor, a loose, irreverent activity: both are forms of "intellectual play"; both have "logic, pattern, rules, structure"; and both are "economical and explicit".
Some performers combine mathematics and jokes to entertain and/or teach math.
Humor of mathematicians may be classified into the esoteric and exoteric categories. Esoteric jokes rely on the intrinsic knowledge of mathematics and its terminology. Exoteric jokes are intelligible to the outsiders, and most of them compare mathematicians with representatives of other disciplines or with common folk.
Pun-based jokes
Some jokes use a mathematical term with a second non-technical meaning as the punchline of a joke.
Occasionally, multiple mathematical puns appear in the same jest:
This invokes four double meanings: adder (snake) vs. addition (algebraic operation); multiplication (biological reproduction) vs. multiplication (algebraic operation); log (a cut tree trunk) vs. log (logarithm); and table (set of facts) vs. table (piece of furniture).
Other jokes create a double meaning from a direct calculation involving facetious variable names, such as this retold from Gravity's Rainbow:
The first part of this joke relies on the fact that the primitive (formed when finding the antiderivative) of the function 1/x is log(x). The second part is then based on the fact that the antiderivative is actually a class of functions, requiring the inclusion of a constant of integration, usually denoted as C—something which calculus students may forget. Thus, the indefinite integral of 1/cabin i |
https://en.wikipedia.org/wiki/William%20Duncan%20MacMillan | William Duncan MacMillan (July 24, 1871 – November 14, 1948) was an American mathematician and astronomer on the faculty of the University of Chicago. He published research on the applications of classical mechanics to astronomy, and is noted for pioneering speculations on physical cosmology. For the latter, Helge Kragh noted, "the cosmological model proposed by MacMillan was designed to lend support to a cosmic optimism, which he felt was threatened by the world view of modern physics."
Biography
He was born in La Crosse, Wisconsin, to D. D. MacMillan, who was in the lumber business, and Mary Jane McCrea. His brother, John H. MacMillan, headed the Cargill Corporation from 1909 to 1936. MacMillan graduated from La Crosse High School in 1888. In 1889, he attended Lake Forest College, then entered the University of Virginia. Later in 1898, he earned an A.B. degree from Fort Worth University, which was then a Methodist university in Texas. He performed his graduate work at the University of Chicago, earning a master's degree in 1906 and a PhD in astronomy in 1908. In 1907, prior to completing his PhD, he joined the staff of the University of Chicago as a research assistant in geology. In 1908, he became an associate in mathematics, then in 1909, he began instruction in astronomy at the same institution. His career as a professor began in 1912 when he became an assistant professor. In 1917, when the U.S. declared war on Germany, Dr. MacMillan served as a major in the U.S. army's ordnance department during World War I. Following the war, he became associate professor in 1919, then full professor in 1924. MacMillan retired in 1936.
In a 1958 paper about MacMillan's work on cosmology, Richard Schlegel introduced MacMillan as "best known to physicists for his three-volume Classical Mechanics" that remained in print for decades after MacMillan's 1936 retirement. MacMillan published extensively on the mathematics of the orbits of planets and stars. In the 1920s, MacMilla |
https://en.wikipedia.org/wiki/Active%20shape%20model | Active shape models (ASMs) are statistical models of the shape of objects which iteratively deform to fit to an example of the object in a new image, developed by Tim Cootes and Chris Taylor in 1995. The shapes are constrained by the PDM (point distribution model) Statistical Shape Model to vary only in ways seen in a training set of labelled examples.
The shape of an object is represented by a set of points (controlled by the shape model). The ASM algorithm aims to match the model to a new image.
The ASM works by alternating the following steps:
Generate a suggested shape by looking in the image around each point for a better position for the point. This is commonly done using what is called a "profile model", which looks for strong edges or uses the Mahalanobis distance to match a model template for the point.
Conform the suggested shape to the point distribution model, commonly called a "shape model" in this context. The figure to the right shows an example.
The technique has been widely used to analyse images of faces, mechanical assemblies and medical images (in 2D and 3D).
It is closely related to the active appearance model. It is also known as a "Smart Snakes" method, since it is an analog to an active contour model which would respect explicit shape constraints.
See also
Procrustes analysis
Point distribution model |
https://en.wikipedia.org/wiki/White%20box%20%28software%20engineering%29 | A white box (or glass box, clear box, or open box) is a subsystem whose internals can be viewed but usually not altered. The term is used in systems engineering, software engineering, and in intelligent user interface design, where it is closely related to recent interest in explainable artificial intelligence.
Having access to the subsystem internals in general makes the subsystem easier to understand, but also easier to hack; for example, if a programmer can examine source code, weaknesses in an algorithm are much easier to discover. That makes white-box testing much more effective than black-box testing but considerably more difficult from the sophistication needed on the part of the tester to understand the subsystem.
The notion of a "Black Box in a Glass Box" was originally used as a metaphor for teaching complex topics to computing novices.
See also
Black box
Gray-box testing |
https://en.wikipedia.org/wiki/Liver%20biopsy | Liver biopsy is the biopsy (removal of a small sample of tissue) from the liver. It is a medical test that is done to aid diagnosis of liver disease, to assess the severity of known liver disease, and to monitor the progress of treatment.
Medical uses
Liver biopsy is often required for the diagnosis of a liver problem (jaundice, abnormal blood tests) where blood tests, such as hepatitis A serology, have not been able to identify a cause. It is also required if hepatitis is possibly the result of medication, but the exact nature of the reaction is unclear. Alcoholic liver disease and tuberculosis of the liver may be diagnosed through biopsy. Direct biopsy of tumors of the liver may aid the diagnosis, although this may be avoided if the source is clear (e.g. spread from previously known colorectal cancer). Liver biopsy will likely remain particularly important in the diagnosis of unexplained liver disease. Non-invasive tests for liver fibrosis in alcoholic, nonalcoholic and viral liver diseases are likely to become more widely used.
If the diagnosis is already clear, such as chronic hepatitis B or hepatitis C, liver biopsy is useful to assess the severity of the associated liver damage. The same is true for haemochromatosis (iron overload), although it is frequently omitted. Primary biliary cirrhosis and primary sclerosing cholangitis may require biopsy, although other diagnostic modalities have made this less necessary.
Occasionally, liver biopsy is required to monitor the progress of treatment, such as in chronic viral hepatitis. It is an effective way to measure changes in the Ishak fibrosis score.
For the last century liver biopsy has been considered as the gold standard for assessing the stage and the grade of chronic liver disease. Consensus conference statements recommended liver biopsy in the management of almost all patients with hepatitis C and B.
Biopsy results show significant variability (up to 40% for fibrosis diagnosis) which can lead to a wrong di |
https://en.wikipedia.org/wiki/White%20box%20%28computer%20hardware%29 | In computer hardware, a white box is a personal computer or server without a well-known brand name.
The term is usually applied to systems assembled by small system integrators and to homebuilt computer systems assembled by end users from parts purchased separately at retail. In this sense, building a white box system is part of the DIY movement.
The term is also applied to high volume production of unbranded PCs that began in the mid-1980s with 8 MHz Turbo XT systems selling for just under $1000.
In 2002, around 30% of personal computers sold annually were white box systems.
Operating systems
While PCs built by system manufacturers generally come with a pre-installed operating system, white boxes from both large and small system vendors and other VAR channels can be ordered with or without a pre-installed OS. Usually when ordered with an operating system, the system builder uses an OEM copy of the OS.
Whitebook or Intel "Common Building Blocks"
Intel defined form factor and interconnection standards for notebook computer components, including "Barebones" (chassis and motherboard), hard disk drive, optical disk drive, LCD, battery pack, keyboard, and AC/DC adapter. These building blocks are primarily marketed to computer building companies, rather than DIY users.
Costs
While saving money is a common motivation for building one's own PC, today it is generally more expensive to build a low-end PC than to buy a pre-built one from a well-known manufacturer, due to the build quality and the total cost of the parts being used. For these reasons, it is usually better to just buy a pre-assembled computer from a well-known manufacturer or brand name rather than just have people build it themselves (unless one has the talent, skills, budget, and the knowledge to do so).
See also
Beige box
Enthusiast computing
Homebuilt computer
White-label product |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.