source stringlengths 31 227 | text stringlengths 9 2k |
|---|---|
https://en.wikipedia.org/wiki/Cisco%20Systems%20VPN%20Client | Cisco Systems VPN Client is a software application for connecting to virtual private networks based on Internet Key Exchange version 1.
On July 29, 2011, Cisco announced the end of life of the product. No further product updates were released after July 30, 2012, and support ceased on July 29, 2014. The Support page with documentation links was taken down on July 30, 2016, replaced with an Obsolete Status Notification.
Availability and compatibility
The software is not free but is often installed on university and business computers in accordance with a site-license. As with most corporate licenses, administrators are allowed to freely distribute the software to users within their network.
The open-source vpnc client can connect to most VPNs supported by the official client.
VPN Client 4.9.01.0230 beta added support for Mac OS X 10.6. Stable version 4.9.01.0180 appears to lack that support; 4.9.00.0050 explicitly did not support versions of Mac OS X later than 10.5.
VPN Client 5.0.07.0290 added support for 64-bit versions of Windows Vista and Windows 7.
Security
The client uses profile configuration files (.pcf) that store VPN passwords either hashed with type 7, or stored as plaintext. A vulnerability has been identified, and those passwords can easily be decoded using software or online services. To work around these issues, network administrators are advised to use the Mutual Group Authentication feature, or use unique passwords (that aren't related to other important network passwords).
See also
Cisco ASA, the product line that replaced Cisco VPN Concentrator on the server side |
https://en.wikipedia.org/wiki/Otto%20Finsch | Friedrich Hermann Otto Finsch (8 August 1839, Warmbrunn – 31 January 1917, Braunschweig) was a German ethnographer, naturalist and colonial explorer. He is known for a two-volume monograph on the parrots of the world which earned him a doctorate. He also wrote on the people of New Guinea and was involved in plans for German colonization in Southeast Asia. Several species of bird (such as Oenanthe finschii, Iole finschii, Psittacula finschii) are named after him as also the town of Finschhafen in Morobe Province, Papua New Guinea and a crater on the Moon.
Biography
Finsch was born at Bad Warmbrunn in Silesia to Mortiz Finsch and Mathilde née Leder. His father was in the glass trade and he too trained as a glass painter. An interest in birds led him to use his artistic skills for the purpose. Finsch went to Budapest in 1857 and studied at the Royal Hungarian University, earning money by preparing natural history specimens. He then spent two years in Russe, Bulgaria on an invitation from the Austrian Consul and gave private tutoring in German while exploring the birdlife of the region. He published his first paper in the Journal fur Ornithologie on the birds of Bulgaria. This experience helped him obtain a curatorial position at the Rijksmuseum van Natuurlijke Historie in Leiden (1862–1865) assisting Herman Schlegel. In 1864 he returned to Germany on the suggestion of Gustav Hartlaub to become curator of the museum in Bremen and became its director in 1876. After publishing the two volume monographs on the parrots of the world, Die Papageien, monographisch bearbeitet (1867–68), he obtained an honorary doctorate from the Friedrich Wilhelms University in Bonn. Apart from ornithology he also took an interest in ethnology. In 1876 he accompanied the zoologist Alfred Brehm on an expedition to Turkestan and northwest China.
Finsch resigned as curator of the museum in 1878 in order that he could resume his travels, sponsored by the Humboldt Foundation. Between spring 1879 |
https://en.wikipedia.org/wiki/Glow%20plug%20%28model%20engine%29 | A glow plug engine, or glow engine, is a type of small internal combustion engine typically used in model aircraft, model cars and similar applications. The ignition is accomplished by a combination of heating from compression, heating from a glow plug and the catalytic effect of the platinum within the glow plug on the methanol within the fuel.
History
German inventor Ray Arden invented the first glow plug for model engines in 1947.
Model glow plug design
The glow plugs used in model engines are significantly different from those used in full-size diesel engines. In full-size engines, the glow plug is used only for starting. In model engines, the glow plug is an integral part of the ignition system because of the catalytic effect of the platinum wire. The glow plug is a durable, mostly platinum, helical wire filament recessed into the plug's tip. When an electric current runs through the plug, or when exposed to the heat of the combustion chamber, the filament glows, enabling it to help ignite the special fuel used by these engines. Power can be applied using a special connector attaching to the outside of the engine, and may use a rechargeable battery or DC power source.
There are three types/shapes (at least) of glow plugs. The standard glow plug, which comes in long/standard and short (for smaller engines), in both open and idle-bar configurations, has a threaded tube that penetrates the combustion chamber to varying degrees. Due to the small size of the combustion chamber changing brands or styles of standard glow plug can affect the compression ratio. Turbo style (European/metric) and Nelson style (North American/English) glow plugs do not penetrate the combustion chamber. Instead they have an angled shoulder that seals against a matching surface at the bottom of the glow plug hole. As a Turbo or Nelson plug is installed and seals the combustion chamber, they create a smooth surface inside the head. This smooth surface is very desirable for high-perf |
https://en.wikipedia.org/wiki/Arecibo%20message | The Arecibo message is an interstellar radio message carrying basic information about humanity and Earth that was sent to the globular cluster Messier 13 in 1974. It was meant as a demonstration of human technological achievement, rather than a real attempt to enter into a conversation with extraterrestrials.
The message was broadcast into space a single time via frequency modulated radio waves at a ceremony to mark the remodeling of the Arecibo Telescope in Puerto Rico on 16 November 1974. The message was aimed at the current location of M13, about 25,000 light years from Earth, because M13 was a large and relatively close collection of stars that was available in the sky at the time and place of the ceremony. When correctly translated into graphics, characters, and spaces, the 1,679 bits of data contained within the message form the image shown here.
Description
The content of the Arecibo message was designed by Frank Drake, then at Cornell University and creator of the Drake equation, who wrote the message with help from Carl Sagan and others. The message was meant more as a demonstration of human technological achievement than a serious attempt to enter into a conversation with possible extraterrestrials. As globular cluster M13, at which the message was aimed, is more than 25,000 light-years from Earth, the message, traveling at the speed of light, will take at least 25,000 years to arrive there. By that time, the core of M13 will no longer be in precisely the same location because of the orbit of the star cluster around the Galactic Center. Even so, the proper motion of M13 is small, so the message will still arrive near the center of the cluster.
The message consists of seven parts that encode the following (from the top down in the image):
The numbers one to ten (white; left to right)
The atomic numbers of the elements hydrogen, carbon, nitrogen, oxygen, and phosphorus, which make up deoxyribonucleic acid (DNA) (purple)
The formulas for the chemical comp |
https://en.wikipedia.org/wiki/MiniDisc | MiniDisc (MD) is an erasable magneto-optical disc-based data storage format offering a capacity of 60, 74, and later, 80 minutes of digitized audio.
Sony announced the MiniDisc in September 1992 and released it in November of that year for sale in Japan and in December in Europe, North America, and other countries. The music format was based on ATRAC audio data compression, Sony's own proprietary compression code. Its successor, Hi-MD, would later introduce the option of linear PCM digital recording to meet audio quality comparable to that of a compact disc. MiniDiscs were very popular in Japan and found moderate success in Europe. Although it was designed to succeed the cassette tape, it did not manage to supplant it globally.
By March 2011 Sony had sold 22 million MD players, but halted further development. Sony ceased manufacturing and sold the last of the players by March 2013.
Market history
In 1983, just a year after the introduction of the compact disc, Kees Schouhamer Immink and Joseph Braat presented the first experiments with erasable magneto-optical compact discs during the 73rd AES Convention in Eindhoven. It took almost 10 years, however, before their idea was commercialized.
Sony's MiniDisc was one of two rival digital systems introduced in 1992 that were intended to replace the Philips Compact Cassette analog audio tape system: the other was the Digital Compact Cassette (DCC), created by Philips and Matsushita (now Panasonic). Sony had originally intended the Digital Audio Tape (DAT) to be the dominant home digital audio recording format, replacing the analog cassette. Because of technical delays, the DAT was not launched until 1989, and by then the U.S. dollar had fallen so far against the yen that the introductory DAT machine Sony had intended to market for about $400 in the late 1980s then had to retail for $800 or even $1,000 to break even, putting it out of reach of most users.
Relegating DAT to professional use, Sony set to work to come up |
https://en.wikipedia.org/wiki/Industrial%20fermentation | Industrial fermentation is the intentional use of fermentation in manufacturing processes. In addition to the mass production of fermented foods and drinks, industrial fermentation has widespread applications in chemical industry. Commodity chemicals, such as acetic acid, citric acid, and ethanol are made by fermentation. Moreover, nearly all commercially produced industrial enzymes, such as lipase, invertase and rennet, are made by fermentation with genetically modified microbes. In some cases, production of biomass itself is the objective, as is the case for single-cell proteins, baker's yeast, and starter cultures for lactic acid bacteria used in cheesemaking.
In general, fermentations can be divided into four types:
Production of biomass (viable cellular material)
Production of extracellular metabolites (chemical compounds)
Production of intracellular components (enzymes and other proteins)
Transformation of substrate (in which the transformed substrate is itself the product)
These types are not necessarily disjoined from each other, but provide a framework for understanding the differences in approach. The organisms used are typically microorganisms, particularly bacteria, algae, and fungi, such as yeasts and molds, but industrial fermentation may also involve cell cultures from plants and animals, such as CHO cells and insect cells. Special considerations are required for the specific organisms used in the fermentation, such as the dissolved oxygen level, nutrient levels, and temperature. The rate of fermentation depends on the concentration of microorganisms, cells, cellular components, and enzymes as well as temperature, pH and level of oxygen for aerobic fermentation. Product recovery frequently involves the concentration of the dilute solution.
General process overview
In most industrial fermentations, the organisms or eukaryotic cells are submerged in a liquid medium; in others, such as the fermentation of cocoa beans, coffee cherries, and miso, |
https://en.wikipedia.org/wiki/BIM%20Collaboration%20Format | The BIM Collaboration Format (BCF) is a structured file format suited to issue tracking with a building information model. The BCF is designed primarily for defining views of a building model and associated information on collisions and errors connected with specific objects in the view. The BCF allows users of different BIM software, and/or different disciplines to collaborate on issues with the project. The use of the BCF to coordinate changes to a BIM is an important aspect of OpenBIM.
The format was developed by Tekla and Solibri and later adopted as a standard by buildingSMART. Most major BIM modelling software platforms support some integration with BCF, typically through plug-ins provided by the BCF server vendor.
Although the BCF was originally conceived as a file base there are now many implementations using the cloud-based collaborative workflow described in the BCF API, including Open Source implementation as part of the Open Source BIM collective.
Research work has been done in Denmark looking into using the BCF for a broader range of information management and exchange in the architecture, engineering and construction (AEC) sector.
Supporting software
There are two main categories of support for the BCF: authoring software and coordination software. Authoring software can generate and share BCF issues. Coordination software is most powerful at coordinating issues and presenting a user interface for the management and tracking of issues. Coordination software is typically a web-based service which allows for real-time coordination across multiple authoring software platforms and geographies. Most BIM software has a mix of these functions.
The BCF is supported natively by authoring software such as Vectorworks, ArchiCAD, Tekla Structures, Quadri, DDS CAD, BIMcollab ZOOM, BIMsight, Solibri, Navisworks, and Simplebim. Standalone BCF plugins include BCF Manager, and BCFier. Coordination software as cloud services offering BCF based issue tracking includ |
https://en.wikipedia.org/wiki/Mario%20Szegedy | Mario Szegedy (born October 23, 1960) is a Hungarian-American computer scientist, professor of computer science at Rutgers University. He received his Ph.D. in computer science in 1989 from the University of Chicago. He held a Lady Davis Postdoctoral Fellowship at the Hebrew University of Jerusalem (1989–90), a postdoc at the University of Chicago, 1991–92, and a postdoc at Bell Laboratories (1992).
Szegedy's research areas include computational complexity theory and quantum computing.
He was awarded the Gödel Prize twice, in 2001 and 2005, for his work on probabilistically checkable proofs and on the space complexity of approximating the frequency moments in streamed data. His work on streaming was also recognized by the 2019 Paris Kanellakis Theory and Practice Award.
He is married and has two daughters. |
https://en.wikipedia.org/wiki/Scianna%20antigen%20system | The Scianna blood antigen system consists of seven antigens. These include two high frequency antigens Sc1 and Sc3, and two low frequency antigens Sc2 and Sc4.
The very rare null phenotype is characterised by the absence of Sc1, Sc2 and Sc3.
The antigens are caused by changes in the erythroid membrane associated protein (ERMAP).
History
This blood group system was discovered in 1962 when a high frequency antigen was detected in a young woman (Ms. Scianna) who had experienced several late pregnancy losses due to haemolytic disease of the fetus. |
https://en.wikipedia.org/wiki/Visual%20FoxPro | Visual FoxPro is a Microsoft data-centric procedural programming language with object-oriented programming (OOP) features.
It was derived from FoxPro (originally known as FoxBASE) which was developed by Fox Software beginning in 1984. Fox Technologies merged with Microsoft in 1992, after which the software acquired further features and the prefix "Visual". FoxPro 2.6 worked on Mac OS, DOS, Windows, and Unix.
Visual FoxPro 3.0, the first "Visual" version, reduced platform support to only Mac and Windows, and later versions 5, 6, 7, 8 and 9 were Windows-only. The current version of Visual FoxPro is COM-based and Microsoft has stated that they do not intend to create a Microsoft .NET version.
Version 9.0, released in December 2004 and updated in October 2007 with the SP2 patch, was the final version of the product. Support ended in January 2010 and extended support in January 2015.
History
Visual FoxPro originated as a member of the class of languages commonly referred to as "xBase" languages, which have syntax based on the dBase programming language. Other members of the xBase language family include Clipper and Recital (database).
Visual FoxPro, commonly abbreviated as VFP, is tightly integrated with its own relational database engine, which extends FoxPro's xBase capabilities to support SQL query and data manipulation. Unlike most database management systems, Visual FoxPro is a full-featured, dynamic programming language that does not require the use of an additional general-purpose programming environment. It can be used to write not just traditional "fat client" applications, but also middleware and web applications.
In late 2002, it was demonstrated that Visual FoxPro can run on Linux under the Wine Windows compatibility suite. In 2003, this led to complaints by Microsoft: it was claimed that the deployment of runtime FoxPro code on non-Windows machines violates the End User License Agreement.
Visual FoxPro had a rapid rise and fall in popularity as measu |
https://en.wikipedia.org/wiki/Private%20Communications%20Technology | Private Communications Technology (PCT) 1.0 was a protocol developed by Microsoft in the mid-1990s. PCT was designed to address security flaws in version 2.0 of Netscape's Secure Sockets Layer protocol and to force Netscape to hand control of the then-proprietary SSL protocol to an open standards body.
PCT has since been superseded by SSLv3 and Transport Layer Security. For a while it was still supported by Internet Explorer, but PCT 1.0 has been disabled by default since IE 5 and the option was removed in IE6. It is still found in IIS and in the Windows operating system libraries, although in Windows Server 2003 it is disabled by default. It is used by old versions of MSMQ as the only choice.
Due to its near disuse, it is arguably a security risk, as it has received less attention in testing than commonly used protocols, and there is little incentive for Microsoft to expend effort on maintaining its implementation of it. |
https://en.wikipedia.org/wiki/Good%20clinical%20laboratory%20practice | Good clinical laboratory practice (GCLP) is a GxP guideline for laboratory samples from clinical studies.
Good clinical practice (GCP) does not define requirements for laboratories and good laboratory practice (GLP) focusses on pre-clinical analyses and not on human samples from clinical trials. The Research Quality Association (RQA) suggested in 2003 a guideline to close the gap. Later the World Health Organization and the British Medicines and Healthcare products Regulatory Agency issued their own versions of a GCLP guideline.
Literature
WHO Good Clinical Laboratory Practice (GCLP)
Stevens W. (2003) Good Clinical Laboratory Practice (GCLP): The need for a hybrid of Good Laboratory Practice and Good Clinical Practice guidelines/standards for medical testing laboratories conducting clinical trials in developing countries. Quality Assurance, 10: 83–89.
Grant, Vanessa and Stiles, Tim, Research Quality Association (2003 and revised in 2012), Good Clinical Laboratory Practice
External links
WHO Good Clinical Laboratory Practice (GCLP) (11 Oct 2010)
MHRA guidance on the maintenance of regulatory compliance in laboratories that perform the analysis or evaluation of clinical trial samples (11 Oct 2010)
Research Quality Association (RQA)
Good Clinical Laboratory Practice (GCLP)
Quality management
Good practice |
https://en.wikipedia.org/wiki/Electoral%20Calculus | Electoral Calculus is a political forecasting web site that attempts to predict future United Kingdom general election results. It considers national factors and local demographics.
Main features
The site was developed by Martin Baxter, who was a financial analyst specialising in mathematical modelling.
The site includes maps, predictions and analysis articles. It has separate sections for elections in Scotland and Northern Ireland.
From April 2019, the headline prediction covered the Brexit Party and Change UK – The Independent Group. Change UK was later removed from the headline prediction ahead of the 2019 general election as their poll scores were not statistically significant.
Methodology
The site is based around the employment of scientific techniques on data about the United Kingdom's electoral geography, which can be used to calculate the uniform national swing. It takes account of national polls and trends but excludes local issues.
The calculations were initially based on what is termed the Transition Model, which is derived from the additive uniform national swing model. This uses national swings in a proportional manner to predict local effects. The Strong Transition Model was introduced in October 2007, and considers the effects of strong and weak supporters. The models are explained in detail on the web site.
Predictions
Across the eight general elections from 1992 to 2019:
EC correctly predicted the party to win the most seats in seven out of eight (all except 1992).
EC correctly predicted the majority party or lack of a majority, in five (1997, 2001, 2005, 2010 no majority and 2019).
The mean polling error for the two largest parties was 4.8%.
Reception
It was listed by The Guardian in 2004 as one of the "100 most useful websites", being "the best" for predictions. In 2012 it was described by PhD student Chris Prosser at the University of Oxford as "probably the leading vote/seat predictor on the internet". Its detailed predictions for in |
https://en.wikipedia.org/wiki/Ecohydrology | Ecohydrology (from Greek , oikos, "house(hold)"; , hydōr, "water"; and , -logia) is an interdisciplinary scientific field studying the interactions between water and ecological systems. It is considered a sub discipline of hydrology, with an ecological focus. These interactions may take place within water bodies, such as rivers and lakes, or on land, in forests, deserts, and other terrestrial ecosystems. Areas of research in ecohydrology include transpiration and plant water use, adaption of organisms to their water environment, influence of vegetation and benthic plants on stream flow and function, and feedbacks between ecological processes, the soil carbon sponge and the hydrological cycle.
Key concepts
The hydrologic cycle describes the continuous movement of water on, above, and below the surface on the earth. This flow is altered by ecosystems at numerous points. Transpiration from plants provides the majority of flow of water to the atmosphere. Water is influenced by vegetative cover as it flows over the land surface, while river channels can be shaped by the vegetation within them. Ecohydrology was developed under the International Hydrological Program of UNESCO.
Ecohydrologists study both terrestrial and aquatic systems. In terrestrial ecosystems (such as forests, deserts, and savannas), the interactions among vegetation, the land surface, the vadose zone, and the groundwater are the main focus. In aquatic ecosystems (such as rivers, streams, lakes, and wetlands), emphasis is placed on how water chemistry, geomorphology, and hydrology affect their structure and function.
Principles
The general assumptions of ecological hydrology is to decrease ecosystem degradation using concepts that integrate terrestrial and aquatic processes across scales. The principles of Ecohydrology are expressed in three sequential components:
Hydrological (Framework): The quantification of the hydrological cycle of a basin, should be a template for functional integration of h |
https://en.wikipedia.org/wiki/Human-to-human%20transmission | Human-to-human transmission (HHT) is an epidemiologic vector, especially in case the disease is borne by individuals known as superspreaders. In these cases, the basic reproduction number of the virus, which is the average number of additional people that a single case will infect without any preventative measures, can be as high as 203.9. Interhuman transmission is a synonym for HHT.
The World Health Organization designation of a pandemic hinges on the demonstrable fact that there is sustained HHT in two regions of the world.
Synopsis
Relevant pathogens may be viruses, bacteria, or fungi, and they may be spread through breathing, talking, coughing, sneezing, spraying of liquids, toilet flushing or any activities which generate aerosol particles or droplets or generate fomites, such as raising of dust.
A 2007 study showed that influenza virus was still active on stainless steel 24 hours after contamination. Though on hands it survives only for five minutes, the constant contact with steel almost certainly transmits infection. Transfer efficiency depends not only on surface, but also on pathogen type. For example, avian influenza survives on both porous and non-porous materials for 144 hours.
The pathogens may also be transmitted by poor use of cutlery or improper sanitation of dishes or bedlinen. Particularly problematic are toilet practices, which lead to the fecal–oral route. STDs are by definition spread through this vector.
List of HHT diseases
Examples of some HHT diseases are listed below.
measles: vaccine available
mumps: vaccine available
chicken pox: vaccine available
small pox
bubonic plague: slim non-nil risk
pneumonic plague: 1910-11 Manchurian plague
tuberculosis
Norovirus
monkeypox
SARS-CoV-1
SARS-CoV-2: vaccine available
MERS
Avian flu
Sexually transmitted infections (STIs) or sexually transmitted diseases (STDs):
Syphilis, aka French pox |
https://en.wikipedia.org/wiki/B%C3%A9zout%27s%20theorem | Bézout's theorem is a statement in algebraic geometry concerning the number of common zeros of polynomials in indeterminates. In its original form the theorem states that in general the number of common zeros equals the product of the degrees of the polynomials. It is named after Étienne Bézout.
In some elementary texts, Bézout's theorem refers only to the case of two variables, and asserts that, if two plane algebraic curves of degrees and have no component in common, they have intersection points, counted with their multiplicity, and including points at infinity and points with complex coordinates.
In its modern formulation, the theorem states that, if is the number of common points over an algebraically closed field of projective hypersurfaces defined by homogeneous polynomials in indeterminates, then is either infinite, or equals the product of the degrees of the polynomials. Moreover, the finite case occurs almost always.
In the case of two variables and in the case of affine hypersurfaces, if multiplicities and points at infinity are not counted, this theorem provides only an upper bound of the number of points, which is almost always reached. This bound is often referred to as the Bézout bound.
Bézout's theorem is fundamental in computer algebra and effective algebraic geometry, by showing that most problems have a computational complexity that is at least exponential in the number of variables. It follows that in these areas, the best complexity that can be hoped for will occur with algorithms that have a complexity which is polynomial in the Bézout bound.
History
In the case of plane curves, Bézout's theorem was essentially stated by Isaac Newton in his proof of lemma 28 of volume 1 of his Principia in 1687, where he claims that two curves have a number of intersection points given by the product of their degrees.
The general theorem was later published in 1779 in Étienne Bézout's Théorie générale des équations algébriques. He supposed the e |
https://en.wikipedia.org/wiki/Havana%20syndrome | Havana syndrome is a cluster of idiopathic symptoms experienced mostly abroad by U.S. government officials and military personnel. The symptoms range in severity from pain and ringing in the ears to cognitive dysfunction and were first reported in 2016 by U.S. and Canadian embassy staff in Havana, Cuba. Beginning in 2017, more people, including U.S. intelligence and military personnel and their families, reported having these symptoms in other places, such as China, New Delhi, India, Europe, and Washington, D.C. The U.S. Department of State, Department of Defense, and other federal entities have called the events "Anomalous Health Incidents" (AHI). Of over a thousand purported cases, the majority of US investigative bodies found only a few dozen cases to be suspicious.
Once the story became public, various U.S. government representatives attributed the incidents to attacks by unidentified foreign actors, and various U.S. officials blamed the reported symptoms on a variety of unidentified and unknown technologies, including ultrasound and microwave weapons. Other possibilities such as pesticides and other toxins were also raised, but all suggested causes were speculative as no undisputed evidence was discovered. As the story developed, and the U.S. intelligence services could not determine the cause of the symptoms, U.S. intelligence and government officials expressed suspicions to the press that Russian military intelligence was responsible. Due to the lack of evidence, pattern of reports, and spread to numerous locations, some scientists promoted the alternate hypothesis of mass psychogenic illness as the true cause of the cases.
In January 2022, the Central Intelligence Agency issued an interim assessment concluding that the syndrome is not the result of "a sustained global campaign by a hostile power". Foreign involvement was ruled out in 976 cases of the 1,000 reviewed. In February 2022, a panel of experts assembled by the Biden administration released an exe |
https://en.wikipedia.org/wiki/Lichenin | Lichenin, also known as lichenan or moss starch, is a complex glucan occurring in certain species of lichens. It can be extracted from Cetraria islandica (Iceland moss). It has been studied since about 1957.
Structure
Chemically, lichenin is a mixed-linkage glucan, consisting of repeating glucose units linked by β-1,3 and β-1,4 glycosidic bonds.
Uses
It is an important carbohydrate for reindeers and northern flying squirrels, which eat the lichen Bryoria fremontii.
It can be extracted by digesting Iceland moss in a cold, weak solution of carbonate of soda for some time, and then boiling. By this process the lichenin is dissolved and on cooling separates as a colorless jelly. Iodine imparts no color to it.
Other uses of the name
In his 1960 novel Trouble with Lichen, John Wyndham gives the name Lichenin to a biochemical extract of lichen used to extend life expectancy beyond 300 years. |
https://en.wikipedia.org/wiki/G%C3%B6del%2C%20Escher%2C%20Bach | Gödel, Escher, Bach: an Eternal Golden Braid, also known as GEB, is a 1979 book by Douglas Hofstadter.
By exploring common themes in the lives and works of logician Kurt Gödel, artist M. C. Escher, and composer Johann Sebastian Bach, the book expounds concepts fundamental to mathematics, symmetry, and intelligence. Through short stories, illustrations, and analysis, the book discusses how systems can acquire meaningful context despite being made of "meaningless" elements. It also discusses self-reference and formal rules, isomorphism, what it means to communicate, how knowledge can be represented and stored, the methods and limitations of symbolic representation, and even the fundamental notion of "meaning" itself.
In response to confusion over the book's theme, Hofstadter emphasized that Gödel, Escher, Bach is not about the relationships of mathematics, art, and music—but rather about how cognition emerges from hidden neurological mechanisms. One point in the book presents an analogy about how individual neurons in the brain coordinate to create a unified sense of a coherent mind by comparing it to the social organization displayed in a colony of ants.
Gödel, Escher, Bach won the Pulitzer Prize for general non-fiction and the National Book Award for Science Hardcover. Despite the success of the book, Hofstadter felt that audiences did not adequately grasp what he felt was the main idea of the book: strange loops. In an attempt to remedy this, he published I Am a Strange Loop in 2007.
Structure
Gödel, Escher, Bach takes the form of interweaving narratives. The main chapters alternate with dialogues between imaginary characters, usually Achilles and the tortoise, first used by Zeno of Elea and later by Lewis Carroll in "What the Tortoise Said to Achilles". These origins are related in the first two dialogues, and later ones introduce new characters such as the Crab. These narratives frequently dip into self-reference and metafiction.
Word play also features pro |
https://en.wikipedia.org/wiki/Acclimatization | Acclimatization or acclimatisation (also called acclimation or acclimatation) is the process in which an individual organism adjusts to a change in its environment (such as a change in altitude, temperature, humidity, photoperiod, or pH), allowing it to maintain fitness across a range of environmental conditions. Acclimatization occurs in a short period of time (hours to weeks), and within the organism's lifetime (compared to adaptation, which is evolution, taking place over many generations). This may be a discrete occurrence (for example, when mountaineers acclimate to high altitude over hours or days) or may instead represent part of a periodic cycle, such as a mammal shedding heavy winter fur in favor of a lighter summer coat. Organisms can adjust their morphological, behavioral, physical, and/or biochemical traits in response to changes in their environment. While the capacity to acclimate to novel environments has been well documented in thousands of species, researchers still know very little about how and why organisms acclimate the way that they do.
Names
The nouns acclimatization and acclimation (and the corresponding verbs acclimatize and acclimate) are widely regarded as synonymous, both in general vocabulary and in medical vocabulary. The synonym acclimatation is less commonly encountered, and fewer dictionaries enter it.
Methods
Biochemical
In order to maintain performance across a range of environmental conditions, there are several strategies organisms use to acclimate. In response to changes in temperature, organisms can change the biochemistry of cell membranes making them more fluid in cold temperatures and less fluid in warm temperatures by increasing the number of membrane proteins. In response to certain stressors, some organisms express so-called heat shock proteins that act as molecular chaperones and reduce denaturation by guiding the folding and refolding of proteins. It has been shown that organisms which are acclimated to high or low t |
https://en.wikipedia.org/wiki/Immunoelectrophoresis | Immunoelectrophoresis is a general name for a number of biochemical methods for separation and characterization of proteins based on electrophoresis and reaction with antibodies. All variants of immunoelectrophoresis require immunoglobulins, also known as antibodies, reacting with the proteins to be separated or characterized. The methods were developed and used extensively during the second half of the 20th century. In somewhat chronological order: Immunoelectrophoretic analysis (one-dimensional immunoelectrophoresis ad modum Grabar), crossed immunoelectrophoresis (two-dimensional quantitative immunoelectrophoresis ad modum Clarke and Freeman or ad modum Laurell), rocket-immunoelectrophoresis (one-dimensional quantitative immunoelectrophoresis ad modum Laurell), fused rocket immunoelectrophoresis ad modum Svendsen and Harboe, affinity immunoelectrophoresis ad modum Bøg-Hansen.
Methods
Immunoelectrophoresis is a general term describing many combinations of the principles of electrophoresis and reaction of antibodies, also known as immunodiffusion.
Agarose as 1% gel slabs of about 1 mm thickness buffered at high pH (around 8.6) is traditionally preferred for electrophoresis and the reaction with antibodies. The agarose was chosen as the gel matrix because it has large pores allowing free passage and separation of proteins but provides an anchor for the immunoprecipitates of protein and specific antibodies. The high pH was chosen because antibodies are practically immobile at high pH. Electrophoresis equipment with a horizontal cooling plate was normally recommended for the electrophoresis.
Immunoprecipitates are visible in the wet agarose gel, but are stained with protein stains like Coomassie brilliant blue in the dried gel. In contrast to SDS-gel electrophoresis, the electrophoresis in agarose allows native conditions, preserving the native structure and activities of the proteins under investigation, therefore immunoelectrophoresis allows characterization of e |
https://en.wikipedia.org/wiki/Archodontes | Archodontes is a genus of root-boring beetles in the family Cerambycidae. It is monotypic, being represented by the single species Archodontes melanopus. It is endemic to Central America and the south-eastern United States, and bores the roots of oaks and other hardwoods.
Description
Head short and black. Mandibles short. Antennae dark brown, almost black; shorter than the insect. The thorax broad, rough and black, margined on the posterior and anterior edges; having many small sharp spines on its sides, the two last of which are larger than the rest, and having two tubercles on the upper side. Elytra dark brown, almost black, margined on the sides and suture, with a small spine on each, at the extremities, and extending a little beyond the anus. Abdomen smooth and shining, and of a dark brown colour, nearly black. Sides of the breast hairy. Legs dark brown, almost black, smooth and shining, with three small tibial spurs. Length (including mandibles) 2¼ inches (57 mm). |
https://en.wikipedia.org/wiki/Microscopic%20reversibility | The principle of microscopic reversibility in physics and chemistry is twofold:
First, it states that the microscopic detailed dynamics of particles and fields is time-reversible because the microscopic equations of motion are symmetric with respect to inversion in time (T-symmetry);
Second, it relates to the statistical description of the kinetics of macroscopic or mesoscopic systems as an ensemble of elementary processes: collisions, elementary transitions or reactions. For these processes, the consequence of the microscopic T-symmetry is: Corresponding to every individual process there is a reverse process, and in a state of equilibrium the average rate of every process is equal to the average rate of its reverse process.
History of microscopic reversibility
The idea of microscopic reversibility was born together with physical kinetics. In 1872, Ludwig Boltzmann represented kinetics of gases as statistical ensemble of elementary collisions. Equations of mechanics are reversible in time, hence, the reverse collisions obey the same laws. This reversibility of collisions is the first example of microreversibility. According to Boltzmann, this microreversibility implies the principle of detailed balance for collisions: at the equilibrium ensemble each collision is equilibrated by its reverse collision. These ideas of Boltzmann were analyzed in detail and generalized by Richard C. Tolman.
In chemistry, J. H. van't Hoff (1884) came up with the idea that equilibrium has dynamical nature and is a result of the balance between the forward and backward reaction rates. He did not study reaction mechanisms with many elementary reactions and could not formulate the principle of detailed balance for complex reactions. In 1901, Rudolf Wegscheider introduced the principle of detailed balance for complex chemical reactions. He found that for a complex reaction the principle of detailed balance implies important and non-trivial relations between reaction rate constants for dif |
https://en.wikipedia.org/wiki/Pulse-cross | A pulse-cross is the representation of normally invisible portions of an analog video signal on a studio screen for error analysis.
Only part of the video signal contains image information: with an analog 625-line video signal, each field lasts 20 ms. Of these, only 18.4 ms (287.5 lines) are provided for image information; the remaining 1.6 ms (25 lines) form the vertical blanking interval, a time reserved for the vertical retrace of the electron beam. Likewise, of the 64 μs of each line, only 52 μs contain image information; the remaining 12 μs form the horizontal blanking interval for the horizontal retrace. These blanking intervals are thus outside the picture.
A pulse-cross circuit delays synchronization in the monitor, shifting the image to the left or up. This will reveal areas of the video signal that are usually outside the image. In addition, the circuit reduces the contrast of the image, so that the sync pulses are also shown, where the voltage is below the black level.
In a standard-compliant video signal, the vertical synchronization signal consists of five long pulses of 59.3 μs duration and is framed by five 2.35 μs short pulses before and after, the pre-equalizing pulses. Home computers, game consoles, etc. often do not provide a standard-compliant signal: the sync signal contains no gaps and no pre-equalizing pulses. A pulse-cross circuit will reveal these inaccuracies.
For PAL color coding, the colorburst signal can be seen in the form of an orange bar. For a pure black and white signal, the burst is missing.
Furthermore, the timing error of a video recorder can be recognized on the signal cross. Switching the video heads will cause a single line in the image to be either too long or too short. This error can be corrected by a time base corrector.
Literature
Ru van Wezel, Video-Handbuch (German language)
Martin Hinner, PAL video timing specification
Television technology
Test equipment |
https://en.wikipedia.org/wiki/AMD%20K6-2 | The K6-2 is an x86 microprocessor introduced by AMD on May 28, 1998, and available in speeds ranging from 266 to 550 MHz. An enhancement of the original K6, the K6-2 introduced AMD's 3DNow! SIMD instruction set and an upgraded system-bus interface called Super Socket 7, which was backward compatible with older Socket 7 motherboards. It was manufactured using a 250 nanometer process, ran at 2.2 volts, and had 9.3 million transistors.
History
The K6-2 was designed as a competitor to Intel's flagship processor, the significantly more expensive Pentium II. Performance of the two chips was similar: the previous K6 tended to be faster for general-purpose computing, while the Intel part was faster in x87 floating-point applications. To battle the Pentium II's dominance on floating point calculations the K6-2 was the first CPU to introduce a floating point SIMD instruction set (dubbed 3DNow! by AMD), which significantly boosted performance. However programs needed to be specifically tailored for the new instructions and despite beating Intel's SSE instruction set to market, 3DNow achieved only limited popularity.
Super Socket 7, which increased the processor bus from 66 MHz to 100 MHz, allowed the K6-2 to withstand the effects of ever-increasing CPU multipliers fairly gracefully and in later life it remained surprisingly competitive. Nearly all K6-2s were designed to use 100 MHz Super Socket 7 mainboards, allowing the system-bus to keep pace with the K6-2's clock-frequency.
The K6-2 was a very financially successful chip and enabled AMD to earn the revenue it would need to introduce the forthcoming Athlon. The introductory K6-2 300 was by far the best-selling variant. It rapidly established an excellent reputation in the marketplace and offered a favorable price/performance ratio versus Intel's Celeron 300A. While the K6-2 had mediocre floating-point performance compared to the Celeron, it offered faster system RAM access (courtesy of the Super 7 mainboard), as well as |
https://en.wikipedia.org/wiki/Arkitex | Arkitex was an all-plastic construction toy produced by Tri-ang from c.1959 to c.1965. It was available in 1/42 and OO scales. The toy was designed by Geoff Bailey. One of the advantages that the toy offered was that changes can be made to a partially-built structure without dismantling major sections.
Designer Peter Allen said, "The invention of Arkitex was not in response to Lego, which was nowhere near as significant in the 1950s as it is today, but to Chad Valley's Girder and Panel Building Set, which was selling well since its launch in 1957."
The toy was produced in Tri-ang's factory that produced the Spot-On models.
Reception
One contributor to the toy's failure in the marketplace is that it required more patience and care to build successfully. |
https://en.wikipedia.org/wiki/Pythagoras%20tree%20%28fractal%29 | The Pythagoras tree is a plane fractal constructed from squares. Invented by the Dutch mathematics teacher Albert E. Bosman in 1942, it is named after the ancient Greek mathematician Pythagoras because each triple of touching squares encloses a right triangle, in a configuration traditionally used to depict the Pythagorean theorem.
If the largest square has a size of L × L, the entire Pythagoras tree fits snugly inside a box of size 6L × 4L. The finer details of the tree resemble the Lévy C curve.
Construction
The construction of the Pythagoras tree begins with a square. Upon this square are constructed two squares, each scaled down by a linear factor of /2, such that the corners of the squares coincide pairwise. The same procedure is then applied recursively to the two smaller squares, ad infinitum. The illustration below shows the first few iterations in the construction process.
This is the simplest symmetric triangle. Alternatively, the sides of the triangle are recursively equal proportions, leading to the sides being proportional to the square root of the inverse golden ratio, and the areas of the squares being in golden ratio proportion.
Area
Iteration n in the construction adds 2n squares of area , for a total area of 1. Thus the area of the tree might seem to grow without bound in the limit as n → ∞. However, some of the squares overlap starting at the order 5 iteration, and the tree actually has a finite area because it fits inside a 6×4 box.
It can be shown easily that the area A of the Pythagoras tree must be in the range 5 < A < 18, which can be narrowed down further with extra effort. Little seems to be known about the actual value of A.
Varying the angle
An interesting set of variations can be constructed by maintaining an isosceles triangle but changing the base angle (90 degrees for the standard Pythagoras tree). In particular, when the base half-angle is set to (30°) = arcsin(0.5), it is easily seen that the size of the squares remains constan |
https://en.wikipedia.org/wiki/Smack%20%28software%29 | Smack (full name: Simplified Mandatory Access Control Kernel) is a Linux kernel security module that protects data and process interaction from malicious manipulation using a set of custom mandatory access control (MAC) rules, with simplicity as its main design goal. It has been officially merged since the Linux 2.6.25 release, it was the main access control mechanism for the MeeGo mobile Operating System. It is also used to sandbox HTML5 web applications in the Tizen architecture, in the commercial Wind River Linux solutions for embedded device development, in Philips Digital TV products., and in Intel's Ostro OS for IoT devices.
Since 2016, Smack is required in all Automotive Grade Linux (AGL) implementations where it provides in association with other Linux facilities the base for the AGL security framework.
Design
Smack consists of three components:
A kernel module that is implemented as a Linux Security Module. It works best with file systems that support extended attributes.
A startup script that ensures that device files have the correct Smack attributes and loads the Smack configuration.
A set of patches to the GNU Core Utilities package to make it aware of Smack extended file attributes. A set of similar patches to Busybox were also created. SMACK does not require user-space support.
Criticism
Smack has been criticized for being written as a new LSM module instead of an SELinux security policy which can provide equivalent functionality. Such SELinux policies have been proposed, but none had been demonstrated. Smack's author replied that it would not be practical due to SELinux's complicated configuration syntax and the philosophical difference between Smack and SELinux designs. |
https://en.wikipedia.org/wiki/Wirthbacteria | Candidatus Wirthbacteria is a proposed bacterial phylum containing only one known sample from the Crystal Geyser aquifer, Ca. Wirthibacter wanneri. This bacterium stands out in a basal position in some trees of life as it is closely related to Candidate phyla radiation but is not considered part of that clade. |
https://en.wikipedia.org/wiki/KCNK16 | Potassium channel subfamily K member 16 is a protein that in humans is encoded by the KCNK16 gene. The protein encoded by this gene, K2P16.1, is a potassium channel containing two pore-forming P domains.
See also
Tandem pore domain potassium channel |
https://en.wikipedia.org/wiki/AssureSign | AssureSign (now Nintex AssureSign) provides electronic signature and digital transaction management (DTM) software that can be deployed as a Software as a service (SaaS) application or installed on customer premises. Documents are signed using a computer that has an Internet connection.
AssureSign electronic signature technology uses patent-pending biometric, handwritten, mouse-based signatures.
AssureSign's tools enable users to integrate with existing environments. AssureSign maintains a partner program. AssureSign has also been integrated with other third-party applications.
AssureSign Electronic Signature Technology was developed in 2006 as an offering by Third Party Verification, Inc. (3PV). AssureSign LLC was formed as a separate company in 2008 by David Brinkman and Dale Combs. AssureSign and 3PV are consistently ranked on Orlando Sentinel's Top 100 Companies for Working Families.
In July 2021, AssureSign was purchased by Nintex. |
https://en.wikipedia.org/wiki/Redshift%20%28software%29 | Redshift is an application that adjusts the computer display's color temperature based upon the time of day. The program is free software, and is intended to reduce eye strain as well as insomnia (see Sleep#Circadian clock and Phase response curve#Light).
Redshift transitions the computer display's color temperature evenly between daytime and night temperatures to allow the user's eyes to slowly adapt. At night, the color temperature is low and is typically 3000–4000 K, preferably matching the room's lighting temperature. Typical color temperature during the daytime is 5500–6500 K.
Features
Redshift is primarily distributed for and used on the Linux operating system.
Redshift can be used to set a single color temperature and brightness ("one shot mode") or can adjust the temperature and brightness continuously to follow the sun's elevation, in which case it will transition to the night color temperature settings near twilight. The temperature and brightness settings for daytime and night can be user-configured.
To determine the Sun's elevation the software requires the user's location in form of latitude and longitude.
On Linux and BSD operating systems, Redshift supports multiple monitors through the X extensions RandR (preferred) or VidMode, or through the Direct Rendering Manager. Because Redshift can only be configured to use the same gamma correction on all monitors it controls, it is usually desirable to run one instance of the program per monitor.
Interfaces
Redshift originally possessed only a command-line interface, but now has graphical user interfaces (GUIs) that support most Linux desktop environments. Those GUIs include redshift-gtk, redshift-plasmoid, and nightshift.
redshift-gtk is included in Redshift's source tree. In addition to a windowed interface, it provides a tray status icon that can enable or disable Redshift or adjust the screen's color temperature automatically.
Redshift can be opened with the use of terminal, panel launchers or |
https://en.wikipedia.org/wiki/Caprine%20alphaherpesvirus%201 | Caprine alphaherpesvirus 1 (CpHV-1) is a species of virus known to infect goats worldwide. It has been shown to produce systemic and respiratory symptoms in kids and to cause abortions in adult goats.
The virus is in the genus Varicellovirus, subfamily Alphaherpesvirinae, family Herpesviridae, and order Herpesvirales.
It is less studied than some of the other ruminant viruses in the genus Varicellovirus including Bovine alphaherpesvirus 1 (BoHV-1), and research conflicts about its level of species specificity and its ability to cause bovine infection and illness. Like many other members of its genus, CpHV-1 has been shown to replicate in the respiratory tract and to cause latent infections. There is not currently a vaccine. |
https://en.wikipedia.org/wiki/Quadratic%20set | In mathematics, a quadratic set is a set of points in a projective space that bears the same essential incidence properties as a quadric (conic section in a projective plane, sphere or cone or hyperboloid in a projective space).
Definition of a quadratic set
Let be a projective space. A quadratic set is a non-empty subset of for which the following two conditions hold:
(QS1) Every line of intersects in at most two points or is contained in .
( is called exterior to if , tangent to if either or , and secant to if .)
(QS2) For any point the union of all tangent lines through is a hyperplane or the entire space .
A quadratic set is called non-degenerate if for every point , the set is a hyperplane.
A Pappian projective space is a projective space in which Pappus's hexagon theorem holds.
The following result, due to Francis Buekenhout, is an astonishing statement for finite projective spaces.
Theorem: Let be a finite projective space of dimension and a non-degenerate quadratic set that contains lines. Then: is Pappian and is a quadric with index .
Definition of an oval and an ovoid
Ovals and ovoids are special quadratic sets:
Let be a projective space of dimension . A non-degenerate quadratic set that does not contain lines is called ovoid (or oval in plane case).
The following equivalent definition of an oval/ovoid are more common:
Definition: (oval)
A non-empty point set of a projective plane is called
oval if the following properties are fulfilled:
(o1) Any line meets in at most two points.
(o2) For any point in there is one and only one line such that .
A line is a exterior or tangent or secant line of the
oval if or or respectively.
For finite planes the following theorem provides a more simple definition.
Theorem: (oval in finite plane) Let be a projective plane of order .
A set of points is an oval if and if no three points
of are collinear.
According to this theorem of Beniamino Segre, for Pappian projective p |
https://en.wikipedia.org/wiki/Attribute-oriented%20programming | Attribute-oriented programming (@OP) is a technique for embedding metadata, namely attributes, within program code.
Attribute-oriented programming in various languages
Java
With the inclusion of Metadata Facility for Java (JSR-175) into the J2SE 5.0 release it is possible to utilize attribute-oriented programming right out of the box.
XDoclet library makes it possible to use attribute-oriented programming approach in earlier versions of Java.
C#
The C# language has supported attributes from its very first release. These attributes was used to give run-time information and are not used by a preprocessor. Currently with source generators, you can use attributes to drive generation of additional code at compile-time.
UML
The Unified Modeling Language (UML) supports a kind of attribute called stereotypes.
Hack
The Hack programming language supports attributes. Attributes can be attached to various program entities, and information about those attributes can be retrieved at run-time via reflection.
Tools
Annotation Processing Tool (apt)
Spoon, an Annotation-Driven Java Program Transformer
XDoclet, a Javadoc-Driven Program Generator |
https://en.wikipedia.org/wiki/HippoDraw | HippoDraw is a object-oriented statistical data analysis package written in C++, with user interaction via a Qt-based GUI and a Python-scriptable interface. It was developed by Paul Kunz at SLAC, primarily for the analysis and presentation of particle physics and astrophysics data, but can be equally well used in other fields where data handling is important.
About
HippoDraw can read and write files in an XML-based format, astrophysics FITS files, data objects produced by ROOT (optional), and through the Python bindings, anything that can be read/written by Python (HDF5, for instance, with PyTables).
HippoDraw can be used as a Python extension module, allowing users to use HippoDraw data objects with the full power of the Python language. This includes other scientific Python extension modules such Numeric and numarray, whose use with HippoDraw can lead to a large increase in processing speed, even for ROOT objects.
See also
Java Analysis Studio (JAS)
ROOT
AIDA |
https://en.wikipedia.org/wiki/Alfred%20Marshall%20Bailey | Alfred Marshall Bailey (February 18, 1894 – February 25, 1978) was an American ornithologist who was associated with the Denver Museum of Natural History (now the Denver Museum of Nature and Science)
in Colorado for most of his working life.
Early years
Bailey was born in Iowa City, Iowa, where he went to school and then attended the University of Iowa. While a student there he participated in a three-month scientific expedition to Laysan, one of the Northwestern Hawaiian Islands.
Career
After graduation in 1916, Bailey served as curator of birds and mammals at the Louisiana State Museum in New Orleans (1916–1919). From 1919 to 1921 he was involved in surveying south-eastern Alaska for the Bureau of Biological Survey (later to become the United States Fish and Wildlife Service), followed by a curatorial stint at the Denver Museum (1921–1926). From 1926 to 1927 he was on the staff of the Field Museum of Natural History in Chicago, during which period he took part in an expedition to the Semien Mountains of Ethiopia. From 1927 to 1936 he was Director of the Chicago Academy of Sciences.
Denver Museum
Bailey returned to the Denver Museum as Director in 1936, a position he served in for over thirty years, eventually retiring in 1969 at the age of 75. He was a proponent of fieldwork, over the years leading or taking part in several further expeditions to various parts of the world, including the Arctic, Siberia, Mexico, Pacific islands, and New Zealand's subantarctic Campbell Island. He was also a popularizer of science and a skilled photographer, producing the Denver "Museum Pictorial" series of booklets, and contributing articles to magazines such as National Geographic and Natural History.
Honours
Formal recognition of Bailey's achievements include:
1941 – Fellowship of the American Ornithologists' Union
1944 – Doctor of Science, Norwich University
1954 – Doctor of Public Service, University of Denver
1961 – Malcolm Glenn Wyer Award for distinguished service |
https://en.wikipedia.org/wiki/Source%20unfolding | In computational geometry, the source unfolding of a convex polyhedron is a net obtained by cutting the polyhedron along the cut locus of a point on the surface of the polyhedron. The cut locus of a point consists of all points on the surface that have two or more shortest geodesics to . For every convex polyhedron, and every choice of the point on its surface, cutting the polyhedron on the cut locus will produce a result that can be unfolded into a flat plane, producing the source unfolding. The resulting net may, however, cut across some of the faces of the polyhedron rather than only cutting along its edges.
The source unfolding can also be continuously transformed from the polyhedron to its flat net, keeping flat the parts of the net that do not lie along edges of the polyhedron, as a blooming of the polyhedron. The unfolded shape of the source unfolding is always a star-shaped polygon, with all of its points visible by straight line segments from the image of ; this is in contrast to the star unfolding, a different method for producing nets that does not always produce star-shaped polygons.
An analogous unfolding method can be applied to any higher-dimensional convex polytope, cutting the surface of the polytope into a net that can be unfolded into a flat hyperplane. |
https://en.wikipedia.org/wiki/Howard%20the%20Duck%20%28video%20game%29 | Howard the Duck, also known as Howard the Duck: Adventure on Volcano Island, is an action video game released in 1986 by Activision for the ZX Spectrum, Commodore 64 and Apple II. The game is a tie-in to the film Howard the Duck from the same year.
Gameplay
The game involves players controlling Howard the Duck to save his best friends, Phil and Beverly. After being parachuted to Volcano Island, Howard needs to find a backpack to proceed with the search. The game consists then of four levels, in the last of which Howard, armed with a neutron gun, will finally face Overlord.
Reception
Like the film, the game also received fairly negative reviews. Computer Gamer gave an overall 55%, by stating "beautifully presented, and well-programmed, it rates as one of Activision's better recent releases and deserves consideration outside its unfortunate tie-in", predicting, however, low longevity. Aktueller Software Markt described the game as not fulfilling the expectancies and not worth the money. |
https://en.wikipedia.org/wiki/Picard%E2%80%93Lefschetz%20theory | In mathematics, Picard–Lefschetz theory studies the topology of a complex manifold by looking at the critical points of a holomorphic function on the manifold. It was introduced by Émile Picard for complex surfaces in his book , and extended to higher dimensions by . It is a complex analog of Morse theory that studies the topology of a real manifold by looking at the critical points of a real function. extended Picard–Lefschetz theory to varieties over more general fields, and Deligne used this generalization in his proof of the Weil conjectures.
Picard–Lefschetz formula
The Picard–Lefschetz formula describes the monodromy at a critical point.
Suppose that f is a holomorphic map from an (k+1)-dimensional projective complex manifold to the projective line P1. Also suppose that all critical points are non-degenerate and lie in different fibers, and have images x1,...,xn in P1. Pick any other point x in P1. The fundamental group π1(P1 – {x1, ..., xn}, x) is generated by loops wi going around the points xi, and to each point xi there is a vanishing cycle in the homology Hk(Yx) of the fiber at x. Note that this is the middle homology since the fibre has complex dimension k, hence real dimension 2k.
The monodromy action of π1(P1 – {x1, ..., xn}, x) on Hk(Yx) is described as follows by the Picard–Lefschetz formula. (The action of monodromy on other homology groups is trivial.) The monodromy action of a generator wi of the fundamental group on ∈ Hk(Yx) is given by
where δi is the vanishing cycle of xi. This formula appears implicitly for k = 2 (without the explicit coefficients of the vanishing cycles δi) in . gave the explicit formula in all dimensions.
Example
Consider the projective family of hyperelliptic curves of genus defined by
where is the parameter and . Then, this family has double-point degenerations whenever . Since the curve is a connected sum of tori, the intersection form on of a generic curve is the matrix
we can easily compute the Picard- |
https://en.wikipedia.org/wiki/Covermount | Covermount (sometimes written cover mount) is the name given to storage media (containing software and or audiovisual media) or other products (ranging from toys to flip-flops) packaged as part of a magazine or newspaper. The name comes from the method of packaging; the media or product is placed in a transparent plastic sleeve and mounted on the cover of the magazine with adhesive tape or glue.
History
Audio recordings were distributed in the UK by the use of covermounts in the 1960s by the fortnightly satirical magazine Private Eye though the term "covermount" was not in usage at that time. The Private Eye recordings were pressed onto 7" floppy vinyl (known as "flexi-discs" and "flimsies") and mounted on to the front of the magazine. The weekly pop music paper NME issued audio recordings of rock music on similar 7" flexi-discs as covermounts in the 1970s.
The covermount practice continued with computer magazines in the early era of home computers. In the United Kingdom computer hobbyist magazines began distributing tapes and later floppy disks with their publications. These disks included demo and shareware versions of games, applications, computer drivers, operating systems, computer wallpapers and other (usually free) content. One of the first covermount games to be added as a covermount was the 1984 The Thompson Twins Adventure.
Most magazines backed up by large publishers like Linux Format included a covermount CD or DVD with a Linux distribution and other open-source applications. The distribution of discs with source programs was also common in programming magazines: while the printed version had the code explained, the disk had the code ready to be compiled without forcing the reader to type the whole listing into the computer by hand.
In November 2015, The MagPi magazine brought the concept full circle and attached a free Raspberry Pi Zero on the cover, the first full computer to be included as a covermount on a magazine.
In other places, such as Finl |
https://en.wikipedia.org/wiki/Abductor%20pollicis%20brevis%20muscle | The abductor pollicis brevis is a muscle in the hand that functions as an abductor of the thumb.
Structure
The abductor pollicis brevis is a flat, thin muscle located just under the skin. It is a thenar muscle, and therefore contributes to the bulk of the palm's thenar eminence.
It originates from the flexor retinaculum of the hand, the tubercle of the scaphoid bone, and additionally sometimes from the tubercle of the trapezium.
Running lateralward and downward, it is inserted by a thin, flat tendon into the lateral side of the base of the first phalanx of the thumb, and the capsule of the metacarpophalangeal joint.
Nerve supply
The abductor pollicis brevis is supplied by the recurrent branch of the median nerve (Roots C8-T1).
Function
Abduction of the thumb is defined as the movement of the thumb anteriorly, a direction perpendicular to the palm. The abductor pollicis brevis does this by acting across both the carpometacarpal joint and the metacarpophalangeal joint.
It also assists in opposition and extension of the thumb.
Additional images |
https://en.wikipedia.org/wiki/Matutinal | Matutinal, matinal (in entomological writings), and matutine are terms used in the life sciences to indicate something of, relating to, or occurring in the early morning. The term may describe crepuscular animals that are significantly active during the predawn or early morning hours. During the morning twilight period and shortly thereafter, these animals partake in important tasks, such as scanning for mates, mating, and foraging.
Matutinal behaviour is thought to be adaptive because there may be less competition between species, and sometimes even a higher prevalence of food during these hours. It may also serve as an anti-predator adaptation by allowing animals to sit between the brink of danger that may come with diurnal and nocturnal activity.
Etymology
The word matutinal is derived from the Latin word , meaning "of or pertaining to the morning", from Mātūta, the Roman goddess of the morning or dawn (+ -īnus '-ine' + -ālis '-al').
Adaptive relevance
Selection pressures, such as high predatory activity or low food may require animals to change their behaviours to adapt. An animal changing the time of day at which it carries out significant tasks (e.g., mating and/or foraging) is recognized as one of these adaptive behaviours. For example, human activity, which is more predominant during daylight hours, has forced certain species (most often larger mammals) living in urban areas to shift their schedules to crepuscular ones. When observed in environments where there is little or no human activity, these same species often do not exhibit this temporal shift. It may be argued that if the goal is to avoid human activity, or any other diurnal predator's activity, a nocturnal schedule would be safer. However, many of these animals depend on sight, so a matutinal or crepuscular schedule is especially advantageous as it allows animals to both avoid predation, and have sufficient light to mate and forage.
Matutinal mating
For certain species, commencing mating |
https://en.wikipedia.org/wiki/GemStone/S | GemStone/S is computer software, an application framework that was first available for the programming language Smalltalk as an object database. It is proprietary commercial software.
Company history
GemStone Systems was founded on March 1, 1982, as Servio Logic, to build a database machine based on a set theory model. Ian Huang instigated the founding, as the technology adviser to the CEO of Sampoerna Holdings (Putera Sampoerna), by recruiting the following team, consisting of:
Frank Bouton - President, who was the cofounder of Floating Point Systems Inc
Dr. Michael Mulder - Vice President of Engineering, who was the Group Manager for Advanced Processor Design at Sperry Univac and Principal Architect for the Univac 1180 mainframe
Steve Ivy - Vice President of Operation, who was a senior manager at Tektronix
Leonard Yuen - Vice President, Business Development, who was the Development Manager for the IBM DB2 database
Dr. George Copeland - Chief Architect, who was the Senior Staff Engineer at the Advanced Development Group in Tektronix
Steve Redfield - Chief Engineer, who was the Chief Engineer for the Intel 80286 microprocessor
Alan Purdy - who was a Staff Engineer at Tektronix
Bob Bretl - who was a software engineering manager at Tektronix Signal Processing Systems
Allen Otis - who was also with Tektronix
John Telford - who was a software engineering manager from Electro Scientific Industries
Monty Williams
Servio Logic was renamed GemStone Systems, Inc. in June 1995. The firm developed its first hardware prototype in 1982, and shipped its first software product (GemStone 1.0) in 1986. The engineering group resides in Beaverton, Oregon. Three of the original cofounding engineers, Bob Bretl, Allen Otis, and Monty Williams (now retired), have been with the firm since its start.
GemStone's owners pioneered implementing distributed computing in business systems. Many information system features now associated with Java EE were implemented earlier in GemSt |
https://en.wikipedia.org/wiki/Nested%20function | In computer programming, a nested function (or nested procedure or subroutine) is a function which is defined within another function, the enclosing function. Due to simple recursive scope rules, a nested function is itself invisible outside of its immediately enclosing function, but can see (access) all local objects (data, functions, types, etc.) of its immediately enclosing function as well as of any function(s) which, in turn, encloses that function. The nesting is theoretically possible to unlimited depth, although only a few levels are normally used in practical programs.
Nested functions are used in many approaches to structured programming, including early ones, such as ALGOL, Simula 67 and Pascal, and also in many modern dynamic languages and functional languages. However, they are traditionally not supported in the (originally simple) C-family of languages.
Effects
Nested functions assume function scope or block scope. The scope of a nested function is inside the enclosing function, i.e. inside one of the constituent blocks of that function, which means that it is invisible outside that block and also outside the enclosing function. A nested function can access other local functions, variables, constants, types, classes, etc. that are in the same scope, or in any enclosing scope, without explicit parameter passing, which greatly simplifies passing data into and out of the nested function. This is typically allowed for both reading and writing.
Nested functions may in certain situations (and languages) lead to the creation of a closure. If it is possible for the nested function to escape the enclosing function, for example if functions are first class objects and a nested function is passed to another function or returned from the enclosing function, then a closure is created and calls to this function can access the environment of the original function. The frame of the immediately enclosing function must continue to be alive until the last referencing |
https://en.wikipedia.org/wiki/Perpendicular%20distance | In geometry, the perpendicular distance between two objects is the distance from one to the other, measured along a line that is perpendicular to one or both.
The distance from a point to a line is the distance to the nearest point on that line. That is the point at which a segment from it to the given point is perpendicular to the line.
Likewise, the distance from a point to a curve is measured by a line segment that is perpendicular to a tangent line to the curve at the nearest point on the curve.
The distance from a point to a plane is measured as the length from the point along a segment that is perpendicular to the plane, meaning that it is perpendicular to all lines in the plane that pass through the nearest point in the plane to the given point.
Other instances include:
Point on plane closest to origin, for the perpendicular distance from the origin to a plane in three-dimensional space
Nearest distance between skew lines, for the perpendicular distance between two non-parallel lines in three-dimensional space
Perpendicular regression fits a line to data points by minimizing the sum of squared perpendicular distances from the data points to the line.
Other geometric curve fitting methods using perpendicular distance to measure the quality of a fit exist, as in total least squares.
The concept of perpendicular distance may be generalized to
orthogonal distance, between more abstract non-geometric orthogonal objects, as in linear algebra (e.g., principal components analysis);
normal distance, involving a surface normal, between an arbitrary point and its foot on the surface. It can be used for surface fitting and for defining offset surfaces.
See also
Distance between sets
Hypercycle (geometry)
Moment of inertia
Signed distance |
https://en.wikipedia.org/wiki/List%20of%20abstract%20algebra%20topics | Abstract algebra is the subject area of mathematics that studies algebraic structures, such as groups, rings, fields, modules, vector spaces, and algebras. The phrase abstract algebra was coined at the turn of the 20th century to distinguish this area from what was normally referred to as algebra, the study of the rules for manipulating formulae and algebraic expressions involving unknowns and real or complex numbers, often now called elementary algebra. The distinction is rarely made in more recent writings.
Basic language
Algebraic structures are defined primarily as sets with operations.
Algebraic structure
Subobjects: subgroup, subring, subalgebra, submodule etc.
Binary operation
Closure of an operation
Associative property
Distributive property
Commutative property
Unary operator
Additive inverse, multiplicative inverse, inverse element
Identity element
Cancellation property
Finitary operation
Arity
Structure preserving maps called homomorphisms are vital in the study of algebraic objects.
Homomorphisms
Kernels and cokernels
Image and coimage
Epimorphisms and monomorphisms
Isomorphisms
Isomorphism theorems
There are several basic ways to combine algebraic objects of the same type to produce a third object of the same type. These constructions are used throughout algebra.
Direct sum
Direct limit
Direct product
Inverse limit
Quotient objects: quotient group, quotient ring, quotient module etc.
Tensor product
Advanced concepts:
Category theory
Category of groups
Category of abelian groups
Category of rings
Category of modules (over a fixed ring)
Morita equivalence, Morita duality
Category of vector spaces
Homological algebra
Filtration (algebra)
Exact sequence
Functor
Zorn's lemma
Semigroups and monoids
Semigroup
Subsemigroup
Free semigroup
Green's relations
Inverse semigroup (or inversion semigroup, cf. )
Krohn–Rhodes theory
Semigroup algebra
Transformation semigroup
Monoid
Aperiodic monoid
Free monoid
Monoid (category theory)
Monoid factorisation
Syntacti |
https://en.wikipedia.org/wiki/Theodore%20Holmes%20Bullock | Theodore Holmes Bullock (16 May 1915 – 20 December 2005) is one of the founding fathers of neuroethology. During a career spanning nearly seven decades, this American academic was esteemed both as a pioneering and influential neuroscientist, examining the physiology and evolution of the nervous system across organizational levels, and as a champion of the comparative approach, studying species from nearly all major animal groups—coelenterates, annelids, arthropods, echinoderms, molluscs, and chordates.
Bullock discovered the pit organ in pit vipers and electroreceptors in weakly electric fish, as well as other electrosensory animals. His work on the jamming avoidance response in electric fish (work later carried on by Walter Heiligenberg) is an excellent example of how motor programs are integrated with incoming sensory information when generating a behavior pattern in response to a stimulus.
Bullock appealed to the scientific community to look beyond established paradigms in neuroscience, as well as to consider the ecology of an animal when endeavoring to understand its nervous system. As he once wrote, “Neuroscience is part of biology, more specifically of zoology, and it suffers tunnel vision unless continuous with ethology, ecology, and evolution”.
In his quest to go beyond a descriptive account of the nervous system, Bullock studied many different and
unrelated, species. He believed that this "comparative approach" would reveal both general principles of the nervous system, and offer insights into which nervous system properties (anatomical, physiological, and chemical) were relevant to observed differences in species-specific traits, as well as which specific traits were relevant to observed differences in nervous systems. His resulting discoveries helped explain various properties of nervous systems. In one influential review he wrote, “Comparative neuroscience is likely to reach insights so novel as to constitute revolutions in understanding the structur |
https://en.wikipedia.org/wiki/Submental%20triangle | The submental triangle (or suprahyoid triangle) is a division of the anterior triangle of the neck.
Boundaries
It is limited to:
Lateral (away from the midline), formed by the anterior belly of the digastricus
Medial (towards the midline), formed by the midline of the neck between the mandible and the hyoid bone
Inferior (below), formed by the body of the hyoid bone
Floor is formed by the mylohyoideus
Roof is formed by Investing layer of deep cervical fascia
Contents
It contains one or two lymph glands, the submental lymph nodes (three or four in number) and Submental veins and commencement of anterior jugular veins.
(The contents of the triangle actually lie in the superficial fascia over the roof of submental triangle)
Additional images
See also
Anterior triangle of the neck
Submental space |
https://en.wikipedia.org/wiki/List%20of%20statisticians | This list of statisticians lists people who have made notable contributions to the theories or application of statistics, or to the related fields of probability or machine learning. Also included are actuaries and demographers.
A
Aalen, Odd Olai (1947–1987)
Abbey, Helen (1915–2001)
Abbott, Edith (1876–1957)
Abelson, Robert P. (1928–2005)
Abramovitz, Moses (1912–2000)
Achenwall, Gottfried (1719–1772)
Adelstein, Abraham Manie (1916–1992)
Adkins, Dorothy (1912–1975)
Ahsan, Riaz (1951–2008)
Ahtime, Laura
Aitchison, Beatrice (1908–1997)
Aitchison, John (1926–2016)
Aitken, Alexander (1895–1967)
Akaike, Hirotsugu (1927–2009)
Aliaga, Martha (1937–2011)
Allan, Betty (1905–1952)
Allen, R. G. D. (1906–1983)
Allison, David B.
Altman, Doug (1948–2018)
Altman, Naomi
Amemiya, Takeshi (1938–)
Anderson, Oskar (1887–1960)
Anderson, Theodore Wilbur
Anderson-Cook, Christine (1966–)
de Andrade, Mariza
Anscombe, Francis (1918–2001)
Anselin, Luc
Antonovska, Svetlana (1952–2016)
Armitage, Peter (1924–)
Armstrong, Margaret
Arrow, Kenneth
Ash, Arlene
Ashby, Deborah (1959–)
Asher, Jana
Ashley-Cooper, Anthony
Austin, Oscar Phelps
Ayres, Leonard Porter
B
Backer, Julie E. (1890–1977)
Bahadur, Raghu Raj (1924–1997)
Bahn, Anita K. (1920–1980)
Bailar, Barbara A.
Bailey, Rosemary A. (1947–)
Bailey-Wilson, Joan (1953–)
Baker, Rose
Balding, David
Bandeen-Roche, Karen
Barber, Rina Foygel
Barnard, George Alfred (1915–2002)
Barnard, Mildred (1908–2000)
Barnett, William A.
Bartels, Julius
Bartlett, M. S. (1910–2002)
Bascand, Geoff
Basford, Kaye
Basu, Debabrata (1924–2001)
Bates, Nancy
Batcher, Mary
Baxter, Laurence (1954–1996)
Bayarri, M. J. (1956–2014)
Bayes, Thomas (1702–1761)
Beale, Calvin
Becker, Betsy
Bediako, Grace
Behm, Ernst
Benjamin, Bernard
Benzécri, Jean-Paul (1932–2019)
Berger, James
Berkson, Joseph (1899–1982)
Bernardo, José-Miguel
Berry, Don
Best, Alfred M. (1876–1958)
Best, Nicky
Betensky, Rebecca
Beveridge, William
Bhat, B. R.
Bhat, P |
https://en.wikipedia.org/wiki/Bicipital%20groove | The bicipital groove (intertubercular groove, sulcus intertubercularis) is a deep groove on the humerus that separates the greater tubercle from the lesser tubercle. It allows for the long tendon of the biceps brachii muscle to pass.
Structure
The bicipital groove separates the greater tubercle from the lesser tubercle. It is usually around 8 cm long and 1 cm wide in adults. It lodges the long tendon of the biceps brachii muscle between the tendon of the pectoralis major muscle on the lateral lip and the tendon of the teres major muscle on the medial lip. It also transmits a branch of the anterior humeral circumflex artery to the shoulder joint.
The insertion of the latissimus dorsi muscle is found along the floor of the bicipital groove. The teres major muscle inserts on the medial lip of the groove.
It runs obliquely downward, and ends near the junction of the upper with the middle third of the bone. It is the lateral wall of the axilla.
Function
The bicipital groove allows for the long tendon of the biceps brachii muscle to pass.
Gallery
See also
Radial groove
Medial bicipital groove |
https://en.wikipedia.org/wiki/Verbal%20aggression | Verbal aggressiveness in communication has been studied to examine the underlying message of how the aggressive communicator gains control over different things that occur, through the usage of verbal aggressiveness. Scholars have identified that individuals who express verbal aggressiveness have the goal of controlling and manipulating others through language. Infante and Wigley defined verbal aggressiveness as "a personality trait that predisposes persons to attack the self-concepts of other people instead of, or in addition to, their positions on topics of communication". Self-concept can be described as a group of values and beliefs that one has. Verbal aggressiveness is thought to be mainly a destructive form of communication, but it can produce positive outcomes. Infante and Wigley described aggressive behavior in interpersonal communication as products of individual's aggressive traits and the way the person perceives the aggressive circumstances that prevents them or something in a situation.
Infante, Trebing, Shepard, and Seeds collaborated to showcase the relationship between argumentativeness and verbal aggression. The study investigated two things. The first component investigated whether high, moderate, or low behaviors differ in how easily they are caused by an opponent that selects verbally aggressive responses. The second focused on whether different sexes display different levels of verbal aggression. The results concluded that people who scored high on argumentativeness were the least likely to prefer verbal aggression. Argumentativeness is a constructive, positive trait that recognizes different positions which might exist on issues that are controversial. As for the difference between sexes, males are more likely than females to use verbal aggression because males have been conditioned to be more dominant and competitive.
The Verbal Aggressiveness Scale measures the personality trait of verbal aggressiveness and has been widely used in communi |
https://en.wikipedia.org/wiki/20-pair%20colour%20code%20%28Australia%29 | The 20-pair colour code is a colour code used in Australia to identify individual conductors in a kind of electrical telecommunication wiring for indoor use, known as twisted pair cables. The colours are applied to the insulation that covers each conductor. The first colour is chosen from one group of five colours.
The combinations are also shown in the table below showing the colour for each wire ("1" and "2") and the pair number.
The Australian standard specifies "Grey" in Tables B2 to B7. There are systems in other countries where "Slate" is used rather than "Grey". This is perceived as a minimisation of confusion between "Green" and "Grey" and their potential abbreviations: "G", "Gr", or "Gre". No such consideration is made for "Black", "Blue", or "Brown", or their potential abbreviations of "B", "Bl", or "Br".
Sources
www.commsalliance.com.au
See also
25-pair colour code
Color codes
Telephony equipment
Telecommunications in Australia |
https://en.wikipedia.org/wiki/Tandem%20Repeats%20Database | The Tandem Repeats Database (TRDB) is a database of tandem repeats in genomic DNA.
See also
Tandem repeats |
https://en.wikipedia.org/wiki/Blum%20axioms | In computational complexity theory the Blum axioms or Blum complexity axioms are axioms that specify desirable properties of complexity measures on the set of computable functions. The axioms were first defined by Manuel Blum in 1967.
Importantly, Blum's speedup theorem and the Gap theorem hold for any complexity measure satisfying these axioms. The most well-known measures satisfying these axioms are those of time (i.e., running time) and space (i.e., memory usage).
Definitions
A Blum complexity measure is a pair with a numbering of the partial computable functions and a computable function
which satisfies the following Blum axioms. We write for the i-th partial computable function under the Gödel numbering , and for the partial computable function .
the domains of and are identical.
the set is recursive.
Examples
is a complexity measure, if is either the time or the memory (or some suitable combination thereof) required for the computation coded by i.
is not a complexity measure, since it fails the second axiom.
Complexity classes
For a total computable function complexity classes of computable functions can be defined as
is the set of all computable functions with a complexity less than . is the set of all boolean-valued functions with a complexity less than . If we consider those functions as indicator functions on sets, can be thought of as a complexity class of sets. |
https://en.wikipedia.org/wiki/ETSI%20Satellite%20Digital%20Radio | ETSI Satellite Digital Radio (SDR or ETSI SDR) describes a standard of satellite digital radio. It is an activity of the European standardisation organisation ETSI.
It addresses systems where a satellite broadcast directly to mobile and handheld receivers in L band or S band and is complemented by terrestrial transmitters. The broadcast content consists of multicast audio (digital radio), video (mobile TV) and data (program guide, text and graphical information, as well as off-line content). The satellite component allows geographical coverage at low cost, whereas the terrestrial component improves reception quality in built up areas. The specifications considers conditional access and Digital Rights Management.
1worldspace planned to use ETSI SDR in its new network covering Europe from 2009, but the company went defunct before it launched its service. Also Ondas Media has announced to use ETSI SDR.
The ETSI SDR is also similar to the Sirius XM Radio, the S-DMB used in South Korea for multimedia broadcasting since May 2005, the China Multimedia Mobile Broadcasting (CMMB) and the defunct MobaHo! service (2004-2009). The DVB-SH specifications, which the DVB Project has created, target similar broadcast systems as ETSI SDR.
ETSI SDR Standard
The ETSI SDR standard allows implementation of parts of such networks in an interoperable way. So far, ETSI has standardized the physical layer of the air interface (radio interface). This allows implementation of demodulators in integrated circuits. The physical layer is described by the following parts of ETSI EN 302 550:
ETSI EN 302 550-1-1 "Satellite Earth Stations and Systems (SES); Satellite Digital Radio (SDR) Systems; Part 1: Physical Layer of the Radio Interface; Sub-Part 1: Outer Physical Layer"
ETSI EN 302 550-1-2 "Satellite Earth Stations and Systems (SES); Satellite Digital Radio (SDR) Systems; Part 1: Physical Layer of the Radio Interface; Sub-Part 2: Inner Physical Layer Single Carrier Modulation"
ETSI EN 302 |
https://en.wikipedia.org/wiki/Competence%20factor | The ability of a cell to successfully incorporate exogenous DNA, or competency, is determined by competence factors. These factors consist of certain cell surface proteins and transcription factors that induce the uptake of DNA.
Natural competence is the ability of a cell to bind to and transport extracellular DNA through the membrane and recombine foreign genes into its own DNA through a process called transformation. Horizontal gene transfer is a result of this, where bacterial genes can be transferred amongst same-generation species in a given environment, and competence is the ability of a cell to participate in the transfer. If one cell in a population living in an unfavorable environment has a mutation that results in better survivability, that gene can be passed on to other competent cells to extend the same advantage. Plasmids, commonly used in genetic manipulation, can also be shared through horizontal gene transfer, which is especially relevant in modern medicine concerning the exchange of antibiotic-resistant plasmids.
A cell's competence can be determined by its genetics, which is the case for natural competency, or it can be manipulated in order to achieve artificial competence.
There are two types of competence-inducing pheromones, these are ComX and CSF. ComX is a ten amino acid oligopeptide; it requires two co-components, ComP, an histidine kinase, and ComA, a cytoplasmic response regulator. ComX binds to ComP on the outside of the inner membrane; ComP autophosphorylates and the phosphoric group is transferred to ComA. This activates transcription of genes in the competence pathway. CSF is a five amino acid oligopeptide and is exported via the GSP pathway. CSF enters the cell through oligopeptide permeate and stimulates the competence pathway at low concentrations (1-5 nM); at high concentrations (>20 nM) competence is inhibited and sporulation is stimulated.
Types of competence
Natural
Most bacterium found in nature are said to be naturally |
https://en.wikipedia.org/wiki/BitterDB | The BitterDB is a database of compounds that were reported to taste bitter to humans. The aim of the BitterDB database is to gather information about bitter-tasting natural and synthetic compounds, and their cognate bitter taste receptors (T2Rs or TAS2Rs).
Summary
The BitterDB includes over 670 compounds that were reported to taste bitter to humans. The compounds can be searched by name, chemical structure, similarity to other bitter compounds, association with a particular human bitter taste receptor, and by other properties as well. The database also contains information on mutations in bitter taste receptors that were shown to influence receptor activation by bitter compounds.
Database overview
Bitter compounds
BitterDB currently contains more than 670 compounds that were cited in the literature as bitter. For each compound, the database offers information regarding its molecular properties, references for the compound’s bitterness, including additional information about the bitterness category of the compound (e.g. a ‘bitter-sweet’ or ‘slightly bitter’ annotation), different compound identifiers (SMILES, CAS registry number, IUPAC systematic name), an indication whether the compound is derived from a natural source or is synthetic, a link to the compound’s PubChem entry and different file formats for downloading (sdf, image, smiles).
Over 200 bitter compounds have been experimentally linked to their corresponding human bitter taste receptors. For those compounds, BitterDB provides additional information, including links to the publications indicating these ligand–receptor interactions, the effective concentration for receptor activation and/or the EC50 value and links to the associated bitter taste receptors entries in the BitterDB.
Querying and browsing
Bitter compounds can be queried and browsed in different ways. For example, the 'advanced search' option allows the user to retrieve compounds that fit different criteria, such as a combination of specific |
https://en.wikipedia.org/wiki/Queen%20Elisabeth%20Medical%20Foundation | The Queen Elisabeth Medical Foundation (QEMF) is a Belgian non-profit organization, founded in 1926 by Elisabeth of Bavaria, wife of Albert I. She founded the organization, based on her experience with the wounded from the front-line during the First World War. The foundation wants to encourage laboratory research and contacts between researchers and clinical practitioners, with a particular focus on neurosciences. The QEMF supports seventeen university teams throughout Belgium.
See also
King Baudouin Foundation
National Fund for Scientific Research
Queen Elisabeth Music Competition
Queen Fabiola Foundation for Mental Health |
https://en.wikipedia.org/wiki/Phoenix%20network%20coordinates | Phoenix is a decentralized network coordinate system based on the matrix factorization model.
Background
Network coordinate (NC) systems are an efficient mechanism for internet distance (round-trip latency) prediction with scalable measurements. For a network with N hosts, by performing O(N) measurements, all N*N distances can be predicted.
Use cases: Vuze BitTorrent, application layer multicast, PeerWise overlay, multi-player online gaming.
Triangle inequality violation (TIV) is widely exist on the Internet due to the current sub-optimal internet routing.
Model
Most of the prior NC systems use the Euclidean distance model, i.e. embed N hosts into a d-dimensional Euclidean space Rd. Due to the wide existence of TIVs on the internet, the prediction accuracy of such systems is limited. Phoenix uses a matrix factorization (MF) model, which does not have the constraint of TIV.
The linear dependence among the rows motivates the factorization of internet distance matrix, i.e. for a system with internet nodes, the internet distance matrix D can be factorized into two smaller matrices. where and are matrices (d << N). This matrix factorization is essentially a problem of linear dimensionality reduction and Phoenix tries to solve it in a distributed way.
Design choices in Phoenix
Different from the existing MF based NC systems such as IDES and DMF, Phoenix introduces a weight to each reference NC and trusts the NCs with higher weight values more than the others. The weight-based mechanism can substantially reduce the impact of the error propagation.
For node discovery, Phoenix uses a distributed scheme, so-called peer exchange (PEX), which is used in BitTorrent (protocol). The usage of PEX reduces the load of the tracker, while still ensuring the prediction accuracy under node churn.
Similar to DMF, for avoiding the potential drift of the NCs, Regularization (mathematics) is introduced in NC calculation.
NCShield is a decentralized, goosip-based trust an |
https://en.wikipedia.org/wiki/Ramachandran%20plot | In biochemistry, a Ramachandran plot (also known as a Rama plot, a Ramachandran diagram or a [φ,ψ] plot), originally developed in 1963 by G. N. Ramachandran, C. Ramakrishnan, and V. Sasisekharan, is a way to visualize energetically allowed regions for backbone dihedral angles ψ against φ of amino acid residues in protein structure. The figure on the left illustrates the definition of the φ and ψ backbone dihedral angles (called φ and φ' by Ramachandran). The ω angle at the peptide bond is normally 180°, since the partial-double-bond character keeps the peptide bond planar. The figure in the top right shows the allowed φ,ψ backbone conformational regions from the Ramachandran et al. 1963 and 1968 hard-sphere calculations: full radius in solid outline, reduced radius in dashed, and relaxed tau (N-Cα-C) angle in dotted lines. Because dihedral angle values are circular and 0° is the same as 360°, the edges of the Ramachandran plot "wrap" right-to-left and bottom-to-top. For instance, the small strip of allowed values along the lower-left edge of the plot are a continuation of the large, extended-chain region at upper left.
Uses
A Ramachandran plot can be used in two somewhat different ways. One is to show in theory which values, or conformations, of the ψ and φ angles are possible for an amino-acid residue in a protein (as at top right). A second is to show the empirical distribution of datapoints observed in a single structure (as at right, here) in usage for structure validation, or else in a database of many structures (as in the lower 3 plots at left). Either case is usually shown against outlines for the theoretically favored regions.
Amino-acid preferences
One might expect that larger side chains would result in more restrictions and consequently a smaller allowable region in the Ramachandran plot, but the effect of side chains is small. In practice, the major effect seen is that of the presence or absence of the methylene group at Cβ. Glycine has only a hydroge |
https://en.wikipedia.org/wiki/Memento%20pattern | The memento pattern is a software design pattern that exposes the private internal state of an object.
One example of how this can be used is to restore an object to its previous state (undo via rollback), another is versioning, another is custom serialization.
The memento pattern is implemented with three objects: the originator, a caretaker and a memento. The originator is some object that has an internal state. The caretaker is going to do something to the originator, but wants to be able to undo the change. The caretaker first asks the originator for a memento object. Then it does whatever operation (or sequence of operations) it was going to do. To roll back to the state before the operations, it returns the memento object to the originator. The memento object itself is an opaque object (one which the caretaker cannot, or should not, change). When using this pattern, care should be taken if the originator may change other objects or resources—the memento pattern operates on a single object.
Classic examples of the memento pattern include a pseudorandom number generator (each consumer of the PRNG serves as a caretaker who can initialize the PRNG (the originator) with the same seed (the memento) to produce an identical sequence of pseudorandom numbers) and the state in a finite state machine.
Overview
The Memento design pattern is one of the twenty-three well-known GoF design patterns that describe how to solve recurring design problems to design flexible and reusable object-oriented software, that is, objects that are easier to implement, change, test, and reuse. The Memento Pattern was created by Noah Thompson, David Espiritu, and Dr. Drew Clinkenbeard for early HP products.
What problems can the Memento design pattern solve?
The internal state of an object should be saved externally so that the object can be restored to this state later.
The object's encapsulation must not be violated.
The problem is that a well designed object is encapsulated so tha |
https://en.wikipedia.org/wiki/Perry%20Mason%3A%20The%20Case%20of%20the%20Mandarin%20Murder | Perry Mason: The Case of the Mandarin Murder is an interactive fiction computer game with graphics. The game was published by Telarium (formerly known as Trillium), a subsidiary of Spinnaker Software, in 1985.
Description
The game is based on the popular TV series Perry Mason starring Raymond Burr, who played the fictional defense attorney of the same name created by Erle Stanley Gardner. The player must save client Laura Knapp from being convicted of the murder of her husband Victor.
Reception
Antic Amiga in 1985 called Perry Mason "a major breakthrough in interactive fiction." |
https://en.wikipedia.org/wiki/High-dimensional%20model%20representation | High-dimensional model representation is a finite expansion for a given multivariable function. The expansion was first described by Ilya M. Sobol as
The method, used to determine the right hand side functions, is given in Sobol's paper. A review can be found here: High Dimensional Model Representation (HDMR): Concepts and Applications.
The underlying logic behind the HDMR is to express all variable interactions in a system in a hierarchical order. For instance represents the mean response of the model . It can be considered as measuring what is left from the model after stripping down all variable effects. The uni-variate functions , however represents the "individual" contributions of the variables. For instance, is the portion of the model that can be controlled only by the variable . For this reason, there can not be any constant in because all constants are expressed in . Going further into higher interactions,the next stop is bivariate functions which represents the cooperative effect of variables and together. Similar logic applies here: the bivariate functions do not contain univarite functions nor constants as it violates the construction logic of HDMR. As we go into higher interactions, the number of interactions are increasing and at last we reach the residual term representing the contribution only if all variable act together.
HDMR as an Approximation
The hierarchical representation model of HDMR brings an advantage if one needs to replace an existing model with a simpler one usually containing only univariate or bivariate terms. If the target model does not contain higher level of variable interactions, this approach can yield good approximations with the additional advantage of providing a clearer view of variable interactions.
See also
Variance-based sensitivity analysis
Volterra series |
https://en.wikipedia.org/wiki/Avermectin | The avermectins are a series of drugs and pesticides used to treat parasitic worms and insect pests. They are a group of 16-membered macrocyclic lactone derivatives with potent anthelmintic and insecticidal properties. These naturally occurring compounds are generated as fermentation products by Streptomyces avermitilis, a soil actinomycete. Eight different avermectins were isolated in four pairs of homologue compounds (A1, A2, B1, B2), with a major (a-component) and minor (b-component) component usually in ratios of 80:20 to 90:10. Avermectin B1, a mixture of B1a and B1b, is the drug and pesticide abamectin. Other anthelmintics derived from the avermectins include ivermectin, selamectin, doramectin, eprinomectin.
Half of the 2015 Nobel Prize in Physiology or Medicine was awarded to William C. Campbell and Satoshi Ōmura for discovering avermectin, "the derivatives of which have radically lowered the incidence of river blindness and lymphatic filariasis, as well as showing efficacy against an expanding number of other parasitic diseases."
History
In 1978, an actinomycete was isolated at the Kitasato Institute from a soil sample collected at Kawana, Ito City, Shizuoka Prefecture, Japan. Later that year, the isolated actinomycete was sent to Merck Sharp and Dohme Research Laboratories for testing. Various carefully controlled broths were fermented using the isolated actinomycete. Early tests indicated that some of the whole, fermented broths were active against Nematospiroides dubius in mice over at least an eight-fold range without notable toxicity. Subsequent to this, the anthelmintic activity was isolated and identified as a family of closely related compounds. The compounds were finally characterized and the novel species that produced them were described by a team at Merck in 1978, and named Streptomyces avermitilis (with the adjective probably intended to mean that it kills worms).
In 2002, Yoko Takahashi and others at the Kitasato Institute for Life Scien |
https://en.wikipedia.org/wiki/Irradiance | In radiometry, irradiance is the radiant flux received by a surface per unit area. The SI unit of irradiance is the watt per square metre (W⋅m−2). The CGS unit erg per square centimetre per second (erg⋅cm−2⋅s−1) is often used in astronomy. Irradiance is often called intensity, but this term is avoided in radiometry where such usage leads to confusion with radiant intensity. In astrophysics, irradiance is called radiant flux.
Spectral irradiance is the irradiance of a surface per unit frequency or wavelength, depending on whether the spectrum is taken as a function of frequency or of wavelength. The two forms have different dimensions and units: spectral irradiance of a frequency spectrum is measured in watts per square metre per hertz (W⋅m−2⋅Hz−1), while spectral irradiance of a wavelength spectrum is measured in watts per square metre per metre (W⋅m−3), or more commonly watts per square metre per nanometre (W⋅m−2⋅nm−1).
Mathematical definitions
Irradiance
Irradiance of a surface, denoted Ee ("e" for "energetic", to avoid confusion with photometric quantities), is defined as
where
∂ is the partial derivative symbol;
Φe is the radiant flux received;
A is the area.
If we want to talk about the radiant flux emitted by a surface, we speak of radiant exitance.
Spectral irradiance
Spectral irradiance in frequency of a surface, denoted Ee,ν, is defined as
where ν is the frequency.
Spectral irradiance in wavelength of a surface, denoted Ee,λ, is defined as
where λ is the wavelength.
Property
Irradiance of a surface is also, according to the definition of radiant flux, equal to the time-average of the component of the Poynting vector perpendicular to the surface:
where
is the time-average;
S is the Poynting vector;
α is the angle between a unit vector normal to the surface and S.
For a propagating sinusoidal linearly polarized electromagnetic plane wave, the Poynting vector always points to the direction of propagation while oscillating in magnitude. The irradi |
https://en.wikipedia.org/wiki/Numerical%20modeling%20in%20echocardiography | Numerical manipulation of Doppler parameters obtain during routine Echocardiography has been extensively utilized to non-invasively estimate intra-cardiac pressures, in many cases removing the need for invasive cardiac catheterization.
Echocardiography uses ultrasound to create real-time anatomic images of the heart and its structures. Doppler echocardiography utilizes the Doppler principle to estimate intracardiac velocities. Via the modified Bernoulli equation, velocity is routinely converted to pressure gradient for use in clinical cardiology decision making.
A broad discipline of mathematical modeling of intracardiac velocity parameters for pulmonary circulation and aortic Doppler for aortic stenosis have been investigated. Diasatolic dysfunction algorithms use complex combinations of these numeric models to estimate intra-cardiac filling pressures. Shunt defects have been studied using the Relative Atrial Index.
See also
Medical ultrasonography section: Doppler sonography
Echocardiography
American Society of Echocardiography
Christian Doppler |
https://en.wikipedia.org/wiki/Calibrated%20geometry | In the mathematical field of differential geometry, a calibrated manifold is a Riemannian manifold (M,g) of dimension n equipped with a differential p-form φ (for some 0 ≤ p ≤ n) which is a calibration, meaning that:
φ is closed: dφ = 0, where d is the exterior derivative
for any x ∈ M and any oriented p-dimensional subspace ξ of TxM, φ|ξ = λ volξ with λ ≤ 1. Here volξ is the volume form of ξ with respect to g.
Set Gx(φ) = { ξ as above : φ|ξ = volξ }. (In order for the theory to be nontrivial, we need Gx(φ) to be nonempty.) Let G(φ) be the union of Gx(φ) for x in M.
The theory of calibrations is due to R. Harvey and B. Lawson and others. Much earlier (in 1966) Edmond Bonan introduced G2-manifolds and Spin(7)-manifolds, constructed all the parallel forms and showed that those manifolds were Ricci-flat. Quaternion-Kähler manifolds were simultaneously studied in 1967 by Edmond Bonan and Vivian Yoh Kraines and they constructed the parallel 4-form.
Calibrated submanifolds
A p-dimensional submanifold Σ of M is said to be a calibrated submanifold with respect to φ (or simply φ-calibrated) if TΣ lies in G(φ).
A famous one line argument shows that calibrated p-submanifolds minimize volume within their homology class. Indeed, suppose that Σ is calibrated, and Σ ′ is a p submanifold in the same homology class. Then
where the first equality holds because Σ is calibrated, the second equality is Stokes' theorem (as φ is closed), and the inequality holds because φ is a calibration.
Examples
On a Kähler manifold, suitably normalized powers of the Kähler form are calibrations, and the calibrated submanifolds are the complex submanifolds. This follows from the Wirtinger inequality.
On a Calabi–Yau manifold, the real part of a holomorphic volume form (suitably normalized) is a calibration, and the calibrated submanifolds are special Lagrangian submanifolds.
On a G2-manifold, both the 3-form and the Hodge dual 4-form define calibrations. The corresponding calibrated sub |
https://en.wikipedia.org/wiki/Sourcefire | Sourcefire, Inc was a technology company that developed network security hardware and software. The company's Firepower network security appliances were based on Snort, an open-source intrusion detection system (IDS). Sourcefire was acquired by Cisco for $2.7 billion in July 2013.
Background
Sourcefire was founded in 2001 by Martin Roesch, the creator of Snort. The company created a commercial version of the Snort software, the Sourcefire 3D System, which evolved into the company's Firepower line of network security products. The company's headquarters was in Columbia, Maryland in the United States, with offices abroad.
Financial
The company's initial growth was funded through four separate rounds of financing raising a total of $56.5 million from venture investors such as Sierra Ventures, New Enterprise Associates, Sequoia Capital, Core Capital Partners, Inflection Point Ventures, Meritech Capital Partners, and Cross Creek Capital, L.P.
In 2005, Check Point Software attempted to acquire Sourcefire for $225 million, but later withdrew its offer after it became clear US authorities would attempt to block the acquisition. The company completed an initial public offering in March 2007, raising $86.3 million. In August of the same year, Sourcefire acquired Clam AntiVirus. Sourcefire rejected an offer of $187 million in May 2008 from security appliance vendor Barracuda Networks, who had offered to pay US$7.50 per share, amounting to a 13% premium of their then-current stock price. Sourcefire announced its acquisition of the cloud-based antivirus firm Immunet in January 2011.
Revenue for the fourth quarter of 2012 was $67.4 million compared to $53.2 million in the fourth quarter of 2011, an increase of 27%. Revenue for the year ending December 31, 2012 was $223.1 million compared to $165.6 million for 2011, an increase of 35%. International revenues were $74.4 million, up 77% over 2011. As of December 31, 2012, the company's cash, cash equivalents, and investments to |
https://en.wikipedia.org/wiki/NCEP/NCAR%20Reanalysis | The NCEP/NCAR Reanalysis is an atmospheric reanalysis produced by the National Centers for Environmental Prediction (NCEP) and the National Center for Atmospheric Research (NCAR). It is a continually updated globally gridded data set that represents the state of the Earth's atmosphere, incorporating observations and numerical weather prediction (NWP) model output from 1948 to present.
Accessing the data
The data is available for free download from the NOAA Earth System Research Laboratory and NCEP. It is distributed in Netcdf and GRIB files, for which a number of tools and libraries exist.
It is available for download through the NCAR CISL Research Data Archive on the NCEP/NCAR Reanalysis main data page.
Uses
Initializing a smaller scale atmospheric model
Climate assessment
Subsequent updates
Since then NCEP-DOE Reanalysis 2 and the NCEP CFS Reanalysis are released. The former focuses in fixing existing bugs with the NCEP/NCAR Reanalysis system – most notably surface energy and usage of observed precipitation forcing to the land surface, but otherwise uses a similar numerical model and data assimilation system. The latter is based on the NCEP Climate Forecast System.
See also
ECMWF re-analysis |
https://en.wikipedia.org/wiki/Hybrid%20Wireless%20Mesh%20Protocol | The Hybrid Wireless Mesh Protocol (HWMP), part of IEEE 802.11s, is a basic routing protocol for a wireless mesh network. It is based on AODV (RFC 3561) and tree-based routing. It relies on a Peer Link Management protocol by which each Mesh Point discovers and tracks neighboring nodes. If any of these are connected to a wired backhaul, there is no need for HWMP, which selects paths from those assembled by compiling all mesh point peers into one composite map.
The HWMP protocol is hybrid, because it consists of a proactive tree-based hierarchical routing protocol, and an on-demand logic, based on the Ad-hoc On Demand Vector protocol (AODV). In contradiction with classic IP based (ISO level 3) routing, the HWMP protocol is based on ISO level 2 (based on MAC addresses).
HWMP is intended to displace proprietary protocols used by vendors like Meraki for the same purpose, permitting peer participation by open source router firmware. The open source implementation of 802.11s (open80211s) has been integrated to the Linux kernel by Cozybit Inc. FreeBSD supports HWMP starting with FreeBSD 8.0. |
https://en.wikipedia.org/wiki/Rose%E2%80%93Vinet%20equation%20of%20state | The Rose–Vinet equation of state is a set of equations used to describe the equation of state of solid objects. It is a modification of the Birch–Murnaghan equation of state.
The initial paper discusses how the equation only depends on four inputs: the isothermal bulk modulus , the derivative of bulk modulus with respect to pressure , the volume , and the thermal expansion; all evaluated at zero pressure () and at a single (reference) temperature. The same equation holds for all classes of solids and a wide range of temperatures.
Let the cube root of the specific volume be
then the equation of state is:
A similar equation was published by Stacey et al. in 1981. |
https://en.wikipedia.org/wiki/Osmunda%20wehrii | Osmunda wehrii is an extinct species of fern in the modern genus Osmunda of the family Osmundaceae. Osmunda wehrii is known from Langhian age Miocene fossils found in Central Washington.
History and classification
The species was described from specimens of silicified rhizomes and frond bases in blocks of chert. The cherts were recovered from sediments outcropping near the contact of the Roza Basalts and the overlying Priest Rapids Basalts, designated the type locality, near the town of Beverly, Washington by Fred Brinkman of Sunnyside, Washington. Further specimens of O. wehrii have been found at the "Ho ho" site, one of the "county line hole" fossil localities north of Interstate 82 in Yakima County, Washington. The "Ho ho" site works strata which is part of the Museum Flow Package within the interbeds of the Sentinel Bluffs Unit of the central Columbia Plateau N2 Grande Ronde Basalt, Columbia River Basalt Group. The Museum Flow Package interbeds are dated to the middle Miocene and are approximately 15.6 million years old.
The holotype specimens, two pieces of the same chert specimen containing rhizomes and frond bases, are preserved in the Burke Museum of Natural History and Culture as specimen numbers "4772" and "4773". The specimens of chert were studied by paleobotanists Charles N. Miller jr of University of Montana. Miller published his 1982 type description for Osmunda wehrii in the American Journal of Botany volume 69 article "Osmunda wehrii, a New Species Based on Petrified Rhizomes from the Miocene of Washington". In his type description he noted the etymology for the specific epithet wehrii, in honor of Wesley C. Wehr who made the type specimens available to Miller for study.
Description
Wessiea possesses rhizomes which are approximately in diameter. The fossils have distinct stipular frond bases characteristic of the family Osmundaceae, while the interior of the fronds show distinct long fibers in the frond bases are both representative of the |
https://en.wikipedia.org/wiki/Pteropus%20brunneus | Pteropus brunneus is an extinct species of flying fox in the family Pteropodidae. It was said to be found at Percy Island, southeast of Mackay, Queensland, off the northeast coast of Australia.
Taxonomy
A single male specimen was collected in 1874 and deposited at the British Museum of Natural History (BMNH), this skin and skull was estimated to be a near adult.
The description for this was published by George Edward Dobson in 1878, in a revision of chiropteran specimens held at the museum. Further details were provided when the specimen was again examined in 1912. Since that record, no further documentation is known of this species; the specimen is still located at BMNH. The description was re-evaluated in the late twentieth century, and recognition as a species is maintained in the third edition of The Mammals of Australia (National Photographic Index of Australian Wildlife, 2008). Speculation on the taxon includes the proposition the specimen may be an undiagnosed vagrant of another species.
Description
A smaller species of genus Pteropus, the weight estimated to be around . The length of the head and body combined is approximately , the forearm of the single specimen is . Fur colour of this macrobat is uniform across the body, a golden shade of brown.
The first description notes the form of the ears, comparing the specimen to those of Pteropus keraudrenii (Pteropus mariannus) but lacking any hair. The uropatagium is narrow and obscured across the centre by fur. The hair of the pelage is longer at the nape, but mostly short elsewhere, the fur at the upper back is slightly appressed and oppositely directed for an inch either side of the centre. Little fur appears at the arm, the legs are almost completely covered with hair.
The species bears no resemblance to the Australian pteropodids, the 'flying-foxes', yet was reported to be residing in a large camp that travelled to the Australian mainland to feed.
Distribution and habitat
The presumed distributio |
https://en.wikipedia.org/wiki/Joseph%20Weber | Joseph Weber (May 17, 1919 – September 30, 2000) was an American physicist. He gave the earliest public lecture on the principles behind the laser and the maser and developed the first gravitational wave detectors (Weber bars).
Early life
Joseph Weber was born in Paterson, New Jersey, on 17 May 1919, the last of four children born to Yiddish-speaking immigrant parents. His name was "Yonah" until he entered grammar school. He had no birth certificate, and his father had taken the last name of "Weber" to match an available passport in order to emigrate to the US. Thus, Joe Weber had little proof of either his family or his given name, which gave him some trouble in obtaining a passport at the height of the red scare.
Early education
Weber attended Paterson public schools (and the Paterson Talmud Torah), graduating at sixteen from the "Mechanic Arts Course" of Paterson Eastside High School in June 1935. He began his undergraduate education at Cooper Union, but to save his family the expense of his room and board he won admittance to the United States Naval Academy through a competitive exam. He graduated from the Academy in 1940.
Naval career
He served aboard US Navy ships during World War II, rising to the rank of lieutenant commander. Weber was the Officer of the Deck on the USS Lexington when the ship received word of the attack on Pearl Harbor. In the Battle of the Coral Sea his carrier sank the Japanese aircraft carrier Shōhō and was in turn mortally damaged on May 8, 1942. Weber often regaled his students with the story of how the Lexington glowed incandescent as she slipped beneath the waves.
Later, he commanded the sub-chaser SC-690, first in the Caribbean, and later in the Mediterranean Sea. In that role, he took part in the invasion of Sicily at Gela Beach, in July 1943.
He studied electronics at the Naval Postgraduate School in 1943-45, and from 1945 to 1948, he headed electronic countermeasures design for the Navy's Bureau of Ships, in Washington, DC. |
https://en.wikipedia.org/wiki/Mark%20Levinson%20ML-3 | The Mark Levinson ML-3 was a 200 watt per channel dual monaural Class AB2 power amplifier that used toroidal transformers. Produced between 1979 and 1987, the ML-3 consisted of two electrically separate amplifiers in one chassis, hence the name "Dual Monaural". It also featured discrete circuit construction; no integrated circuits were incorporated to keep the signal pure. The design was by Thomas P. Colangelo.
The ML-3 constituted the archetype of an American highend, highpower amplifier.
Specifications
200 W/channel at 8 ohms, 400 W/ch at 4 ohms, 800 W/ch at 2 ohms
Maximum output: 45 volts 30 amperes
Two 1.2 kVA Avel Lindbergh toroidal transformers, 4 Sprague 36,000 µF, 100 V capacitors and 40 output devices (20 per channel)
Range: 20 Hz to 20 kHz with less than 0.2% total harmonic distortion
Gold-plated CAMAC input connectors
Adjustable AC voltage: No, factory set
Adjustable output damping toggle switches (one per channel) (in later models only)
Weight: 116 lb (56 kg)
External links
Mark Levinson Equipment History |
https://en.wikipedia.org/wiki/List%20of%20operating%20systems | This is a list of operating systems. Computer operating systems can be categorized by technology, ownership, licensing, working state, usage, and by many other characteristics. In practice, many of these groupings may overlap. Criteria for inclusion is notability, as shown either through an existing Wikipedia article or citation to a reliable source.
Proprietary
Acorn Computers
Arthur
ARX
MOS
RISC iX
RISC OS
Amazon
Fire OS
Amiga Inc.
AmigaOS
AmigaOS 1.0-3.9 (Motorola 68000)
AmigaOS 4 (PowerPC)
Amiga Unix (a.k.a. Amix)
Amstrad
AMSDOS
Contiki
CP/M 2.2
CP/M Plus
SymbOS
Apple Inc.
Apple II family
Apple DOS
Apple Pascal
Apex (Colorado School of Mines)
ProDOS
GS/OS
GNO/ME
Contiki
Apple III
Apple SOS
Apple Lisa
Apple Macintosh
Classic Mac OS
A/UX (UNIX System V with BSD extensions)
Copland
MkLinux
Pink
Rhapsody
macOS (formerly Mac OS X and OS X)
macOS Server (formerly Mac OS X Server and OS X Server)
Apple Network Server
IBM AIX (Apple-customized)
Apple MessagePad
Newton OS
iPhone and iPod Touch
iOS (formerly iPhone OS)
iPad
iPadOS
Apple Watch
watchOS
Apple TV
tvOS
Embedded operating systems
bridgeOS
Apple Vision Pro
visionOS
Embedded operating systems
A/ROSE
iPod software (unnamed embedded OS for iPod)
Unnamed NetBSD variant for Airport Extreme and Time Capsule
Apollo Computer, Hewlett-Packard
Domain/OS – One of the first network-based systems. Run on Apollo/Domain hardware. Later bought by Hewlett-Packard.
Atari
Atari DOS (for 8-bit computers)
Atari TOS
Atari MultiTOS
Contiki (for 8-bit, ST, Portfolio)
BAE Systems
XTS-400
Be Inc.
BeOS
BeIA
BeOS r5.1d0
magnussoft ZETA (based on BeOS r5.1d0 source code, developed by yellowTAB)
Bell Labs
Unix ("Ken's new system," for its creator (Ken Thompson), officially Unics and then Unix, the prototypic operating system created in Bell Labs in 1969 that formed the basis for the Unix family of operating systems)
UNIX Time-Sharing System v1
UNIX Time-Sharing Sy |
https://en.wikipedia.org/wiki/Catabolism | Catabolism () is the set of metabolic pathways that breaks down molecules into smaller units that are either oxidized to release energy or used in other anabolic reactions. Catabolism breaks down large molecules (such as polysaccharides, lipids, nucleic acids, and proteins) into smaller units (such as monosaccharides, fatty acids, nucleotides, and amino acids, respectively). Catabolism is the breaking-down aspect of metabolism, whereas anabolism is the building-up aspect.
Cells use the monomers released from breaking down polymers to either construct new polymer molecules or degrade the monomers further to simple waste products, releasing energy. Cellular wastes include lactic acid, acetic acid, carbon dioxide, ammonia, and urea. The formation of these wastes is usually an oxidation process involving a release of chemical free energy, some of which is lost as heat, but the rest of which is used to drive the synthesis of adenosine triphosphate (ATP). This molecule acts as a way for the cell to transfer the energy released by catabolism to the energy-requiring reactions that make up anabolism.
Catabolism is a destructive metabolism and anabolism is a constructive metabolism. Catabolism, therefore, provides the chemical energy necessary for the maintenance and growth of cells. Examples of catabolic processes include glycolysis, the citric acid cycle, the breakdown of muscle protein in order to use amino acids as substrates for gluconeogenesis, the breakdown of fat in adipose tissue to fatty acids, and oxidative deamination of neurotransmitters by monoamine oxidase.
Catabolic hormones
There are many signals that control catabolism. Most of the known signals are hormones and the molecules involved in metabolism itself. Endocrinologists have traditionally classified many of the hormones as anabolic or catabolic, depending on which part of metabolism they stimulate. The so-called classic catabolic hormones known since the early 20th century are cortisol, glucagon, and |
https://en.wikipedia.org/wiki/Fobos-Grunt | Fobos-Grunt or Phobos-Grunt (, where грунт refers to the ground in the narrow geological meaning of any type of soil or rock exposed on the surface) was an attempted Russian sample return mission to Phobos, one of the moons of Mars. Fobos-Grunt also carried the Chinese Mars orbiter Yinghuo-1 and the tiny Living Interplanetary Flight Experiment funded by the Planetary Society.
It was launched on 8 November 2011, at 20:16 UTC, from the Baikonur Cosmodrome, but subsequent rocket burns intended to set the craft on a course for Mars failed, leaving it stranded in low Earth orbit. Efforts to reactivate the craft were unsuccessful, and it fell back to Earth in an uncontrolled re-entry on 15 January 2012, over the Pacific Ocean, west of Chile. The return vehicle was to have returned to Earth in August 2014, carrying up to of soil from Phobos.
Funded by the Russian Federal Space Agency and developed by Lavochkin and the Russian Space Research Institute, Fobos-Grunt was the first Russian-led interplanetary mission since the failed Mars 96. The last successful interplanetary missions were the Soviet Vega 2 in 1985–1986, and the partially successful Phobos 2 in 1988–1989. Fobos-Grunt was designed to become the first spacecraft to return a macroscopic sample from an extraterrestrial body since Luna 24 in 1976.
Project history
Budget
The cost of the project was 1.5 billion rubles (US$64.4 million). Project funding for the timeframe 2009–2012, including post-launch operations, was about 2.4 billion rubles. The total cost of the mission was to have been 5 billion rubles (US$163 million).
According to lead scientist Alexander Zakharov, the entire spacecraft and most of the instruments were new, though the designs drew upon the nation's legacy of three successful Luna missions, which in the 1970s retrieved a few hundred grams of Moon rocks. Zakharov had described the Phobos sample return project as "possibly the most difficult interplanetary one to date".
Development
The Fo |
https://en.wikipedia.org/wiki/Janzen%E2%80%93Connell%20hypothesis | The Janzen–Connell hypothesis is a widely accepted explanation for the maintenance of tree species biodiversity in tropical rainforests. It was published independently in the early 1970s by Daniel Janzen and Joseph Connell. According to their hypothesis, host-specific herbivores, pathogens, or other natural enemies (often referred to as predators) make the areas near a parent tree (the seed-producing tree) inhospitable for the survival of seedlings. These natural enemies are referred to as 'distance-responsive predators' if they kill seeds or seedlings near the parent tree, or 'density-dependent predators' if they kill seeds or seedlings where they are most abundant (which is typically near the parent tree). Such predators can prevent any one species from dominating the landscape, because if that species is too common, there will be few safe places for its seedlings to survive. However, because the predators are host-specific (also called specialists), they will not harm other tree species. As a result, if a species becomes very rare, then more predator-free areas will become available, giving that species' seedlings a competitive advantage. This negative feedback allows the tree species to coexist, and can be classified as a stabilizing mechanism.
Notably, Janzen–Connell effects provide a recruitment advantage to rare trees, since they act primarily on seeds and seedlings. These effects promote the establishment of rare tree species, but do nothing to ensure the survival of these species post-germination.
The Janzen–Connell hypothesis has been called a special case of keystone predation, predator partitioning or the pest pressure hypothesis. The pest pressure hypothesis states that plant diversity is maintained by specialist natural enemies. The Janzen–Connell hypothesis expands on this, by claiming that the natural enemies are not only specialists, but also are distance-responsive or density-responsive.
This mechanism has been proposed as promoting diversit |
https://en.wikipedia.org/wiki/Brazilian%20real%20%28old%29 | The first official currency of Brazil was the real (pronounced ; pl. réis), with the symbol Rs$. As the currency of the Portuguese empire, it was in use in Brazil from the earliest days of the colonial period, and remained in use until 1942, when it was replaced by the cruzeiro.
The name "real" was resurrected in 1994 for the new currency unit (but with the new plural form "reais"). This currency is still in use. One modern real is equivalent to 2.75 × 1018 (2.75 quintillion) of the old réis.
The name comes from the Portuguese word real (in the sense of "royal" or "regal") and was borrowed from a Portuguese currency previously used in Brazil.
The dollar-like sign in the currency's symbol (and in the symbols of all other Brazilian currencies), called cifrão in Portuguese, was always written with two vertical strokes () rather than one.
History
The Portuguese real was the currency used by the first Portuguese settlers to arrive in the Americas, but the first official money to circulate bearing the name real was actually printed in 1654 by the Dutch, during their occupation of part of the Brazilian Northeast.
Until 1747 the Brazilian real was the same as the Portuguese real, with the gold of 13.145 g fine gold worth 6,400 or . After that date, however, the Brazilian real started to become a separate currency unit when the value of the was raised by 10% in Brazil (but not in Portugal) to 7,040 . The values of both units diverged further in the 19th century, with the becoming 8,000 Portuguese in 1837 versus 16,000 Brazilian in 1846.
The real was retained when Brazil became independent in 1822. It was not sub-divided in smaller units, and was affected by significant inflation during its long lifespan. The practical currency unit shifted from the real to the ('one thousand ') and then to the (one million , literally 'one count of ') in the final years of the First Brazilian Republic.
Amounts under 1,000 were typically written prefixed by "Rs", as in "Rs 350 |
https://en.wikipedia.org/wiki/Effective%20action | In quantum field theory, the quantum effective action is a modified expression for the classical action taking into account quantum corrections while ensuring that the principle of least action applies, meaning that extremizing the effective action yields the equations of motion for the vacuum expectation values of the quantum fields. The effective action also acts as a generating functional for one-particle irreducible correlation functions. The potential component of the effective action is called the effective potential, with the expectation value of the true vacuum being the minimum of this potential rather than the classical potential, making it important for studying spontaneous symmetry breaking.
It was first defined perturbatively by Jeffrey Goldstone and Steven Weinberg in 1962, while the non-perturbative definition was introduced by Bryce DeWitt in 1963 and independently by Giovanni Jona-Lasinio in 1964.
The article describes the effective action for a single scalar field, however, similar results exist for multiple scalar or fermionic fields.
Generating functionals
These generation functionals also have applications in statistical mechanics and information theory, with slightly different factors of and sign conventions.
A quantum field theory with action can be fully described in the path integral formalism using the partition functional
Since it corresponds to vacuum-to-vacuum transitions in the presence of a classical external current , it can be evaluated perturbatively as the sum of all connected and disconnected Feynman diagrams. It is also the generating functional for correlation functions
where the scalar field operators are denoted by . One can define another useful generating functional responsible for generating connected correlation functions
which is calculated perturbatively as the sum of all connected diagrams. Here connected is interpreted in the sense of the cluster decomposition, meaning that the correlation functions approa |
https://en.wikipedia.org/wiki/RNA%20velocity | RNA velocity is based on bridging measurements to a underlying mechanism, mRNA splicing, with two modes indicating the current and future state. It is a method used to predict the future gene expression of a cell based on the measurement of both spliced and unspliced transcripts of mRNA.
RNA velocity could be used to infer the direction of gene expression changes in single-cell RNA sequencing (scRNA-seq) data. It provides insights into the future state of individual cells by using the abundance of unspliced to spliced RNA transcripts. This ratio can indicate the transcriptional dynamics and potential fate of a cell, such as whether it is transitioning from one cell type to another or undergoing differentiation.
Software usage
There are several software tools available for RNA velocity analysis.Each of these tools has its own strengths and applications, so the choice of tool would depend on the specific requirements of your analysis:
velocyto
Velocyto is a package for the analysis of expression dynamics in single cell RNA seq data. In particular, it enables estimations of RNA velocities of single cells by distinguishing unspliced and spliced mRNAs in standard single-cell RNA sequencing protocols. It is the first paper proposed the concept of RNA velocity. velocyto predicted RNA velocity by solving the proposed differential equations for each gene. The authors envision future manifold learning algorithms that simultaneously fit a manifold and the kinetics on that manifold, on the basis of RNA velocity.
scVelo
scVelo is a method that solves the full transcriptional dynamics of splicing kinetics using a likelihood-based dynamical model. This generalizes RNA velocity to systems with transient cell states, which are common in development and in response to perturbations. scVelo was applied to disentangling subpopulation kinetics in neurogenesis and pancreatic endocrinogenesis. scVelo demonstrate the capabilities of the dynamical model on various cell lineages in hip |
https://en.wikipedia.org/wiki/Hamlet%20chicken%20processing%20plant%20fire | On September 3, 1991, an industrial fire caused by a failed improvised repair to a hydraulic line destroyed the Imperial Food Products chicken processing plant in Hamlet, North Carolina. Despite three previous fires in 11 years of operation, the plant had never received a safety inspection. The fire killed 25 people and injured 54, many of whom were unable to escape due to locked exits. It was the second deadliest industrial disaster in North Carolina's history.
Imperial Food Products was a corporation owned by Emmett Roe, who acquired the Hamlet facility in 1980 to produce chicken products. The company had a poor safety record at one of its other plants, and the Hamlet building lacked a fire alarm or an operational fire sprinkler system. For reasons that remain disputed, Roe ordered several exterior doors of the plant locked in the summer of 1991—including a labeled fire exit—in violation of federal safety regulations and without notifying most workers. In September, the plant's maintenance workers attempted to replace a leaking hydraulic line, attached to the conveyor belt which fed chicken tenders into a fryer in the processing room, with improvised parts. On September 3 at around 8:15 am, they turned on the conveyor belt after altering the line; it separated from its connection and spewed hydraulic fluid around the room. The fluid vaporized and was ignited by the fryer's flame. Fire engulfed the facility in minutes, severing telephone lines and filling the plant with hydrocarbon-charged smoke and carbon monoxide.
There were 90 workers in the plant at the time. Some were able to escape through the plant's front door, while others could not leave due to locked or obstructed exits. Brad Roe (Emmett's son and the company's operations manager) drove to the local fire station for help since the telephone line had burned; firefighters reached the scene at 8:27 am and sent a mutual aid call to other fire departments. Over 100 medical and emergency service personnel ul |
https://en.wikipedia.org/wiki/Russula%20vinosa | Russula vinosa, commonly known in English as the darkening brittlegill, is a species of basidiomycete mushroom found in coniferous woodlands in Europe and North America in summer and early autumn. Unlike many red-capped members of the russula genus, it is edible and mild-tasting. It is usually understood to have a symbiotic relationship with evergreen tree roots, except for in mountainous areas where it has occasionally associated with birches.
Taxonomy
Russula vinosa was originally described in the Swedish guide "Svampbok", (lit. "Mushroom Book"), written by Dr. M. A. Lindblad for publication in 1901. Romel who came up with the synonymous Russula obscura was an editor for the 1913 release of the text. The specific epithet "vinosa" is derived from the Latin vinum "wine", likely alluding to the wine-colored cap of this species that is capable of acting as a dye.
Description
The cap is concave and wine to red-brown in colour, often fading to a pale white or tan in the center with age. The widely spaced gills are white, and adnexed or free. The stipe is cylindrical and white or cream colored. The brittle flesh is light and the taste is mild.
Similar species
The red-capped color of Russula vinosa is almost impossible to visually separate from other toxic and inedible red-capped Russulas, such as the bloody brittlegill (R. sanguinaria), the sickener (R. emetica), and the beechwood sickener (R. nobilis). It may also be confused with similar edible species such as Russula paludosa and Russula decolorans. It is therefore important to identify the mushroom with absolute certainty before consumption. Chinese and Southeast Asian populations of R. vinosa have been determined to be genetically distinct enough from R. vinosa to be placed in a separate, but anatomically identical species, R. griseocarnosa.
Distribution and habitat
Russula vinosa is found in Europe and North America. It is known from Great Britain, Southern Europe, New England, and Fennoscandia. It usually |
https://en.wikipedia.org/wiki/Carboxypeptidase%20A | Carboxypeptidase A usually refers to the pancreatic exopeptidase that hydrolyzes peptide bonds of C-terminal residues with aromatic or aliphatic side-chains. Most scientists in the field now refer to this enzyme as CPA1, and to a related pancreatic carboxypeptidase as CPA2.
Types
In addition, there are 4 other mammalian enzymes named CPA-3 through CPA-6, and none of these are expressed in the pancreas. Instead, these other CPA-like enzymes have diverse functions.
CPA3 (also known as mast-cell CPA) is involved in the digestion of proteins by mast cells.
CPA4 (previously known as CPA-3, but renumbered when mast-cell CPA was designated CPA-3) may be involved in tumor progression, but this enzyme has not been well studied.
CPA5 has not been well studied.
CPA6 is expressed in many tissues during mouse development, and in adult shows a more limited distribution in brain and several other tissues. CPA6 is present in the extracellular matrix where it is enzymatically active. A human mutation of CPA-6 has been linked to Duane's syndrome (abnormal eye movement). Recently, mutations in CPA6 were found to be linked to epilepsy. CPA6 is also one of several enzymes which degrade enkephalins.
Function
CPA-1 and CPA-2 (and, it is presumed, all other CPAs) employ a zinc ion within the protein for hydrolysis of the peptide bond at the C-terminal end of an amino acid residue. Loss of the zinc leads to loss of activity, which can be replaced easily by zinc, and also by some other divalent metals (cobalt, nickel). Carboxypeptidase A is produced in the pancreas and is crucial to many processes in the human body to include digestion, post-translational modification of proteins, blood clotting, and reproduction.
Applications
This vast scope of functionality for a single protein makes it the ideal model for research regarding other zinc proteases of unknown structure. Recent biomedical research on collagenase, enkephalinase, and angiotensin-converting enzyme used carboxy |
https://en.wikipedia.org/wiki/The%20monkey%20and%20the%20coconuts | The monkey and the coconuts is a mathematical puzzle in the field of Diophantine analysis that originated in a magazine fictional short story involving five sailors and a monkey on a desert island who divide up a pile of coconuts; the problem is to find the number of coconuts in the original pile (fractional coconuts not allowed). The problem is notorious for its confounding difficulty to unsophisticated puzzle solvers, though with the proper mathematical approach, the solution is trivial. The problem has become a staple in recreational mathematics collections.
General description
The problem can be expressed as:
There is a pile of coconuts, owned by five men. One man divides the pile into five equal piles, giving the one left over coconut to a passing monkey, and takes away his own share. The second man then repeats the procedure, dividing the remaining pile into five and taking away his share, as do the third, fourth, and fifth, each of them finding one coconut left over when dividing the pile by five, and giving it to a monkey. Finally, the group divide the remaining coconuts into five equal piles: this time no coconuts are left over.
How many coconuts were there in the original pile?
The monkey and the coconuts is the best known representative of a class of puzzle problems requiring integer solutions structured as recursive division or fractionating of some discretely divisible quantity, with or without remainders, and a final division into some number of equal parts, possibly with a remainder. The problem is so well known that the entire class is often referred to broadly as "monkey and coconut type problems", though most are not closely related to the problem.
Another example: "I have a whole number of pounds of cement, I know not how many, but after addition of a ninth and an eleventh, it was partitioned into 3 sacks, each with a whole number of pounds. How many pounds of cement did I have?"
Problems ask for either the initial or terminal quantity. |
https://en.wikipedia.org/wiki/TOM%20%28mascot%29 | TOM (Tigers Of Memphis) is the name of three Bengal Tigers which have served as the mascot of the Memphis Tigers since 1972. The most recent, TOM III, was a beloved Bengal Tiger mascot for the University of Memphis during one of the most glorious periods in University and athletics history. He died on September 18, 2020, less than three weeks after his 12th birthday. The Tigers' football team also has a costumed mascot called Pouncer.
TOM III was housed and cared for by the Tiger Guard, a committee of the Highland Hundred football booster club. University funds are not used to provide for the tiger's needs. The University of Memphis was one of two universities in the United States that use a live tiger as a mascot (the other being LSU) and has received criticism from animal welfare organizations.
Until Tom II Memphis was the only school to have a live tiger mascot present at football games. TOM attends Memphis Tiger home games at
Liberty Bowl Memorial Stadium in a special sound proof, air conditioned trailer.
History
TOM I
The first tiger, TOM, was purchased for $1,500 by the Highland Hundred Football Boosters in 1972. TOM was placed in a dog kennel and flown to Memphis on November 9, 1972. The tiger cub was taken to Athletic Director Billy J. Murphy's office for a press conference and was officially presented to Memphis University in a Liberty Bowl Memorial Stadium ceremony during the November 11, 1972, football game against the University of Cincinnati.
TOM was initially named Shane at the suggestion of the breeder’s daughter. Once in Memphis, a contest was held to rename the mascot and over 2,500 entries were submitted to a committee chaired by Harry Pierotti. The list was reduced to two choices, Shane, and TOM, which stands for Tigers Of Memphis and TOM was the victor. The winning entry was submitted by Mrs. Lauraine Huddleston of Memphis.
During his first few months in Memphis, TOM was housed in Highland Hundred member Bill Proctor's garage. TOM was lat |
https://en.wikipedia.org/wiki/Staining | Staining is a technique used to enhance contrast in samples, generally at the microscopic level. Stains and dyes are frequently used in histology (microscopic study of biological tissues), in cytology (microscopic study of cells), and in the medical fields of histopathology, hematology, and cytopathology that focus on the study and diagnoses of diseases at the microscopic level. Stains may be used to define biological tissues (highlighting, for example, muscle fibers or connective tissue), cell populations (classifying different blood cells), or organelles within individual cells.
In biochemistry, it involves adding a class-specific (DNA, proteins, lipids, carbohydrates) dye to a substrate to qualify or quantify the presence of a specific compound. Staining and fluorescent tagging can serve similar purposes. Biological staining is also used to mark cells in flow cytometry, and to flag proteins or nucleic acids in gel electrophoresis. Light microscopes are used for viewing stained samples at high magnification, typically using bright-field or epi-fluorescence illumination.
Staining is not limited to only biological materials, since it can also be used to study the structure of other materials; for example, the lamellar structures of semi-crystalline polymers or the domain structures of block copolymers.
In vivo vs In vitro
In vivo staining (also called vital staining or intravital staining) is the process of dyeing living tissues. By causing certain cells or structures to take on contrasting colours, their form (morphology) or position within a cell or tissue can be readily seen and studied. The usual purpose is to reveal cytological details that might otherwise not be apparent; however, staining can also reveal where certain chemicals or specific chemical reactions are taking place within cells or tissues.
In vitro staining involves colouring cells or structures that have been removed from their biological context. Certain stains are often combined to reveal mo |
https://en.wikipedia.org/wiki/Unibranch%20local%20ring | In algebraic geometry, a local ring A is said to be unibranch if the reduced ring Ared (obtained by quotienting A by its nilradical) is an integral domain, and the integral closure B of Ared is also a local ring. A unibranch local ring is said to be geometrically unibranch if the residue field of B is a purely inseparable extension of the residue field of Ared. A complex variety X is called topologically unibranch at a point x if for all complements Y of closed algebraic subsets of X there is a fundamental system of neighborhoods (in the classical topology) of x whose intersection with Y is connected.
In particular, a normal ring is unibranch. The notions of unibranch and geometrically unibranch points are used in some theorems in algebraic geometry. For example, there is the following result:
Theorem Let X and Y be two integral locally noetherian schemes and a proper dominant morphism. Denote their function fields by K(X) and K(Y), respectively. Suppose that the algebraic closure of K(Y) in K(X) has separable degree n and that is unibranch. Then the fiber has at most n connected components. In particular, if f is birational, then the fibers of unibranch points are connected.
In EGA, the theorem is obtained as a corollary of Zariski's main theorem. |
https://en.wikipedia.org/wiki/Alan%20M.%20Frieze | Alan M. Frieze (born 25 October 1945 in London, England) is a professor in the Department of Mathematical Sciences at Carnegie Mellon University, Pittsburgh, United States. He graduated from the University of Oxford in 1966, and obtained his PhD from the University of London in 1975. His research interests lie in combinatorics, discrete optimisation and theoretical computer science. Currently, he focuses on the probabilistic aspects of these areas; in particular, the study of the asymptotic properties of random graphs, the average case analysis of algorithms, and randomised algorithms. His recent work has included approximate counting and volume computation via random walks; finding edge disjoint paths in expander graphs, and exploring anti-Ramsey theory and the stability of routing algorithms.
Key contributions
Two key contributions made by Alan Frieze are:
(1) polynomial time algorithm for approximating the volume of convex bodies
(2) algorithmic version for Szemerédi regularity lemma
Both these algorithms will be described briefly here.
Polynomial time algorithm for approximating the volume of convex bodies
The paper
is a joint work by Martin Dyer, Alan Frieze and Ravindran Kannan.
The main result of the paper is a randomised algorithm for finding an approximation to the volume of a convex body in -dimensional Euclidean space by assume the existence of a membership oracle. The algorithm takes time bounded by a polynomial in , the dimension of and .
The algorithm is a sophisticated usage of the so-called Markov chain Monte Carlo (MCMC) method.
The basic scheme of the algorithm is a nearly uniform sampling from within by placing a grid consisting n-dimensional cubes and doing a random walk over these cubes. By using the theory of
rapidly mixing Markov chains, they show that it takes a polynomial time for the random walk to settle down to being a nearly uniform distribution.
Algorithmic version for Szemerédi regularity partition
This paper
is a comb |
https://en.wikipedia.org/wiki/Timeline%20of%20quantum%20mechanics | The timeline of quantum mechanics is a list of key events in the history of quantum mechanics, quantum field theories and quantum chemistry.
19th century
1801 – Thomas Young establishes that light made up of waves with his Double-slit experiment.
1859 – Gustav Kirchhoff introduces the concept of a blackbody and proves that its emission spectrum depends only on its temperature.
1860-1900 – Ludwig Eduard Boltzmann, James Clerk Maxwell and others develop the theory of statistical mechanics. Boltzmann argues that entropy is a measure of disorder.
1877 – Boltzmann suggests that the energy levels of a physical system could be discrete based on statistical mechanics and mathematical arguments; also produces the first circle diagram representation, or atomic model of a molecule (such as an iodine gas molecule) in terms of the overlapping terms α and β, later (in 1928) called molecular orbitals, of the constituting atoms.
1885 – Johann Jakob Balmer discovers a numerical relationship between visible spectral lines of hydrogen, the Balmer series.
1887 – Heinrich Hertz discovers the photoelectric effect, shown by Einstein in 1905 to involve quanta of light.
1888 – Hertz demonstrates experimentally that electromagnetic waves exist, as predicted by Maxwell.
1888 – Johannes Rydberg modifies the Balmer formula to include all spectral series of lines for the hydrogen atom, producing the Rydberg formula which is employed later by Niels Bohr and others to verify Bohr's first quantum model of the atom.
1895 – Wilhelm Conrad Röntgen discovers X-rays in experiments with electron beams in plasma.
1896 – Antoine Henri Becquerel accidentally discovers radioactivity while investigating the work of Wilhelm Conrad Röntgen; he finds that uranium salts emit radiation that resembled Röntgen's X-rays in their penetrating power. In one experiment, Becquerel wraps a sample of a phosphorescent substance, potassium uranyl sulfate, in photographic plates surrounded by very thick black paper i |
https://en.wikipedia.org/wiki/Thermally%20stimulated%20depolarization%20current | Thermally stimulated depolarization current (TSDC) is a scientific technique used to measure dielectric properties of materials. It can be used to measure the thermally stimulated depolarization of molecules within a material. One method of doing so is to place the material between two electrodes, cool the material in the presence of an external electric field, remove the field once a desired temperature has been reached, and measure the current between the electrodes as the material warms. The external electric field must be applied at a sufficiently high temperature to allow the molecular dipoles time to align with the field. Because the dielectric relaxation time increases exponentially on cooling, the polarization caused by their alignment with the field gets "frozen-in". So when the field is removed and the material begins to warm the dipoles begin to "thaw" whereby losing their net alignment and thus the material become depolarized. This depolarization can be measured if the material is sandwiched between two ohmic electrodes and the current is measured on warming. As the material depolarizes, charges are pulled to (or pushed away from) the electrodes which causes a current through the measuring device. |
https://en.wikipedia.org/wiki/Internet%20genre | Internet genre refers to a type of genre explored in multimedia Studies. Others include film genre, video game genres and music genre. Genre, in terms of genre studies refers to the method based on similarities in the narrative elements from which media-texts are constructed.
Types of Internet genres
There are various genres of Internet services.
Personal homepage
Personal homepages are regularly updated, allows people to connect with those that they know through leaving messages and joining buddylists.
Message boards
A message board is one of the most familiar genres of online gathering place, which is asynchronous, meaning people do not have to be in the same place at the same time to have a conversation. With this genre posts can be accessed at any time and it is easy to ignore undesirable content.
E-mail lists and newsletters
An electronic mailing list or email list is a special usage of email that allows for widespread distribution of information to many Internet users. It is the easiestkind of online gathering place to create, maintain and participate.
Chat groups
Chat Group, where people can chat synchronously, communicating in the same place at the same time. Many micro-blogging platforms now function like chat groups, such as Twitter. The first online chat system was called Talkomatic, created by Doug Brown and David R. Woolley in 1973 on the PLATO System at the University of Illinois. It offered several channels, each of which could accommodate up to five people, with messages appearing on all users' screens character-by-character as they were typed. Talkomatic was very popular among PLATO users into the mid-1980s.
Virtual worlds
A virtual world takes the form of a computer-based simulated environment through which users can interact with one another and use and create objects. The term has become largely synonymous with interactive 3D virtual environments, where the users take the form of avatars visible to others.
Weblogs and directories
W |
https://en.wikipedia.org/wiki/Spin%20Hall%20magnetoresistance | Spin Hall magnetoresistance (SMR) is a transport phenomenon that is found in some electrical conductors that have at least one surface in direct contact with another magnetic material due to changes in the spin current that are present in metals and semiconductors with a large spin Hall angle. It is most easily detected when the magnetic material is an insulator which eliminates other magnetically sensitive transport effects arising from conduction in the magnetic material.
Origins
Spin Hall magnetoresistance is one of many ways in which the electrical resistance of a material is influenced by the spin Hall effect. An electron moving through a conductor is scattered by the spin Hall effect in a direction determined by its spin orientation which induces a net accumulation of spin at the conductors edge. The spin polarized electrons at the conductors surface are able to interact with the magnetization of a magnetic material in close proximity through a spin-transfer torque. When the conduction electrons spin is aligned parallel to the magnetization direction the electron reflects from the conductor surface with no change in its spin, however, when there is a component of the magnetization that is normal to the spin orientation, the spin can be flipped to its opposite state transferring angular momentum into the magnetic material. This results in a spin current that travels at a normal to the direction of the charge current that can be altered by changing the direction of magnetization. This spin current is deflected through the inverse spin Hall effect which adds or subtracts from the electrons momentum in the direction of the charge current depending on the size and sign of the conductors spin Hall angle. This deflection provides an addition to the conductors resistivity allowing the spin current to be estimated by the change in the electrical resistivity.
Description
To construct a device that exhibits the spin Hall magnetoresistance, a multilayer of conductor |
https://en.wikipedia.org/wiki/Infraclavicular%20fossa | The Infraclavicular fossa is an indentation, or fossa, immediately below the clavicle, above the third rib and between the deltoid muscle laterally and medioclavicular line medially.
See also
Supraclavicular fossa |
https://en.wikipedia.org/wiki/Wafer-level%20packaging | Wafer-level packaging (WLP) is a process where packaging components are attached to an integrated circuit (IC) before the wafer – on which the IC is fabricated – is diced. In WSP, the top and bottom layers of the packaging and the solder bumps are attached to the integrated circuits while they are still in the wafer. This process differs from a conventional process, in which the wafer is sliced into individual circuits (dice) before the packaging components are attached.
WLP is essentially a true chip-scale package (CSP) technology, since the resulting package is practically of the same size as the die. Wafer-level packaging allows integration of wafer fab, packaging, test, and burn-in at wafer level in order to streamline the manufacturing process undergone by a device from silicon start to customer shipment. There is no single industry-standard method of wafer-level packaging at present.
A major application area of WLPs are smartphones due to the size constraints. For example, the Apple iPhone 5 has at least eleven different WLPs, the Samsung Galaxy S3 has six WLPs and the HTC One X has seven. Functions provided WLPs in smartphones include sensors, power management, wireless, etc. In fact, it has recently been rumored that the iPhone 7 will use fan-out wafer-level packaging technology in order to achieve a thinner and lighter model.
Wafer-level chip scale packaging (WL-CSP) is the smallest package currently available on the market and is produced by OSAT (Outsourced Semiconductor Assembly and Test) companies, such as Advanced Semiconductor Engineering (ASE). A WL-CSP or WLCSP package is just a bare die with a redistribution layer (RDL, interposer or I/O pitch) to rearrange the pins or contacts on the die so that they can be big enough and have sufficient spacing so that they can be handled just like a ball grid array (BGA) package.
There are two kinds of wafer level packaging: fan-in and fan-out. Fan-in WLCSP packages have an interposer that is the same size |
https://en.wikipedia.org/wiki/Qigong | Qigong (), is a system of coordinated body-posture and movement, breathing, and meditation used for the purposes of health, spirituality, and martial arts training. With roots in Chinese medicine, philosophy, and martial arts, qigong is traditionally viewed by the Chinese and throughout Asia as a practice to cultivate and balance qi (pronounced approximately as "chee"), translated as "life energy".
Qigong practice typically involves moving meditation, coordinating slow-flowing movement, deep rhythmic breathing, and a calm meditative state of mind. People practice qigong throughout China and worldwide for recreation, exercise, relaxation, preventive medicine, self-healing, alternative medicine, meditation, self-cultivation, and training for martial arts.
Etymology
Qigong (Pinyin), ch'i kung (Wade-Giles), and chi gung (Yale) are romanizations of two Chinese words "qì" and "gōng" (). Qi primarily means air, gas or breath but is often translated as a metaphysical concept of 'vital energy', referring to a supposed energy circulating through the body; though a more general definition is universal energy, including heat, light, and electromagnetic energy; and definitions often involve breath, air, gas, or the relationship between matter, energy, and spirit. Qi is the central underlying principle in traditional Chinese medicine and martial arts. Gong (or kung) is often translated as cultivation or work, and definitions include practice, skill, mastery, merit, achievement, service, result, or accomplishment, and is often used to mean gongfu (kung fu) in the traditional sense of achievement through great effort. The two words are combined to describe systems to cultivate and balance life energy, especially for health and wellbeing.
The term qigong as currently used was promoted in the late 1940s through the 1950s to refer to a broad range of Chinese self-cultivation exercises, and to emphasize health and scientific approaches, while de-emphasizing spiritual practices, mys |
https://en.wikipedia.org/wiki/Geometric%20transformation | In mathematics, a geometric transformation is any bijection of a set to itself (or to another such set) with some salient geometrical underpinning. More specifically, it is a function whose domain and range are sets of points — most often both or both — such that the function is bijective so that its inverse exists. The study of geometry may be approached by the study of these transformations.
Classifications
Geometric transformations can be classified by the dimension of their operand sets (thus distinguishing between, say, planar transformations and spatial transformations). They can also be classified according to the properties they preserve:
Displacements preserve distances and oriented angles (e.g., translations);
Isometries preserve angles and distances (e.g., Euclidean transformations);
Similarities preserve angles and ratios between distances (e.g., resizing);
Affine transformations preserve parallelism (e.g., scaling, shear);
Projective transformations preserve collinearity;
Each of these classes contains the previous one.
Möbius transformations using complex coordinates on the plane (as well as circle inversion) preserve the set of all lines and circles, but may interchange lines and circles.
Conformal transformations preserve angles, and are, in the first order, similarities.
Equiareal transformations, preserve areas in the planar case or volumes in the three dimensional case. and are, in the first order, affine transformations of determinant 1.
Homeomorphisms (bicontinuous transformations) preserve the neighborhoods of points.
Diffeomorphisms (bidifferentiable transformations) are the transformations that are affine in the first order; they contain the preceding ones as special cases, and can be further refined.
Transformations of the same type form groups that may be sub-groups of other transformation groups.
Opposite group actions
Many geometric transformations are expressed with linear algebra. The bijective linear transformations ar |
https://en.wikipedia.org/wiki/Architectural%20geometry | Architectural geometry is an area of research which combines applied geometry and architecture, which looks at the design, analysis and manufacture processes. It lies at the core of architectural design and strongly challenges contemporary practice, the so-called architectural practice of the digital age.
Architectural geometry is influenced by following fields: differential geometry, topology, fractal geometry, and cellular automata.
Topics include:
freeform curves and surfaces creation
developable surfaces
discretisation
generative design
digital prototyping and manufacturing
See also
Geometric design
Computer-aided architectural design
Mathematics and architecture
Fractal geometry
Blobitecture |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.