source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/ICON%20%28microcomputer%29
The ICON (also the CEMCorp ICON, Burroughs ICON, and Unisys ICON, and nicknamed the bionic beaver) was a networked personal computer built specifically for use in schools, to fill a standard created by the Ontario Ministry of Education. It was based on the Intel 80186 CPU and ran an early version of QNX, a Unix-like operating system. The system was packaged as an all-in-one machine similar to the Commodore PET, and included a trackball for mouse-like control. Over time, a number of GUI-like systems appeared for the platform, based on the system's NAPLPS-based graphics system. The ICON was widely used in the mid to late 1980s, but disappeared after that time with the widespread introduction of PCs and Apple Macintoshes. History Development Origin In 1981, four years after the first microcomputers for mainstream consumers appeared, the Ontario Ministry of Education sensed that microcomputers could be an important component of education. In June the Minister of Education, Bette Stephenson, announced the need for computer literacy for all students and formed the Advisory Committee on Computers in Education to guide their efforts. She stated that: It is now clear that one of the major goals that education must add to its list of purposes, is computer literacy. The world of the very near future requires that all of us have some understanding of the processes and uses of computers. According to several contemporary sources, Stephenson was the driving force behind the project; "whenever there was a problem she appears to have 'moved heaven and earth' to get it back on the tracks." The Ministry recognized that a small proportion of teachers and other school personnel were already quite involved with microcomputers and that some schools were acquiring first-generation machines. These acquisitions were uneven, varying in brand and model not just between school boards, but among schools within boards and even classroom to classroom. Among the most popular were the Comm
https://en.wikipedia.org/wiki/Anti-neurofascin%20demyelinating%20diseases
Anti-neurofascin demyelinating diseases (anti-NF diseases) refers to health conditions engendered by auto-antibodies against neurofascins, which can produce both central and peripheral demyelination. Some cases of combined central and peripheral demyelination (CCPD) could be produced by them. Chronic inflammatory demyelinating polyneuropathy: Some cases of CIDP are reported to be produced by auto-antibodies against several neurofascin proteins. These proteins are present in the neurons and four of them have been reported to produce disease: NF186, NF180, NF166 and NF155. Neuromyelitis optica: NF auto antibodies can also appear in NMO cases. These antibodies are more related to the peripheral nervous demyelination, but they were also found in NMO. Multiple sclerosis: Also antibodies against Neurofascins NF-155 can also appear in atypical multiple sclerosis and NF-186 could be involved in subtypes of MS yielding an intersection between both conditions. Around 10% of MS cases are now thought to be anti-NF cases. History The first report about a subgroup of MS patients with anti-NF and contactin 2 auto-antibodies was published in 2011
https://en.wikipedia.org/wiki/Gigantomania
Gigantomania (from Ancient Greek γίγας gigas, "giant" and μανία mania, "madness") is the production of unusually and superfluously large works. Gigantomania is in varying degrees a feature of the political and cultural lives of prehistoric and ancient civilizations (Megalithic cultures, Ancient Egypt, Ancient Rome, Ancient China, Aztec civilization), several totalitarian regimes (Stalin's USSR, Nazi Germany, Fascist Italy, Maoist China, Juche Korea), as well as of contemporary capitalist countries (notably for skyscrapers and shopping malls). Soviet Union The social engineering and modernization efforts in agriculture and industry of Soviet dictator Joseph Stalin have been described as gigantomaniac. The creation of extremely large industrial complexes, farms, engineering efforts, buildings and statues was to prove the superiority of the socialist system over capitalism. These projects also aimed for a mass transformation of the Russian peasant society into a proletarian one: massive construction sites, such as Magnitogorsk, also functioned as ideological education centers for the workers or Gulag inmates. In addition to massive construction projects, Stalin's gigantomania can be seen in the ideologue of Stakhanovism, which emphasized constantly over-fulfilling production target quotas. Soviet gigantomania continued, albeit with less popularity, after Stalin's death. Examples Dnieper Hydroelectric Station Kuybyshev Hydroelectric Station Magnitogorsk Palace of the Soviets The Motherland Calls White Sea Canal Nazi Germany German dictator Adolf Hitler was extremely interested in architecture, and desired works of monumental scale to be built to represent the values and achievements of the Nazi regime, and to impress foreigners and later generations. He stated that architecture was "the Word in stone" (i.e. it was inseparable from politics), which demonstrated if a civilization was in ascendancy or in decline. Colossal architecture was to render the indiv
https://en.wikipedia.org/wiki/Putative%20gene
A putative gene is a segment of DNA that is believed to be a gene. Putative genes can share sequence similarities to already characterized genes and thus can be inferred to share a similar function, yet the exact function of putative genes remains unknown. Newly identified sequences are considered putative gene candidates when homologs of those sequences are found to be associated with the phenotype of interest. Examples Examples of studies involving putative genes include the discovery of 30 putative receptor genes found in rat vomeronasal organ (VNO) and the identification of 79 putative TATA boxes found in many plant genomes. Practical importance In order to define and characterize a biosynthetic gene cluster, all the putative genes within said cluster must first be identified and their functions must be characterized. This can be performed by complementation and knock out experiments. In the process of characterizing putative genes, the genome under study becomes increasingly well understood as more interactions can be identified. Identification of putative genes is necessary to study genomic evolution, as significant proportion of genomes make up larger families of related genes. Genomic evolution occurs by processes such as duplication of individual genes, genome segments, or entire genomes. These processes can result in loss of function, altered function, or gain of function, and have drastic affects on the phenotype. DNA mutations outside of a putative gene can act by positional effect, in which they alter the gene expression. These alterations leave the transcription unit and promoter of the gene intact, but may involve distal promoters, enhancer/silencer elements, or the local chromatin environment. These mutations can be associated with diseases or disorders associated with the gene. Identification Putative genes can be identified by clustering large groups of sequences by patterns and arranging by mutual similarity or can be inferred by potential
https://en.wikipedia.org/wiki/Tor2web
Tor2web (pronounced "Tor to Web") is a software project to allow Tor hidden services to be accessed from a standard browser without being connected to the Tor network. It was created by Aaron Swartz and Virgil Griffith. History Tor is a network which enables people to use the Internet anonymously (though with known weaknesses) and to publish content on "hidden services", which exist only within the Tor network for security reasons and thus are typically only accessible to the relatively small number of people using a Tor-connected web browser. Aaron Swartz and Virgil Griffith developed Tor2web in 2008 as a way to support whistleblowing and other forms of anonymous publishing through Tor, allowing materials to remain anonymous while making them accessible to a broader audience. In an interview with Wired Swartz explained that Tor is great for anonymous publishing, but because its focus is not user-friendliness and thus not many people would install it, he wanted to "produce this hybrid where people could publish stuff using Tor and make it so that anyone on the internet could view it". The software developed by Swartz and Griffith is today considered version 1.0. Since then, it has been maintained and developed by Giovanni Pellerano from the Hermes Center for Transparency and Digital Human Rights as part of the GlobaLeaks Project, with financial support from the Open Technology Fund. Version 2.0 was released in August 2011, and version 3.0 is in beta . Operation and security Rather than typical top-level domains like .com, .org, or .net, hidden service URLs end with .onion and are only accessible when connected to Tor. Tor2web acts as a specialized proxy or middleman between hidden services and users, making them visible to people who are not connected to Tor. To do so, a user takes the URL of a hidden service and replaces .onion with .onion.to. Like Tor, Tor2web operates using servers run voluntarily by an open community of individuals and organizations. Tor2we
https://en.wikipedia.org/wiki/Raising%20and%20lowering%20indices
In mathematics and mathematical physics, raising and lowering indices are operations on tensors which change their type. Raising and lowering indices are a form of index manipulation in tensor expressions. Vectors, covectors and the metric Mathematical formulation Mathematically vectors are elements of a vector space over a field , and for use in physics is usually defined with or . Concretely, if the dimension of is finite, then, after making a choice of basis, we can view such vector spaces as or . The dual space is the space of linear functionals mapping . Concretely, in matrix notation these can be thought of as row vectors, which give a number when applied to column vectors. We denote this by , so that is a linear map . Then under a choice of basis , we can view vectors as an vector with components (vectors are taken by convention to have indices up). This picks out a choice of basis for , defined by the set of relations . For applications, raising and lowering is done using a structure known as the (pseudo-)metric tensor (the 'pseudo-' refers to the fact we allow the metric to be indefinite). Formally, this is a non-degenerate, symmetric bilinear form In this basis, it has components , and can be viewed as a symmetric matrix in with these components. The inverse metric exists due to non-degeneracy and is denoted , and as a matrix is the inverse to . Raising and lowering vectors and covectors Raising and lowering is then done in coordinates. Given a vector with components , we can contract with the metric to obtain a covector: and this is what we mean by lowering the index. Conversely, contracting a covector with the inverse metric gives a vector: This process is called raising the index. Raising and then lowering the same index (or conversely) are inverse operations, which is reflected in the metric and inverse metric tensors being inverse to each other (as is suggested by the terminology): where is the Kronecker delta or identity
https://en.wikipedia.org/wiki/Future%20of%20an%20expanding%20universe
Observations suggest that the expansion of the universe will continue forever. The prevailing theory is that the universe will cool as it expands, eventually becoming too cold to sustain life. For this reason, this future scenario once popularly called "Heat Death" is now known as the "Big Chill" or "Big Freeze". If dark energy—represented by the cosmological constant, a constant energy density filling space homogeneously, or scalar fields, such as quintessence or moduli, dynamic quantities whose energy density can vary in time and space—accelerates the expansion of the universe, then the space between clusters of galaxies will grow at an increasing rate. Redshift will stretch ancient, incoming photons (even gamma rays) to undetectably long wavelengths and low energies. Stars are expected to form normally for 1012 to 1014 (1–100 trillion) years, but eventually the supply of gas needed for star formation will be exhausted. As existing stars run out of fuel and cease to shine, the universe will slowly and inexorably grow darker. According to theories that predict proton decay, the stellar remnants left behind will disappear, leaving behind only black holes, which themselves eventually disappear as they emit Hawking radiation. Ultimately, if the universe reaches thermodynamic equilibrium, a state in which the temperature approaches a uniform value, no further work will be possible, resulting in a final heat death of the universe. Cosmology Infinite expansion does not determine the overall spatial curvature of the universe. It can be open (with negative spatial curvature), flat, or closed (positive spatial curvature), although if it is closed, sufficient dark energy must be present to counteract the gravitational forces or else the universe will end in a Big Crunch. Observations of the cosmic background radiation by the Wilkinson Microwave Anisotropy Probe and the Planck mission suggest that the universe is spatially flat and has a significant amount of dark energy.
https://en.wikipedia.org/wiki/AVG%20AntiVirus
AVG AntiVirus (previously known as AVG, an abbreviation of Anti-Virus Guard) is a line of antivirus software developed by AVG Technologies, a subsidiary of Avast, a part of Gen Digital. It is available for Windows, macOS and Android. History The brand AVG comes from Grisoft's first product, Anti-Virus Guard, launched in 1992 in the Czech Republic. In 1997, the first AVG licenses were sold in Germany and the UK. AVG was introduced in the US in 1998. The AVG Free Edition helped raise awareness of the AVG product line. In 2006, the AVG security package grew to include anti-spyware as AVG Technologies acquired ewido Networks, an anti-spyware group. AVG Technologies acquired Exploit Prevention Labs (XPL) in December 2007 and incorporated that company's LinkScanner safe search and surf technology into the AVG 8.0 security product range released in March 2008. In January 2009, AVG Technologies acquired Sana Security, a developer of identity theft prevention software. This software was incorporated into the AVG security product range in March 2009. According to AVG Technologies, the company has more than 200 million active users worldwide, including more than 100 million who use their products and services on mobile devices. On 7 July 2016, Avast announced an agreement to acquire AVG for $1.3 billion. Platform support AVG provides AVG AntiVirus Free for Windows, AVG AntiVirus for Mac for macOS and AVG AntiVirus for Android for Android devices. All are freemium products: They are free to download, install, update and use, but for technical support a premium plan must be purchased. AVG stopped providing new features for Windows XP and Windows Vista in January 2019. New versions require Windows 7 or later; virus definitions are still provided for previous versions. Features AVG features most of the common functions available in modern antivirus and Internet security programs, including periodic scans, scans of sent and received emails (including adding footers to t
https://en.wikipedia.org/wiki/Numerical%20Wind%20Tunnel
Numerical Wind Tunnel (数値風洞) was an early implementation of the vector parallel architecture developed in a joint project between National Aerospace Laboratory of Japan and Fujitsu. It was the first supercomputer with a sustained performance of close to 100 Gflop/s for a wide range of fluid dynamics application programs. It stood out at the top of the TOP500 during 1993-1996. With 140 cores, the Numerical Wind Tunnel reached a Rmax of 124.0 GFlop/s and a Rpeak of 235.8 GFlop/s in November 1993. It consisted of parallel connected 166 vector processors with a gate delay as low as 60 ps in the Ga-As chips. The resulting cycle time was 9.5 ns. The processor had four independent pipelines each capable of executing two Multiply-Add instructions in parallel resulting in a peak speed of 1.7 Gflop/s per processor. Each processor board was equipped with 256 Megabytes of central memory.
https://en.wikipedia.org/wiki/Attribute-based%20encryption
Attribute-based encryption is a generalisation of public-key encryption which enables fine grained access control of encrypted data using authorisation policies. The secret key of a user and the ciphertext are dependent upon attributes (e.g. their email address, the country in which they live, or the kind of subscription they have). In such a system, the decryption of a ciphertext is possible only if the set of attributes of the user key matches the attributes of the ciphertext. A crucial security aspect of attribute-based encryption is collusion-resistance: An adversary that holds multiple keys should only be able to access data if at least one individual key grants access. Description Attribute-based encryption is provably a generalisation of identity-based encryption. History Identity-based encryption was first proposed in 1984 by Adi Shamir, without a specific solution or proof. In 2004 Amit Sahai and Brent Waters published a solution, improved in 2006 by Vipul Goyal, Omkant Pandey, Amit Sahai and Brent Waters. Melissa Chase and other researchers have further proposed attribute-based encryption with multiple authorities who jointly generate users' private keys. Types of attribute-based encryption schemes There are mainly two types of attribute-based encryption schemes: Key-policy attribute-based encryption (KP-ABE) and ciphertext-policy attribute-based encryption (CP-ABE). In KP-ABE, users' secret keys are generated based on an access tree that defines the privileges scope of the concerned user, and data are encrypted over a set of attributes. However, CP-ABE uses access trees to encrypt data and users' secret keys are generated over a set of attributes. Relationship to Role-based Encryption The related concept of role-based encryption refers exclusively to access keys having roles that can be validated against an authoritative store of roles. In this sense, Role-based encryption can be expressed by Attribute-based encryption and within that limited co
https://en.wikipedia.org/wiki/Solid%20Logic%20Technology
Solid Logic Technology (SLT) was IBM's method for hybrid packaging of electronic circuitry introduced in 1964 with the IBM System/360 series of computers and related machines. IBM chose to design custom hybrid circuits using discrete, flip chip-mounted, glass-encapsulated transistors and diodes, with silk-screened resistors on a ceramic substrate, forming an SLT module. The circuits were either encapsulated in plastic or covered with a metal lid. Several of these SLT modules (20 in the image on the right) were then mounted on a small multi-layer printed circuit board to make an SLT card. Each SLT card had a socket on one edge that plugged into pins on the computer's backplane (the exact reverse of how most other companies' modules were mounted). IBM considered monolithic integrated circuit technology too immature at the time. SLT was a revolutionary technology for 1964, with much higher circuit densities and improved reliability over earlier packaging techniques such as the Standard Modular System. It helped propel the IBM System/360 mainframe family to overwhelming success during the 1960s. SLT research produced ball chip assembly, wafer bumping, trimmed thick-film resistors, printed discrete functions, chip capacitors and one of the first volume uses of hybrid thick-film technology. SLT replaced the earlier Standard Modular System, although some later SMS cards held SLT modules. Details SLT used silicon planar glass-encapsulated transistors and diodes. SLT uses dual diode chips and individual transistor chips each approximately square. The chips are mounted on a square substrate with silk-screened resistors and printed connections. The whole is encapsulated to form a square module. Up to 36 modules are mounted on each card, though a few card types had just discrete components and no modules. Cards plug into boards which are connected to form gates which form frames. SLT voltage levels, logic low to logic high, varied by circuit speed: High speed (5-10 ns)
https://en.wikipedia.org/wiki/Random%20sample%20consensus
Random sample consensus (RANSAC) is an iterative method to estimate parameters of a mathematical model from a set of observed data that contains outliers, when outliers are to be accorded no influence on the values of the estimates. Therefore, it also can be interpreted as an outlier detection method. It is a non-deterministic algorithm in the sense that it produces a reasonable result only with a certain probability, with this probability increasing as more iterations are allowed. The algorithm was first published by Fischler and Bolles at SRI International in 1981. They used RANSAC to solve the Location Determination Problem (LDP), where the goal is to determine the points in the space that project onto an image into a set of landmarks with known locations. RANSAC uses repeated random sub-sampling. A basic assumption is that the data consists of "inliers", i.e., data whose distribution can be explained by some set of model parameters, though may be subject to noise, and "outliers" which are data that do not fit the model. The outliers can come, for example, from extreme values of the noise or from erroneous measurements or incorrect hypotheses about the interpretation of data. RANSAC also assumes that, given a (usually small) set of inliers, there exists a procedure which can estimate the parameters of a model that optimally explains or fits this data. Example A simple example is fitting a line in two dimensions to a set of observations. Assuming that this set contains both inliers, i.e., points which approximately can be fitted to a line, and outliers, points which cannot be fitted to this line, a simple least squares method for line fitting will generally produce a line with a bad fit to the data including inliers and outliers. The reason is that it is optimally fitted to all points, including the outliers. RANSAC, on the other hand, attempts to exclude the outliers and find a linear model that only uses the inliers in its calculation. This is done by fitting
https://en.wikipedia.org/wiki/Misliya%20Cave
Misliya Cave (), also known as the "Brotzen Cave" after Fritz Brotzen, who first described it in 1927, is a collapsed cave at Mount Carmel, Israel, containing archaeological layers from the Lower Paleolithic and Middle Paleolithic periods. The site is significant in paleoanthropology for the discovery of what were from 2018 to 2019 considered to be the earliest known remains attributed to Homo sapiens outside Africa, dated to 185,000 years ago. Since the time of its discovery in 2011, Jebel Faya, in the United Arab Emirates, had been considered to be the oldest settlement of anatomically-modern humans outside Africa, with its deepest assemblage being dated to 125,000 years ago. Excavations Excavations by teams of University of Haifa and University of Tel Aviv were conducted in the 2000/1 season, yielding finds dated to between 300,000 and 150,000 years ago. Misliya-1 fossil Of special interest is the Misliya-1 fossil, an upper jawbone discovered in 2002, and at first dated to "possibly 150,000 years ago" and classified as "early modern Homo sapiens" (EMHS). In January 2018, the date of the fossil has been revised to between 177,000 and 194,000 years ago (95% CI). This qualifies Misliya-1 as one of the oldest known fossil of H. sapiens, of comparable age to the Omo remains (as well as those of Herto, identified as "archaic Homo sapiens", or Homo sapiens idaltu), and the second oldest modern humans ever found outside of Africa, the oldest being the skull Apidima 1 from the south western Peloponnese dated to roughly 210,000 years ago. See also Anatomically modern humans List of human evolution fossils Northern Dispersal Recent African origin of modern humans
https://en.wikipedia.org/wiki/Rounding
Rounding means replacing a number with an approximate value that has a shorter, simpler, or more explicit representation. For example, replacing $ with $, the fraction 312/937 with 1/3, or the expression with . Rounding is often done to obtain a value that is easier to report and communicate than the original. Rounding can also be important to avoid misleadingly precise reporting of a computed number, measurement, or estimate; for example, a quantity that was computed as but is known to be accurate only to within a few hundred units is usually better stated as "about ". On the other hand, rounding of exact numbers will introduce some round-off error in the reported result. Rounding is almost unavoidable when reporting many computations – especially when dividing two numbers in integer or fixed-point arithmetic; when computing mathematical functions such as square roots, logarithms, and sines; or when using a floating-point representation with a fixed number of significant digits. In a sequence of calculations, these rounding errors generally accumulate, and in certain ill-conditioned cases they may make the result meaningless. Accurate rounding of transcendental mathematical functions is difficult because the number of extra digits that need to be calculated to resolve whether to round up or down cannot be known in advance. This problem is known as "the table-maker's dilemma". Rounding has many similarities to the quantization that occurs when physical quantities must be encoded by numbers or digital signals. A wavy equals sign (≈, approximately equal to) is sometimes used to indicate rounding of exact numbers, e.g. 9.98 ≈ 10. This sign was introduced by Alfred George Greenhill in 1892. Ideal characteristics of rounding methods include: Rounding should be done by a function. This way, when the same input is rounded in different instances, the output is unchanged. Calculations done with rounding should be close to those done without rounding. As a result
https://en.wikipedia.org/wiki/Cat%27s%20paw%20%28tool%29
A cat's paw or cat's claw is a metal hand tool used for extracting nails, typically from wood, using leverage. A standard tool in carpentry, it has a sharp V-shaped tip on one or both ends, which is driven into the wood by a hammer to capture the nailhead. Essentially, it is a smaller, more ergonomic, purpose-designed crowbar. Historically, the cat's paw had a single significantly rounder, more cup-shaped extracting head, giving it its name. Today, the norm is to have the two much narrower and more pointed heads offset 90-degrees (in plane) from one-another (allowing the bar to be pressed fully down when using the tip on the long end without damaging the surface the free end contacts). By the physics of its design the tip on the short end has substantially more leverage, but is not always convenient to be set with a hammer. Tool stock is typically hexagonal, though it may be round or rectangular. When the latter is sometimes is flattened on its long end to create a combination pry bar/nail extractor. Terms for each type used by popular retail outlets include "claw bar" when it has a claw on each end, and "moulding bar" if one end is flat. The cat's paw is well designed for demolition work, able to removed nails from wood, synthetic wood, and concrete, but because it tears up the surface around the nailhead is only used with care in finish work. History Prior to advances of the Industrial Revolution nails were individually hand-made by blacksmiths. As a result, they were generally far more valuable than the wood they were driven into. In North America wood was so abundant that it was commonplace to burn an old structure down in order to salvage the nails from it. As a result, nail pullers were designed to preserve the condition of the nail for reuse, resulting in a slide hammer type design, still in specialty use today. With machine made nails capped with distinct heads a cat's paw shaped puller appeared; however, the distinctly rounded shape of the ext
https://en.wikipedia.org/wiki/Taxonomic%20treatment
A taxonomic treatment is a section in a scientific publication documenting the features of a related group of organisms or taxa. Treatments have been the building blocks of how data about taxa are provided, ever since the beginning of modern taxonomy by Linnaeus 1753 for plants and 1758 for animals. Each scientifically described taxon has at least one taxonomic treatment. In today’s publishing, a taxonomic treatment tag is used to delimit such a section. It allows to make this section findable, accessible, interoperable and reusable FAIR data. This is implemented in the Biodiversity Literature Repository, where upon deposition of the treatment a persistent DataCite digital object identifier (DOI) is minted. This includes metadata about the treatment, the source publication and other cited resources, such as figures cited in the treatment. This DOI allows a link from a taxonomic name usage to the respective scientific evidence provided by the author(s), both for human and machine consumption. Treatments are considered data and thus copyright is not applicable and thus can be made available even from closed access publications. Etymology The term taxonomic treatment has been coined because the term description has two meanings in species or taxonomic descriptions. One is equivalent to treatment, the second as subsection in treatments describing the taxon, complementing diagnosis, materials examined, distribution, conservation and other subsections. History This term has been introduced during a national US NSF digital library project, and has been further developed into Taxpub, a taxonomy specific version of the Journal Article Tag Suite by Plazi, National Center for Biotechnology Information, and Pensoft Publishers. It was prototyped by the taxonomic journal ZooKeys, which adopted Taxpub from its volume 50 onwards, followed by PhytoKeys. Taxpub is now used by journals published by Pensoft Publishers, European Journal of Taxonomy by Consortium of European Taxonomic
https://en.wikipedia.org/wiki/Framework%20Class%20Library
The Framework Class Library (FCL) is a component of Microsoft's .NET Framework, the first implementation of the Common Language Infrastructure (CLI). In much the same way as Common Language Runtime (CLR) implements the CLI Virtual Execution System (VES), the FCL implements the CLI foundational Standard Libraries. As a CLI foundational class libraries implementation, it is a collection of reusable classes, interfaces, and value types, and includes an implementation of the CLI Base Class Library (BCL). With Microsoft's move to .NET Core, the CLI foundational class libraries implementation is known as CoreFX instead of Framework Class Library. See also Standard Libraries (CLI) Base Class Library (BCL)
https://en.wikipedia.org/wiki/Pomology
Pomology (from Latin , "fruit", + , "study") is a branch of botany that studies fruits and their cultivation. Someone who researches and practices the science of pomology is called a pomologist. The term fruticulture (from Latin , "fruit", + , "care") is also used to describe the agricultural practice of growing fruits in orchards. Pomological research is mainly focused on the development, enhancement, cultivation and physiological studies of fruit trees. The goals of fruit tree improvement include enhancement of fruit quality, regulation of production periods, and reduction of production costs. History Middle East In ancient Mesopotamia, pomology was practiced by the Sumerians, who are known to have grown various types of fruit, including dates, grapes, apples, melons, and figs. While the first fruits cultivated by the Egyptians were likely indigenous, such as the palm date and sorghum, more fruits were introduced as other cultural influences were introduced. Grapes and watermelon were found throughout predynastic Egyptian sites, as were the sycamore fig, dom palm and Christ's thorn. The carob, olive, apple and pomegranate were introduced to Egyptians during the New Kingdom. Later, during the Greco-Roman period peaches and pears were also introduced. Europe The ancient Greeks and Romans also had a strong tradition of pomology, and they cultivated a wide range of fruits, including apples, pears, figs, grapes, quinces, citron, strawberries, blackberries, elderberries, currants, damson plums, dates, melons, rose hips and pomegranates. Less common fruits were the more exotic azeroles and medlars. Cherries and apricots, both introduced in the 1st century BC, were popular. Peaches were introduced in the 1st century AD from Persia. Oranges and lemons were known but used more for medicinal purposes than in cookery. The Romans, in particular, were known for their advanced methods of fruit cultivation and storage, and they developed many of the techniques that are sti
https://en.wikipedia.org/wiki/Tussock%20grasslands%20of%20New%20Zealand
Tussock grasslands form expansive and distinctive landscapes in the South Island and, to a lesser extent, in the Central Plateau region of the North Island of New Zealand. Most of the plants referred to as tussocks are in the genera Chionochloa, Festuca, and Poa, also Carex. What would be termed "herbfields" for European mountains, and bunchgrass meadows in North America, are referred to as tussock herbfields in New Zealand due to a dominance of this type of plant. Species of the genus Chionochloa dominate in these areas. The larger tussocks are called snow grass (or less commonly snow tussocks) and may grow up to in height. They grow slowly and some specimens are estimated to be several centuries old. See also Canterbury–Otago tussock grasslands Southland montane grasslands Environment of New Zealand
https://en.wikipedia.org/wiki/Halomicrobium
Halomicrobium is a genus of the Haloarculaceae.
https://en.wikipedia.org/wiki/Hotline%20Communications
Hotline Communications Limited (HCL) was a software company founded in 1997, based in Toronto, Canada, with employees also in the United States and Australia. Hotline Communications' main activity was the publishing and distribution of a multi-purpose client/server communication software product named Hotline Connect, informally called, simply, Hotline. Initially, Hotline Communications sought a wide audience for its products, and organizations as diverse as Avid Technology, Apple Computer Australia, and public high schools used Hotline. At its peak, Hotline received millions of dollars in venture capital funding, grew to employ more than fifty people, served millions of users, and won accolades at trade shows and in newspapers and computer magazines around the world. Hotline eventually attracted more of an "underground" community, which saw it as an easier to use successor to the Internet Relay Chat (IRC) community. In 2001 Hotline Communications lost the bulk of its VC funding, and went out of business later that year. All of its assets were acquired in 2002 by Hotsprings, Inc., a new company formed by some ex-employees and shareholders. Hotsprings Inc. has since also abandoned development of the Hotline Connect software suite; the last iteration of Hotline Connect was released in December 2003. Currently, only a few servers and trackers remain but the Hotline community is still alive. History Hotline was designed in 1996 and known as "hotwire" by Australian programmer Adam Hinkley (known online by his username, "Hinks"), then 17 years old, as a classic Mac OS application. The source code for the Hotline applications was based on a class library, "AppWarrior" (AW), which Hinkley wrote. AppWarrior would later become litigious, as Hinkley wrote parts of it while he was employed by an Australian company, Redrock Holdings. Six other fans of Hotline, David Murphy, Alex King, Phil Hilton, Jason Roks, David Bordin, and Terrance Gregory, joined Adam Hinkley's efforts to
https://en.wikipedia.org/wiki/Henry%20Lubben%20House%2C%20Smokehouse%20and%20Springhouse
The Henry Lubben House, Smokehouse and Springhouse are a collection of historic buildings located north of Baldwin, Iowa, United States. They are three of over 217 limestone structures in Jackson County from the mid-19th century, of which 101 were houses, 13 were springhouses, and 36 were other farm related buildings. What makes the Lubben buildings unique is that the three stone buildings are grouped together on the farmstead. The wood frame farm buildings are located immediately to the north. The stonework on the house is coursed-cut stone that is believed to have been quarried just west of the house. The windows have dressed stone sills and lintels. It also features "high style" elements such as the denticulated wooden cornice. The house is L-shaped with a single story stone section on the back, which is original to the house, capped by a wood frame second floor that was added later. An enclosed wooden porch on the front was added in 1931. The quality of the stonework on the springhouse and the smokehouse are of a lesser quality. Henry Lubben was a native of Oldenberg, Germany who settled in Jackson County in 1837. He was one of the first settlers in Monmouth Township. He married in 1858 and the house was built that same year. The two out buildings were built either at the same time, or possibly later in the century. The buildings were listed together on the National Register of Historic Places in 1992.
https://en.wikipedia.org/wiki/Nepenthes%20%28sculpture%29
Nepenthes is a series of four sculptures by artist Dan Corson, installed in 2013 along Northwest Davis Street in the Old Town Chinatown neighborhood of Portland, Oregon, in the United States. The work was inspired by the genus of carnivorous plants of the same name, known as tropical pitcher plants. The sculptures are tall and glow in the dark due to photovoltaics. Description and history Nepenthes is a series of four sculptures by Dan Corson. Located along Northwest Davis Street between Fifth and Eighth Avenues, the pieces are inspired by the genus of carnivorous plants of the same name, commonly referred to as tropical pitcher plants. According to Corson and the Regional Arts & Culture Council (RACC), which maintains the sculpture series, the work and genus are named after the "magical Greek potion that eliminates sorrow and suffering". The agency said that Nepenthes "insert[s] a quirky expression of nature into an urban environment" and celebrates the "unique and diverse community" of Portland's Old Town Chinatown neighborhood. Each bulbous-shaped sculpture is nearly tall and glows in the dark due to photovoltaics. The work was commissioned by TriMet's Portland Mall Project in an attempt to increase "pedestrian connectivity" between Old Town Chinatown and the Pearl District. Corson was hired by the Portland Mall Project's design team, which was led by ZGF Architects LLP. The Old Town/Chinatown Visions Committee was also involved in the early stages of the project. Once the project evolved from a lighting project into a sculpture, TriMet asked RACC to facilitate the project on behalf of the city's public art collection. The project had a $300,000 budget and required several design changes. RACC assembled a panel of "original stakeholders", artists and neighbors, to refine Corson's design. The agency worked with Portland Transportation to determine the exact placement of the pieces. Some of them were reportedly placed in "awkward" locations in order to receive
https://en.wikipedia.org/wiki/OpenMDAO
OpenMDAO is an open-source high-performance computing platform for systems analysis and multidisciplinary optimization written in the Python programming language. The OpenMDAO project is primarily focused on supporting gradient based optimization with analytic derivatives to allow you to explore large design spaces with hundreds or thousands of design variables, but the framework also has a number of parallel computing features that can work with gradient-free optimization, mixed-integer nonlinear programming, and traditional design space exploration. The OpenMDAO framework is designed to aid in linking together separate pieces of software for the purpose of combined analyses. It allows users to combine analysis tools (or design codes) from multiple disciplines, at multiple levels of fidelity, and to manage the interaction between them. OpenMDAO is specifically designed to manage the dataflow (the actual data) and the workflow (what code is run when) in conjunction with optimization algorithms and other advanced solution techniques. The development of OpenMDAO is led out of the NASA Glenn Research Center. Features Library of built-in solvers and optimizers Tools for metamodeling Data recording capabilities Support for analytic derivatives Support for high-performance computer clusters and distributed computing Extensible plugin library Applications NASA’s motivation in supporting the OpenMDAO project stems from the demands of unconventional aircraft concepts like Turbo-Electric distributed propulsion. Although NASA’s focus is on analyzing aerospace applications, the framework itself is general and is not specific to any discipline. Framework structure OpenMDAO is designed to separate the flow of information (dataflow) from the process in which analyses are executed (workflow). It does that by using four specific constructs: Component, Assembly, Driver, and Workflow. The construction of system models begins with wrapping (or writing from scratch) vario
https://en.wikipedia.org/wiki/Medidata%20Solutions
Medidata Solutions is an American technology company that develops and markets software as a service (SaaS) for clinical trials. These include protocol development, clinical site collaboration and management; randomization and trial supply management; capturing patient data through web forms, mobile health (mHealth) devices, laboratory reports, and imaging systems; quality monitor management; safety event capture; and monitoring and business analytics. Headquartered in New York City, Medidata has locations in China, Japan, Singapore, South Korea, the United Kingdom, and the United States. Medidata's customers include pharmaceutical, biotechnology, medical device, and diagnostic companies; academic and government institutions; contract research organizations; and other life sciences organizations around the world that develop and bring medical therapies and products to market. History The company was founded in June 1999 by Tarek Sherif, Glen de Vries and Ed Ikeguchi. In 1994, de Vries and Ikeguchi created OceanTek, a startup that developed Web applications for conducting clinical trials and was the precursor to Medidata. In 1999, they restarted the company as a new firm and, together with Sherif, formed Medidata, to provide online systems for designing and running clinical trials. In 2004, they completed a $10 million round of financing with Insight Venture Partners, and were later backed by investors including Milestone Venture Partners and Stonehenge Capital Fund. Ikeguchi left the company in 2008, and de Vries moved from chief technology officer to president, with Tarek as chief executive officer. On January 26, 2009, Medidata filed to raise $86 million in an initial public offering (IPO). It made its IPO on the Nasdaq Stock Market on June 25, 2009, with its market capitalization at $313 million. It debuted on the Nasdaq at $18 per share. The company was ranked #11 on Fortune magazine's 2017 Fortune Future 50 list,#51 on the Fortune 100 Fastest Growing Compan
https://en.wikipedia.org/wiki/Abraham%E2%80%93Lorentz%20force
In the physics of electromagnetism, the Abraham–Lorentz force (also known as the Lorentz–Abraham force) is the recoil force (a force of equal magnitude and opposite direction) on an accelerating charged particle caused by the particle emitting electromagnetic radiation by self-interaction. It is also called the radiation reaction force, the radiation damping force, or the self-force. It is named after the physicists Max Abraham and Hendrik Lorentz. The formula although predating the theory of special relativity, was initially calculated for non-relativistic velocity approximations was extended to arbitrary velocities by Max Abraham and was shown to be physically consistent by George Adolphus Schott. The non-relativistic form is called Lorentz self-force while the relativistic version is called the Lorentz–Dirac force or collectively known as Abraham–Lorentz–Dirac force. The equations are in the domain of classical physics, not quantum physics, and therefore may not be valid at distances of roughly the Compton wavelength or below. There are, however, two analogs of the formula that are both fully quantum and relativistic: one is called the "Abraham–Lorentz–Dirac–Langevin equation", the other is the self-force on a moving mirror. The force is proportional to the square of the object's charge, multiplied by the jerk that it is experiencing. (Jerk is the rate of change of acceleration.) The force points in the direction of the jerk. For example, in a cyclotron, where the jerk points opposite to the velocity, the radiation reaction is directed opposite to the velocity of the particle, providing a braking action. The Abraham–Lorentz force is the source of the radiation resistance of a radio antenna radiating radio waves. There are pathological solutions of the Abraham–Lorentz–Dirac equation in which a particle accelerates in advance of the application of a force, so-called pre-acceleration solutions. Since this would represent an effect occurring before its cause
https://en.wikipedia.org/wiki/Essence%20of%20Decision
Essence of Decision: Explaining the Cuban Missile Crisis is book by political scientist Graham T. Allison analyzing the 1962 Cuban Missile Crisis. Allison used the crisis as a case study for future studies into governmental decision-making. The book became the founding study of the John F. Kennedy School of Government, and in doing so revolutionized the field of international relations. Allison originally published the book in 1971. In 1999, because of new materials available (including tape recordings of the U.S. government's proceedings), he rewrote the book with Philip Zelikow. The title is based on a speech by John F. Kennedy, in which he said, "The essence of ultimate decision remains impenetrable to the observer - often, indeed, to the decider himself." Thesis When he first wrote the book, Allison contended that political science and the study of international relations were saturated with rational expectations theories inherited from the field of economics. Under such a view, the actions of states are analyzed by assuming that nations consider all options and act rationally to maximize their utility. Allison attributes such viewpoints to the dominance of economists such as Milton Friedman, statesmen such as Robert McNamara and Henry Kissinger, disciplines such as game theory, and organizations such as the RAND Corporation. However, as he puts it: It must be noted, however, that an imaginative analyst can construct an account of value-maximizing choice for any action or set of actions performed by a government. Or, to put it bluntly, this approach (which Allison terms the "Rational Actor Model") violates the principle of falsifiability. Also, Allison notes that "rational" analysts must ignore a lot of facts in order to make their analysis fit their models. In response, Allison constructed three different ways (or "lenses") through which analysts can examine events: the "Rational Actor" model, the "Organizational Behavior" model, and the "Governmental Po
https://en.wikipedia.org/wiki/Ras-interacting%20protein%201
Ras-interacting protein 1 (Rain), is a protein that in humans is encoded by the RASIP1 gene. Function It is required for the proper formation of vascular structures that develop via both vasculogenesis and angiogenesis. Acts as a critical and vascular-specific regulator of GTPase signaling, cell architecture, and adhesion, which is essential for endothelial cell morphogenesis and blood vessel tubulogenesis. Regulates the activity of Rho GTPases in part by recruiting ARHGAP29 and suppressing RhoA signaling and dampening ROCK and MYH9 activities in endothelial cells. Clinical significance A recent genome-wide association study (GWAS) has found that genetic variations in RASIP1 are associated with late-onset sporadic Alzheimer’s disease (LOAD). However, it's unknown how RASIP1 mutation contributes to disease.
https://en.wikipedia.org/wiki/Paul%20Benioff
Paul Anthony Benioff (May 1, 1930 – March 29, 2022) was an American physicist who helped pioneer the field of quantum computing. Benioff was best known for his research in quantum information theory during the 1970s and 80s that demonstrated the theoretical possibility of quantum computers by describing the first quantum mechanical model of a computer. In this work, Benioff showed that a computer could operate under the laws of quantum mechanics by describing a Schrödinger equation description of Turing machines. Benioff's body of work in quantum information theory encompassed quantum computers, quantum robots, and the relationship between foundations in logic, math, and physics. Early life and education Benioff was born on May 1, 1930, in Pasadena, California. His father, Hugo Benioff, was a professor of seismology at the California Institute of Technology, and his mother, Alice Pauline Silverman, received a master's degree in English from the University of California, Berkeley. He married Hannelore Benioff. Benioff also attended Berkeley, where he earned an undergraduate degree in botany in 1951. After a two-year stint working in nuclear chemistry for Tracerlab, he returned to Berkeley. In 1959, he obtained his PhD in nuclear chemistry. Career In 1960, Benioff spent a year at the Weizmann Institute of Science in Israel as a postdoctoral fellow. He then spent six months at the Niels Bohr Institute in Copenhagen as a Ford Fellow. In 1961, he began a long career at Argonne National Laboratory, first with its Chemistry Division and later in 1978 in the lab's Environmental Impact Division. Benioff remained at Argonne until he retired in 1995. He continued to conduct research at the laboratory as a post-retirement emeritus scientist for the Physics Division until his death. In addition, Benioff taught the foundations of quantum mechanics as a visiting professor at Tel Aviv University in 1979, and he worked as a visiting scientist at CNRS Marseilles in 1979 and 1
https://en.wikipedia.org/wiki/Epiphenomenon
An epiphenomenon (plural: epiphenomena) is a secondary phenomenon that occurs alongside or in parallel to a primary phenomenon. The word has two senses: one that connotes known causation and one that connotes absence of causation or reservation of judgment about it. Examples Metaphysics The problem of epiphenomena is often a counterexample to theories of causation and is identified with situations in which an event E is caused by (or, is said to be caused by) an event C, which also causes (or, is said to cause) an event F. For example, take a simplified Lewisian counterfactual analysis of causation that the meaning of propositions about causal relationships between two events A and B can be explained in terms of counterfactual conditionals of the form "if A had not occurred then B would not have occurred". Suppose that C causes E and that C has an epiphenomenon F. We then have that if E had not occurred, then F would not have occurred, either. But then according to the counterfactual analysis of causation, the proposition that there is a causal dependence of F on E is true; that is, on this view, E caused F. Since this is not in line with how we ordinarily speak about causation (we would not say that E caused F), a counterfactual analysis seems to be insufficient. Philosophy of mind and psychology An epiphenomenon can be an effect of primary phenomena, but cannot affect a primary phenomenon. In philosophy of mind, epiphenomenalism is the view that mental phenomena are epiphenomena in that they can be caused by physical phenomena, but cannot cause physical phenomena. In strong epiphenomenalism, epiphenomena that are mental phenomena can only be caused by physical phenomena, not by other mental phenomena. In weak epiphenomenalism, epiphenomena that are mental phenomena can be caused by both physical phenomena and other mental phenomena, but mental phenomena cannot be the cause of any physical phenomenon. The physical world operates independently of the menta
https://en.wikipedia.org/wiki/Lysibody
Although cell wall carbohydrates are ideal immunotherapeutic targets due to their abundance in bacteria and high level of conservation, their poor immunogenicity compared with protein targets complicates their use for the development of protective antibodies. A lysibody is a chimeric antibody in which the Fab region is the binding domain from a bacteriophage lysin, or the binding domain from an autolysin or bacteriocin, all of which bind to bacterial cell wall carbohydrate epitopes. This is linked to the Fc of Immunoglobulin G (IgG). The chimera forms a stable homodimer held together by hinge-region disulfide bonds. Thus, lysibodies are homodimeric hybrid immunoglobulin G molecules that can bind with high affinity and specificity to a carbohydrate substrate in the bacterial cell wall peptidoglycan. Lysibodies behave like authentic IgG by binding at high affinity to their bacterial wall receptor, fix complement and therefore promote phagocytosis by macrophages and neutrophils, protecting mice from infection in model systems. Since cell wall hydrolases, autolysins and bacteriocins are ubiquitous in nature, production of lysibodies specific for difficult to treat pathogenic bacteria is possible. Binding domains may be linked to either the N-terminus of the IgG Fc (as is the case for autolysins) or to the C-terminus (as seen with phage lysins - see figure). In both cases the binding domains are able to bind their substrates in the bacterial cell wall and the Fc is able to perform its effector functions (see ref 2 for more detail). Lysibodies may be used prophylactically to help protect surgical patients from bacterial infections, particularly methicillin resistant Staphylococcus aureus (MRSA) and boost immune clearance in infected individuals. Monoclonal antibodies Glycoproteins Immune system
https://en.wikipedia.org/wiki/Domestic%20turkey
The domestic turkey (Meleagris gallopavo domesticus) is a large fowl, one of the two species in the genus Meleagris and the same species as the wild turkey. Although turkey domestication was thought to have occurred in central Mesoamerica at least 2,000 years ago, recent research suggests a possible second domestication event in the area that is now the southwestern United States between 200 BC and AD 500. However, all of the main domestic turkey varieties today descend from the turkey raised in central Mexico that was subsequently imported into Europe by the Spanish in the 16th century. The domestic turkey is a popular form of poultry, and it is raised throughout temperate parts of the world, partially because industrialized farming has made it very cheap for the amount of meat it produces. Female domestic turkeys are called hens, and the chicks are poults or turkeylings. In Canada and the United States, male turkeys are called toms; in the United Kingdom and Ireland they are stags. The great majority of domestic turkeys are bred to have white feathers because their pin feathers are less visible when the carcass is dressed, although brown or bronze-feathered varieties are also raised. The fleshy protuberance atop the beak is the snood, and the one attached to the underside of the beak is known as a wattle. The English-language name for this species results from an early misidentification of the bird with an unrelated species which was imported to Europe through the country of Turkey. The Latin species name means "chicken peacock". History The modern domestic turkey is descended from the South Mexican subspecies (the nominate subspecies M. g. gallopavo) of wild turkey, found in Central Mexico in a region bounded by the present Mexican states of Jalisco to the northwest, Guerrero to the southwest, and Veracruz to the east. Ancient Mesoamericans domesticated this subspecies, using its meat and eggs as major sources of protein and employing its feathers extensiv
https://en.wikipedia.org/wiki/Gloeomargarita%20lithophora
Gloeomargarita lithophora is a cyanobacterium, and is the proposed sister of the endosymbiotic plastids in the eukaryote group Archaeplastida (glaucophytes, plants, green and red algae). Gloeomargarita'''s relative would have ended up in an ancestral archaeplastid through a singular endosymbiotic event some 1900-1400 million years ago, after which it was recruited by the euglenids and some members of the SAR supergroup. The origin of plastids by endosymbiosis signifies the beginning of photosynthesis in eukaryotes, and as such their evolutionary relationship to Gloeomargarita lithophora, perhaps as a direct divergent, is of high importance to the evolutionary history of photosynthesis. Gloeomargarita appears to be related to a (basal) Synechococcus branch. A similar endosymbiotic event occurred about 500 million years ago, with another Synechococcus related bacteria appearing in Paulinella chromatophora. Description G. lithophora was first isolated in 2007 from microbiolate samples taken from alkaline Lake Alchichica (Mexico). These samples were maintained in a lab aquarium and G. lithophora was isolated from biofilm that occurred within the aquarium. G. lithophora are gram-negative, unicellular rods with oxygenic photoautotrophic metabolism and gliding motility. They contain chlorophyll a and phycocyanin and photosynthetic thylakoids located peripherally. Cells are 1.1 μm wide and 3.9 μm long on average. Growth occurred in both liquid and solid BG-11 growth media, as well as in alkaline water. Optimal growth temperature is 25 °C and optimal growth pH is 8–8.5. Bioremediation Some evidence suggests that Gloeomargarita lithophora'' could serve as a biological buffer to treat water contaminated with strontium, barium, or radioactive pollutants such as radium. This could be a useful application of bioremediation.
https://en.wikipedia.org/wiki/Mobile%203D%20Graphics%20API
The Mobile 3D Graphics API, commonly referred to as M3G, is a specification defining an API for writing Java programs that produce 3D computer graphics. It extends the capabilities of the Java ME, a version of the Java platform tailored for embedded devices such as mobile phones and PDAs. The object-oriented interface consists of 30 classes that can be used to draw complex animated three-dimensional scenes. M3G was developed under the Java Community Process as JSR 184. , the current version of M3G is 1.1, but version 2.0 is in development as JSR 297. Immediate and retained modes M3G provides two ways for developers to draw 3D graphics: immediate mode and retained mode. In immediate mode, graphics commands are issued directly into the graphics pipeline and the rendering engine executes them immediately. When using this method, the developer must write code that specifically tells the rendering engine what to draw for each animation frame. A camera, and set of lights are also associated with the scene, but is not necessarily part of it. In immediate mode it is possible to display single objects, as well as entire scenes (or worlds, with a camera, lights, and background as parts of the scene). Retained mode always uses a scene graph that links all geometric objects in the 3D world in a tree structure, and also specifies the camera, lights, and background. Higher-level information about each object—such as its geometric structure, position, and appearance—is retained from frame to frame. Other features The M3G standard also specifies a file format for 3D model data, including animation data format. This allows developers to create content on PCs that can be loaded by M3G on mobile devices. Further reading Alessio Malizia: Mobile 3D Graphics, Springer, 2006, Kari Pulli, Tomi Aarnio, Ville Miettinen, Kimmo Roimela, Jani Vaarala: Mobile 3D Graphics with OpenGL ES and M3G, Morgan Kaufmann, 2007, Claus Höfele: Mobile 3D Graphics: Learning 3D Graphics with the
https://en.wikipedia.org/wiki/Stem%20Cell%20Research%20%28journal%29
Stem Cell Research is a peer-reviewed open access scientific journals of cell biology. The journal was established in 2007, and is currently published 8 times per year by Elsevier. The current editor-in-chief is Thomas Zwaka (Icahn School of Medicine at Mount Sinai). Abstracting and indexing The journal is abstracted and indexed in the following bibliographic databases: According to the Journal Citation Reports, the journal has a 2020 impact factor of 2.02.
https://en.wikipedia.org/wiki/Actephila%20excelsa
Actephila excelsa is a species of shrub in the family Phyllanthaceae. It is native to an area in Tropical Asia and Zhōngguó/China, from Sulawesi to India and Guangxi. It is a highly variable species and leaf forms vary across adjacent ecozones. The plant is used in building houses and as a vegetable. Grey-shanked douc langurs eat the leaves. Description It is noted that this is a highly variable species. The taxa grows as a shrub or tree from 1 to 10m (rarely 15m) tall with a trunk that is up to 30cm in diameter. The outer bark is pale-tan to greyish, greenish-yellow and reddish in colour, and from smooth to possessing fine vertical fissures to scaly. Leaves are alternate, though subopposite at the end of branches, smooth to slightly to completely covered in hairs, with an elliptic (sometimes more or less obovate) blade some (4-)5.5–35.5 x (1.1–)1.9–13.5(–15.9) cm in size; an acute to obtuse base, flat margin, cuspidate apex (sometimes to acuminate or even rarely acute); light to dark-green glossy upper, paler-green lower which is sometimes hairy on midrib and veins. Solitary to bundled flowers, white to greenish-white. The fruit capsule is oblate, some 1.5 by 2-2.5cm in diameter with a brown skin and yellowish-white inside. Three-angled seeds about 10 by 0.9cm in size. Distinguishing features of this species of Actephila include 5-95mm long petioles; elliptic (to more or less obovate) leaf blades; dimensions of leaf blades (see above); white to greenish pistillate flowers; knobbly-surfaced fruit-wall with venation not raised; 5-8mm long columella which are somewhat thickened basally but do not completely cover disc and base of sepals. In most of its habitat it will flower and fruit the whole year round. Flowering in Zhōngguó/China occurs from February to September, fruit from July to October. Distinguishing characteristics for this species of Actephila in Zhōngguó/China include the long acuminate leaf blade which is puberulent abaxially (hairs on lower); the f
https://en.wikipedia.org/wiki/TwinBee
is a vertically scrolling shooter released by Konami as an arcade video game in 1985 in Japan. Along with Sega's Fantasy Zone, released a year later, TwinBee is credited as an early archetype of the "cute 'em up" type in its genre. It was the first game to run on Konami's Bubble System hardware. TwinBee was ported to the Family Computer and MSX in 1986 and has been included in numerous compilations released in later years. The original arcade game was released outside Japan for the first time in the Nintendo DS compilation Konami Classics Series: Arcade Hits. A mobile phone version was released for i-mode Japan phones in 2003 with edited graphics. Various TwinBee sequels were released for the arcade and home console markets following the original game, some which spawned audio drama and anime adaptations in Japan. Gameplay TwinBee can be played by up to 2-players simultaneously. The player takes control of a cartoon-like anthropomorphic spacecraft, with Player 1 taking control of TwinBee, the titular ship, while Player 2 controls WinBee. The game control consists of an eight-way joystick and two buttons: one for shooting enemies in the air and the other for dropping bombs to ground enemies (similarly to Xevious). The player's primary power-ups are bells that can be uncovered by shooting at the floating clouds where they're hidden. If the player continues shooting the bell after it appears, it will change into one of four other colors: the regular yellow bells only grant bonus points, the white bell will upgrade the player's gun into a twin cannon, the blue bell increases the player's speed (for up to five speed levels), the green bell will allow the player to create image copies of its ship for additional firepower, and the red bell will provide the player's ship a barrier that allows it to sustain more damage. The green and red bells cannot be combined. Other power-ups can also be retrieved from ground enemies such as an alternate bell that gives the player's sh
https://en.wikipedia.org/wiki/Wireless%20Transport%20Layer%20Security
Wireless Transport Layer Security (WTLS) is a security protocol, part of the Wireless Application Protocol (WAP) stack. It sits between the WTP and WDP layers in the WAP communications stack. Overview WTLS is derived from TLS. WTLS uses similar semantics adapted for a low bandwidth mobile device. The main changes are: Compressed data structures — Where possible packet sizes are reduced by using bit-fields, discarding redundancy and truncating some cryptographic elements. New certificate format — WTLS defines a compressed certificate format. This broadly follows the X.509 v3 certificate structure, but uses smaller data structures. Packet based design — TLS is designed for use over a data stream. WTLS adapts that design to be more appropriate on a packet based network. A significant amount of the design is based on a requirement that it be possible to use a packet network such as SMS as a data transport. WTLS has been superseded in the WAP Wireless Application Protocol 2.0 standard by the End-to-end Transport Layer Security Specification. Security WTLS uses cryptographic algorithms and in common with TLS allows negotiation of cryptographic suites between client and server. Algorithms An incomplete list: Key Exchange and Signature RSA Elliptic Curve Cryptography (ECC) Symmetric Encryption DES Triple DES RC5 Message Digest MD5 SHA1 Security criticisms Encryption/Decryption at the gateway — in the WAP architecture the content is typically stored on the server as uncompressed WML (an XML DTD). That content is retrieved by the gateway using HTTP and compressed into WBXML, in order to perform that compression the gateway must be able to handle the WML in cleartext, so even if there is encryption between the client and the gateway (using WTLS) and between the gateway and the originating server (using HTTPS) the gateway acts as a man-in-the-middle. This gateway architecture serves a number of purposes: transcoding between HTML and WML; content provide
https://en.wikipedia.org/wiki/Charles%20F.%20Roos
Charles Frederick Roos (May 18, 1901 – January 6, 1958) was an American economist who made contributions to mathematical economics. He was one of the founders of the Econometric Society together with American economist Irving Fisher and Norwegian economist Ragnar Frisch in 1930. He served as Secretary-Treasurer during the first year of the Society and was elected as President in 1948. He was director of research of the Cowles Commission from September 1934 to January 1937. Roos earned a PhD in mathematics from Rice University in 1926, under supervision of Griffith C. Evans. He was amongst the first, together with Evans and mathematician Frank P. Ramsey, to use the calculus of variations in mathematical economics. His direct involvement with two key institutions in economic history, both the Econometric Society and the Cowles Commission, place him in a pivotal position in the mathematization of economics in the first half of the 20th century. His own work, however, would not be so influential. Mathematical economics and econometrics eventually favored technical and epistemological approaches that were different from his own. Biography Charles Frederick Roos was born in New Orleans, Louisiana on 18 May 1901. He studied mathematics at the Rice Institute, receiving his Bachelor of Arts degree in 1921, his Master of Arts degree in 1924, and, being awarded his PhD in 1926. His main interests were in the calculus of variations, integral equations, and the applications of these to economic theory. His early research was deeply inspired by the work of his thesis advisor Griffith C. Evans who used these same mathematical tools to analyze business cycles, economic equilibrium, and economic competition. During the following years, from 1926 to 1928, Roos continued his academic studies as National Research Fellow at the University of Chicago and Princeton University. At Princeton, Roos met the Norwegian economist Ragnar Frisch who was travelling there under a grant from the
https://en.wikipedia.org/wiki/Webroot
Webroot Inc. is an American privately-held cybersecurity software company that provides Internet security for consumers and businesses. The company was founded in Boulder, Colorado, US, and is now headquartered in Broomfield, Colorado, and has US operations in San Mateo and San Diego, and globally in Australia, Austria, Ireland, Japan and the United Kingdom. History Webroot was founded on 5 July 1997 when Steven Thomas and his girlfriend Boulderite Kristen Tally launched Webroot's first commercial product, a trace removal agent called Webroot Window Washer. Investors include venture capital firms such as Technology Crossover Ventures, Accel Partners and Mayfield. In 2002, Webroot launched a spyware blocking and removal product called Webroot Spy Sweeper. The company introduced antivirus protection with the launch of Spy Sweeper with AntiVirus in 2006. In October 2007, Webroot AntiVirus with AntiSpyware and Desktop Firewall was released with added firewall protection feature. Webroot entered the enterprise market in 2004 with the launch of Webroot Spy Sweeper Enterprise, which combined Spy Sweeper with technology that enables IT administrators to deploy antispyware protection across an entire network. In October 2008, Webroot launched its first consumer security suite, Webroot Internet Security Essentials, in the United States. The international release of the security suite followed in early 2009. In August 2009, Webroot appointed a new president and CEO, former CEO of Wily Technology. In May 2010 Webroot announced plans to open its international headquarters in Dublin, Ireland. In July 2010 Webroot Internet Security Complete 2011 was released, including antivirus and antispyware protection, firewall capabilities, online back-up, password management licensed from LastPass, protection against identity theft and credit card monitoring for US customers. In September 2010 Webroot opened a regional office in Leidschendam, The Netherlands which is primarily ai
https://en.wikipedia.org/wiki/115th%20meridian%20east
The meridian 115° east of Greenwich is a line of longitude that extends from the North Pole across the Arctic Ocean, Asia, the Indian Ocean, Australasia, the Southern Ocean, and Antarctica to the South Pole. The 115th meridian east forms a great circle with the 65th meridian west. Between Australia and the 60th parallel south it forms the western boundary of the South Pacific Nuclear-Weapon-Free Zone. From Pole to Pole Starting at the North Pole and heading south to the South Pole, the 115th meridian east passes through: {| class="wikitable plainrowheaders" ! scope="col" width="130" | Co-ordinates ! scope="col" | Country, territory or sea ! scope="col" | Notes |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Arctic Ocean | style="background:#b0e0e6;" | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Laptev Sea | style="background:#b0e0e6;" | |-valign="top" | ! scope="row" | | Sakha Republic Irkutsk Oblast — from Republic of Buryatia — from Zabaykalsky Krai — from |- | ! scope="row" | | |-valign="top" | ! scope="row" | | Inner Mongolia Hebei – from Henan – from Shandong – from Henan – from Anhui – for about 18 km from Henan – from Hubei – from Jiangxi – from Guangdong – from |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | South China Sea | style="background:#b0e0e6;" | Passing through the disputed Spratly Islands |- | ! scope="row" | | On the island of Borneo |- | ! scope="row" | | Sarawak - on the island of Borneo |- | ! scope="row" | | On the island of BorneoNorth KalimantanEast KalimantanCentral KalimantanSouth Kalimantan |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Java Sea | style="background:#b0e0e6;" | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Bali Sea | style="background:#b0e0e6;" | |- | ! scope="row" | | Island of Bali |- | style="background:#b0e0e6;" | ! scope="
https://en.wikipedia.org/wiki/Oleg%20Viro
Oleg Yanovich Viro () (b. 13 May 1948, Leningrad, USSR) is a Russian mathematician in the fields of topology and algebraic geometry, most notably real algebraic geometry, tropical geometry and knot theory. Contributions Viro developed a "patchworking" technique in algebraic geometry, which allows real algebraic varieties to be constructed by a "cut and paste" method. Using this technique, Viro completed the isotopy classification of non-singular plane projective curves of degree 7. The patchworking technique was one of the fundamental ideas which motivated the development of tropical geometry. In topology, Viro is most known for his joint work with Vladimir Turaev, in which the Turaev-Viro invariants (relatives of the Reshetikhin-Turaev invariants) and related topological quantum field theory notions were introduced. Education and career Viro studied at the Leningrad State University where he received his Ph.D. degree in 1974; his advisor was Vladimir Rokhlin. Viro taught from 1973 until 1991 at Leningrad State University. Since 1986 he has been a member of the Saint Petersburg Department of the Steklov Institute of Mathematics. In 1992-1997, Viro was a F. B. Jones chair professor in Topology at the University of California, Riverside. In 1994-2003 he was a professor at Uppsala University, Sweden. On 8 February 2007, Viro and his colleague Burglind Juhl-Jöricke were forced to resign from the university. There had been a history of conflict at the Mathematics Institute, with allegations of disagreeable behavior by several parties in the conflict. A number of Swedish, European and American mathematicians protested the manner in which the two Professors of Mathematics were forced to resign. These protests include the following: an open letter by Lennart Carleson, former president of the International Mathematical Union, a letter by Ari Laptev, current president of the European Mathematical Society, and a letter from M. Salah Baouendi, Arthur Jaffe, Joel Lebowitz, E
https://en.wikipedia.org/wiki/Germinal%20matrix
In anatomy, the germinal matrix is a highly cellular and highly vascularized region in the brain out from which cells migrate during brain development. The germinal matrix is the source of both neurons and glial cells and is most active between 8 and 28 weeks gestation. It is a fragile portion of the brain that may be damaged leading to a germinal matrix hemorrhage (grade 1 intraventricular hemorrhage). Location/anatomy: The germinal matrix is next to the lateral ventricles (the "inside" of the brain). Function/physiology: Neurons and glia migrate radially outward from the germinal matrix towards the cerebral cortex. For more information, see the associated articles on neuronal migration and corticogenesis. Dysfunction/pathophysiology: in prenatology/neonatology, intraventricular hemorrhages occur starting in the germinal matrix due to the lack of structural integrity there. Intraventricular hemorrhages are a common and harmful issue in children born prematurely. See also Intraventricular hemorrhage Ganglionic eminence
https://en.wikipedia.org/wiki/Vital%20theory
According to the vital force theory, the conduction of water up the xylem vessel is a result of vital action of the living cells in the xylem tissue. These living cells are involved in ascent of sap. Relay pump theory and Pulsation theory support the active theory of ascent of sap. Emil Godlewski (senior) (1884) proposed Relay pump or Clamberinh force theory (through xylem parenchyma) and Jagadish Chandra Bose(1923) proposed pulsation theory (due to pulsatory activities of innermost cortical cells just outside endodermis). Jagadish Chandra Bose suggested a mechanism for the ascent of sap in 1927. His theory can be explained with the help of galvanometer of electric probes. He found electrical ‘pulsations’ or oscillations in electric potentials, and came to believe these were coupled with rhythmic movements in the telegraph plant Codariocalyx motorius (then Desmodium). On the basis of this Bose theorized that regular wave-like ‘pulsations’ in cell electric potential and turgor pressure were an endogenous form of cell signaling. According to him the living cells in the inner lining of the xylem tissue pump water by contractive and expulsive movements similar to the animal heart circulating blood. This mechanism has not been well supported, and in spite of some ongoing debate, the evidence overwhelmingly supports the cohesion-tension theory for the ascent of sap. See also Cohesion-tension theory External links Bioelectricity and the rhythms of sensitive plants – The biophysical research of Jagadis Chandra Bose Botany
https://en.wikipedia.org/wiki/Florey%20Medal
The Florey Medal, also known as the CSL Florey Medal and the Florey Medal for Lifetime Achievement, is an Australian award for biomedical research named in honour of Australian Nobel Laureate Howard Florey. The medal is awarded biennially and the recipient receives $50,000 in prize money. The Medal was first awarded in 1998, the centenary of Florey's birth. It is administered by the Australian Institute of Policy & Science and has been sponsored by F H Faulding, then Mayne (when they took over Fauldings), Merck Sharp & Dohme, and is currently sponsored by CSL Limited. Recipients Past recipients include: 1998 – Barry Marshall and Robin Warren for their work on Helicobacter pylori and its role in gastritis and peptic ulcer disease 2000 – Jacques Miller for work on the function of the thymus 2002 – Colin L. Masters for Alzheimer's disease research 2004 – Peter Colman for structural biology research 2006 – Ian Frazer for development of the cervical cancer vaccine Gardasil 2009 – for research and clinical application in lysosomal disorders 2011 – Graeme Clark for his invention of the bionic ear 2013 – Ruth Bishop for her work on understanding the rotavirus and the creation of a vaccine 2015 – Perry Bartlett for his discoveries that have transformed our understanding of the brain 2017 – Elizabeth Rakoczy from the Lions Eye Institute at the University of Western Australia for her work on a new gene therapy for wet age-related macular degeneration. 2019 – David Vaux and Andreas Strasser of the Walter and Eliza Hall Institute for their work on revealing the links between cell death and cancer. See also List of biomedical science awards List of awards named after people
https://en.wikipedia.org/wiki/HCT116%20cells
HCT116 is a human colon cancer cell line used in therapeutic research and drug screenings. Characteristics HCT116 cells have a mutation in codon 13 of the KRAS proto-oncogene, and are suitable transfection targets for gene therapy research. The cells have an epithelial morphology and can metastasize in xenograft models. When transducted with viral vectors carrying the p53 gene, HCT116 cells remain arrested in the G1 phase. The proliferation of HCT116 colonies was found to be inhibited by 5-Fu/P85 copolymer micelles. Furthermore, it was found that the knockout of MARCH2 limited growth of HCT116 cells via stress on the endoplasmic reticulum. Use in research HCT116 cells are used in a variety of biomedical studies involving colon cancer proliferation and corresponding inhibitors. The cell line has been used in tumorigenicity studies, along with other research that has shown that Cyclin D1 holds large importance for the activity of lithocholic acid hydroxyamide. HCT116 cells can also function in xenografts, with docetaxel, 5-FU, and flavopiridol limiting tumor growth in vitro. The HCT116 cell line was found to have two variations; one with a large expression of the Insp8 gene, and the other without. The Insp8 gene is part of a cell's energy metabolism process, and can affect the cellular phenotype as a result.
https://en.wikipedia.org/wiki/Algorithm
In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus". In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result. As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input. History Ancient algorithms Since antiquity, step-by-step procedures for solving mathematical problems have been attested. This includes Babylonian mathematics (around 2500 BC), Egyptian mathematics (around 1550 BC), Indian mathematics (around 800 BC and later; e.g. Shulba Sutras, Kerala School, and Brāhmasphuṭasiddhānta), The Ifa Oracle (around 500 BC), Greek mathematics (around 240 BC, e.g. sieve of Eratosthenes and Euclidean algorithm), and Arabic mathematics (9th century, e.g. cryptographic algorithms for
https://en.wikipedia.org/wiki/Perceived%20visual%20angle
In human visual perception, the visual angle, denoted θ, subtended by a viewed object sometimes looks larger or smaller than its actual value. One approach to this phenomenon posits a subjective correlate to the visual angle: the perceived visual angle or perceived angular size. An optical illusion where the physical and subjective angles differ is then called a visual angle illusion or angular size illusion. Angular size illusions are most obvious as relative angular size illusions, in which two objects that subtend the same visual angle appear to have different angular sizes; it is as if their equal-sized images on the retina were of different sizes. Angular size illusions are contrasted with linear size illusions, in which two objects that are the same physical size do not appear so. An angular size illusion may be accompanied by (or cause) a linear size illusion at the same time. The perceived visual angle paradigm begins with a rejection of the classical size–distance invariance hypothesis (SDIH), which states that the ratio of perceived linear size to perceived distance is a simple function of the visual angle. The SDIH does not explain some illusions, such as the Moon illusion, in which the Moon appears larger when it is near the horizon. It is replaced by a perceptual SDIH, in which the visual angle is replaced by the perceived visual angle. This new formulation avoids some of the paradoxes of the SDIH, but it remains difficult to explain why a given illusion occurs. This paradigm is not universally accepted; many textbook explanations of size and distance perception do not refer to the perceived visual angle, and some researchers deny that it exists. Some recent evidence supporting the idea, reported by Murray, Boyaci and Kersten (2006), suggests a direct relationship between the perceived angular size of an object and the size of the neural activity pattern it excites in the primary visual cortex. A relatively new idea Visual angle illusions have been
https://en.wikipedia.org/wiki/Notation%20system
In linguistics and semiotics, a notation system is a system of graphics or symbols, characters and abbreviated expressions, used (for example) in artistic and scientific disciplines to represent technical facts and quantities by convention. Therefore, a notation is a collection of related symbols that are each given an arbitrary meaning, created to facilitate structured communication within a domain knowledge or field of study. Standard notations refer to general agreements in the way things are written or denoted. The term is generally used in technical and scientific areas of study like mathematics, physics, chemistry and biology, but can also be seen in areas like business, economics and music. Written communication Writing systems Phonographic writing systems, by definition, use symbols to represent components of auditory language, i.e. speech, which in turn refers to things or ideas. The two main kinds of phonographic notational system are the alphabet and the syllabary. Some written languages are more consistent in their correlation of written symbols (or graphemes) with sound (or phonemes), and are therefore considered to have better phonemic orthography. Ideographic writing, by definition, refers to things or ideas independently of their pronunciation in any language. Some ideographic systems are also pictograms that convey meaning through their pictorial resemblance to a physical object. Linguistics Various brackets, parentheses, slashes, and lines are used around words and letters in linguistics to distinguish written from spoken forms, etc. See . Biology and medicine Nucleic acid notation Systems Biology Graphical Notation (SBGN) Sequence motif pattern-description notations Cytogenetic notation Energy Systems Language Chemistry A chemical formula describes a chemical compound using element symbols and subscripts, e.g. for water or for glucose SMILES is a notation for describing the structure of a molecule with a plain text string, e.g. N
https://en.wikipedia.org/wiki/Odd%E2%80%93even%20rationing
Odd–even rationing is a method of rationing in which access to some resource is restricted to some of the population on any given day. In a common example, drivers of private vehicles may be allowed to drive, park, or purchase gasoline on alternating days, according to whether the last digit in their license plate is even or odd. Similarly, during a drought, houses can be restricted from using water outdoors according to the parity of the house number. Typically a day is "odd" or "even" depending on the day of the month. An issue with this approach is that two "odd" days in a row occur whenever a month ends on an odd-numbered day. Sometimes odd or even may be based on day of the week, with Sundays excluded or included for everyone. Effectiveness The efficacy of odd–even rationing is debated. For gasoline, it does not actually reduce consumption much, since people prevented from filling up one day will just fill up the day before or the day after; the total number of people in line on each day is roughly unchanged. Some propose that it has psychological effects like reducing panic buying, discouraging people from making small purchases on a daily basis, or emphasizing the shortage and further discouraging unnecessary trips. Rationing access, rather than gasoline, based on number plate parity can reduce traffic congestion. In some areas, wealthier people purposely own two cars with opposite-parity number plates, to circumvent any restrictions. Vanity plates which do not contain any digits may be arbitrarily classed as odd or even. Dealing with 0 Mathematically, zero is an even number; indeed, half of the numbers in a given range end in 0, 2, 4, 6, 8 and the other half in 1, 3, 5, 7, 9, so it makes sense to include 0 with the other even digits for rationing. However, some people do not know that zero is even, and this ignorance can cause confusion. The relevant law sometimes stipulates that zero is even. In fact, an odd–even restriction on driving in Paris in 1
https://en.wikipedia.org/wiki/UMA%20Today
UMA Today is an international consortium of companies joined together to lead the adoption of 3GPP UMA technology around the world. UMA is the commercial name for the global 3GPP Generic Access Network (GAN) standard for fixed-mobile convergence (FMC). UMA enables secure, scalable access to mobile voice, data and IMS services over broadband IP access networks. By deploying UMA, mobile operators can deliver a number of compelling FMC services. The most well-known applications of UMA include dual-mode Wi-Fi/cellular phones. Leading operators around the world have embraced UMA as the foundation for their FMC strategies, including France Telecom/Orange, T-Mobile (USA), Rogers Wireless, TeliaSonera and Cincinnati Bell. UMA Today publishes the UMA Today Magazine hosts Webinars, sends news alerts to a subscription list and is involved in other industry activity to promote UMA technology. In addition, UMA Today co-sponsored the UMA Innovation Awards with Orange/France Telecom in 2008 and 2009. As of October 2010, T-Mobile USA announced it was using Kineto Wireless Smart Wi-Fi technology as the enabling technology for its Wi-Fi Calling service offer. External links UMAToday UMA Today blog. BlackBerry Curve 8320 Wins UMA Innovation Award, Mobile Shop, Feb. 2009 Orange and UMA Today Announce Second Annual UMA Innovation Awards Orange and UMA Today Announce Winners of Second Annual UMA Innovation Awards, TMCNet, Feb. 2009 Technology trade associations Wi-Fi VoIP organizations
https://en.wikipedia.org/wiki/Contractible%20space
In mathematics, a topological space X is contractible if the identity map on X is null-homotopic, i.e. if it is homotopic to some constant map. Intuitively, a contractible space is one that can be continuously shrunk to a point within that space. Properties A contractible space is precisely one with the homotopy type of a point. It follows that all the homotopy groups of a contractible space are trivial. Therefore any space with a nontrivial homotopy group cannot be contractible. Similarly, since singular homology is a homotopy invariant, the reduced homology groups of a contractible space are all trivial. For a topological space X the following are all equivalent: X is contractible (i.e. the identity map is null-homotopic). X is homotopy equivalent to a one-point space. X deformation retracts onto a point. (However, there exist contractible spaces which do not strongly deformation retract to a point.) For any path-connected space Y, any two maps f,g: Y → X are homotopic. For any space Y, any map f: Y → X is null-homotopic. The cone on a space X is always contractible. Therefore any space can be embedded in a contractible one (which also illustrates that subspaces of contractible spaces need not be contractible). Furthermore, X is contractible if and only if there exists a retraction from the cone of X to X. Every contractible space is path connected and simply connected. Moreover, since all the higher homotopy groups vanish, every contractible space is n-connected for all n ≥ 0. Locally contractible spaces A topological space X is locally contractible at a point x if for every neighborhood U of x there is a neighborhood V of x contained in U such that the inclusion of V is nulhomotopic in U. A space is locally contractible if it is locally contractible at every point. This definition is occasionally referred to as the "geometric topologist's locally contractible," though is the most common usage of the term. In Hatcher's standard Algebraic Topology text,
https://en.wikipedia.org/wiki/Leptomycin
Leptomycins are secondary metabolites produced by Streptomyces spp. Leptomycin B (LMB) was originally discovered as a potent antifungal compound. Leptomycin B was found to cause cell elongation of the fission yeast Schizosaccharomyces pombe. Since then this elongation effect has been used for the bioassay of leptomycin. However, recent data shows that leptomycin causes G1 cell cycle arrest in mammalian cells and is a potent anti-tumor agent against murine experimental tumors in combination therapy. Leptomycin B has been shown to be a potent and specific nuclear export inhibitor in humans and the fission yeast S. pombe. Leptomycin B alkylates and inhibits CRM1 (chromosomal region maintenance)/exportin 1 (), a protein required for nuclear export of proteins containing a nuclear export sequence (NES), by glycosylating a cysteine residue (cysteine 529 in S. pombe). In addition to antifungal and antibacterial activities, leptomycin B blocks the cell cycle and is a potent anti-tumor agent. At low nM concentrations, leptomycin B blocks the nuclear export of many proteins including HIV-1 Rev, MAPK/ERK, and NF-κB/IκB, and it inhibits the inactivation of p53. Leptomycin B also inhibits the export and translation of many RNAs, including COX-2 and c-Fos mRNAs, by inhibiting the export of ribonucleoproteins. Leptomycin A (LPA) was discovered together with LMB. LMB is twice as potent as LPA. See also Selective inhibitor of nuclear export
https://en.wikipedia.org/wiki/Mir-340%20microRNA%20precursor%20family
In molecular biology mir-340 microRNA is a short RNA molecule. MicroRNAs function to regulate the expression levels of other genes by several mechanisms. See also MicroRNA
https://en.wikipedia.org/wiki/Fenchel%27s%20theorem
In differential geometry, Fenchel's theorem is an inequality on the total absolute curvature of a closed smooth space curve, stating that it is always at least . Equivalently, the average curvature is at least , where is the length of the curve. The only curves of this type whose total absolute curvature equals and whose average curvature equals are the plane convex curves. The theorem is named after Werner Fenchel, who published it in 1929. The Fenchel theorem is enhanced by the Fáry–Milnor theorem, which says that if a closed smooth simple space curve is nontrivially knotted, then the total absolute curvature is greater than . Proof Given a closed smooth curve with unit speed, the velocity is also a closed smooth curve. The total absolute curvature is its length . The curve does not lie in an open hemisphere. If so, then there is such that , so , a contradiction. This also shows that if lies in a closed hemisphere, then , so is a plane curve. Consider a point such that curves and have the same length. By rotating the sphere, we may assume and are symmetric about the axis through the poles. By the previous paragraph, at least one of the two curves and intersects with the equator at some point . We denote this curve by . Then . We reflect across the plane through , , and the north pole, forming a closed curve containing antipodal points , with length . A curve connecting has length at least , which is the length of the great semicircle between . So , and if equality holds then does not cross the equator. Therefore, , and if equality holds then lies in a closed hemisphere, and thus is a plane curve.
https://en.wikipedia.org/wiki/Chili%20sauce%20and%20paste
Chili sauce and chili paste are condiments prepared with chili peppers. Chili sauce may be hot, sweet or a combination thereof, and may differ from hot sauce in that many sweet or mild varieties exist, which is typically lacking in hot sauces. Several varieties of chili sauce include sugar in their preparation, such as the Thai sweet chili sauce and Filipino agre dulce, which adds sweetness to their flavor profile. Sometimes, chili sauces are prepared with red tomato as primary ingredients. Many chili sauces may have a thicker texture and viscosity when compared to that of hot sauces. Chili paste usually refers to a paste where the main ingredient is chili pepper. Some are used as a cooking ingredient, while others are used to season a dish after preparation. Some are fermented with beans, as in Chinese doubanjiang, and some are prepared with powdered fermented beans, as in Korean gochujang. There are different regional varieties of chili paste and also within the same cuisine. Chili sauces and pastes can be used as dipping sauces, cooking glazes and marinades. Many commercial varieties of mass-produced chili sauce and paste exist. Ingredients Ingredients typically include puréed or chopped chili peppers, vinegar, sugar and salt, that are cooked, which thickens the mixture. Additional ingredients may include, water, garlic, other foodstuffs, corn syrup, spices and seasonings. Some varieties use ripe red puréed tomato as the primary ingredient. Varieties East Asia China Chili oil is a distinctive Sichuan flavoring found mainly in cold dishes, as well as a few hot dishes. Chili oil is made by pouring hot oil onto a bowl of dried chilies, to which some Sichuan pepper is usually added. After steeping in hot oil for at least a few hours, the oil takes on the taste and fragrance of chili. The finer the chili is ground, the stronger the flavor (regional preferences vary; ground chili is usually used in western China, while whole dried chili is more common in nor
https://en.wikipedia.org/wiki/Umami
Umami ( from ), or savoriness, is one of the five basic tastes. It has been described as savory and is characteristic of broths and cooked meats. People taste umami through taste receptors that typically respond to glutamates and nucleotides, which are widely present in meat broths and fermented products. Glutamates are commonly added to some foods in the form of monosodium glutamate (MSG), and nucleotides are commonly added in the form of disodium guanylate, inosine monophosphate (IMP) or guanosine monophosphate (GMP). Since umami has its own receptors rather than arising out of a combination of the traditionally recognized taste receptors, scientists now consider umami to be a distinct taste. Foods that have a strong umami flavor include meats, shellfish, fish (including fish sauce and preserved fish such as Maldives fish, Katsuobushi, sardines, and anchovies), tomatoes, mushrooms, hydrolyzed vegetable protein, meat extract, yeast extract, cheeses, and soy sauce. Etymology A loanword from Japanese , umami can be translated as "pleasant savory taste". This neologism was coined in 1908 by Japanese chemist Kikunae Ikeda from a nominalization of umai () "delicious". The compound (with mi () "taste") is used for a more general sense of a food as delicious. There is no current English equivalent of umami; however, some close descriptions are "meaty", "savory", and "broth-like". Background Scientists have debated whether umami was a basic taste since Kikunae Ikeda first proposed its existence in 1908. In 1985, the term umami was recognized as the scientific term to describe the taste of glutamates and nucleotides at the first Umami International Symposium in Hawaii. Umami represents the taste of the amino acid L-glutamate and 5'-ribonucleotides such as guanosine monophosphate (GMP) and inosine monophosphate (IMP). It can be described as a pleasant "brothy" or "meaty" taste with a long-lasting, mouthwatering and coating sensation over the tongue. The sensation o
https://en.wikipedia.org/wiki/Journal%20of%20Aging%20Studies
The Journal of Aging Studies is a quarterly peer-reviewed scientific journal focused on the study of aging and related topics. Its disciplinary scope is broad, including the social sciences, behavioral sciences and the humanities. It was established in 1987 by Jaber F. Gubrium (University of Missouri), and is published by Elsevier. The editor-in-chief is Renée Lynn Beard (College of the Holy Cross). According to the Journal Citation Reports, the journal has a 2019 impact factor of 2.078. The Journal of Aging Studies features scholarly articles offering theoretically engaged interpretations that challenge existing theory and empirical work. Articles need not deal with the field of aging as a whole, but with any defensibly relevant topic pertinent to the aging experience and related to the broad concerns and subject matter of the social and behavioral sciences and the humanities. The journal emphasizes innovation and critique - new directions in general - regardless of theoretical or methodological orientation or academic discipline (https://www.sciencedirect.com/journal/journal-of-aging-studies).
https://en.wikipedia.org/wiki/Lenin%20was%20a%20mushroom
Lenin was a mushroom () was a highly influential televised hoax by Soviet musician Sergey Kuryokhin and reporter Sergey Sholokhov. It was first broadcast on 17 May 1991 on Leningrad Television. Hoax The hoax took the form of an interview on the television program Pyatoe Koleso (The Fifth Wheel). In the interview, Kuryokhin, impersonating a historian, narrated his findings that Vladimir Lenin consumed large quantities of psychedelic mushrooms and eventually became a mushroom and a radio wave. Kuryokhin arrived at his conclusion through a long series of logical fallacies and appeals to the authority of various "sources" (such as Carlos Castaneda, the Massachusetts Institute of Technology, and Konstantin Tsiolkovsky), creating the illusion of a reasoned and plausible logical chain. The timing of the hoax played a large role in its success, coming as it did during the glasnost period when the ebbing of censorship in the Soviet Union led to many revelations about the country's history, often presented in sensational form. Furthermore, Soviet television had, up to that point, been regarded by its audience as conservative in style and content. As a result, a large number of Soviet citizens (one estimate puts the number at 11.3 million audience members) took the deadpan "interview" at face value, in spite of the absurd claims presented. Sholokhov has said that perhaps the most notable result of the show was an appeal by a group of party members to the Leningrad Regional Committee of the CPSU to clarify the veracity of Kuryokhin's claim. According to Sholokhov, in response to the request one of the top regional functionaries stated that "Lenin could not have been a mushroom" because "a mammal cannot be a plant." Modern taxonomy classifies mushrooms as fungi, a separate kingdom from plants. See also The Sacred Mushroom and the Cross
https://en.wikipedia.org/wiki/Thermal%20expansion
Thermal expansion is the tendency of matter to change its shape, area, volume, and density in response to a change in temperature, usually not including phase transitions. Temperature is a monotonic function of the average molecular kinetic energy of a substance. When a substance is heated, molecules begin to vibrate and move more, usually creating more distance between themselves. Substances which contract with increasing temperature are unusual, and only occur within limited temperature ranges (see examples below). The relative expansion (also called strain) divided by the change in temperature is called the material's coefficient of linear thermal expansion and generally varies with temperature. As energy in particles increases, they start moving faster and faster, weakening the intermolecular forces between them and therefore expanding the substance. Overview Predicting expansion If an equation of state is available, it can be used to predict the values of the thermal expansion at all the required temperatures and pressures, along with many other state functions. Contraction effects (negative thermal expansion) A number of materials contract on heating within certain temperature ranges; this is usually called negative thermal expansion, rather than "thermal contraction". For example, the coefficient of thermal expansion of water drops to zero as it is cooled to 3.983 °C and then becomes negative below this temperature; this means that water has a maximum density at this temperature, and this leads to bodies of water maintaining this temperature at their lower depths during extended periods of sub-zero weather. Other materials are also known to exhibit negative thermal expansion. Fairly pure silicon has a negative coefficient of thermal expansion for temperatures between about 18 and 120 kelvin. ALLVAR Alloy 30, a titanium alloy, exhibits anisotropic negative thermal expansion across a wide range of temperatures. Factors affecting thermal expansion Unlike g
https://en.wikipedia.org/wiki/Statistical%20model%20validation
In statistics, model validation is the task of evaluating whether a chosen statistical model is appropriate or not. Oftentimes in statistical inference, inferences from models that appear to fit their data may be flukes, resulting in a misunderstanding by researchers of the actual relevance of their model. To combat this, model validation is used to test whether a statistical model can hold up to permutations in the data. This topic is not to be confused with the closely related task of model selection, the process of discriminating between multiple candidate models: model validation does not concern so much the conceptual design of models as it tests only the consistency between a chosen model and its stated outputs. There are many ways to validate a model. Residual plots plot the difference between the actual data and the model's predictions: correlations in the residual plots may indicate a flaw in the model. Cross validation is a method of model validation that iteratively refits the model, each time leaving out just a small sample and comparing whether the samples left out are predicted by the model: there are many kinds of cross validation. Predictive simulation is used to compare simulated data to actual data. External validation involves fitting the model to new data. Akaike information criterion estimates the quality of a model. Overview Model validation comes in many forms and the specific method of model validation a researcher uses is often a constraint of their research design. To emphasize, what this means is that there is no one-size-fits-all method to validating a model. For example, if a researcher is operating with a very limited set of data, but data they have strong prior assumptions about, they may consider validating the fit of their model by using a Bayesian framework and testing the fit of their model using various prior distributions. However, if a researcher has a lot of data and is testing multiple nested models, these conditions may len
https://en.wikipedia.org/wiki/Patrick%27s%20test
Patrick's test or FABER test is performed to evaluate pathology of the hip joint or the sacroiliac joint. The test is performed by having the tested leg flexed and the thigh abducted and externally rotated. If pain is elicited on the ipsilateral side anteriorly, it is suggestive of a hip joint disorder on the same side. If pain is elicited on the contralateral side posteriorly around the sacroiliac joint, it is suggestive of pain mediated by dysfunction in that joint. History Patrick's test is named after the American neurologist Hugh Talbot Patrick. See also Gaenslen's test Physical medicine and rehabilitation
https://en.wikipedia.org/wiki/Forensic%20mycology
Forensic mycology is the use of mycology in criminal investigations. Mycology is used in estimating times of death or events by using known growth rates of fungi, in providing trace evidence, and in locating corpses. It also includes tracking mold growth in buildings, the use of fungi in biological warfare, and the use of psychotropic and toxic fungus varieties as illicit drugs or causes of death. Post Mortem Interval The constant growth rate of fungi is used to determine post-mortem interval and help investigators pinpoint time of death. Traditionally, medical examiners will rely on body cooling, level of decomposition, and/or insect succession. Fungi have been noted to be present on dead bodies but, until recently, were thought be little more than another organism aiding in decomposition. There is no limit to which species of fungi or which parts of the body can be used in this process, as long as conditions at the scene can be experimentally recreated. H. van de Voorde and P. J. Van Dijck of the Catholic University of Leuven had the first noted use of recreating fungal growth to determine post-mortem interval in 1980. In this, a woman, living alone, was found dead in a temperature-controlled house with stab wounds in her chest and fungal growth on her face and lower abdomen. The body had already cooled to 12 °C, the ambient temperature, and showed no signs of insect colonization which made accurate post-mortem interval determination difficult. van de Voorde and Van Dijck recorded the size of the fungal growth on the eye and obtained a sample. This sample was incubated in similar conditions to the corpse and the time needed to grow the colony to the size on the body was used to determine port-mortem interval and, subsequently, the time of death, which was confirmed by the confession of the murderer. In addition to size, distinct phases of fungal growth can be used to aid in post-mortem determination. Chronologically, these include the formation of substrate m
https://en.wikipedia.org/wiki/ASIC5
ASIC5 gene is one of the five paralogous genes that encode proteins that form trimeric Acid-sensing ion channels (ASICs) in mammals. Aliases previously used for this gene include ACCN5 and BASIC. The protein encoded by this gene does not appear to be acid responsive. The cDNA coding for this protein was first characterized in 2000. The ASIC genes have splicing variants that encode different proteins that are called isoforms. These genes are mainly expressed in the central and peripheral nervous system. ASICs can form both homotrimeric (meaning composed of three identical subunits) and heterotrimeric channels. Structure and function This gene encodes a member of the ASIC/ENaC superfamily of proteins. The members of this family are amiloride-sensitive sodium channels that contain intracellular N and C termini, 2 hydrophobic transmembrane (TM) regions, and a large extracellular loop, which has many cysteine residues with conserved spacing. The TM regions are generally symbolized as TM1 (clone to N-terminus) and TM2 (close to C-terminus). The pore of the channel through which ions selectively flow from the extracellular side into the cytoplasm is formed by the three TM2 regions of the trimer.
https://en.wikipedia.org/wiki/Munk%27s%20Roll
The Roll of the Royal College of Physicians of London, commonly referred to as Munk's Roll, is a series of published works containing biographical entries of the fellows of the Royal College of Physicians. It was published in print in eleven volumes (1861 to 2004) with a twelfth online (2005 to present). The series is now titled Inspiring Physicians (from 2020). The series has been informally known as Munk’s Roll, after the original compiler, for over a century. However, the formal name for the series of volumes (1-11) in print, is Lives of the Fellows of the Royal College of Physicians of London. History Munk's Roll was initially the work of the College's Harveian Librarian, William Munk. The first published edition (1861) was originally prepared as manuscript in three large volumes, containing biographical information on all physicians who were connected with the College, with no thought to publication. Each volume of the manuscript was presented to the College library upon its completion. The first volume covered the foundation of the College to 1600 (completed March 1855), the second covered 1601 to 1700 (completed December 1855), and the third volume 1701-1800 (completed in June 1856). The Council of the College, at the instigation of some influential Fellows, expressed the view in December 1860 that the work be published, and it was, in 1861. The first edition consisted of volumes one and two, containing fellows and licentiates from 1518 to 1800. In 1878, volume three was included in the second edition, overseen by Munk, with biographies to 1825. The work was set out in chronological order with an index. Subsequent editions, for biographies post 1825, were limited to fellows, which reflects the increasing numbers, qualifications and professionalism of physicians through the 19th century. They were titled Lives of the Fellows of the Royal College of Physicians of London. Volumes Volume 1, 1518-1700 Volume 2, 1701-1800 Volume 3, 1801-1825 Volume 4, 1826
https://en.wikipedia.org/wiki/Digital%20video%20recorder
A digital video recorder (DVR) is an electronic device that records video in a digital format to a disk drive, USB flash drive, SD memory card, SSD or other local or networked mass storage device. The term includes set-top boxes with direct to disk recording, portable media players and TV gateways with recording capability, and digital camcorders. Personal computers are often connected to video capture devices and used as DVRs; in such cases the application software used to record video is an integral part of the DVR. Many DVRs are classified as consumer electronic devices; such devices may alternatively be referred to as personal video recorders (PVRs), particularly in Canada. Similar small devices with built-in (~5 inch diagonal) displays and SSD support may be used for professional film or video production, as these recorders often do not have the limitations that built-in recorders in cameras have, offering wider codec support, the removal of recording time limitations and higher bitrates. History Hard-disk-based digital video recorders The first working DVR prototype was developed in 1998 at Stanford University Computer Science department. The DVR design was a chapter of Edward Y. Chang's PhD dissertation, supervised by Professors Hector Garcia-Molina and Jennifer Widom. Two design papers were published 2017 VLDB conference, and 1999 ICDE conference. The prototype was developed in 1998 at Pat Hanrahan's CS488 class: Experiments in Digital Television, and the prototype was demoed to industrial partners including SONY, Intel, and Apple. Consumer digital video recorders ReplayTV and TiVo were launched at the 1999 Consumer Electronics Show in Las Vegas, Nevada. Microsoft also demonstrated a unit with DVR capability, but this did not become available until the end of 1999 for full DVR features in Dish Network's DISHplayer receivers. TiVo shipped their first units on March 31, 1999. ReplayTV won the "Best of Show" award in the video category with Netscape co-fou
https://en.wikipedia.org/wiki/Electret
An electret (formed as a portmanteau of electr- from "electricity" and -et from "magnet") is a dielectric material that has a quasi-permanent electric charge or dipole polarisation. An electret generates internal and external electric fields, and is the electrostatic equivalent of a permanent magnet. Although Oliver Heaviside coined this term in 1885, materials with electret properties were already known to science and had been studied since the early 1700s. One particular example is the electrophorus, a device consisting of a slab with electret properties and a separate metal plate. The electrophorus was originally invented by Johan Carl Wilcke in Sweden and again by Alessandro Volta in Italy. The name derives from "electron" and "magnet"; drawing analogy to the formation of a magnet by alignment of magnetic domains in a piece of iron. Historically, electrets were made by first melting a suitable dielectric material such as a polymer or wax that contains polar molecules, and then allowing it to re-solidify in a powerful electrostatic field. The polar molecules of the dielectric align themselves to the direction of the electrostatic field, producing a dipole electret with a permanent electrostatic bias. Modern electrets are usually made by embedding excess charges into a highly insulating dielectric, e.g. by means of an electron beam, corona discharge, injection from an electron gun, electric breakdown across a gap, or a dielectric barrier. Similarity to magnets Electrets, like magnets, are dipoles. Another similarity is the radiant fields: they produce an electrostatic field (as opposed to a magnetic field) around their perimeter. When a magnet and an electret are near one another, a rather unusual phenomenon occurs: while stationary, neither has any effect on one another. However, when an electret is moved with respect to a magnetic pole, a force is felt which acts perpendicular to the magnetic field, pushing the electret along a path 90 degrees to the expecte
https://en.wikipedia.org/wiki/Bremen%20Institute%20for%20Applied%20Beam%20Technology
Bremen Institute for Applied Beam Technology (German: Bremer Institut für angewandte Strahltechnik) is a private sector research institute, located in the city of Bremen, Germany. It was founded on July 1, 1977 as a premier laser institute, which was not a part of a university. It was the first of its kind, during that time. The institute was founded by Werner Jüptner and Gerd Sepold. Areas of Research The institute focuses its research in the areas of Material Processing and Opto-electronic Systems and Optical Metrology. The departments within the institute are accordingly categorised. Materials and modelling Optical metrology and testing Welding and surface technology Industrial applications Location The institute is located in the University of Bremen's campus. It can be conveniently reached via the Straßenbahn number 6, alighting at the stop named 'Klagenfurter Straße'. It is about 15 mins via the Straßenbahn to the city centre and 35 mins to the airport.
https://en.wikipedia.org/wiki/230%20%28number%29
230 (two hundred [and] thirty) is the natural number following 229 and preceding 231. Additionally, 230 is: a composite number, with its divisors being 2, 5, 10, 23, 46, and 115. a sphenic number because it is the product of 3 primes. It is also the first sphenic number to immediately precede another sphenic number. palindromic and a repdigit in bases 22 (AA22), 45 (5545), 114 (22114), 229 (11229) a Harshad number in bases 2, 6, 10, 12, 23 (and 16 other bases). a happy number. a nontotient since there is no integer with 230 coprimes below it. the sum of the coprime counts for the first 27 integers. the aliquot sum of both 454 and 52441. part of the 41-aliquot tree. the maximal number of pieces that can be obtained by cutting an annulus with 20 cuts. The aliquot sequence starting at 224 is: 224, 280, 440, 640, 890, 730, 602, 454, 230, 202, 104, 106, 56, 64, 63, 41, 1, 0. There are 230 unique space groups describing all possible crystal symmetries. Integers between 231 and 239 231 232 233 234 235 236 237 238 239
https://en.wikipedia.org/wiki/Karlqvist%20gap
The Karlqvist gap or Karlqvist Field is an electromagnetic phenomenon discovered in 1953 by the Swedish engineer Olle Karlqvist (1922-1976), which is important in magnetic storage for computers. Karlqvist discovered the phenomenon while designing a ferromagnetic surface layer to the magnetic drum memory for the BESK computer. When designing a magnetic memory store, the ferromagnetic layer must be studied to determine the variation of the magnetic field with permeability, air gap, layer thickness and other influencing factors. The problem is non-linear and extremely difficult to solve. Karlqvist's gap discovery shows that the non-linear problem could be approximated by a linear boundary value for the two-dimensional static field and the one-dimensional transient field. This linear calculation gives a first approximation. Karlqvist published his discovery in the 1954 paper "Calculation of the magnetic field in ferromagnetic layer of a magnetic drum" at KTH Royal Institute of Technology in Stockholm. See also Carousel memory, 2560-kilobyte storage unit first sold in 1958
https://en.wikipedia.org/wiki/Boyle%20temperature
The Boyle temperature is formally defined as the temperature for which the second virial coefficient, , becomes zero. It is at this temperature that the attractive forces and the repulsive forces acting on the gas particles balance out This is the virial equation of state and describes a real gas. Since higher order virial coefficients are generally much smaller than the second coefficient, the gas tends to behave as an ideal gas over a wider range of pressures when the temperature reaches the Boyle temperature (or when or are minimized). In any case, when the pressures are low, the second virial coefficient will be the only relevant one because the remaining concern terms of higher order on the pressure. Also at Boyle temperature the dip in a PV diagram tends to a straight line over a period of pressure. We then have where is the compressibility factor. Expanding the van der Waals equation in one finds that . See also Virial equation of state Temperature Thermodynamics
https://en.wikipedia.org/wiki/DLGAP1
Disks large-associated protein 1 (DAP-1), also known as guanylate kinase-associated protein (GKAP), is a protein that in humans is encoded by the DLGAP1 gene. DAP-1 is known to be highly enriched in synaptosomal preparations of the brain, and present in the post-synaptic density. Function This gene encodes the protein called guanylate kinase-associated protein (GKAP). GKAP binds to the SHANK2 and PSD-95 proteins, facilitating the assembly of the post-synaptic density of neurons. Dlgap1 has five 14-amino-acid repeats and three Pro-rich portions. Interactions DLGAP1 has been shown to interact with: DLG1 DLG4 DYNLL1 DYNLL2 SHANK2 The interaction with PSD95 and S-SCAM is mediated by the GUK domain and it has been hypothesized that this might mean it can also interact with other GUK containing proteins.
https://en.wikipedia.org/wiki/Wiener%27s%20lemma
In mathematics, Wiener's lemma is a well-known identity which relates the asymptotic behaviour of the Fourier coefficients of a Borel measure on the circle to its atomic part. This result admits an analogous statement for measures on the real line. It was first discovered by Norbert Wiener. Statement Given a real or complex Borel measure on the unit circle , let be its atomic part (meaning that and for . Then where is the -th Fourier coefficient of . Similarly, given a real or complex Borel measure on the real line and called its atomic part, we have where is the Fourier transform of . Proof First of all, we observe that if is a complex measure on the circle then with . The function is bounded by in absolute value and has , while for , which converges to as . Hence, by the dominated convergence theorem, We now take to be the pushforward of under the inverse map on , namely for any Borel set . This complex measure has Fourier coefficients . We are going to apply the above to the convolution between and , namely we choose , meaning that is the pushforward of the measure (on ) under the product map . By Fubini's theorem So, by the identity derived earlier, By Fubini's theorem again, the right-hand side equals The proof of the analogous statement for the real line is identical, except that we use the identity (which follows from Fubini's theorem), where . We observe that , and for , which converges to as . So, by dominated convergence, we have the analogous identity Consequences A real or complex Borel measure on the circle is diffuse (i.e. ) if and only if . A probability measure on the circle is a Dirac mass if and only if . (Here, the nontrivial implication follows from the fact that the weights are positive and satisfy , which forces and thus , so that there must be a single atom with mass .)
https://en.wikipedia.org/wiki/Nannochloropsis
Nannochloropsis is a genus of algae comprising six known species. The genus in the current taxonomic classification was first termed by Hibberd (1981). The species have mostly been known from the marine environment but also occur in fresh and brackish water. All of the species are small, nonmotile spheres which do not express any distinct morphological features that can be distinguished by either light or electron microscopy. The characterisation is mostly done by rbcL gene and 18S rRNA sequence analysis. The algae of the genus Nannochloropsis differ from other related microalgae in that they have chlorophyll a and completely lack chlorophyll b and chlorophyll c. In addition they are able to build up a high concentrations of a range of pigments such as astaxanthin, zeaxanthin and canthaxanthin. They have a diameter of about 2 to 3 micrometers and a very simple ultrastructure with reduced structural elements compared to neighbouring taxa. Nannochloropsis is considered a promising alga for industrial applications because of its ability to accumulate high levels of polyunsaturated fatty acids. Moreover, it shows promising features that can allow genetic manipulation aimed at the genetic improvement of the current oleaginous strains. Various species of Nannochloropsis indeed are transfectable and there has been evidence that some strains are able to perform homologous recombination. At the moment it is mainly used as an energy-rich food source for fish larvae and rotifers. Nevertheless, it has raised growing interest also for the investigation of biofuel production from photosynthetic organisms. (see Nannochloropsis and biofuels). Nannochloropsis is actually in use as food additive for human nutrition and it is also served at Restaurant "A Poniente" of El Puerto de Santa María (Cádiz, Spain) close to the natural environment where Nannochloropsis gaditana was first isolated and still grows. A 2020 study suggests it could be used for a highly performant, sustainable fi
https://en.wikipedia.org/wiki/Ocimum%20campechianum
Ocimum campechianum is a plant species in the family Lamiaceae, widespread across Mexico, Central America, South America, the West Indies, and Florida. Leaves of Ocimum campechianum are eaten in Brazil's Amazon jungle. Similar to basil, it has a pungent flavor and contains essential oils which have been used ethnomedicinally. In Amazonia, the aromatic leaves are used as an admixture in ayahuasca brews. The plant is known as albahaca in Mexico and called xkakaltun in Mexico's Yucatan Peninsula, where it is considered a honey plant and is used as an abortifacient. It is referred to in Brazil as and has also been referred to as albahaca del monte, Amazonian basil, wild sweet basil, wild mosquito plant, least basil, Peruvian basil, spice basil, alfavaca-do-campo, manjericao and estoraque. Essential oil Essential oil from O. campechianum has been tested for its in vitro food-related biological activities and found comparable to the essential oils of common basil and thyme and superior in its capacity as an antioxidant. It has also been found to possess antifungal activity against food spoiling yeasts. The leaves have the highest concentration of essential oil (4.3%). Multiple chemotypes exist within the species and can be distinguished by analyzing the essential oil by gas chromatography (GC) and/or GC isotope ratio mass spectrometry.
https://en.wikipedia.org/wiki/Complementary%20code%20keying
Complementary code keying (CCK) is a modulation scheme used with wireless networks (WLANs) that employ the IEEE 802.11b specification. In 1999, CCK was adopted to supplement the Barker code in wireless digital networks to achieve data rate higher than 2 Mbit/s at the expense of shorter distance. This is due to the shorter chipping sequence in CCK (8 bits versus 11 bits in Barker code) that means less spreading to obtain higher data rate but more susceptible to narrowband interference resulting in shorter radio transmission range. Beside shorter chipping sequence, CCK also has more chipping sequences to encode more bits (4 chipping sequences at 5.5 Mbit/s and 8 chipping sequences at 11 Mbit/s) increasing the data rate even further. The Barker code, however, only has a single chipping sequence. The complementary codes first discussed by Golay were pairs of binary complementary codes and he noted that when the elements of a code of length N were either [−1 or 1] it followed immediately from their definition that the sum of their respective autocorrelation sequences was zero at all points except for the zero shift where it is equal to K×N. (K being the number of code words in the set). CCK is a variation and improvement on M-ary Orthogonal Keying and uses 'polyphase complementary codes'. They were developed by Lucent Technologies and Harris Semiconductor and were adopted by the 802.11 working group in 1998. CCK is the form of modulation used when 802.11b operates at either 5.5 or 11 Mbit/s. CCK was selected over competing modulation techniques as it used approximately the same bandwidth and could use the same preamble and header as pre-existing 1 and 2 Mbit/s wireless networks and thus facilitated interoperability. Polyphase complementary codes, first proposed by Sivaswamy, 1978, are codes where each element is a complex number of unit magnitude and arbitrary phase, or more specifically for 802.11b is one of [1, −1, j, −j]. Networks using the 802.11g specification e
https://en.wikipedia.org/wiki/Lavasoft
Adaware, formerly known as Lavasoft, is a software development company that produces spyware and malware detection software, including Adaware. It operates as a subsidiary of Avanquest, a division of Claranova. The company offers products Adaware Antivirus, Adaware Protect, Adaware Safe Browser, Adaware Privacy, Adaware AdBlock, Adaware PC Cleaner and Adaware Driver Manager. Adaware's headquarters are in Montreal, Canada, having previously been located in Gothenburg, Sweden since 2002. Nicolas Stark and Ann-Christine Åkerlund established the company in Germany in 1999 with its flagship Adaware antivirus product. In 2011, Adware was acquired by the Solaria Fund, a private equity fund front for entrepreneurs Daniel Assouline and Michael Dadoun, who have been accused of selling software that is available for free, including Adaware antivirus prior to acquiring the company itself. Adaware antivirus An anti-spyware and anti-virus software program, Adaware Antivirus, according to its developer, supposedly detects and removes malware, spyware and adware, computer viruses, dialers, Trojans, bots, rootkits, data miners,, parasites, browser hijackers and tracking components. Adaware Web Companion, a component of the Adaware antivirus, is frequently packaged alongside potentially unwanted programs. Adaware accomplishes this by striking deals with malware operators and site owners to distribute its software in exchange for money. Adaware Web Companion is known to collect user data and send it back to remote servers. History Adaware antivirus was originally developed, as Ad-Aware, in 1999 to highlight web beacons inside of Internet Explorer. On many websites, users would see a tiny pixelated square next to each web beacon, warning the user that the computer's IP address and other non-essential information was being tracked by this website. Over time, Ad-Aware added the ability to block those beacons, or ads. In the 2008 Edition, Lavasoft bundled Ad-Aware Pro and Plus for
https://en.wikipedia.org/wiki/Dynamic%20testing
Dynamic testing (or dynamic analysis) is a term used in software engineering to describe the testing of the dynamic behavior of code. That is, dynamic analysis refers to the examination of the physical response from the system to variables that are not constant and change with time. In dynamic testing the software must actually be compiled and run. It involves working with the software, giving input values and checking if the output is as expected by executing specific test cases which can be done manually or with the use of an automated process. This is in contrast to static testing. Unit tests, integration tests, system tests and acceptance tests utilize dynamic testing. Usability tests involving a mock version made in paper or cardboard can be classified as static tests when taking into account that no program has been executed; or, as dynamic ones when considering the interaction between users and such mock version is effectively the most basic form of a prototype. Main procedure The process and function of dynamic testing in software development, dynamic testing can be divided into unit testing, integration testing, system testing, acceptance testing and finally regression testing. Unit testing is a test that focuses on the correctness of the basic components of a software. Unit testing falls into the category of white-box testing. In the entire quality inspection system, unit testing needs to be completed by the product group, and then the software is handed over to the testing department. Integration testing is used to detect if the interfaces between the various units are properly connected during the integration process of the entire software. Testing a software system that has completed integration is called a system test, and the purpose of the test is to verify that the correctness and performance of the software system meet the requirements specified in its specifications. Testers should follow the established test plan. When testing the robustn
https://en.wikipedia.org/wiki/The%20Human%20Embryo
The Human Embryo: Aristotle and the Arabic and European Traditions is a book looking at the philosophy and religious viewpoints of human reproduction over the ages by the Reverend Canon G. R. Dunstan and published by University of Exeter Press in 1990. It specialises in the study of the human embryo both historically and from different cultural viewpoints. The largest section is devoted to the understanding of the embryo in the Middle Ages, with seven articles alone reinterpreting Dante's passages on the animation of the embryo.
https://en.wikipedia.org/wiki/Configuration%20space%20%28mathematics%29
In mathematics, a configuration space is a construction closely related to state spaces or phase spaces in physics. In physics, these are used to describe the state of a whole system as a single point in a high-dimensional space. In mathematics, they are used to describe assignments of a collection of points to positions in a topological space. More specifically, configuration spaces in mathematics are particular examples of configuration spaces in physics in the particular case of several non-colliding particles. Definition For a topological space and a positive integer , let be the Cartesian product of copies of , equipped with the product topology. The nth (ordered) configuration space of is the set of n-tuples of pairwise distinct points in : This space is generally endowed with the subspace topology from the inclusion of into . It is also sometimes denoted , , or . There is a natural action of the symmetric group on the points in given by This action gives rise to the th unordered configuration space of , which is the orbit space of that action. The intuition is that this action "forgets the names of the points". The unordered configuration space is sometimes denoted , , or . The collection of unordered configuration spaces over all is the Ran space, and comes with a natural topology. Alternative formulations For a topological space and a finite set , the configuration space of with particles labeled by is For , define . Then the th configuration space of X is , and is denoted simply . Examples The space of ordered configuration of two points in is homeomorphic to the product of the Euclidean 3-space with a circle, i.e. . More generally, the configuration space of two points in is homotopy equivalent to the sphere . The configuration space of points in is the classifying space of the th braid group (see below). Connection to braid groups The -strand braid group on a connected topological space is the fundamental group
https://en.wikipedia.org/wiki/Stereoplotter
A stereoplotter uses stereo photographs to determine elevations. It has been the primary method to plot contour lines on topographic maps since the 1930s. Although the specific devices have advanced technologically, they are all based on the apparent change in position of a feature in the two stereo photographs. Stereoplotters have changed as technology has improved. The first stereoplotters where projection stereoplotters they used only the light rays and optics to adjust the image. The Kelsh Plotter is an example of the projection stereoplotters. The analog stereoplotters came next and were more sophisticated in that they used more sophisticated optics to view the image. The analytical stereoplotter is used today. It incorporates a computer which does the work of mathematically aligning the images so that they line up properly. The analytic stereoplotter also allows for storing the data and redrawing at any desired scale. Analogical The stereoplotter requires two photographs that have considerable overlap (60%) and are corrected for distortion due to angle of photo. The photos are put onto transparent media and projected with a light source. Each image will be projected with overlap on the other. The operator, using a special set of optics, would then see the image as three-dimensional due to the differing perspective of each photo. The optics of the stereoplotter is what allows the operator to plot the contours and features. The light source used to project the photo is what begins the process. One photo is projected using cyan/blue filter and the other photo is projected with a red filter. The operator wears a special set of glasses that have the same color filter for lenses. Seeing the left photo in blue light while the left eye has the blue filter and the right photo projected with red light and the right eye seeing through the red filter, the overlapping image becomes three-dimensional. The images will have control points that detail how the overlap of th
https://en.wikipedia.org/wiki/Instabus
Instabus, is a decentralized open system to manage and control electrical devices within a facility. It is developed by Berker, Gira, Jung, Merten and Siemens AG. There are about 200 companies of electrical supplies using this communication protocol. The European Installation Bus (EIB) allows all electrical components to be interconnected through an electrical bus. Every component is able to send commands to other components, no matter where they are. A typical EIB network is made of electrical components such as switches, pulsers, electric motors, electrovalves, contactors, and sensors. This electrical bus is made of a 2x2x0,8mm twisted pair cable, that connects all devices within the network. The theoretical maximum number of components is 57375. EIB system was developed to increase power savings, security, comfort and flexibility. System control Although the EIB is a decentralized system and doesn't need any electric switchboard or control console, it's possible to implement a PC based monitoring system to check device status and to send manual or pre-programmed commands to one or more components of the network. Convergence with other standards The Konnex KNX (standard) was developed as a result of the convergence between EIB, BCi and EHSA. External links The Konnex Standard Le Bus EIB - Le standard KNX (fr) Mise en oeuvre du Bus EIB/KNX (fr) Building automation Home automation Computer buses
https://en.wikipedia.org/wiki/Nixtamalization
Nixtamalization () is a process for the preparation of maize, or other grain, in which the grain is soaked and cooked in an alkaline solution, usually limewater (but sometimes aqueous alkali metal carbonates), washed, and then hulled. The term can also refer to the removal via an alkali process of the pericarp from other grains such as sorghum. Nixtamalized corn has several benefits over unprocessed grain: It is more easily ground, its nutritional value is increased, flavor and aroma are improved, and mycotoxins are reduced by up to 97%–100% (for aflatoxins). Lime and ash are highly alkaline: the alkalinity helps the dissolution of hemicellulose, the major glue-like component of the maize cell walls, and loosens the hulls from the kernels and softens the maize. Corn's hemicellulose-bound niacin is converted to free niacin (a form of vitamin B3), making it available for absorption into the body, thus helping to prevent pellagra. Some of the corn oil is broken down into emulsifying agents (monoglycerides and diglycerides), while bonding of the maize proteins to each other is also facilitated. The divalent calcium in lime acts as a cross-linking agent for protein and polysaccharide acidic side chains. While cornmeal made from untreated ground maize is unable by itself to form a dough on addition of water, the chemical changes in masa allow dough formation. These benefits make nixtamalization a crucial preliminary step for further processing of maize into food products, and the process is employed using both traditional and industrial methods, in the production of tortillas and tortilla chips (but not corn chips), tamales, hominy, and many other items. Etymology In the Aztec language Nahuatl, the word for the product of this procedure is or ( or ), which in turn has yielded Mexican Spanish (). The Nahuatl word is a compound of "lime ashes" and "unformed/cooked corn dough, tamal". The term nixtamalization can also be used to describe the removal of the pericar
https://en.wikipedia.org/wiki/Nuclear%20fusion%E2%80%93fission%20hybrid
Hybrid nuclear fusion–fission (hybrid nuclear power) is a proposed means of generating power by use of a combination of nuclear fusion and fission processes. The basic idea is to use high-energy fast neutrons from a fusion reactor to trigger fission in non-fissile fuels like U-238 or Th-232. Each neutron can trigger several fission events, multiplying the energy released by each fusion reaction hundreds of times. As the fission fuel is not fissile, there is no self-sustaining chain reaction from fission. This would not only make fusion designs more economical in power terms, but also be able to burn fuels that were not suitable for use in conventional fission plants, even their nuclear waste. In general terms, the hybrid is similar in concept to the fast breeder reactor, which uses a compact high-energy fission core in place of the hybrid's fusion core. Another similar concept is the accelerator-driven subcritical reactor, which uses a particle accelerator to provide the neutrons instead of nuclear reactions. History The concept dates to the 1950s, and was strongly advocated by Hans Bethe during the 1970s. At that time the first powerful fusion experiments were being built, but it would still be many years before they could be economically competitive. Hybrids were proposed as a way of greatly accelerating their market introduction, producing energy even before the fusion systems reached break-even. However, detailed studies of the economics of the systems suggested they could not compete with existing fission reactors. The idea was abandoned and lay dormant until the 2000s, when the continued delays in reaching break-even led to a brief revival around 2009. These studies generally concentrated on the nuclear waste disposal aspects of the design, as opposed to the production of energy. The concept has seen cyclical interest since then, based largely on the success or failure of more conventional solutions like the Yucca Mountain nuclear waste repository Another
https://en.wikipedia.org/wiki/McCarthy%2091%20function
The McCarthy 91 function is a recursive function, defined by the computer scientist John McCarthy as a test case for formal verification within computer science. The McCarthy 91 function is defined as The results of evaluating the function are given by M(n) = 91 for all integer arguments n ≤ 100, and M(n) = n − 10 for n > 100. Indeed, the result of M(101) is also 91 (101 - 10 = 91). All results of M(n) after n = 101 are continually increasing by 1, e.g. M(102) = 92, M(103) = 93. History The 91 function was introduced in papers published by Zohar Manna, Amir Pnueli and John McCarthy in 1970. These papers represented early developments towards the application of formal methods to program verification. The 91 function was chosen for being nested-recursive (contrasted with single recursion, such as defining by means of ). The example was popularized by Manna's book, Mathematical Theory of Computation (1974). As the field of Formal Methods advanced, this example appeared repeatedly in the research literature. In particular, it is viewed as a "challenge problem" for automated program verification. It is easier to reason about tail-recursive control flow, this is an equivalent (extensionally equal) definition: As one of the examples used to demonstrate such reasoning, Manna's book includes a tail-recursive algorithm equivalent to the nested-recursive 91 function. Many of the papers that report an "automated verification" (or termination proof) of the 91 function only handle the tail-recursive version. This is an equivalent mutually tail-recursive definition: A formal derivation of the mutually tail-recursive version from the nested-recursive one was given in a 1980 article by Mitchell Wand, based on the use of continuations. Examples Example A: M(99) = M(M(110)) since 99 ≤ 100 = M(100) since 110 > 100 = M(M(111)) since 100 ≤ 100 = M(101) since 111 > 100 = 91 since 101 > 100 Example B: M(87) = M(M(98)) = M(M(M
https://en.wikipedia.org/wiki/Grothendieck%27s%20Galois%20theory
In mathematics, Grothendieck's Galois theory is an abstract approach to the Galois theory of fields, developed around 1960 to provide a way to study the fundamental group of algebraic topology in the setting of algebraic geometry. It provides, in the classical setting of field theory, an alternative perspective to that of Emil Artin based on linear algebra, which became standard from about the 1930s. The approach of Alexander Grothendieck is concerned with the category-theoretic properties that characterise the categories of finite G-sets for a fixed profinite group G. For example, G might be the group denoted , which is the inverse limit of the cyclic additive groups Z/nZ — or equivalently the completion of the infinite cyclic group Z for the topology of subgroups of finite index. A finite G-set is then a finite set X on which G acts through a quotient finite cyclic group, so that it is specified by giving some permutation of X. In the above example, a connection with classical Galois theory can be seen by regarding as the profinite Galois group Gal(F/F) of the algebraic closure F of any finite field F, over F. That is, the automorphisms of F fixing F are described by the inverse limit, as we take larger and larger finite splitting fields over F. The connection with geometry can be seen when we look at covering spaces of the unit disk in the complex plane with the origin removed: the finite covering realised by the zn map of the disk, thought of by means of a complex number variable z, corresponds to the subgroup n.Z of the fundamental group of the punctured disk. The theory of Grothendieck, published in SGA1, shows how to reconstruct the category of G-sets from a fibre functor Φ, which in the geometric setting takes the fibre of a covering above a fixed base point (as a set). In fact there is an isomorphism proved of the type G ≅ Aut(Φ), the latter being the group of automorphisms (self-natural equivalences) of Φ. An abstract classification of categories w
https://en.wikipedia.org/wiki/Clamp%20connection
A clamp connection is a hook-like structure formed by growing hyphal cells of certain fungi. It is a characteristic feature of basidiomycete fungi. It is created to ensure that each cell, or segment of hypha separated by septa (cross walls), receives a set of differing nuclei, which are obtained through mating of hyphae of differing sexual types. It is used to maintain genetic variation within the hypha much like the mechanisms found in croziers (hooks) during the sexual reproduction of ascomycetes. Formation Clamp connections are formed by the terminal hypha during elongation. Before the clamp connection is formed this terminal segment contains two nuclei. Once the terminal segment is long enough it begins to form the clamp connection. At the same time, each nucleus undergoes mitotic division to produce two daughter nuclei. As the clamp continues to develop it uptakes one of the daughter (green circle) nuclei and separates it from its sister nucleus. While this is occurring the remaining nuclei (orange circles) begin to migrate from one another to opposite ends of the cell. Once all these steps have occurred a septum forms, separating each set of nuclei. Use in classification Clamp connections are structures unique to the phylum Basidiomycota. Many fungi from this phylum produce spores in basidiocarps (fruiting bodies, or mushrooms), above ground. Though clamp connections are exclusive to this phylum, not all species of Basidiomycota possess these structures. As such, the presence or absences of clamp connections has been a tool in categorizing genera and species. Fossil record A fungal mycelium containing abundant clamp connections was found that dated to the Pennsylvanian era (298.9–323.2 Mya). This fossil, classified in the form genus Palaeancistrus, has hyphae that compare with extant saprophytic basidiomycetes. The oldest known clamp connections exist in hyphae present in the fossil fern Botryopteris antiqua, which predate Palaeancistrus by about 25 Ma.
https://en.wikipedia.org/wiki/Nevanlinna%20function
In mathematics, in the field of complex analysis, a Nevanlinna function is a complex function which is an analytic function on the open upper half-plane and has non-negative imaginary part. A Nevanlinna function maps the upper half-plane to itself or to a real constant, but is not necessarily injective or surjective. Functions with this property are sometimes also known as Herglotz, Pick or R functions. Integral representation Every Nevanlinna function admits a representation where is a real constant, is a non-negative constant, is the upper half-plane, and is a Borel measure on satisfying the growth condition Conversely, every function of this form turns out to be a Nevanlinna function. The constants in this representation are related to the function via and the Borel measure can be recovered from by employing the Stieltjes inversion formula (related to the inversion formula for the Stieltjes transformation): A very similar representation of functions is also called the Poisson representation. Examples Some elementary examples of Nevanlinna functions follow (with appropriately chosen branch cuts in the first three). ( can be replaced by for any real number .) These are injective but when does not equal 1 or −1 they are not surjective and can be rotated to some extent around the origin, such as . A sheet of such as the one with . (an example that is surjective but not injective). A Möbius transformation is a Nevanlinna function if (sufficient but not necessary) is a positive real number and . This is equivalent to the set of such transformations that map the real axis to itself. One may then add any constant in the upper half-plane, and move the pole into the lower half-plane, giving new values for the parameters. Example: and are examples which are entire functions. The second is neither injective nor surjective. If is a self-adjoint operator in a Hilbert space and is an arbitrary vector, then the function is a Nevanlinna
https://en.wikipedia.org/wiki/End%20node%20problem
The end node problem arises when individual computers are used for sensitive work and/or temporarily become part of a trusted, well-managed network/cloud and then are used for more risky activities and/or join untrusted networks. (Individual computers on the periphery of networks/clouds are called end nodes.) End nodes often are not managed to the trusted network‘s high computer security standards. End nodes often have weak/outdated software, weak security tools, excessive permissions, mis-configurations, questionable content and apps, and covert exploitations. Cross contamination and unauthorized release of data from within a computer system becomes the problem. Within the vast cyber-ecosystem, these end nodes often attach transiently to one or more clouds/networks, some trustworthy and others not. A few examples: a corporate desktop browsing the Internet, a corporate laptop checking company webmail via a coffee shop's open Wi-Fi access point, a personal computer used to telecommute during the day and gaming at night, or app within a smartphone/tablet (or any of the previous use/device combinations). Even if fully updated and tightly locked down, these nodes may ferry malware from one network (e.g. a corrupted webpage or an infected email message) into another, sensitive network. Likewise, the end nodes may exfiltrate sensitive data (e.g. log keystrokes or screen-capture). Assuming the device is fully trustworthy, the end node must provide the means to properly authenticate the user. Other nodes may impersonate trusted computers, thus requiring device authentication. The device and user may be trusted but within an untrustworthy environment (as determined by inboard sensors' feedback). Collectively, these risks are called the end node problem. There are several remedies but all require instilling trust in the end node and conveying that trust to the network/cloud. The cloud’s weakest link Cloud computing may be characterized as a vast, seemingly endless, ar
https://en.wikipedia.org/wiki/Stieltjes%E2%80%93Wigert%20polynomials
In mathematics, Stieltjes–Wigert polynomials (named after Thomas Jan Stieltjes and Carl Severin Wigert) are a family of basic hypergeometric orthogonal polynomials in the basic Askey scheme, for the weight function on the positive real line x > 0. The moment problem for the Stieltjes–Wigert polynomials is indeterminate; in other words, there are many other measures giving the same family of orthogonal polynomials (see Krein's condition). Koekoek et al. (2010) give in Section 14.27 a detailed list of the properties of these polynomials. Definition The polynomials are given in terms of basic hypergeometric functions and the Pochhammer symbol by where Orthogonality Since the moment problem for these polynomials is indeterminate there are many different weight functions on [0,∞] for which they are orthogonal. Two examples of such weight functions are and Notes
https://en.wikipedia.org/wiki/Carnage%20Heart
is a video game for the PlayStation, developed by Artdink. Its gameplay is a mecha-based, turn-based strategy game, where the player takes the role of a commander in a war fought by robots. The robots, called Overkill Engines (OKEs), cannot be directly controlled in battle; they must be programmed beforehand to behave in a certain way under certain conditions using a flow diagram system. Gameplay The game features a fairly complex negotiation system that allows the player to purchase, research, or upgrade new equipment and parts. The OKEs themselves can be upgraded as well through this system, allowing for extended use of the same model for as long as possible. The various companies involved in the negotiation process can also provide valuable information about the purchases of the enemy, allowing the player to better plan for the next advance in enemy technology. To aid the player in learning the gameplay, Carnage Heart was packaged with an unusually large amount of materials for a console game of the time: in addition to the game disc and above-average length manual, the jewel case contained a 58-page strategy guide and a tutorial disc with a 30-minute, fully voice-guided overview of most aspects of the game. OKE production The main focus of the game is in the design and programming of the OKEs. The OKEs will only be ready to produce once they have a complete hardware and software profile. Both of these profiles are stored in the form of a "card" that can be named as the player likes. It is possible for there to be a total of 28 cards, but in reality the player may use only 25, as there are three pre-made cards that can not be deleted. Before a software profile can be created for a card, there must be a hardware profile. The first choice a player must make in the hardware creation process is that of a body type and style. There are four styles of OKE bodies and three designs in each style to choose from. These styles include a two-legged type, a tank type, a m
https://en.wikipedia.org/wiki/Environmental%20change
Environmental change is a change or disturbance of the environment most often caused by human influences and natural ecological processes. Environmental changes include various factors, such as natural disasters, human interferences, or animal interaction. Environmental change encompasses not only physical changes, but also factors like an infestation of invasive species. See also Climate change (general concept) Environmental degradation Global warming Human impact on the environment Acclimatization Atlas of Our Changing Environment Phenotypic plasticity Socioeconomics
https://en.wikipedia.org/wiki/Electronic%20Frontier%20Canada
Electronic Frontier Canada (EFC) was a Canadian on-line civil rights organization founded to ensure that the principles embodied in the Canadian Charter of Rights and Freedoms remain protected as new computing, communications, and information technologies are introduced into Canadian society. As of 2005, the organization is no longer active. EFC was founded in January, 1994 and later became incorporated under the Canada Corporations Act as a Federal non-profit corporation. The letters patent was submitted December 29, 1994, and recorded on January 18, 1995. EFC is not formally affiliated with the Electronic Frontier Foundation (EFF), which is based in San Francisco, although it shares many of their goals about which the groups communicate from time to time. EFC is focused on issues directly affecting Canadians, whereas the EFF has a clear American focus. Briefly, EFC's mandate is to conduct research into issues and promote public awareness in Canada regarding the application of the Charter of Rights and Freedoms to new computer, communication, and information technologies, such as the Internet. The aim is protect freedom of expression and the right to privacy in cyberspace. As of 2011, has included a submission to the Canadian government concerning the Internet Service Provider wiretapping legislation reforms known as the Lawful Access proposals, and intervention in the BMG Canada Inc. and others v. Doe and others file-sharing case, where an Ontario Court refused to allow the Canadian Recording Industry Association and several major record labels from obtaining the subscriber information of ISP customers alleged to have been infringing copyright. The EFC appears to have been inactive since 2004 (based on the last updates to their website). Similar Organizations OpenMedia.ca Online Rights Canada External links Foundations based in Canada Politics and technology Computer law organizations Internet privacy organizations Privacy organizations Organizations es
https://en.wikipedia.org/wiki/Swan%20neck%20deformity
Swan neck deformity is a deformed position of the finger, in which the joint closest to the fingertip is permanently bent toward the palm while the nearest joint to the palm is bent away from it (DIP flexion with PIP hyperextension). It is commonly caused by injury, hypermobility or inflammatory conditions like rheumatoid arthritis or sometimes familial (congenital, like Ehlers–Danlos syndrome). Pathophysiology Swan neck deformity has many of possible causes arising from the DIP, PIP, or even the MCP joints. In all cases, there is a stretching of the volar plate at the PIP joint to allow hyperextension, plus some damage to the attachment of the extensor tendon to the base of the distal phalanx that produces a hyperflexed mallet finger. Duck bill deformity is a similar condition affecting the thumb (which cannot have true swan neck deformity because it does not have enough joints). Diagnosis Diagnosis of swan neck deformity is mainly clinical. MRI of the hand may suggest volar plate attenuation of PIP and extensor tendon damage for DIP Genetic screening tests such as for CMT disease may also be indicated. Treatment Splinting for fingers. Passive stretching and clearing the deformity.
https://en.wikipedia.org/wiki/Pseudo-polyomino
A pseudo-polyomino, also called a polyking, polyplet or hinged polyomino, is a plane geometric figure formed by joining one or more equal squares edge-to-edge or corner-to-corner at 90°. It is a polyform with square cells. The polyominoes are a subset of the polykings. The name "polyking" refers to the king in chess. The n-kings are the n-square shapes which could be occupied by a king on an infinite chessboard in the course of legal moves. Golomb uses the term pseudo-polyomino referring to kingwise-connected sets of squares. Enumeration of polykings Free, one-sided, and fixed polykings There are three common ways of distinguishing polyominoes and polykings for enumeration: free polykings are distinct when none is a rigid transformation (translation, rotation, reflection or glide reflection) of another (pieces that can be picked up and flipped over). one-sided polykings are distinct when none is a translation or rotation of another (pieces that cannot be flipped over). fixed polykings are distinct when none is a translation of another (pieces that can be neither flipped nor rotated). The following table shows the numbers of polykings of various types with n cells. Notes External links Polyforms
https://en.wikipedia.org/wiki/Rotor%20current%20meter
A rotor current meter (RCM) is a mechanical current meter, an oceanographic device deployed within an oceanographic mooring measuring the flow within the world oceans to learn more about ocean currents. Many RCMs have been replaced by instruments measuring the flow by hydroacoustics, the so-called Acoustic Doppler Current Profilers. However, for instance in Fram Strait, the Alfred Wegener Institute still uses RCMs for long-term monitoring the inflow into the Arctic Ocean. Measurement principle A RCM usually consists of the recording unit, a propeller to detect the water velocity and vane to determine the direction of the flow. The gimbal rings of the RCM-instruments allow to compensate a tilt of up to 40° of the mooring line. The recording unit houses a battery pack for energy supply, the Analog-to-digital converter, possible additional sensors for other variables (like the standard parameters conductivity, temperature, depth, CTD) and a micro-controller. According to the data sheet of the instruments RCM 7/8, the micro-controller performs the following tasks: The flow direction is detected by a magnetic compass, a needle clamped onto a potentiometer. This design becomes problematic for instruments in the Arctic Ocean since the North Magnetic Pole moves relative to the geographic North Pole and this (time-depended) magnetic declination cannot be neglected any more. Data quality Accuracy and precision of the RCMs are topic of an ongoing discussion. Comparative studies between different types of RCMs placed close to each other show that the instruments have large systematical errors. The following table summarizes the specification for various single-point current meters. These values only hold for medium currents between 0.02 and 2.95 m/s. Small velocities are difficult to detect because the vane shows a lot of inertia and the rotor needs to overcome friction before starting to rotate. The company Aanderaa calls the acoustical single-point instruments reco
https://en.wikipedia.org/wiki/Weierstrass%20factorization%20theorem
In mathematics, and particularly in the field of complex analysis, the Weierstrass factorization theorem asserts that every entire function can be represented as a (possibly infinite) product involving its zeroes. The theorem may be viewed as an extension of the fundamental theorem of algebra, which asserts that every polynomial may be factored into linear factors, one for each root. The theorem, which is named for Karl Weierstrass, is closely related to a second result that every sequence tending to infinity has an associated entire function with zeroes at precisely the points of that sequence. A generalization of the theorem extends it to meromorphic functions and allows one to consider a given meromorphic function as a product of three factors: terms depending on the function's zeros and poles, and an associated non-zero holomorphic function. Motivation It is clear that any finite set of points in the complex plane has an associated polynomial whose zeroes are precisely at the points of that set. The converse is a consequences of the fundamental theorem of algebra: any polynomial function in the complex plane has a factorization where is a non-zero constant and is the set of zeroes of . The two forms of the Weierstrass factorization theorem can be thought of as extensions of the above to entire functions. The necessity of additional terms in the product is demonstrated when one considers where the sequence is not finite. It can never define an entire function, because the infinite product does not converge. Thus one cannot, in general, define an entire function from a sequence of prescribed zeroes or represent an entire function by its zeroes using the expressions yielded by the fundamental theorem of algebra. A necessary condition for convergence of the infinite product in question is that for each z, the factors must approach 1 as . So it stands to reason that one should seek a function that could be 0 at a prescribed point, yet remain near 1
https://en.wikipedia.org/wiki/Falkland%20Islands%20and%20Dependencies%20Aerial%20Survey%20Expedition
The Falkland Islands and Dependencies Aerial Survey Expedition (FIDASE) was an aerial survey of the Falkland Islands Dependencies and the Antarctic peninsula which took place in the 1955–56 and 1956–57 southern summers. Funded by the Colonial Office and organized by Peter Mott, the survey was carried out by Hunting Aerosurveys Ltd. The expedition was based at Deception Island and utilized the Oluf Sven, two Canso flying-boats, and several helicopters. The photographic collection, held by the British Antarctic Survey as the United Kingdom Antarctic Mapping Centre, comprises about 12,800 frames taken on 26,700 kilometers of ground track.
https://en.wikipedia.org/wiki/Gauge%20symmetry%20%28mathematics%29
In mathematics, any Lagrangian system generally admits gauge symmetries, though it may happen that they are trivial. In theoretical physics, the notion of gauge symmetries depending on parameter functions is a cornerstone of contemporary field theory. A gauge symmetry of a Lagrangian is defined as a differential operator on some vector bundle taking its values in the linear space of (variational or exact) symmetries of . Therefore, a gauge symmetry of depends on sections of and their partial derivatives. For instance, this is the case of gauge symmetries in classical field theory. Yang–Mills gauge theory and gauge gravitation theory exemplify classical field theories with gauge symmetries. Gauge symmetries possess the following two peculiarities. Being Lagrangian symmetries, gauge symmetries of a Lagrangian satisfy Noether's first theorem, but the corresponding conserved current takes a particular superpotential form where the first term vanishes on solutions of the Euler–Lagrange equations and the second one is a boundary term, where is called a superpotential. In accordance with Noether's second theorem, there is one-to-one correspondence between the gauge symmetries of a Lagrangian and the Noether identities which the Euler–Lagrange operator satisfies. Consequently, gauge symmetries characterize the degeneracy of a Lagrangian system. Note that, in quantum field theory, a generating functional may fail to be invariant under gauge transformations, and gauge symmetries are replaced with the BRST symmetries, depending on ghosts and acting both on fields and ghosts. See also Gauge theory (mathematics) Lagrangian system Noether identities Gauge theory Gauge symmetry Yang–Mills theory Gauge group (mathematics) Gauge gravitation theory Notes
https://en.wikipedia.org/wiki/Transmembrane%20protein%20222
Transmembrane protein 222 is a protein that in humans is encoded by the TMEM222 gene. One notable feature of the protein encoded by this gene is the presence of three predicted transmembrane domains. The TMEM222 protein is predicted to most likely localize to the secretory vesicles. Gene Features TMEM222 has a domain of unknown function (DUF778). Aliases of this gene include DKFZP564D0478, RP11-4K3__A.4, C1orf160, and MGC111002. Accession NM_032125.2, the longest coding sequence (1629 bp), encodes a protein of 208 amino acid residues (23230 Daltons), which is considered the consensus coding sequence (CCDS297.2). There are two isoforms of the protein encoded by this gene. They are similar except the second (Q9H0R3-2) is lacking the first 96 amino acid residues that are present in the first (Q9H0R3-1). Gene Expression ACEVIEW has labeled TMEM222 as highly expressed with 3.8 times more expression than the average gene in the database. There is expression evidence from 166 tissues including brain, lung, colon, kidney, and placenta. Homology Orthologs and distant homologs of the human TMEM222 have been identified throughout Eukaryota especially in plants and animals. No paralogs of this gene have been found in the human genome. Distant Homolog A distant homolog of TMEM222, RTH (RTE1-Homolog), is a homolog of RTE1 (Reversion-to-Ethylene Perception 1), which is known to induce conformational changes in ETR1 (Ethylene receptor 1) that result in negative regulation corresponding with loss of ethylene perception. Protein Interactions Evidence from yeast two-hybrid screening exists for two protein interactions with this gene. One is a serine protease (PRSS23) that has been identified to be involved in mouse ovulation and is excreted into the extracellular matrix. The other protein is an ab-hydrolase (HLA-B associated transcript 5) that is integral to the membrane, and its corresponding gene is located in the genome near Tumor Necrosis Factor (TNF)-alpha and TNF-beta