source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Germania%20%28St.%20Paul%27s%20Church%2C%20Frankfurt%20am%20Main%29
Germania is the name of a painting that was probably created in March 1848. It hung in the St. Paul's Church (Paulskirche)) in Frankfurt, Germany. At that time, first the so-called Pre-Parliament and then the Frankfurt National Assembly, the first all-German parliament, met there. The National Assembly was a popular motif of the time, so the Germania painting also became very well-known. After the National Assembly was violently terminated in May 1849, the painting was taken down. In 1867 it was moved to the German National Museum in Nuremberg. The painting is one of the best-known representations of Germania, a woman who stands for Germany. Such a national allegory also exists in other countries. The motif was often taken up during the time of the emerging German Empire 1848/1849 and later. Image contents The early date of completion, at the end of March 1848, could explain the imagery used. At that time, there were still few ideas about the future of Germany and its form of government. Accordingly, the painting is politically restrained and refers neither to the popular movement nor to a crown (of a German emperor). It is clearly less militant than comparable paintings of the revolutionary period, more conservative-moderate and appealing to the unity of the nation. "Germania] stands on a stone pedestal high above a shadowy hilly landscape, illuminated in gold by the rising sun of a new age. She wears a red ermine-covered ruler's robe with the double-headed eagle in the breast shield, over it a wide, blue-lined gold brocade cloak. With her left hand she is leaning on a medieval tournament lance from which the black-red-gold flag is flying. The shimmering German tricolour forms the foil for the youthful blond head of Germania crowned with oak leaves. [...] In her right hand Germania holds a raised bare sword and an olive branch. At her feet lies a burst fetlock." (Rainer Schoch)"With shattered fetters, holding the black-red-gold flag in her left hand, she embodi
https://en.wikipedia.org/wiki/Enoxolone
Enoxolone (INN, BAN; also known as glycyrrhetinic acid or glycyrrhetic acid) is a pentacyclic triterpenoid derivative of the beta-amyrin type obtained from the hydrolysis of glycyrrhizic acid, which was obtained from the herb liquorice. It is used in flavoring and it masks the bitter taste of drugs like aloe and quinine. It is effective in the treatment of peptic ulcer and also has expectorant (antitussive) properties. It has some additional pharmacological properties with possible antiviral, antifungal, antiprotozoal, and antibacterial activities. Mechanism of action Glycyrrhetinic acid inhibits the enzymes (15-hydroxyprostaglandin dehydrogenase and delta-13-prostaglandin) that metabolize the prostaglandins PGE-2 and PGF-2α to their respective, inactive 15-keto-13,14-dihydro metabolites. This increases prostaglandins in the digestive system. Prostaglandins inhibit gastric secretion, stimulate pancreatic secretion and mucous secretion in the intestines, and markedly increase intestinal motility. They also cause cell proliferation in the stomach. The effect on gastric acid secretion, and promotion of mucous secretion and cell proliferation shows why licorice has potential in treating peptic ulcers. Licorice should not be taken during pregnancy, because PGF-2α stimulates activity of the uterus during pregnancy and can cause abortion. The structure of glycyrrhetinic acid is similar to that of cortisone. Both molecules are flat and similar at positions 3 and 11. This might be the basis for licorice's anti-inflammatory action. 3-β-D-(Monoglucuronyl)-18-β-glycyrrhetinic acid, a metabolite of glycyrrhetinic acid, inhibits the conversion of 'active' cortisol to 'inactive' cortisone in the kidneys. This occurs via inhibition of the enzyme 11-β-hydroxysteroid dehydrogenase. As a result, cortisol levels become high within the collecting duct of the kidney. Cortisol has intrinsic mineralocorticoid properties (that is, it acts like aldosterone and increases sodium reabsorpt
https://en.wikipedia.org/wiki/Hilbert%27s%20irreducibility%20theorem
In number theory, Hilbert's irreducibility theorem, conceived by David Hilbert in 1892, states that every finite set of irreducible polynomials in a finite number of variables and having rational number coefficients admit a common specialization of a proper subset of the variables to rational numbers such that all the polynomials remain irreducible. This theorem is a prominent theorem in number theory. Formulation of the theorem Hilbert's irreducibility theorem. Let be irreducible polynomials in the ring Then there exists an r-tuple of rational numbers (a1, ..., ar) such that are irreducible in the ring Remarks. It follows from the theorem that there are infinitely many r-tuples. In fact the set of all irreducible specializations, called Hilbert set, is large in many senses. For example, this set is Zariski dense in There are always (infinitely many) integer specializations, i.e., the assertion of the theorem holds even if we demand (a1, ..., ar) to be integers. There are many Hilbertian fields, i.e., fields satisfying Hilbert's irreducibility theorem. For example, number fields are Hilbertian. The irreducible specialization property stated in the theorem is the most general. There are many reductions, e.g., it suffices to take in the definition. A result of Bary-Soroker shows that for a field K to be Hilbertian it suffices to consider the case of and absolutely irreducible, that is, irreducible in the ring Kalg[X,Y], where Kalg is the algebraic closure of K. Applications Hilbert's irreducibility theorem has numerous applications in number theory and algebra. For example: The inverse Galois problem, Hilbert's original motivation. The theorem almost immediately implies that if a finite group G can be realized as the Galois group of a Galois extension N of then it can be specialized to a Galois extension N0 of the rational numbers with G as its Galois group. (To see this, choose a monic irreducible polynomial f(X1, ..., Xn, Y) whose root generates
https://en.wikipedia.org/wiki/Dependent%20ML
Dependent ML is an experimental functional programming language proposed by Hongwei Xi and Frank Pfenning. Dependent ML extends ML by a restricted notion of dependent types: types may be dependent on static indices of type Nat (natural numbers). Dependent ML employs a constraint theorem prover to decide a strong equational theory over the index expressions. DML's types are not dependent on runtime values - there is still a phase distinction between compilation and execution of the program. By restricting the generality of full dependent types type checking remains decidable, but type inference becomes undecidable. Dependent ML has been superseded by ATS and is no longer under active development.
https://en.wikipedia.org/wiki/Orphon
An orphon is a gene located outside the main chromosomal locus, i.e., it may be dispersed to an unconnected genomic location. Orphons have been found in both protein-coding and non-protein-coding gene families, which suggests that most gene transcription processes do not constitute a restriction on the development of orphons. Extensive polymorphism in this feature between individuals of the same species was shown. The gene class was first discovered in yeast, sea urchins, and fruitflies, and has since been reported from the genome of many other eukaryote groups including molluscs, amphibians, and mammals including humans.
https://en.wikipedia.org/wiki/Light%20dressed%20state
In the fields of atomic, molecular, and optical science, the term light dressed state refers to a quantum state of an atomic or molecular system interacting with a laser light in terms of the Floquet picture, i.e. roughly like an atom or a molecule plus a photon. The Floquet picture is based on the Floquet theorem in differential equations with periodic coefficients. Mathematical formulation The Hamiltonian of a system of charged particles interacting with a laser light can be expressed as where is the vector potential of the electromagnetic field of the laser; is periodic in time as . The position and momentum of the -th particle are denoted as and , respectively, while its mass and charge are symbolized as and , respectively. is the speed of light. By virtue of this time-periodicity of the laser field, the total Hamiltonian is also periodic in time as The Floquet theorem guarantees that any solution of the Schrödinger equation with this type of Hamiltonian, can be expressed in the form where has the same time-periodicity as the Hamiltonian, Therefore, this part can be expanded in a Fourier series, obtaining where is the frequency of the laser field. This expression (2) reveals that a quantum state of the system governed by the Hamiltonian (1) can be specified by a real number and an integer . The integer in eq. (2) can be regarded as the number of photons absorbed from (or emitted to) the laser field. In order to prove this statement, we clarify the correspondence between the solution (2), which is derived from the classical expression of the electromagnetic field where there is no concept of photons, and one which is derived from a quantized electromagnetic field (see quantum field theory). (It can be verified that is equal to the expectation value of the absorbed photon number at the limit of , where is the initial number of total photons.)
https://en.wikipedia.org/wiki/Critical%20point%20%28thermodynamics%29
In thermodynamics, a critical point (or critical state) is the end point of a phase equilibrium curve. One example is the liquid–vapor critical point, the end point of the pressure–temperature curve that designates conditions under which a liquid and its vapor can coexist. At higher temperatures, the gas cannot be liquefied by pressure alone. At the critical point, defined by a critical temperature Tc and a critical pressure pc, phase boundaries vanish. Other examples include the liquid–liquid critical points in mixtures, and the ferromagnet–paramagnet transition (Curie temperature) in the absence of an external magnetic field. Liquid–vapor critical point Overview For simplicity and clarity, the generic notion of critical point is best introduced by discussing a specific example, the vapor–liquid critical point. This was the first critical point to be discovered, and it is still the best known and most studied one. The figure to the right shows the schematic P-T diagram of a pure substance (as opposed to mixtures, which have additional state variables and richer phase diagrams, discussed below). The commonly known phases solid, liquid and vapor are separated by phase boundaries, i.e. pressure–temperature combinations where two phases can coexist. At the triple point, all three phases can coexist. However, the liquid–vapor boundary terminates in an endpoint at some critical temperature Tc and critical pressure pc. This is the critical point. The critical point of water occurs at and . In the vicinity of the critical point, the physical properties of the liquid and the vapor change dramatically, with both phases becoming even more similar. For instance, liquid water under normal conditions is nearly incompressible, has a low thermal expansion coefficient, has a high dielectric constant, and is an excellent solvent for electrolytes. Near the critical point, all these properties change into the exact opposite: water becomes compressible, expandable, a poor diele
https://en.wikipedia.org/wiki/Extreme%20Graphics
Extreme Graphics is a computer graphics architecture for Silicon Graphics computer workstations. Extreme Graphics was developed in 1993 and was available as a high-end graphics option on workstations such as the Indigo2, released during the mid-1990s. Extreme Graphics gives the workstation real-time 2D and 3D graphics rendering capability similar to that of even high-end PCs made many years after Extreme's introduction, with the exception of texture rendering which is performed in software. Extreme Graphics systems consist of eight Geometry Engines and two Raster Engines, twice as many units as the Elan/XZ graphics used in the Indy, Indigo, and Indigo2. The eight geometry engines are rated at 256 MFLOPS maximum, far faster than the MIPS R4400 CPU used in the workstation. Extreme Graphics consists of five graphics subsystems: the Command Engine, Geometry Subsystem, Raster Engine, framebuffer and Display Subsystem. Extreme Graphics can produce resolutions up to 1280 x 1024 pixels with 24-bit color and can also process unencoded NTSC and PAL analog television signals. It is reported by the PROM as GU1-Extreme. The Extreme Graphics architecture was superseded by SGI's IMPACT graphics architecture in 1995. External links Indigo2 and POWER Indigo2 Technical Report 2nd-hand Indigo2 Buyers' Guide Graphics chips SGI graphics
https://en.wikipedia.org/wiki/Feza%20G%C3%BCrsey
Feza Gürsey (; April 7, 1921 – April 13, 1992) was a Turkish mathematician and physicist. Among his contributions to theoretical physics, his work on the chiral model and on SU(6) symmetry of the quark model are the most well-known. Early life Feza Gürsey was born on April 7, 1921, in Istanbul, to Reşit Süreyya Gürsey, a military physician, and Remziye Hisar, a chemist and a pioneering Turkish scientist. He graduated from Galatasaray High School in 1940, and received his degree in Mathematics – Physics from Istanbul University in 1944. Career Through a scholarship from the Turkish Ministry of Education he received while he was an assistant in Istanbul University, he pursued a doctorate degree at the Imperial College London in the United Kingdom. He completed his work on the application of quaternions to quantum field theory in 1950. After spending the period from 1950 to 1951 in postdoctoral research at Cambridge University, he worked as an assistant at Istanbul University, where he married Suha Pamir, also a physics assistant, in 1952, and in 1953 he acquired the title of associate professor. During 1957–1961 he worked at Brookhaven National Laboratory, Institute for Advanced Study in Princeton, New Jersey, and Columbia University. In 1960s, he worked on the nonlinear chiral Lagrangian, and produced results of relevance to quantum chromodynamics. Returning to Turkey in 1961, he accepted the title of professor from Middle East Technical University (METU) and took part in the establishment of METU Department of Theoretical Physics. Continuing his work as a lecturer at METU until 1974, he formed a research group. Being offered a position at Yale University in 1965, he started to work in both Yale University and METU, until 1974, when he decided to give up his position in METU and settle in the United States to continue with Yale. During these years, he took part in the formulation of E(6) grand unified theories. Death and legacy Gürsey died in 1992, in New
https://en.wikipedia.org/wiki/Conversion%20between%20quaternions%20and%20Euler%20angles
Spatial rotations in three dimensions can be parametrized using both Euler angles and unit quaternions. This article explains how to convert between the two representations. Actually this simple use of "quaternions" was first presented by Euler some seventy years earlier than Hamilton to solve the problem of magic squares. For this reason the dynamics community commonly refers to quaternions in this application as "Euler parameters". Definition There are two representations of quaternions. This article uses the more popular Hamilton. A quaternion has 4 scaler values: (the real part) and (the imaginary part). Defining the norm of the quaternion as follows: A unit quaternion satisfies: We can associate a quaternion with a rotation around an axis by the following expression where α is a simple rotation angle (the value in radians of the angle of rotation) and cos(βx), cos(βy) and cos(βz) are the "direction cosines" of the angles between the three coordinate axes and the axis of rotation. (Euler's Rotation Theorem). Intuition To better understand how "direction cosines" work with quaternions: If the axis of rotation is the x-axis: If the axis of rotation is the y-axis: If the axis of rotation is the z-axis: If the axis of rotation is a vector located 45° ( radians) between the x and y axes: Therefore, the x and y axes "share" influence over the new axis of rotation. Tait–Bryan angles Similarly for Euler angles, we use the Tait Bryan angles (in terms of flight dynamics): Heading – : rotation about the Z-axis Pitch – : rotation about the new Y-axis Bank – : rotation about the new X-axis where the X-axis points forward, Y-axis to the right and Z-axis downward. In the conversion example above the rotation occurs in the order heading, pitch, bank. Rotation matrices The orthogonal matrix (post-multiplying a column vector) corresponding to a clockwise/left-handed (looking along positive axis to origin) rotation by the unit quaternion is given by the in
https://en.wikipedia.org/wiki/Environmental%20toxicants%20and%20fetal%20development
Environmental toxicants and fetal development is the impact of different toxic substances from the environment on the development of the fetus. This article deals with potential adverse effects of environmental toxicants on the prenatal development of both the embryo or fetus, as well as pregnancy complications. The human embryo or fetus is relatively susceptible to impact from adverse conditions within the mother's environment. Substandard fetal conditions often cause various degrees of developmental delays, both physical and mental, for the growing baby. Although some variables do occur as a result of genetic conditions pertaining to the father, a great many are directly brought about from environmental toxins that the mother is exposed to. Various toxins pose a significant hazard to fetuses during development. A 2011 study found that virtually all US pregnant women carry multiple chemicals, including some banned since the 1970s, in their bodies. Researchers detected polychlorinated biphenyls, organochlorine pesticides, perfluorinated compounds, phenols, polybrominated diphenyl ethers, phthalates, polycyclic aromatic hydrocarbons, perchlorate PBDEs, compounds used as flame retardants, and dichlorodiphenyltrichloroethane (DDT), a pesticide banned in the United States in 1972, in the bodies of 99 to 100 percent of the pregnant women they tested. Among other environmental estrogens, Bisphenol A (BPA) was identified in 96 percent of the women surveyed. Several of the chemicals were at the same concentrations that have been associated with negative effects in children from other studies and it is thought that exposure to multiple chemicals can have a greater impact than exposure to only one substance. Effects Environmental toxicants can be described separately by what effects they have, such as structural abnormalities, altered growth, functional deficiencies, congenital neoplasia, or even death for the fetus. Preterm birth One in ten US babies is born preterm and a
https://en.wikipedia.org/wiki/Wedge
A wedge is a triangular shaped tool, a portable inclined plane, and one of the six simple machines. It can be used to separate two objects or portions of an object, lift up an object, or hold an object in place. It functions by converting a force applied to its blunt end into forces perpendicular (normal) to its inclined surfaces. The mechanical advantage of a wedge is given by the ratio of the length of its slope to its width. Although a short wedge with a wide angle may do a job faster, it requires more force than a long wedge with a narrow angle. The force is applied on a flat, broad surface. This energy is transported to the pointy, sharp end of the wedge, hence the force is transported. The wedge simply transports energy in the form of friction and collects it to the pointy end, consequently breaking the item. History Wedges have existed for thousands of years. They were first made of simple stone. Perhaps the first example of a wedge is the hand axe (see also Olorgesailie), which is made by chipping stone, generally flint, to form a bifacial edge, or wedge. A wedge is a simple machine that transforms lateral force and movement of the tool into a transverse splitting force and movement of the workpiece. The available power is limited by the effort of the person using the tool, but because power is the product of force and movement, the wedge amplifies the force by reducing the movement. This amplification, or mechanical advantage is the ratio of the input speed to output speed. For a wedge, this is given by 1/tanα, where α is the tip angle. The faces of a wedge are modeled as straight lines to form a sliding or prismatic joint. The origin of the wedge is not known. In ancient Egyptian quarries, bronze wedges were used to break away blocks of stone used in construction. Wooden wedges that swelled after being saturated with water were also used. Some indigenous peoples of the Americas used antler wedges for splitting and working wood to make canoes, dwelli
https://en.wikipedia.org/wiki/APC%20by%20Schneider%20Electric
APC by Schneider Electric (formerly American Power Conversion Corporation) is a manufacturer of uninterruptible power supplies (UPS), electronics peripherals, and data center products. In 2007, Schneider Electric acquired APC and combined it with MGE UPS Systems to form Schneider Electric's Critical Power & Cooling Services Business Unit, which recorded 2007 revenue of US$3.5 billion (EUR 2.4 billion) and employed 12,000 people worldwide. Until February 2007, when it was acquired, it had been a member of the S&P 500 list of the largest publicly traded companies in the United States. Schneider Electric, with 113,900 employees and operations in 102 countries, had 2008 annual sales of $26 billion (EUR 18.3 billion). In 2011, APC by Schneider Electric became a product brand only, while the company was rebranded as the IT Business Unit of Schneider Electric. History APC was founded in 1981 by three MIT Lincoln Lab electronic power engineers. Originally, the engineers focused on solar power research and development. When government funding for their research ended, APC shifted its focus to power protection by introducing its first UPS in 1984. Acquisition by Schneider Schneider Electric announced its acquisition of APC on October 30, 2006 and completed it on February 14, 2007. APC share-holders approved the deal on January 16, 2007. The European Union authorized the merger, provided that Schneider divest itself of the MGE UPS SYSTEMS global UPS business below 10kVA. Late in 2007 Eaton Powerware bought the MGE Office Protection Systems division of Schneider. Product lines The company focuses its efforts on four application areas: Home/home office Business networks Access provider networks Data centers and facilities Symmetra APC Symmetra LX is a line of uninterruptible power supply products, aimed at network and server applications. Symmetras come in power configurations ranging from 4 kVA to 16 kVA. Symmetras are built for use in a data center (in a 19-in
https://en.wikipedia.org/wiki/Ion%20trap
An ion trap is a combination of electric and/or magnetic fields used to capture charged particles — known as ions — often in a system isolated from an external environment. Atomic and molecular ion traps have a number of applications in physics and chemistry such as precision mass spectrometry, improved atomic frequency standards, and quantum computing. In comparison to neutral atom traps, ion traps have deeper trapping potentials (up to several electronvolts) that do not depend on the internal electronic structure of a trapped ion. This makes ion traps more suitable for the study of light interactions with single atomic systems. The two most popular types of ion traps are the Penning trap, which forms a potential via a combination of static electric and magnetic fields, and the Paul trap which forms a potential via a combination of static and oscillating electric fields. Penning traps can be used for precise magnetic measurements in spectroscopy. Studies of quantum state manipulation most often use the Paul trap. This may lead to a trapped ion quantum computer and has already been used to create the world's most accurate atomic clocks. Electron guns (a device emitting high-speed electrons, used in CRTs) can use an ion trap to prevent degradation of the cathode by positive ions. History The physical principles of ion traps were first explored by F. M. Penning (1894–1953), who observed that electrons released by the cathode of an ionization vacuum gauge follow a long cycloidal path to the anode in the presence of a sufficiently strong magnetic field. A scheme for confining charged particles in three dimensions without the use of magnetic fields was developed by W. Paul based on his work with quadrupole mass spectrometers. Theory A charged particle, such as an ion, feels a force from an electric field. As a consequence of Earnshaw's theorem, it is not possible to confine an ion in an electrostatic field. However, physicists have various ways of working around th
https://en.wikipedia.org/wiki/Pinnacle%20Studio
Pinnacle Studio is a video editing program originally developed by Pinnacle Systems as consumer-level software. Upon Pinnacle System's acquisition of Munich-based FAST Multimedia, Pinnacle integrated the professional code base of FAST's editing software, (since re-branded as Pinnacle Liquid) beginning with Pinnacle Studio version 10. It was acquired by Avid and later by Corel in July 2012. Pinnacle Studio allows users to author video content in Video CD, DVD-Video, AVCHD or Blu-ray format, add complementary menus and burn them to disc. In the second half of 2007, Pinnacle introduced VideoSpin, a shareware version of Studio with fewer features; it was discontinued in March 2009. Versions Since version 9, Studio has been sold in several editions: Studio, Studio Plus and Studio Ultimate, all of which are commercial software. There is some additional functionality in the Plus and Ultimate editions, notably a second video track. This allowed Overlay, A-B Edits, Chroma Key, and Picture-in-Picture. Pinnacle Studio 24 was released on August 11, 2020. This version included Unlimited tracks plus 4K video support, Multi-camera Editing, Enhanced Motion Tracking, Enhanced Video masking, and many advanced technical features. No support for HEVC (H.265) on AMD hardware. iOS In addition to the desktop versions of Pinnacle Studio,two versions of Pinnacle Studio also exists for iPad and iPhone - Pinnacle Studio for iOS and Pinnacle Studio Pro for iOS. Last one has additional features, for example, trim frame by frame using the Dual Viewer Precision Trimmer, export to cloud services (such as Dropbox, Google Drive and OneDrive). Last version of Pinnacle Studio for iOS (5.6.1) requires iOS 9.3 or later, iPad 2 or higher, iPhone 4s or higher, iPod Touch Series 5 or higher. Reception MacUser rated version Pinnacle Studio 4 for iOS as 4 out of 5, saying that it provides "a more fully featured movie editor than iMovie for iPad", but complained that the extra in-app purchase needed for
https://en.wikipedia.org/wiki/PCI-SIG
PCI-SIG, or Peripheral Component Interconnect Special Interest Group, is an electronics industry consortium responsible for specifying the Peripheral Component Interconnect (PCI), PCI-X, and PCI Express (PCIe) computer buses. It is based in Beaverton, Oregon. The PCI-SIG is distinct from the similarly named and adjacently-focused PCI Industrial Computer Manufacturers Group. It has produced the PCI, PCI-X and PCI Express specifications. As of 2022, the board of directors of the PCI-SIG has representatives from: AMD, ARM, Dell EMC, IBM, Intel, Synopsys, Keysight, NVIDIA, and Qualcomm. The chairman and president of the PCI-SIG is Al Yanes, a "Distinguished Engineer" from IBM. The executive director of the PCI-SIG is Reen Presnell, president of VTM Group. Formation The PCI Special Interest Group was formed in 1992, initially as a "compliance program" to help computer manufacturers implement the Intel specification. The organization became a nonprofit corporation, officially named "PCI-SIG" in the year 2000. Membership Membership of PCI-SIG is open to all of the microcomputer industry with a $4,000 annual fee. PCI-SIG has a membership of over 800 companies that develop differentiated, interoperable products based on its specifications. PCI-SIG specifications are available to members of the organization as free downloads. Non-members can purchase hard-copy specifications. See also PICMG
https://en.wikipedia.org/wiki/HyperTransport%20Consortium
The HyperTransport Consortium is an industry consortium responsible for specifying and promoting the computer bus technology called HyperTransport. Organizational form The Technical Working Group along with several Task Forces manage the HyperTransport specification and drive new developments. A Marketing Working Group promotes the use of the technology and the consortium. History It was founded in 2001 by Advanced Micro Devices, Alliance Semiconductor, Apple Computer, Broadcom Corporation, Cisco Systems, NVIDIA, PMC-Sierra, Sun Microsystems, and Transmeta. As of 2009 it has over 50 members. Executives As of 2009, Mike Uhler of AMD is the President of the Consortium, Mario Cavalli is the General Manager, Brian Holden of PMC-Sierra is both the Vice President and the Chair of the Technical Working Group, Deepika Sarai is the Treasurer. External links HyperTransport Consortium web site Technology Page Link Technical Specifications HTX and DUT Connector Specifications White Papers Technology consortia Computer buses
https://en.wikipedia.org/wiki/Link%20aggregation
In computer networking, link aggregation is the combining (aggregating) of multiple network connections in parallel by any of several methods. Link aggregation increases total throughput beyond what a single connection could sustain, and provides redundancy where all but one of the physical links may fail without losing connectivity. A link aggregation group (LAG) is the combined collection of physical ports. Other umbrella terms used to describe the concept include trunking, bundling, bonding, channeling or teaming. Implementation may follow vendor-independent standards such as Link Aggregation Control Protocol (LACP) for Ethernet, defined in IEEE 802.1AX or the previous IEEE 802.3ad, but also proprietary protocols. Motivation Link aggregation increases the bandwidth and resilience of Ethernet connections. Bandwidth requirements do not scale linearly. Ethernet bandwidths historically have increased tenfold each generation: 10 megabit/s, 100 Mbit/s, 1000 Mbit/s, 10,000 Mbit/s. If one started to bump into bandwidth ceilings, then the only option was to move to the next generation, which could be cost prohibitive. An alternative solution, introduced by many of the network manufacturers in the early 1990s, is to use link aggregation to combine two physical Ethernet links into one logical link. Most of these early solutions required manual configuration and identical equipment on both sides of the connection. There are three single points of failure inherent to a typical port-cable-port connection, in either a computer-to-switch or a switch-to-switch configuration: the cable itself or either of the ports the cable is plugged into can fail. Multiple logical connections can be made, but many of the higher level protocols were not designed to fail over completely seamlessly. Combining multiple physical connections into one logical connection using link aggregation provides more resilient communications. Architecture Network architects can implement aggregation at an
https://en.wikipedia.org/wiki/Freifunk
Freifunk (German for: "free radio") is a non-commercial open grassroots initiative to support free computer networks in the German region. Freifunk is part of the international movement for a wireless community network. The initiative counts about 400 local communities with over 41,000 access points. Among them, Münster, Aachen, Munich, Hanover, Stuttgart, and Uelzen are the biggest communities, with more than 1,000 access points each. Aim The main goals of Freifunk are to build a large-scale free wireless Wi-Fi network that is decentralized, owned by those who run it and to support local communication. The initiative is based on the Picopeering Agreement. In this agreement, participants agree upon a network that is free from discrimination, in the sense of net neutrality. Similar grassroots initiatives in Austria and in Switzerland are FunkFeuer and Openwireless. Technology Like many other free community-driven networks, Freifunk uses mesh technology to bring up ad hoc networks by interconnecting multiple Wireless LANs. In a Wi-Fi mobile ad hoc network, all routers connect to each other using special routing software. When a router fails, this software automatically calculates a new route to the destination. This software, the Freifunk firmware, is based on OpenWrt and other free software. There are several different implementations of the firmware depending on the hardware and protocols local communities use. The first Wi-Fi ad hoc network was done in Georgia, USA in 1999 as demonstrated by Toh. It was a six-node implementation running the Associativity-Based Routing protocol on Linux kernel and WAVELAN WiFi. But ABR was a patented protocol. Following that experience, Freifunk worked on standard IETF protocols - the two common standard proposals are OLSR and B.A.T.M.A.N. The development of B.A.T.M.A.N. is driven by Freifunk activists on a volunteering basis. History One of the results of the BerLon workshop in October 2002 on free wireless community networ
https://en.wikipedia.org/wiki/Surface%20core%20level%20shift
A surface core level shift (SCS) is a kind of core-level shift that often emerges in X-ray photoelectron spectroscopy spectra of surface atoms. Because surface atoms have different chemical environments from bulk atoms, small shifts of binding energies are observed by X-ray photoelectron spectroscopy. SCS is ascribed mainly to the lower coordination numbers of surface atoms than bulk atoms. Reduced coordination leads to narrower valence bandwidth. Such narrowing of the bandwidth increases the density of states, and if more than half of the valence band is filled, the band center is lower than bulk and the binding energy increases. In contrast, if less than half of the valence band is filled, the band center is higher than bulk, and the binding energy decreases. Because the binding energy in X-ray photoelectron spectroscopy is affected by the final state and other chemical environments, this simple explanation cannot always be applied to the interpretation of X-ray photoelectron spectra. In spite of such complexity, the SCS gives important information about the chemical nature of surface atoms. Spectroscopy
https://en.wikipedia.org/wiki/Redundancy%20%28engineering%29
In engineering and systems theory, redundancy is the intentional duplication of critical components or functions of a system with the goal of increasing reliability of the system, usually in the form of a backup or fail-safe, or to improve actual system performance, such as in the case of GNSS receivers, or multi-threaded computer processing. In many safety-critical systems, such as fly-by-wire and hydraulic systems in aircraft, some parts of the control system may be triplicated, which is formally termed triple modular redundancy (TMR). An error in one component may then be out-voted by the other two. In a triply redundant system, the system has three sub components, all three of which must fail before the system fails. Since each one rarely fails, and the sub components are designed to preclude common failure modes (which can then be modelled as independent failure), the probability of all three failing is calculated to be extraordinarily small; it is often outweighed by other risk factors, such as human error. Electrical surges arising from lightning strikes are an example of a failure mode which is difficult to fully isolate, unless the components are powered from independent power busses and have no direct electrical pathway in their interconnect (communication by some means is required for voting). Redundancy may also be known by the terms "majority voting systems" or "voting logic". Redundancy sometimes produces less, instead of greater reliability it creates a more complex system which is prone to various issues, it may lead to human neglect of duty, and may lead to higher production demands which by overstressing the system may make it less safe. Redundancy is one form of robustness as practiced in computer science. Geographic redundancy has become important in the data center industry, to safeguard data against natural disasters and political instability (see below). Forms of redundancy In computer science, there are four major forms of redundancy:
https://en.wikipedia.org/wiki/Redundancy%20%28information%20theory%29
In information theory, redundancy measures the fractional difference between the entropy of an ensemble , and its maximum possible value . Informally, it is the amount of wasted "space" used to transmit certain data. Data compression is a way to reduce or eliminate unwanted redundancy, while forward error correction is a way of adding desired redundancy for purposes of error detection and correction when communicating over a noisy channel of limited capacity. Quantitative definition In describing the redundancy of raw data, the rate of a source of information is the average entropy per symbol. For memoryless sources, this is merely the entropy of each symbol, while, in the most general case of a stochastic process, it is in the limit, as n goes to infinity, of the joint entropy of the first n symbols divided by n. It is common in information theory to speak of the "rate" or "entropy" of a language. This is appropriate, for example, when the source of information is English prose. The rate of a memoryless source is simply , since by definition there is no interdependence of the successive messages of a memoryless source. The absolute rate of a language or source is simply the logarithm of the cardinality of the message space, or alphabet. (This formula is sometimes called the Hartley function.) This is the maximum possible rate of information that can be transmitted with that alphabet. (The logarithm should be taken to a base appropriate for the unit of measurement in use.) The absolute rate is equal to the actual rate if the source is memoryless and has a uniform distribution. The absolute redundancy can then be defined as the difference between the absolute rate and the rate. The quantity is called the relative redundancy and gives the maximum possible data compression ratio, when expressed as the percentage by which a file size can be decreased. (When expressed as a ratio of original file size to compressed file size, the quantity gives the maxim
https://en.wikipedia.org/wiki/Global%20Information%20Grid
The Global Information Grid (GIG) is a network of information transmission and processing maintained by the United States Department of Defense. More descriptively, it is a worldwide network of information transmission, of associated processes, and of personnel serving to collect, process, safeguard, transmit, and manage this information. It is an all-encompassing communications project of the United States Department of Defense. The GIG makes this immediately available to military personnel, to those responsible for military politics, and for support personnel. It includes all infrastructure, bought or loaned, of communications, electronics, informatics (including software and databases), and security. It is the most visible manifestation of network-centric warfare. It is the combination of technology and human activity that enables warfighters to access information on demand. It is defined as a "globally interconnected, end-to-end set of information capabilities for collecting, processing, storing, disseminating, and managing information on demand to warfighters, policy makers, and support personnel". The GIG includes owned and leased communications and computing systems and services, software (including applications), data, security services, other associated services, and National Security Systems. Non-GIG Information Technology (IT) includes stand-alone, self-contained, or embedded IT that is not, and will not be, connected to the enterprise network. This new definition removes references to the National Security Systems as defined in section 5142 of the Clinger-Cohen Act of 1996. Further, this new definition removes the references to the GIG providing capabilities from all operating locations (bases, posts, camps, stations, facilities, mobile platforms, and deployed sites). And lastly, this definition removes the part of the definition that discusses the interfaces to coalition, allied, and non-Department of Defense users and systems. The DoD's use of t
https://en.wikipedia.org/wiki/Stanford%20Research%20Institute%20Problem%20Solver
The Stanford Research Institute Problem Solver, known by its acronym STRIPS, is an automated planner developed by Richard Fikes and Nils Nilsson in 1971 at SRI International. The same name was later used to refer to the formal language of the inputs to this planner. This language is the base for most of the languages for expressing automated planning problem instances in use today; such languages are commonly known as action languages. This article only describes the language, not the planner. Definition A STRIPS instance is composed of: An initial state; The specification of the goal states – situations that the planner is trying to reach; A set of actions. For each action, the following are included: preconditions (what must be established before the action is performed); postconditions (what is established after the action is performed). Mathematically, a STRIPS instance is a quadruple , in which each component has the following meaning: is a set of conditions (i.e., propositional variables); is a set of operators (i.e., actions); each operator is itself a quadruple , each element being a set of conditions. These four sets specify, in order, which conditions must be true for the action to be executable, which ones must be false, which ones are made true by the action and which ones are made false; is the initial state, given as the set of conditions that are initially true (all others are assumed false); is the specification of the goal state; this is given as a pair , which specify which conditions are true and false, respectively, in order for a state to be considered a goal state. A plan for such a planning instance is a sequence of operators that can be executed from the initial state and that leads to a goal state. Formally, a state is a set of conditions: a state is represented by the set of conditions that are true in it. Transitions between states are modeled by a transition function, which is a function mapping states into new states
https://en.wikipedia.org/wiki/Savannah%20cat
The Savannah is a breed of hybrid cat developed in the late 20th century from crossing a serval (Leptailurus serval) with a domestic cat (Felis catus). This hybridization typically produces large and lean offspring, with the Serval's characteristic large ears and markedly brown-spotted coats. F1 and F2 male Savannahs can be very large, and in 2016 an F2 male attained a world record for tallest cat at . Show-eligible F4–F5 cats range from however, comparable in size to other large domestic cat breeds such as the Maine Coon or Norwegian Forest cat. History On April 7, 1986, Judee Frank crossbred a male serval, belonging to Suzi Wood, with a Siamese domestic cat to produce the first Savannah cat, a female named Savannah. That first Savannah was bred with a Turkish Angora male and gave birth to viable F2 kittens in April 1989. In 1996, Patrick Kelley and Joyce Sroufe wrote the original version of the Savannah breed standard and presented it to the board of The International Cat Association (TICA). In 2001, the board accepted it as a new registered breed, and in May 2012, TICA accepted the Savannah as an eligible championship breed. Physical features and breeding techniques Size The Savannah's tall and slim build give them the appearance of greater size than their actual weight. Size is very dependent on generation and sex. Early (F1 and F2) generations are usually the largest due to the stronger genetic influence of the African serval ancestor, usually weighing , although there is considerable financial incentive for breeders to produce F1 cats as large as possible; some are the size of dogs and can weigh or more, and in the US can fetch very high prices. Like most cat breeds, males tend to be larger than females, and as with other hybrid cat breeds such as the Chausie and Bengal, most F1 Savannah cats will possess many of the exotic traits from the wild (serval) ancestor, which recede in later generations. Later-generation Savannahs are comparable in size to oth
https://en.wikipedia.org/wiki/Hackathon
A hackathon (also known as a hack day, hackfest, datathon or codefest; a portmanteau of hacking and marathon) is an event where people engage in rapid and collaborative engineering over a relatively short period of time such as 24 or 48 hours. They are often run using agile software development practices, such as sprint-like design wherein computer programmers and others involved in software development, including graphic designers, interface designers, product managers, project managers, domain experts, and others collaborate intensively on engineering projects, such as software engineering. The goal of a hackathon is to create functioning software or hardware by the end of the event. Hackathons tend to have a specific focus, which can include the programming language used, the operating system, an application, an API, or the subject and the demographic group of the programmers. In other cases, there is no restriction on the type of software being created or the design of the new system. Etymology The word "hackathon" is a portmanteau of the words "hack" and "marathon", where "hack" is used in the sense of exploratory programming, not its alternate meaning as a reference to breaching computer security. OpenBSD's apparent first use of the term referred to a cryptographic development event held in Calgary on June 4, 1999, where ten developers came together to avoid legal problems caused due to export regulations of cryptographic software from the United States. Since then, a further three to six events per year have occurred around the world to advance development, generally on university campuses. For Sun Microsystems, the usage referred to an event at the JavaOne conference from June 15 to June 19, 1999; there John Gage challenged attendees to write a program in Java for the new Palm V using the infrared port to communicate with other people who are using Palm and register it on the Internet. Starting in the mid to late 2000s, hackathons became significantly
https://en.wikipedia.org/wiki/Fredholm%27s%20theorem
In mathematics, Fredholm's theorems are a set of celebrated results of Ivar Fredholm in the Fredholm theory of integral equations. There are several closely related theorems, which may be stated in terms of integral equations, in terms of linear algebra, or in terms of the Fredholm operator on Banach spaces. The Fredholm alternative is one of the Fredholm theorems. Linear algebra Fredholm's theorem in linear algebra is as follows: if M is a matrix, then the orthogonal complement of the row space of M is the null space of M: Similarly, the orthogonal complement of the column space of M is the null space of the adjoint: Integral equations Fredholm's theorem for integral equations is expressed as follows. Let be an integral kernel, and consider the homogeneous equations and its complex adjoint Here, denotes the complex conjugate of the complex number , and similarly for . Then, Fredholm's theorem is that, for any fixed value of , these equations have either the trivial solution or have the same number of linearly independent solutions , . A sufficient condition for this theorem to hold is for to be square integrable on the rectangle (where a and/or b may be minus or plus infinity). Here, the integral is expressed as a one-dimensional integral on the real number line. In Fredholm theory, this result generalizes to integral operators on multi-dimensional spaces, including, for example, Riemannian manifolds. Existence of solutions One of Fredholm's theorems, closely related to the Fredholm alternative, concerns the existence of solutions to the inhomogeneous Fredholm equation Solutions to this equation exist if and only if the function is orthogonal to the complete set of solutions of the corresponding homogeneous adjoint equation: where is the complex conjugate of and the former is one of the complete set of solutions to A sufficient condition for this theorem to hold is for to be square integrable on the rectangle .
https://en.wikipedia.org/wiki/Lipofuscin
Lipofuscin is the name given to fine yellow-brown pigment granules composed of lipid-containing residues of lysosomal digestion. It is considered to be one of the aging or "wear-and-tear" pigments, found in the liver, kidney, heart muscle, retina, adrenals, nerve cells, and ganglion cells. Formation and turnover Lipofuscin appears to be the product of the oxidation of unsaturated fatty acids and may be symptomatic of membrane damage, or damage to mitochondria and lysosomes. Aside from a large lipid content, lipofuscin is known to contain sugars and metals, including mercury, aluminium, iron, copper and zinc. Lipofuscin is also accepted as consisting of oxidized proteins (30–70%) as well as lipids (20–50%). It is a type of lipochrome and is specifically arranged around the nucleus. The accumulation of lipofuscin-like material may be the result of an imbalance between formation and disposal mechanisms. Such accumulation can be induced in rats by administering a protease inhibitor (leupeptin); after a period of three months, the levels of the lipofuscin-like material return to normal, indicating the action of a significant disposal mechanism. However, this result is controversial, as it is questionable if the leupeptin-induced material is true lipofuscin. There exists evidence that "true lipofuscin" is not degradable in vitro; whether this holds in vivo over longer time periods is not clear. The ABCR -/- knockout mouse has delayed dark adaptation but normal final rod threshold relative to controls. Bleaching the retina with strong light leads to formation of toxic cationic bis-pyridinium salt, N-retinylidene-N-retinyl-ethanolamine (A2E), which causes dry and wet age-related macular degeneration. From this experiment, it was concluded that ABCR has a significant role in preventing formation of A2E in extracellular photoreceptor surfaces during bleach recovery. Relation to diseases Lipofuscin accumulation in the eye, is a major risk factor implicated in macular deg
https://en.wikipedia.org/wiki/Four-fermion%20interactions
In quantum field theory, fermions are described by anticommuting spinor fields. A four-fermion interaction describes a local interaction between four fermionic fields at a point. Local here means that it all happens at the same spacetime point. This might be an effective field theory or it might be fundamental. Relativistic models Some examples are the following: Fermi's theory of the weak interaction. The interaction term has a (vector minus axial) form. The Gross–Neveu model. This is a four-fermi theory of Dirac fermions without chiral symmetry and as such, it may or may not be massive. The Thirring model. This is a four-fermi theory of fermions with a vector coupling. The Nambu–Jona-Lasinio model. This is a four-fermi theory of Dirac fermions with chiral symmetry and as such, it has no bare mass. Nonrelativistic models A nonrelativistic example is the BCS theory at large length scales with the phonons integrated out so that the force between two dressed electrons is approximated by a contact term. In four space-time dimensions, such theories are not renormalisable.
https://en.wikipedia.org/wiki/Product%20order
In mathematics, given a partial order and on a set and , respectively, the product order (also called the coordinatewise order or componentwise order) is a partial ordering on the Cartesian product Given two pairs and in declare that if and Another possible ordering on is the lexicographical order, which is a total ordering. However the product order of two total orders is not in general total; for example, the pairs and are incomparable in the product order of the ordering with itself. The lexicographic combination of two total orders is a linear extension of their product order, and thus the product order is a subrelation of the lexicographic order. The Cartesian product with the product order is the categorical product in the category of partially ordered sets with monotone functions. The product order generalizes to arbitrary (possibly infinitary) Cartesian products. Suppose is a set and for every is a preordered set. Then the on is defined by declaring for any and in that if and only if for every If every is a partial order then so is the product preorder. Furthermore, given a set the product order over the Cartesian product can be identified with the inclusion ordering of subsets of The notion applies equally well to preorders. The product order is also the categorical product in a number of richer categories, including lattices and Boolean algebras.
https://en.wikipedia.org/wiki/Tom%20DeMarco
Tom DeMarco (born August 20, 1940) is an American software engineer, author, and consultant on software engineering topics. He was an early developer of structured analysis in the 1970s. Early life and education Tom DeMarco was born in Hazleton, Pennsylvania. He received a BSEE degree in Electrical Engineering from Cornell University, a M.S. from Columbia University, and a diplôme from the University of Paris. Career DeMarco started working at Bell Telephone Laboratories in 1963, where he participated in ESS-1 project to develop the first large scale Electronic Switching System, which became installed in telephone offices all over the world. Later in the 1960s he started working for a French IT consulting firm, where he worked on the development of a conveyor system for the new merchandise mart at La Villette in Paris, and in the 1970s on the development of on-line banking systems in Sweden, Holland, France and New York. In the 1970s DeMarco was one of the major figures in the development of structured analysis and structured design in software engineering. In January 1978 he published Structured Analysis and System Specification, a major milestone in the field. In the 1980s with Tim Lister, Stephen McMenamin, John F. Palmer, James Robertson and Suzanne Robertson, he founded the consulting firm "The Atlantic Systems Guild" in New York. The firm initially shared offices with the Dorset House Publishing owned by Wendy Eachan, Tim Lister's wife. Their company developed into a New York- and London-based consulting company specializing in methods and management of software development. DeMarco has lectured and consulted throughout the Americas, Europe, Africa, Australia and the Far East. He has also been a technical advisor for ZeniMax Media, the parent company of video game publisher Bethesda Softworks. He is a member of the ACM and a Fellow of the IEEE. He lives in Camden, Maine, and is a principal of the Atlantic Systems Guild, and a fellow and Senior Consulta
https://en.wikipedia.org/wiki/Fan%20film
A fan film is a film or video inspired by a film, television program, comic book, book, or video game created by fans rather than by the source's copyright holders or creators. Fan filmmakers have traditionally been amateurs, but some of the more notable films have actually been produced by professional filmmakers as film school class projects or as demonstration reels. Fan films vary tremendously in quality, as well as in length, from short faux-teaser trailers for non-existent motion pictures to full-length motion pictures. Fan films are also examples of fan labor and the remix culture. Closely related concepts are fandubs, fansubs and vidding which are reworks of fans on already released film material. History The earliest known fan film is Anderson 'Our Gang, which was produced in 1926 by a pair of itinerant filmmakers. Shot in Anderson, South Carolina, the short is based on the Our Gang film series; the only known copy resides in the University of South Carolina's Newsfilm Library. Various amateur filmmakers created their own fan films throughout the ensuing decades, including a teenaged Hugh Hefner, but the technology required to make fan films was a limiting factor until relatively recently. In the 1960s UCLA film student Don Glut filmed a series of short black and white "underground films", based on adventure and comic book characters from 1940s and 1950s motion picture serials. Around the same time, artist Andy Warhol produced a film called Batman Dracula which could be described as a fan film. But it wasn't until the 1970s that the popularization of science fiction conventions allowed fans to show their films to the wider fan community. In the 1960s, 1970s, and 1980s, there were many unofficial foreign remakes of American films that today may be considered fan films, such as Süpermenler (Superman), 3 Dev Adam, (Spider-Man), Mahakaal (A Nightmare on Elm Street), and Dünyayı Kurtaran Adam (Star Wars). Most of the more prominent science fiction films and t
https://en.wikipedia.org/wiki/Axillary%20bud
The axillary bud (or lateral bud) is an embryonic or organogenic shoot located in the axil of a leaf. Each bud has the potential to form shoots, and may be specialized in producing either vegetative shoots (stems and branches) or reproductive shoots (flowers). Once formed, a bud may remain dormant for some time, or it may form a shoot immediately. Overview An axillary bud is an embryonic or organogenic shoot which lies dormant at the junction of the stem and petiole of a plant. It arises exogenously from outer layer of cortex of the stem. Axillary buds do not become actively growing shoots on plants with strong apical dominance (the tendency to grow just the terminal bud on the main stem). Apical dominance occurs because the shoot apical meristem produces auxin which prevents axillary buds from growing. The axillary buds begin developing when they are exposed to less auxin, for example if the plant naturally has weak apical dominance, if apical dominance is broken by removing the terminal bud, or if the terminal bud has grown far enough away for the auxin to have less of an effect. An example of axillary buds are the eyes of the potato. Effects of auxin As the apical meristem grows and forms leaves, a region of meristematic cells is left behind at the node between the stem and the leaf. These axillary buds are usually dormant, inhibited by auxin produced by the apical meristem, which is known as apical dominance. If the apical meristem is removed, or has grown a sufficient distance away from an axillary bud, the axillary bud may become activated (or more appropriately freed from hormone inhibition). Like the apical meristem, axillary buds can develop into a stem or flower. Diseases that affect axillary buds Certain plant diseases - notably phytoplasmas - can cause the proliferation of axillary buds, and cause plants to become bushy in appearance.
https://en.wikipedia.org/wiki/Osmoconformer
Osmoconformers are marine organisms that maintain an internal environment which is isotonic to their external environment. This means that the osmotic pressure of the organism's cells is equal to the osmotic pressure of their surrounding environment. By minimizing the osmotic gradient, this subsequently minimizes the net influx and efflux of water into and out of cells. Even though osmoconformers have an internal environment that is isosmotic to their external environment, the types of ions in the two environments differ greatly in order to allow critical biological functions to occur. An advantage of osmoconformation is that such organisms don’t need to expend as much energy as osmoregulators in order to regulate ion gradients. However, to ensure that the correct types of ions are in the desired location, a small amount of energy is expended on ion transport. A disadvantage to osmoconformation is that the organisms are subject to changes in the osmolarity of their environment. Examples Invertebrates Most osmoconformers are marine invertebrates such as echinoderms (such as starfish), mussels, marine crabs, lobsters, jellyfish, ascidians (sea squirts - primitive chordates), and scallops. Some insects are also osmoconformers. Some osmoconformers, such as echinoderms, are stenohaline, which means they can only survive in a limited range of external osmolarities. The survival of such organisms is thus contingent on their external osmotic environment remaining relatively constant. On the other hand, some osmoconformers are classified as euryhaline, which means they can survive in a broad range of external osmolarities. Mussels are a prime example of a euryhaline osmoconformer. Mussels have adapted to survive in a broad range of external salinities due to their ability to close their shells which allows them to seclude themselves from unfavorable external environments. Craniates There are a couple of examples of osmoconformers that are craniates such as ha
https://en.wikipedia.org/wiki/Brookings%20Report
Proposed Studies on the Implications of Peaceful Space Activities for Human Affairs, often referred to as "the Brookings Report", was a 1960 report commissioned by NASA and created by the Brookings Institution in collaboration with NASA's Committee on Long-Range Studies. It was submitted to the House Committee on Science and Astronautics of the United States House of Representatives in the 87th United States Congress on April 18, 1961. Significance The report has become noted for one short section entitled "The implications of a discovery of extraterrestrial life", which examines the potential implications of such a discovery on public attitudes and values. The section briefly considers possible public reactions to some possible scenarios for the discovery of extraterrestrial life, stressing a need for further research in this area. It recommended continuing studies to determine the likely social impact of such a discovery and its effects on public attitudes, including study of the question of how leadership should handle information about such a discovery and under what circumstances leaders might or might not find it advisable to withhold such information from the public. The significance of this section of the report is a matter of controversy. Persons who believe that extraterrestrial life has already been confirmed and that this information is being withheld by government from the public sometimes turn to this section of the report as support for their view. Frequently cited passages from this section of the report are drawn both from its main body and from its footnotes. The report has been mentioned in newspapers such as The New York Times, The Baltimore Sun, The Washington Times, and the Huffington Post. Background and context The report was entered into the Congressional Record, which is currently archived at over 1110 libraries as part of the Federal Depository Library Program. The main author Donald N. Michael was a "social psychologist with a backg
https://en.wikipedia.org/wiki/Strongly%20compact%20cardinal
In set theory, a branch of mathematics, a strongly compact cardinal is a certain kind of large cardinal. A cardinal κ is strongly compact if and only if every κ-complete filter can be extended to a κ-complete ultrafilter. Strongly compact cardinals were originally defined in terms of infinitary logic, where logical operators are allowed to take infinitely many operands. The logic on a regular cardinal κ is defined by requiring the number of operands for each operator to be less than κ; then κ is strongly compact if its logic satisfies an analog of the compactness property of finitary logic. Specifically, a statement which follows from some other collection of statements should also follow from some subcollection having cardinality less than κ. The property of strong compactness may be weakened by only requiring this compactness property to hold when the original collection of statements has cardinality below a certain cardinal λ; we may then refer to λ-compactness. A cardinal is weakly compact if and only if it is κ-compact; this was the original definition of that concept. Strong compactness implies measurability, and is implied by supercompactness. Given that the relevant cardinals exist, it is consistent with ZFC either that the first measurable cardinal is strongly compact, or that the first strongly compact cardinal is supercompact; these cannot both be true, however. A measurable limit of strongly compact cardinals is strongly compact, but the least such limit is not supercompact. The consistency strength of strong compactness is strictly above that of a Woodin cardinal. Some set theorists conjecture that existence of a strongly compact cardinal is equiconsistent with that of a supercompact cardinal. However, a proof is unlikely until a canonical inner model theory for supercompact cardinals is developed. Extendibility is a second-order analog of strong compactness. See also List of large cardinal properties
https://en.wikipedia.org/wiki/FAAC
FAAC or Freeware Advanced Audio Coder is a software project which includes the AAC encoder FAAC and decoder FAAD2. It supports MPEG-2 AAC as well as MPEG-4 AAC. It supports several MPEG-4 Audio object types (LC, Main, LTP for encoding and SBR, PS, ER, LD for decoding), file formats (ADTS AAC, raw AAC, MP4), multichannel and gapless encoding/decoding and MP4 metadata tags. The encoder and decoder is compatible with standard-compliant audio applications using one or more of these object types and facilities. It also supports Digital Radio Mondiale. FAAC and FAAD2, being distributed in C source code form, can be compiled on various platforms and are distributed free of charge. FAAD2 is free software. FAAC contains some code which is published as Free Software, but as a whole it is only distributed under a proprietary license. FAAC was originally written by Menno Bakker. FAAC encoder FAAC stands for Freeware Advanced Audio Coder. The FAAC encoder is an audio compression computer program that creates AAC (MPEG-2 AAC/MPEG-4 AAC) sound files from other formats (usually, CD-DA audio files). It contains a library (libfaac) that can be used by other programs. AAC files are commonly used in computer programs and portable music players, being Apple Inc.'s recommended format for the company's iPod music player. Some of the features that FAAC has are: cross-platform support, "reasonably" fast encoding, support for more than one "object type" of the AAC format, multi-channel encoding, and support for Digital Radio Mondiale streams. It also supports multi-channel streams, like 5.1. The MPEG-4 object types of the AAC format supported by FAAC are the "Low Complexity" (LC), "Main", and "Long Term Prediction" (LTP). The MPEG-2 AAC profiles supported by FAAC are LC and Main. The SBR and PS object types are not supported, so the HE-AAC and HE-AACv2 profiles are also not supported. The object type "Low Complexity" is the default and also happens to be used in videos meant to be play
https://en.wikipedia.org/wiki/Mauritia%20flexuosa
Mauritia flexuosa, known as the moriche palm, ité palm, ita, buriti, muriti, miriti (Brazil), canangucho (Colombia), morete (Ecuador), or aguaje (Peru), is a palm tree. It grows in and near swamps and other wet areas in tropical South America. Mauritia flexuosa, a tree, can reach up to in height. The large leaves form a rounded crown. The flowers are yellowish and appear from December to April. The fruit, which grows from December to June, is a chestnut color and is covered with shiny scales. The yellow flesh covers a hard, oval nut. The seeds float, and this is the means by which the palm tree propagates. In natural populations, the tree reaches very high densities. Fruit Moriche palm fruit ("morete" in the Oriente of Ecuador) is edible and used to make juice, jam, ice cream, a fermented "wine", desserts and snacks, requiring harvesting of more than 50 tonnes per day in Peru. The inflorescence buds are eaten as a vegetable and the sap can be drunk fresh or fermented (see palm wine). Threads and cords are locally produced from the tree's fibers. Humans consume palm weevil larvae (Rhynchophorus palmarum) which burrow in the tree trunk. Oil Buriti oil is an orange-reddish oil extracted from the fruit of the moriche palm. The oil contains high concentrations of oleic acid, tocopherols, and carotenoids, especially beta-carotene. The oil has a reddish color used as ink on hides and skins. Ecology This tree is important to many animal species; several bird species, such as the red-bellied macaw, sulphury flycatcher, and moriche oriole, use it for nesting and food. Tapirs, peccaries, fish and monkeys depend on the fruit. Alexander von Humboldt documented the tree and the ecosystem it supports in 1800 when traveling through the Llanos region of Venezuela. He "observed with astonishment how many things are connected with the existence of a single plant." He called it the "tree of life" and essentially described it as a keystone species although the concept would not
https://en.wikipedia.org/wiki/Fuzzy%20electronics
Fuzzy electronics is an electronic technology that uses fuzzy logic, instead of the two-state Boolean logic more commonly used in digital electronics. Fuzzy electronics is fuzzy logic implemented on dedicated hardware. This is to be compared with fuzzy logic implemented in software running on a conventional processor. Fuzzy electronics has a wide range of applications, including control systems and artificial intelligence. History The first fuzzy electronic circuit was built by Takeshi Yamakawa et al. in 1980 using discrete bipolar transistors. The first industrial fuzzy application was in a cement kiln in Denmark in 1982. The first VLSI fuzzy electronics was by Masaki Togai and Hiroyuki Watanabe in 1984. In 1987, Yamakawa built the first analog fuzzy controller. The first digital fuzzy processors came in 1988 by Togai (Russo, pp. 2-6). In the early 1990s, the first fuzzy logic chips were presented to the public. Two companies which are Omron and NEC have announced the development of dedicated fuzzy electronic hardware in the year 1991. Two years later, the japanese Omron Cooperation has shown a working fuzzy chip during a technical fair. See also Defuzzification Fuzzy set Fuzzy set operations
https://en.wikipedia.org/wiki/Flavin%20mononucleotide
Flavin mononucleotide (FMN), or riboflavin-5′-phosphate, is a biomolecule produced from riboflavin (vitamin B2) by the enzyme riboflavin kinase and functions as the prosthetic group of various oxidoreductases, including NADH dehydrogenase, as well as cofactor in biological blue-light photo receptors. During the catalytic cycle, a reversible interconversion of the oxidized (FMN), semiquinone (FMNH•), and reduced (FMNH2) forms occurs in the various oxidoreductases. FMN is a stronger oxidizing agent than NAD and is particularly useful because it can take part in both one- and two-electron transfers. In its role as blue-light photo receptor, (oxidized) FMN stands out from the 'conventional' photo receptors as the signaling state and not an E/Z isomerization. It is the principal form in which riboflavin is found in cells and tissues. It requires more energy to produce, but is more soluble than riboflavin. In cells, FMN occurs freely circulating but also in several covalently bound forms. Covalently or non-covalently bound FMN is a cofactor of many enzymes playing an important pathophysiological role in cellular metabolism. For example dissociation of flavin mononucleotide from mitochondrial complex I has been shown to occur during ischemia/reperfusion brain injury during stroke. Food additive Flavin mononucleotide is also used as an orange-red food colour additive, designated in Europe as E number E101a. E106, a very closely related food dye, is riboflavin-5′-phosphate sodium salt, which consists mainly of the monosodium salt of the 5′-monophosphate ester of riboflavin. It is rapidly turned to free riboflavin after ingestion. It is found in many foods for babies and young children as well as jams, milk products, and sweets and sugar products. See also Flavin adenine dinucleotide
https://en.wikipedia.org/wiki/Amyloid%20beta
Amyloid beta (Aβ or Abeta) denotes peptides of 36–43 amino acids that are the main component of the amyloid plaques found in the brains of people with Alzheimer's disease. The peptides derive from the amyloid-beta precursor protein (APP), which is cleaved by beta secretase and gamma secretase to yield Aβ in a cholesterol-dependent process and substrate presentation. Aβ molecules can aggregate to form flexible soluble oligomers which may exist in several forms. It is now believed that certain misfolded oligomers (known as "seeds") can induce other Aβ molecules to also take the misfolded oligomeric form, leading to a chain reaction akin to a prion infection. The oligomers are toxic to nerve cells. The other protein implicated in Alzheimer's disease, tau protein, also forms such prion-like misfolded oligomers, and there is some evidence that misfolded Aβ can induce tau to misfold. A study has suggested that APP and its amyloid potential is of ancient origins, dating as far back as early deuterostomes. Normal function The normal function of Aβ is not well understood. Though some animal studies have shown that the absence of Aβ does not lead to any obvious loss of physiological function, several potential activities have been discovered for Aβ, including activation of kinase enzymes, protection against oxidative stress, regulation of cholesterol transport, functioning as a transcription factor, and anti-microbial activity (potentially associated with Aβ's pro-inflammatory activity). The glymphatic system clears metabolic waste from the mammalian brain, and in particular amyloid beta. A number of proteases have been implicated by both genetic and biochemical studies as being responsible for the recognition and degradation of amyloid beta; these include insulin degrading enzyme and presequence protease. The rate of removal is significantly increased during sleep. However, the significance of the glymphatic system in Aβ clearance in Alzheimer's disease is unknown. D
https://en.wikipedia.org/wiki/Superhydrophilicity
Superhydrophilicity refers to the phenomenon of excess hydrophilicity, or attraction to water; in superhydrophilic materials, the contact angle of water is equal to zero degrees. This effect was discovered in 1995 by the Research Institute of Toto Ltd. for titanium dioxide irradiated by sunlight. Under light irradiation, water dropped onto titanium dioxide forms no contact angle (almost 0 degrees). Superhydrophilic material has various advantages. For example, it can defog glass, and it can also enable oil spots to be swept away easily with water. Such materials are already commercialized as door mirrors for cars, coatings for buildings, self-cleaning glass, etc. Several mechanisms of this superhydrophilicity have been proposed by researchers. One is the change of the surface structure to a metastable structure, and another is cleaning the surface by the photodecomposition of dirt such as organic compounds adsorbed on the surface, after either of which water molecules can adsorb to the surface. The mechanism is still controversial, and it is too soon to decide which suggestion is correct. To decide, atomic scale measurements and other studies will be necessary. See also Superhydrophobicity, the opposite phenomenon
https://en.wikipedia.org/wiki/Interventionism%20%28medicine%29
Interventionism, when discussing the practice of medicine, is generally a derogatory term used by critics of a medical model in which patients are viewed as passive recipients receiving external treatments provided by the physician that have the effect of prolonging life, or at least of providing a subjective sense of doing everything possible. Interventionism is commonly encouraged by terminally ill patients and their family members when they are emotionally unprepared to acknowledge that the patient is going to die. Most healthcare providers are uncomfortable telling people that further cure-oriented or life-extending treatment is futile medical care, and patients and families are frequently angry with the provider or feel rejected by the provider when they are given accurate, but negative, information about the patient's prospects. In nearly all cases, "something" can be done for the patient, and families often reward and encourage a provider who proposes a string of useless and often directly harmful treatments; as a result, it is easier for providers to substitute worthless and expensive activity than to honestly admit that nothing will extend the patient's life. Interventionism is related to optimism bias. This is the belief that the patient will beat the odds, no matter how unlikely this might be. Optimism bias encourages patients to undertake treatments that have only tiny chances of success, in the erroneous and irrational belief that they will be part of the tiny minority that is successful, rather than part of the vast majority who are not. With terminally ill patients, the attitude of interventionism prevents providers and patients from taking full advantage of palliative care options. The primary focus for palliative care is improving the patient's immediate, daily life through better management of medications, practical assistance, planning for possible complications, and other services. Patients who use palliative care services usually live l
https://en.wikipedia.org/wiki/Vanguard%20%28microkernel%29
Vanguard is a discontinued experimental microkernel developed at Apple Computer, in the research-oriented Apple Advanced Technology Group (ATG) in the early 1990s. Based on the V-System, Vanguard introduced standardized object identifiers and a unique message chaining system for improved performance. Vanguard was not used in any of Apple's commercial products. Development ended in 1993 when Ross Finlayson, the project's main investigator, left Apple. Basic concepts Vanguard was generally very similar to the V-System, but added support for true object-oriented programming of the operating system. This meant that kernel and server interfaces were exported as objects, which could be inherited and extended in new code. This change has no visible effect on the system, it is mainly a change in the source code that makes programming easier. For example, Vanguard had an input/output (I/O) class which was supported by several different servers, for example, networking and file servers, which new applications could interact with by importing the I/O interface and calling methods. This also made writing new servers much easier, because they had a standard to program to, and were able to share code more easily. V messaging semantics A key concept to almost all microkernels is to break down one larger kernel into a set of communicating servers. Instead of having one larger program controlling all of the hardware of a computer system, the various duties are apportioned among smaller programs that are given rights to control different parts of the machine. For example, one server can be given control of the networking hardware, while another has the task of managing the hard disk drives. Another server would handle the file system, calling both of these lower-level servers. User applications ask for services by sending messages to these servers, using some form of inter-process communications (IPC), in contrast to asking the kernel to do this work via a system call (syscall) or
https://en.wikipedia.org/wiki/Location%20transparency
In computer networks, location transparency is the use of names to identify network resources, rather than their actual location. For example, files are accessed by a unique file name, but the actual data is stored in physical sectors scattered around a disk in either the local computer or in a network. In a location transparency system, the actual location where the file is stored doesn't matter to the user. A distributed system will need to employ a networked scheme for naming resources. The main benefit of location transparency is that it no longer matters where the resource is located. Depending on how the network is set, the user may be able to obtain files that reside on another computer connected to the particular network. This means that the location of a resource doesn't matter to either the software developers or the end-users. This creates the illusion that the entire system is located in a single computer, which greatly simplifies software development. An additional benefit is the flexibility it provides. Systems resources can be moved to a different computer at any time without disrupting any software systems running on them. By simply updating the location that goes with the named resource, every program using that resource will be able to find it. Location transparency effectively makes the location easy to use for users, since the data can be accessed by almost everyone who can connect to the Internet, who knows the right file names for usage, and who has proper security credentials to access it. See also Transparency (computing)
https://en.wikipedia.org/wiki/Arbutus%20unedo
Arbutus unedo is an evergreen shrub or small tree in the family Ericaceae, native to the Mediterranean Basin and Western Europe. The tree is well known for its fruits, the arbutus berry, which bear some resemblance to the strawberry, hence the common name strawberry tree. However, it is not closely related to true strawberries of the genus Fragaria. Its presence in Ireland also lends it the name "Irish strawberry tree", or cain, or cane apple (from the Irish name for the tree, caithne), or sometimes "Killarney strawberry tree". The strawberry tree is the national tree of Italy because of its green leaves, its white flowers and its red berries, colors that recall the Italian flag. Taxonomy Arbutus unedo was one of the many species described by Carl Linnaeus in Volume One of his landmark 1753 work Species Plantarum, giving it the name it still bears today. A study published in 2001 which analyzed ribosomal DNA from Arbutus and related genera found Arbutus to be paraphyletic, and A. unedo to be closely related to the other Mediterranean Basin species such as A. andrachne and A. canariensis and not to the western North American members of the genus. Arbutus unedo and A. andrachne hybridise naturally where their ranges overlap; the hybrid has been named Arbutus × andrachnoides (syn. A. × hybrida, or A. andrachne × unedo), inheriting traits of both parent species, though fruits are not usually borne freely, and as a hybrid is unlikely to breed true from seed. It is sold in California as Arbutus x Marina named for a district in San Francisco where it was hybridized. Description Arbutus unedo grows to tall, rarely up to , with a trunk diameter of up to . It grows in hardiness zones 7–10. The leaves are green and glossy on the upper side, dull on the underside, long and broad, laurel-like and with a serrated or serrulated margin. The hermaphrodite flowers are white (yellow when desiccated), bell-shaped, in diameter, and flower from a reddish hanging panicle in au
https://en.wikipedia.org/wiki/Weinberg%20angle
The weak mixing angle or Weinberg angle is a parameter in the Weinberg–Salam theory of the electroweak interaction, part of the Standard Model of particle physics, and is usually denoted as . It is the angle by which spontaneous symmetry breaking rotates the original and vector boson plane, producing as a result the  boson, and the photon. Its measured value is slightly below 30°, but also varies, very slightly increasing, depending on how high the relative momentum of the particles involved in the interaction is that the angle is used for. Details The algebraic formula for the combination of the and vector bosons (i.e. 'mixing') that simultaneously produces the massive  boson and the massless photon () is expressed by the formula The weak mixing angle also gives the relationship between the masses of the W and Z bosons (denoted as and ), The angle can be expressed in terms of the and couplings (weak isospin and weak hypercharge , respectively), and The electric charge is then expressible in terms of it, (refer to the figure). Because the value of the mixing angle is currently determined empirically, in the absence of any superseding theoretical derivation it is mathematically defined as The value of varies as a function of the momentum transfer, , at which it is measured. This variation, or 'running', is a key prediction of the electroweak theory. The most precise measurements have been carried out in electron–positron collider experiments at a value of , corresponding to the mass of the  boson, . In practice, the quantity is more frequently used. The 2004 best estimate of , at , in the scheme is , which is an average over measurements made in different processes, at different detectors. Atomic parity violation experiments yield values for at smaller values of , below 0.01 GeV/c, but with much lower precision. In 2005 results were published from a study of parity violation in Møller scattering in which a value of was obtained at , establishin
https://en.wikipedia.org/wiki/Scanning%20tunneling%20spectroscopy
Scanning tunneling spectroscopy (STS), an extension of scanning tunneling microscopy (STM), is used to provide information about the density of electrons in a sample as a function of their energy. In scanning tunneling microscopy, a metal tip is moved over a conducting sample without making physical contact. A bias voltage applied between the sample and tip allows a current to flow between the two. This is as a result of quantum tunneling across a barrier; in this instance, the physical distance between the tip and the sample The scanning tunneling microscope is used to obtain "topographs" - topographic maps - of surfaces. The tip is rastered across a surface and (in constant current mode), a constant current is maintained between the tip and the sample by adjusting the height of the tip. A plot of the tip height at all measurement positions provides the topograph. These topographic images can obtain atomically resolved information on metallic and semi-conducting surfaces However, the scanning tunneling microscope does not measure the physical height of surface features. One such example of this limitation is an atom adsorbed onto a surface. The image will result in some perturbation of the height at this point. A detailed analysis of the way in which an image is formed shows that the transmission of the electric current between the tip and the sample depends on two factors: (1) the geometry of the sample and (2) the arrangement of the electrons in the sample. The arrangement of the electrons in the sample is described quantum mechanically by an "electron density". The electron density is a function of both position and energy, and is formally described as the local density of electron states, abbreviated as local density of states (LDOS), which is a function of energy. Spectroscopy, in its most general sense, refers to a measurement of the number of something as a function of energy. For scanning tunneling spectroscopy the scanning tunneling microscope is use
https://en.wikipedia.org/wiki/Sales%20per%20unit%20area
In retail, sales per unit area is a standard and usually the primary measurement of store success. The unit of area is usually square metres in the metric system or square feet in U.S. customary units. Square feet are also widely used in retailing in the United Kingdom, but there are signs of a trend towards use of square meters. Sales levels in the United States As of 2005 annual store sales in the range of $300 per square foot ($3,000/m2) is considered a respectable result in the United States as the national average for regional malls is $341 per square foot, but the target number depends on the location, the type of store and other factors. For example, the Forum Shops at Caesars Palace in Las Vegas sets a precedent for Las Vegas stores. The location has the second highest sales per square foot of any mall in the nation at approximately $1,300 per square foot (Bal Harbour Shops is first with over $2,500 per square foot). The average for specialty apparel retailers, for instance, is $400 per square foot ($4,400/m2), and according to Baseline Magazine the retailer Hot Topic achieves an annual $619 per square foot ($6,660/m2). According to industry research firm RetailSails, Apple has the highest sales per square foot, with average in all their stores of $6,050 per square foot annually. Among shopping mall retailers the food court area, considered as a single store, and jewelers post the highest sales per unit area, in the range of $600 per square foot ($6,600/m2). Books and sporting goods are among the lowest.
https://en.wikipedia.org/wiki/Error%20catastrophe
Error catastrophe refers to the cumulative loss of genetic information in a lineage of organisms due to high mutation rates. The mutation rate above which error catastrophe occurs is called the error threshold. Both terms were coined by Manfred Eigen in his mathematical evolutionary theory of the quasispecies. The term is most widely used to refer to mutation accumulation to the point of inviability of the organism or virus, where it cannot produce enough viable offspring to maintain a population. This use of Eigen's term was adopted by Lawrence Loeb and colleagues to describe the strategy of lethal mutagenesis to cure HIV by using mutagenic ribonucleoside analogs. There was an earlier use of the term introduced in 1963 by Leslie Orgel in a theory for cellular aging, in which errors in the translation of proteins involved in protein translation would amplify the errors until the cell was inviable. This theory has not received empirical support. Error catastrophe is predicted in certain mathematical models of evolution and has also been observed empirically. Like every organism, viruses 'make mistakes' (or mutate) during replication. The resulting mutations increase biodiversity among the population and help subvert the ability of a host's immune system to recognise it in a subsequent infection. The more mutations the virus makes during replication, the more likely it is to avoid recognition by the immune system and the more diverse its population will be (see the article on biodiversity for an explanation of the selective advantages of this). However, if it makes too many mutations, it may lose some of its biological features which have evolved to its advantage, including its ability to reproduce at all. The question arises: how many mutations can be made during each replication before the population of viruses begins to lose self-identity? Basic mathematical model Consider a virus which has a genetic identity modeled by a string of ones and zeros (e.g.
https://en.wikipedia.org/wiki/EMachines%20eOne
The eOne is an all-in-one desktop computer that was produced by eMachines in 1999. It resembles Apple's "Bondi Blue" iMac. Apple sued eMachines for allegedly infringing upon the distinctive trade dress of the iMac with the eOne. Apple and eMachines settled the case in 2000, which required the model to be discontinued. History and legal issues Upon its release in 1999, the eOne came with a translucent "cool blue" case, while the original iMac had a two-toned case with "Bondi Blue" accents. At US$799, the eOne was also cheaper than the US$1,199 iMac. eMachines hoped to avoid legal trouble because the shape of the computer was different from the iMac. However, Apple sued eMachines, alleging that the computer's design infringed upon the protected trade dress of the iMac. In March 2000, eMachines reached a settlement with Apple, under which it agreed to discontinue the infringing model. The eOne was available at Circuit City and Micro Center, but it did not sell well in the few months when it was available due to a lawsuit from Apple which eventually caused the eOne to be widely considered a failure for eMachines. The eOne was discontinued in 2002, and due to its lackluster sales, is rare in the secondary market. Technical specifications The eOne had a 433 MHz Intel Celeron microprocessor, 64 megabytes of PC-100 SDRAM RAM, a 15-inch CRT monitor, a 10BASE-T Ethernet port, a floppy drive, an 8 MB ATI video card, a 56k modem, and a CD-ROM drive, along with the ability to use two PC cards, which were commonly used to expand the capabilities of laptops. As a Wintel-based computer, the eOne ran Windows 98 or Windows Me depending on the time of manufacture, as opposed to the iMac running Mac OS 8 or Mac OS 9. Legacy In 2007, three years after acquiring eMachines, Gateway released the One, an all-in-one desktop computer similar to the eOne but in black and utilizing a flat-screen monitor.
https://en.wikipedia.org/wiki/Cleaning%20station
A cleaning station is a location where aquatic life congregate to be cleaned by smaller creatures. Such stations exist in both freshwater and marine environments, and are used by animals including fish, sea turtles and hippos, referred to as clients. The cleaning process includes the removal of parasites from the animal's body (both externally and internally), and is performed by various smaller animals including cleaner shrimp and numerous species of cleaner fish, especially wrasses and gobies (Elacatinus spp.), collectively referred to as cleaners. When the animal approaches a cleaning station, it will open its mouth wide or position its body in such a way as to signal that it needs to be cleaned. The cleaner fish will then remove and eat the parasites from the skin, even swimming into the mouth and gills of any fish being cleaned. This is a form of cleaning symbiosis. How predator clients recognize cleaners is still uncertain. It has been hypothesized that color, size, and pattern indicate to clients that an organism is a cleaner. For example, cleaning gobies tend to exhibit full-body lateral stripes, unlike their non-cleaning counterparts, which exhibit shorter lateral stripes. Cleaners also tend to be smaller because in fish species, usually juveniles are cleaners. Cleaning stations may be associated with coral reefs, located either on top of a coral head or in a slot between two outcroppings. Other cleaning stations may be located under large clumps of floating seaweed or at an accepted point in a river or lagoon. Cleaning stations are an exhibition of mutualism between cleaners and clients. Cleaner fish may also impact species diversity around coral reefs. Some clients have smaller home ranges and can only access one cleaning station. Clients with larger home ranges are able to access a variety of cleaning stations and are capable of choosing between cleaning stations. Visitor clients travel long distances to a cleaning station and are not local to the
https://en.wikipedia.org/wiki/Amyloid-beta%20precursor%20protein
Amyloid-beta precursor protein (APP) is an integral membrane protein expressed in many tissues and concentrated in the synapses of neurons. It functions as a cell surface receptor and has been implicated as a regulator of synapse formation, neural plasticity, antimicrobial activity, and iron export. It is coded for by the gene APP and regulated by substrate presentation. APP is best known as the precursor molecule whose proteolysis generates amyloid beta (Aβ), a polypeptide containing 37 to 49 amino acid residues, whose amyloid fibrillar form is the primary component of amyloid plaques found in the brains of Alzheimer's disease patients. Genetics Amyloid-beta precursor protein is an ancient and highly conserved protein. In humans, the gene APP is located on chromosome 21 and contains 18 exons spanning 290 kilobases. Several alternative splicing isoforms of APP have been observed in humans, ranging in length from 639 to 770 amino acids, with certain isoforms preferentially expressed in neurons; changes in the neuronal ratio of these isoforms have been associated with Alzheimer's disease. Homologous proteins have been identified in other organisms such as Drosophila (fruit flies), C. elegans (roundworms), and all mammals. The amyloid beta region of the protein, located in the membrane-spanning domain, is not well conserved across species and has no obvious connection with APP's native-state biological functions. Mutations in critical regions of amyloid precursor protein, including the region that generates amyloid beta (Aβ), cause familial susceptibility to Alzheimer's disease. For example, several mutations outside the Aβ region associated with familial Alzheimer's have been found to dramatically increase production of Aβ. A mutation (A673T) in the APP gene protects against Alzheimer's disease. This substitution is adjacent to the beta secretase cleavage site and results in a 40% reduction in the formation of amyloid beta in vitro. Structure A number of diff
https://en.wikipedia.org/wiki/Interleukin%204
The interleukin 4 (IL4, IL-4) is a cytokine that induces differentiation of naive helper T cells (Th0 cells) to Th2 cells. Upon activation by IL-4, Th2 cells subsequently produce additional IL-4 in a positive feedback loop. IL-4 is produced primarily by mast cells, Th2 cells, eosinophils and basophils. It is closely related and has functions similar to IL-13. Function Interleukin 4 has many biological roles, including the stimulation of activated B cell and T cell proliferation, and the differentiation of B cells into plasma cells. It is a key regulator in humoral and adaptive immunity. IL-4 induces B cell class switching to IgE, and up-regulates MHC class II production. IL-4 decreases the production of Th1 cells, macrophages, IFNγ, and dendritic cells IL-12. Overproduction of IL-4 is associated with allergies. Inflammation and wound repair Tissue macrophages play an important role in chronic inflammation and wound repair. The presence of IL-4 in extravascular tissues promotes alternative activation of macrophages into M2 cells and inhibits classical activation of macrophages into M1 cells. An increase in repair macrophages (M2) is coupled with secretion of IL-10 and TGF-β that result in a diminution of pathological inflammation. Release of arginase, proline, polyaminases and TGF-β by the activated M2 cell is tied with wound repair and fibrosis. Receptor The receptor for interleukin-4 is known as the IL-4Rα. This receptor exists in 3 different complexes throughout the body. Type 1 receptors are composed of the IL-4Rα subunit with a common γ chain and specifically bind IL-4. Type 2 receptors consist of an IL-4Rα subunit bound to a different subunit known as IL-13Rα1. These type 2 receptors have the ability to bind both IL-4 and IL-13, two cytokines with closely related biological functions. Structure IL-4 has a compact, globular fold (similar to other cytokines), stabilised by 3 disulphide bonds. One half of the structure is dominated by a 4 alpha-hel
https://en.wikipedia.org/wiki/Van%20der%20Pol%20oscillator
In the study of dynamical systems, the van der Pol oscillator (named for Dutch physicist Balthasar van der Pol) is a non-conservative, oscillating system with non-linear damping. It evolves in time according to the second-order differential equation where is the position coordinate—which is a function of the time —and is a scalar parameter indicating the nonlinearity and the strength of the damping. History The Van der Pol oscillator was originally proposed by the Dutch electrical engineer and physicist Balthasar van der Pol while he was working at Philips. Van der Pol found stable oscillations, which he subsequently called relaxation-oscillations and are now known as a type of limit cycle, in electrical circuits employing vacuum tubes. When these circuits are driven near the limit cycle, they become entrained, i.e. the driving signal pulls the current along with it. Van der Pol and his colleague, van der Mark, reported in the September 1927 issue of Nature that at certain drive frequencies an irregular noise was heard, which was later found to be the result of deterministic chaos. The Van der Pol equation has a long history of being used in both the physical and biological sciences. For instance, in biology, Fitzhugh and Nagumo extended the equation in a planar field as a model for action potentials of neurons. The equation has also been utilised in seismology to model the two plates in a geological fault, and in studies of phonation to model the right and left vocal fold oscillators. Two-dimensional form Liénard's theorem can be used to prove that the system has a limit cycle. Applying the Liénard transformation , where the dot indicates the time derivative, the Van der Pol oscillator can be written in its two-dimensional form: . Another commonly used form based on the transformation leads to: . Results for the unforced oscillator When , i.e. there is no damping function, the equation becomesThis is a form of the simple harmonic oscillator, and there
https://en.wikipedia.org/wiki/KRON-TV
KRON-TV (channel 4) is a television station licensed to San Francisco, California, United States, serving as the San Francisco Bay Area's outlet for The CW Television Network. The station also maintains a secondary affiliation with MyNetworkTV. Owned and operated by The CW's majority owner, Nexstar Media Group, KRON-TV has studios on Front Street in the city's historic Northeast Waterfront, in the same building as ABC owned-and-operated station KGO-TV, channel 7 (but with completely separate operations from that station). The transmitting antenna is located atop Sutro Tower in San Francisco. History NBC affiliation (1949–2001) In 1948, the Federal Communications Commission (FCC) authorized a construction permit by the Chronicle Publishing Company, publishers of the San Francisco Chronicle daily newspaper, for a new television station in San Francisco, KRON-TV. Chronicle Publishing was founded by brothers Charles and Michael de Young. The company already owned radio station KRON-FM. Managed by Michael de Young's grandson Charles de Young Thieriot, KRON signed on the air on November 15, 1949, as a full-time NBC affiliate. Its opening night program schedule included a special about San Francisco entertainment followed by the usual NBC prime time lineup of the Texaco Star Theater with Milton Berle, The Life of Riley, Mohawk Showroom, and The Chesterfield Supper Club. KRON-TV was the third television outlet in the Bay Area behind KGO-TV (channel 7) and KPIX-TV (channel 5), all going on the air within a year, and the last license before the FCC placed a moratorium on new television station licenses that would last the next four years. KRON-TV originally broadcast from studios located in the basement of the Chronicle Building at Fifth and Mission Streets. Newscasts benefited from the resources of the Chronicle and there was cooperation between KRON-TV and the newspaper. It originally maintained transmitter facilities, master control and a small insert studio on San Br
https://en.wikipedia.org/wiki/Geobiology
Geobiology is a field of scientific research that explores the interactions between the physical Earth and the biosphere. It is a relatively young field, and its borders are fluid. There is considerable overlap with the fields of ecology, evolutionary biology, microbiology, paleontology, and particularly soil science and biogeochemistry. Geobiology applies the principles and methods of biology, geology, and soil science to the study of the ancient history of the co-evolution of life and Earth as well as the role of life in the modern world. Geobiologic studies tend to be focused on microorganisms, and on the role that life plays in altering the chemical and physical environment of the pedosphere, which exists at the intersection of the lithosphere, atmosphere, hydrosphere and/or cryosphere. It differs from biogeochemistry in that the focus is on processes and organisms over space and time rather than on global chemical cycles. Geobiological research synthesizes the geologic record with modern biologic studies. It deals with process - how organisms affect the Earth and vice versa - as well as history - how the Earth and life have changed together. Much research is grounded in the search for fundamental understanding, but geobiology can also be applied, as in the case of microbes that clean up oil spills. Geobiology employs molecular biology, environmental microbiology, organic geochemistry, and the geologic record to investigate the evolutionary interconnectedness of life and Earth. It attempts to understand how the Earth has changed since the origin of life and what it might have been like along the way. Some definitions of geobiology even push the boundaries of this time frame - to understanding the origin of life and to the role that humans have played and will continue to play in shaping the Earth in the Anthropocene. History The term geobiology was coined by Lourens Baas Becking in 1934. In his words, geobiology "is an attempt to describe the relationship b
https://en.wikipedia.org/wiki/Szemer%C3%A9di%20regularity%20lemma
Szemerédi's regularity lemma is one of the most powerful tools in extremal graph theory, particularly in the study of large dense graphs. It states that the vertices of every large enough graph can be partitioned into a bounded number of parts so that the edges between different parts behave almost randomly. According to the lemma, no matter how large a graph is, we can approximate it with the edge densities between a bounded number of parts. Between any two parts, the distribution of edges will be pseudorandom as per the edge density. These approximations provide essentially correct values for various properties of the graph, such as the number of embedded copies of a given subgraph or the number of edge deletions required to remove all copies of some subgraph. Statement To state Szemerédi's regularity lemma formally, we must formalize what the edge distribution between parts behaving 'almost randomly' really means. By 'almost random', we're referring to a notion called -regularity. To understand what this means, we first state some definitions. In what follows is a graph with vertex set . Definition 1. Let be disjoint subsets of . The edge density of the pair is defined as: where denotes the set of edges having one end vertex in and one in . We call a pair of parts -regular if, whenever you take a large subset of each part, their edge density isn't too far off the edge density of the pair of parts. Formally, Definition 2. For , a pair of vertex sets and is called -regular, if for all subsets , satisfying , , we have The natural way to define an -regular partition should be one where each pair of parts is -regular. However, some graphs, such as the half graphs, require many pairs of partitions (but a small fraction of all pairs) to be irregular. So we shall define -regular partitions to be one where most pairs of parts are -regular. Definition 3. A partition of into sets is called an -regular partition if Now we can state the lemma: Szemerédi'
https://en.wikipedia.org/wiki/Attitude%20change
Attitudes are associated beliefs and behaviors towards some object. They are not stable, and because of the communication and behavior of other people, are subject to change by social influences, as well as by the individual's motivation to maintain cognitive consistency when cognitive dissonance occurs—when two attitudes or attitude and behavior conflict. Attitudes and attitude objects are functions of affective and cognitive components. It has been suggested that the inter-structural composition of an associative network can be altered by the activation of a single node. Thus, by activating an affective or emotional node, attitude change may be possible, though affective and cognitive components tend to be intertwined. Bases There are three bases for attitude change: compliance, identification, and internalization. These three processes represent the different levels of attitude change. Compliance Compliance refers to a change in behavior based on consequences, such as an individual's hopes to gain rewards or avoid punishment from another group or person. The individual does not necessarily experience changes in beliefs or evaluations of an attitude object, but rather is influenced by the social outcomes of adopting a change in behavior. The individual is also often aware that he or she is being urged to respond in a certain way. Compliance was demonstrated through a series of laboratory experiments known as the Asch experiments. Experiments led by Solomon Asch of Swarthmore College asked groups of students to participate in a "vision test". In reality, all but one of the participants were confederates of the experimenter, and the study was really about how the remaining student would react to the confederates' behavior. Participants were asked to pick, out of three line options, the line that is the same length as a sample and were asked to give the answer out loud. Unbeknown to the participants, Asch had placed a number of confederates to deliberately give
https://en.wikipedia.org/wiki/Tux%20Paint
Tux Paint is a free and open source raster graphics editor geared towards young children. The project was started in 2002 by Bill Kendrick who continues to maintain and improve it, with help from numerous volunteers. Tux Paint is seen by many as a free software alternative to Kid Pix, a similar proprietary educational software product. History Tux Paint was initially created for the Linux operating system, as there was no suitable drawing program for young children available for Linux at that time. It is written in the C programming language and uses various free and open source helper libraries, including the Simple DirectMedia Layer (SDL), and has since been made available for Microsoft Windows, Apple macOS, Android, Haiku, and other platforms. Selected milestone releases: 2002.06.16 (June 16, 2002) - Initial release (brushes, stamps, lines, eraser), two days after coding started 2002.06.30 (June 30, 2002) - First Magic Tools added (blur, blocks, negative) 2002.07.31 (July 31, 2002) - Localization support added 0.9.11 (June 17, 2003) - Right-to-left support, UTF-8 support in Text tool 0.9.14 (October 12, 2004) - Tux Paint Config. configuration tool released, Starter image support 0.9.16 (October 21, 2006) - Slideshow feature, animated and directional brushes 0.9.17 (July 1, 2007) - Arbitrary screen size and orientation support, SVG support, input method support 0.9.18 (November 21, 2007) - Magic Tools turned into plug-ins, Pango text rendering 0.9.25 (December 20, 2020) - Support for exporting individual drawings and slideshows (as animated GIFs) 0.9.28 (June 4, 2022) - 20-year milestone release, adds the ability to use any colour by setting hue and saturation instead of a static palette. 0.9.29 (April 2, 2023) - Introduces fifteen new Magic Tools, improvements to the Stamp and Shapes tool and a new quick start guide. Features Tux Paint stands apart from typical graphics editing software (such as GIMP or Photoshop) that it was designed to be us
https://en.wikipedia.org/wiki/Three-body%20force
A three-body force is a force that does not exist in a system of two objects but appears in a three-body system. In general, if the behaviour of a system of more than two objects cannot be described by the two-body interactions between all possible pairs, as a first approximation, the deviation is mainly due to a three-body force. The fundamental strong interaction does exhibit such behaviour, the most important example being the stability experimentally observed for the helium-3 isotope, which can be described as a 3-body quantum cluster entity of two protons and one neutron [PNP] in stable superposition. Direct evidence of a 3-body force in helium-3 is known: . The existence of stable [PNP] cluster calls into question models of the atomic nucleus that restrict nucleon interactions within shells to 2-body phenomenon. The three-nucleon-interaction is fundamentally possible because gluons, the mediators of the strong interaction, can couple to themselves. In particle physics, the interactions between the three quarks that compose hadrons can be described in a diquark model which might be equivalent to the hypothesis of a three-body force. There is growing evidence in the field of nuclear physics that three-body forces exist among the nucleons inside atomic nuclei for many different isotopes (three-nucleon force). See also Faddeev equation Few-body systems N-body problem Hydrogen molecular ion Borromean nucleus Efimov state Chiral perturbation theory
https://en.wikipedia.org/wiki/HCLTech
HCL Technologies Limited, d/b/a HCLTech (formerly Hindustan Computers Pvt. Limited), is an Indian multinational information technology (IT) services and consulting company headquartered in Noida. The founder of HCLTech is Shiv Nadar. It emerged as an independent company in 1991 when HCL entered into the software services business. The company has offices in 52 countries and over 225,944 employees. HCLTech is on the Forbes Global 2000 list. It is among the top 20 largest publicly traded companies in India with a market capitalization of 281,209 crore as of March 2022. It is one of the top Big Tech (India) companies. History Formation and early years In 1976, a group of eight engineers, all former employees of Delhi Cloth & General Mills, led by Shiv Nadar, started a company that would make personal computers. Initially floated as Microcomp Limited, Nadar and his team (which also included Arjun Malhotra, Ajai Chowdhry, D.S. Puri, Yogesh Vaidya and Subhash Arora) started selling teledigital calculators to gather capital for their main product. On 11 August 1976, the company was renamed Hindustan Computers Limited (HCL). The company originally was focused on hardware but, via HCL Technologies, software and services became a main focus. HCL Technologies began as the R&D Division of HCL Enterprise, a company which was a contributor to the development and growth of the IT and computer industry in India. HCL Enterprise developed an indigenous microcomputer in 1978, and a networking OS and client-server architecture in 1983. On 12 November 1991, HCL Technologies was spun off as a separate unit to provide software services. Later subsidiaries included HCL Infosystems and HCL Healthcare. On 12 November 1991, a company called HCL Overseas Limited was incorporated as a provider of technology development services. It received the certificate of commencement of business on 10 February 1992 after which it began its operations. Two years later, in July 1994, the company nam
https://en.wikipedia.org/wiki/Two-photon%20physics
Two-photon physics, also called gamma–gamma physics, is a branch of particle physics that describes the interactions between two photons. Normally, beams of light pass through each other unperturbed. Inside an optical material, and if the intensity of the beams is high enough, the beams may affect each other through a variety of non-linear effects. In pure vacuum, some weak scattering of light by light exists as well. Also, above some threshold of this center-of-mass energy of the system of the two photons, matter can be created. Astronomy Cosmological/intergalactic gamma rays Photon–photon interactions limit the spectrum of observed gamma-ray photons at moderate cosmological distances to a photon energy below around 20 GeV, that is, to a wavelength of greater than approximately . This limit reaches up to around 20 TeV at merely intergalactic distances. An analogy would be light traveling through a fog: At near distances a light source is more clearly visible than can at long distances due to the scattering of light by fog particles. Similarly, the further a gamma-ray travels through the universe, the more likely it is to be scattered by an interaction with a low energy photon from the extragalactic background light. At those energies and distances, very high energy gamma-ray photons have a significant probability of a photon-photon interaction with a low energy background photon from the extragalactic background light resulting in either the creation of particle-antiparticle pairs via direct pair production or (less often) by photon-photon scattering events that lower the incident photon energies. This renders the universe effectively opaque to very high energy photons at intergalactic to cosmological distances. Experiments Two-photon physics can be studied with high-energy particle accelerators, where the accelerated particles are not the photons themselves but charged particles that will radiate photons. The most significant studies so far were performed at
https://en.wikipedia.org/wiki/Bloom%20%28novel%29
Bloom, written in 1998, is the fifth science fiction novel written by Wil McCarthy. It was first released as a hardcover in September 1998. Almost a year later, in August 1999, its first mass market edition was published. An ebook reprint was published in 2011. Bloom is one of Borders' "Best 10 Books of 1998" and is a New York Times Notable Book. The premise of the book is how to handle human technology that has evolved beyond human control. Plot summary Bloom is set in the year 2106, in a world where self-replicating nanomachines called "Mycora" have consumed Earth and other planets of the inner Solar System, forcing humankind to eke out a bleak living in the asteroids and Galilean moons. Two groups of humanity are described—The Immunity, who use "ladderdown" technology and augmented reality and live on the moons of Jupiter, and The Gladholders, who use human intelligence amplification and artificial intelligence and live in the asteroid belt. The story begins on Ganymede with an article about a "bloom", or outbreak of Mycora, that serves to emphasize the danger and horror of this technogenic life (TGL). The article is written by Strasheim, the primary narrator character. He is first seen in the office of Chief of Immunology Lottick, the effective ruler of Ganymede, who has called him there for an unknown purpose. Lottick tells Strasheim that the Mycora have apparently been stealing or assimilating human designed defensive nanotech and may soon develop resistance to the coldness of the outer Solar System, which incites concern. It is planned to send mission to drop TGL detectors onto the polar ice caps of Mars, Earth, and the Moon, and Lottick asks Strasheim to go along as a reporter. For the longer term, a starship is being constructed to colonize other star systems before the Mycora. Strasheim agrees, and goes to meet the other crew-members and inspect the ship, which is called the Louis Pasteur. The ship is technologically camouflaged to protect the cre
https://en.wikipedia.org/wiki/Multiresolution%20analysis
A multiresolution analysis (MRA) or multiscale approximation (MSA) is the design method of most of the practically relevant discrete wavelet transforms (DWT) and the justification for the algorithm of the fast wavelet transform (FWT). It was introduced in this context in 1988/89 by Stephane Mallat and Yves Meyer and has predecessors in the microlocal analysis in the theory of differential equations (the ironing method) and the pyramid methods of image processing as introduced in 1981/83 by Peter J. Burt, Edward H. Adelson and James L. Crowley. Definition A multiresolution analysis of the Lebesgue space consists of a sequence of nested subspaces that satisfies certain self-similarity relations in time-space and scale-frequency, as well as completeness and regularity relations. Self-similarity in time demands that each subspace Vk is invariant under shifts by integer multiples of 2k. That is, for each the function g defined as also contained in . Self-similarity in scale demands that all subspaces are time-scaled versions of each other, with scaling respectively dilation factor 2k-l. I.e., for each there is a with . In the sequence of subspaces, for k>l the space resolution 2l of the l-th subspace is higher than the resolution 2k of the k-th subspace. Regularity demands that the model subspace V0 be generated as the linear hull (algebraically or even topologically closed) of the integer shifts of one or a finite number of generating functions or . Those integer shifts should at least form a frame for the subspace , which imposes certain conditions on the decay at infinity. The generating functions are also known as scaling functions or father wavelets. In most cases one demands of those functions to be piecewise continuous with compact support. Completeness demands that those nested subspaces fill the whole space, i.e., their union should be dense in , and that they are not too redundant, i.e., their intersection should only contain the zero element.
https://en.wikipedia.org/wiki/Euro%20calculator
A euro calculator is a type of calculator in European countries (see eurozone) that adopted the euro as their official monetary unit. It functions like any other normal calculator, but it also includes a special function which allows one to convert a value expressed in the previously official unit (the peseta in Spain, for example) to the new value in euros, or vice versa. Its use became very popular within the population and commerce of these countries especially during the first few months after adopting the euro. As so many were produced, they are also found outside the eurozone to help staff with conversions at airports or railway stations where the euro has a strong presence.
https://en.wikipedia.org/wiki/Chasman%E2%80%93Green%20lattice
The Chasman–Green lattice, also known as a double bend achromat lattice (DBA lattice), is a special periodic arrangement of magnets designed by Renate Chasman and George Kenneth Green of Brookhaven National Laboratory in the mid-1970s for synchrotrons. This lattice provides optimized bending and focusing of electrons in storage rings designed for synchrotron light sources. An electron storage ring constructed with a Chasman–Green lattice has the important property that the circulating electron beams have very low emittance, which results in the emission of synchrotron light of exceptional brightness. For this reason it is the lattice of choice for most of the premier synchrotron light source facilities worldwide. Each period of the Chasman–Green lattice contains a focusing quadrupole magnet symmetrically located between a pair of identical dipole magnets, which transports incident electrons through a bending arc to an exit path that is independent of the electron energy.
https://en.wikipedia.org/wiki/Content-addressable%20network
The content-addressable network (CAN) is a distributed, decentralized P2P infrastructure that provides hash table functionality on an Internet-like scale. CAN was one of the original four distributed hash table proposals, introduced concurrently with Chord, Pastry, and Tapestry. Overview Like other distributed hash tables, CAN is designed to be scalable, fault tolerant, and self-organizing. The architectural design is a virtual multi-dimensional Cartesian coordinate space, a type of overlay network, on a multi-torus. This n-dimensional coordinate space is a virtual logical address, completely independent of the physical location and physical connectivity of the nodes. Points within the space are identified with coordinates. The entire coordinate space is dynamically partitioned among all the nodes in the system such that every node possesses at least one distinct zone within the overall space. Routing A CAN node maintains a routing table that holds the IP address and virtual coordinate zone of each of its neighbors. A node routes a message towards a destination point in the coordinate space. The node first determines which neighboring zone is closest to the destination point, and then looks up that zone's node's IP address via the routing table. Node joining To join a CAN, a joining node must: Find a node already in the overlay network. Identify a zone that can be split Update the routing tables of nodes neighboring the newly split zone. To find a node already in the overlay network, bootstrapping nodes may be used to inform the joining node of IP addresses of nodes currently in the overlay network. After the joining node receives an IP address of a node already in the CAN, it can attempt to identify a zone for itself. The joining node randomly picks a point in the coordinate space and sends a join request, directed to the random point, to one of the received IP addresses. The nodes already in the overlay network route the join request to the correct d
https://en.wikipedia.org/wiki/History%20of%20general%20relativity
General relativity is a theory of gravitation that was developed by Albert Einstein between 1907 and 1915, with contributions by many others after 1915. According to general relativity, the observed gravitational attraction between masses results from the warping of space and time by those masses. Before the advent of general relativity, Newton's law of universal gravitation had been accepted for more than two hundred years as a valid description of the gravitational force between masses, even though Newton himself did not regard the theory as the final word on the nature of gravity. Within a century of Newton's formulation, careful astronomical observation revealed unexplainable differences between the theory and the observations. Under Newton's model, gravity was the result of an attractive force between massive objects. Although even Newton was bothered by the unknown nature of that force, the basic framework was extremely successful at describing motion. However, experiments and observations show that Einstein's description accounts for several effects that are unexplained by Newton's law, such as minute anomalies in the orbits of Mercury and other planets. General relativity also predicts novel effects of gravity, such as gravitational waves, gravitational lensing and an effect of gravity on time known as gravitational time dilation. Many of these predictions have been confirmed by experiment or observation, while others are the subject of ongoing research. General relativity has developed into an essential tool in modern astrophysics. It provides the foundation for the current understanding of black holes, regions of space where gravitational attraction is so strong that not even light can escape. Their strong gravity is thought to be responsible for the intense radiation emitted by certain types of astronomical objects (such as active galactic nuclei or microquasars). General relativity is also part of the framework of the standard Big Bang model of cosmo
https://en.wikipedia.org/wiki/Neosalvarsan
Neosalvarsan is a synthetic chemotherapeutic that is an organoarsenic compound. It became available in 1912 and superseded the more toxic and less water-soluble salvarsan as an effective treatment for syphilis. Because both of these arsenicals carried considerable risk of side effects, they were replaced for this indication by penicillin in the 1940s. Both salvarsan and neosalvarsan were developed in the laboratory of Paul Ehrlich in Frankfurt, Germany. Their discoveries were the result of the first organized team effort to optimize the biological activity of a lead compound through systematic chemical modifications. This scheme is the basis for most modern pharmaceutical research. Both salvarsan and neosalvarsan are prodrugsthat is, they are metabolised into the active drug in the body. Although, like salvarsan, it was originally believed to contain an arsenic-arsenic double bond, this is now known to be incorrect for Salvarsan. Presumably, neosalvarsan also exists as a mixture of differently sized rings with arsenic-arsenic single bonds.
https://en.wikipedia.org/wiki/Codan
Codan Limited is a manufacturer and supplier of communications, metal detection, and mining technology, headquartered in Adelaide, South Australia with revenue of A$348.0 million (2020). Codan Limited is the communications business unit and the parent company of the Codan group, which is engaged in business through its operating segment Radio Communications. This product range is sold to customers in more than 150 countries. In addition to its global service and support network, the Codan group has regional sales offices in Perth (Western Australia), Washington D.C., and Chicago (United States), Victoria, BC, (Canada), Farnham (UK), Cork (Ireland), Florianópolis (Brazil), Penang (Malaysia) and Dubai (United Arab Emirates). The company maintains quality assurance systems approved to the ISO 9001:2000 standard. The company was established in 1959 by three friends from the University of Adelaide: Alastair Wood, Ian Wall and Jim Bettison. The company was established as Electronics, Instrument and Lighting Company Limited (EILCO), renaming as Codan in 1970. Codan was listed on the Australian Stock Exchange in 2003 and expanded into military technology in 2006. In 2005, CEO Mike Heard denied that Codan had knowingly supplied technology to an Al-Qaeda operative in 2001. Mike Heard acted as the company's CEO during the 1990s, and held the position until his retirement in 2010. In 2009, Codan established its Military and Security Division in the US. On 30 June 2012, Codan Limited sold its Satellite Communications assets to CPI International Holding Corp, and its wholly owned subsidiary CPI International, Inc (CPI). In 2016, Codan Defence Electronics was established to "leverage core competencies in military radio and countermine technology." Codan Radio Communications Codan designs and manufactures a range of HF equipment including transceivers (base, portable and mobile), modems, power supplies, amplifiers, antennas and accessories. It also provides HF solutions ran
https://en.wikipedia.org/wiki/Boris%20Babayan
Boris Artashesovich Babayan (; ; born Baku, 20 December 1933) is a Soviet and Russian computer scientist of Armenian descent, notable as the pioneering creator of supercomputers in the former Soviet Union and Russia. Biography Babayan was born in Baku, Soviet Union to an Armenian family. He graduated from the Moscow Institute of Physics and Technology in 1957. He completed his Ph.D. in 1964 and his doctorate of science in 1971. From 1956 to 1996, Babayan worked in the Lebedev Institute of Precision Mechanics and Computer Engineering, where he eventually became chief of the hardware and software division. Babayan and his team built their first computers during the 1950s. In the 1970s, being one of 15 deputies of chief architect V. S. Burtsev, he worked on the first superscalar computer, the Elbrus-1 and programming language Эль-76. Using these computers in 1978, ten years before commercial applications appeared in the West, the Soviet Union developed its missile systems and its nuclear and space programs. A team headed by Babayan designed Elbrus-3 computer using an architecture named Explicitly Parallel Instruction Computing (EPIC). From 1992 to 2004, Babayan held senior positions in the Moscow Center for SPARC Technology (MCST) and Elbrus International. In these roles he led the development of Elbrus 2000 (single-chip implementation of Elbrus-3) and Elbrus90micro (SPARC computer based on domestically developed microprocessor) projects. Since August 2004, Babayan is the Director of Architecture for the Software and Solutions Group in Intel Corporation and scientific advisor of the Intel R&D center in Moscow. He leads efforts in such areas as compilers, binary translation and security technologies. He became the second European holding the Intel Fellow title (after Norwegian, Tryggve Fossum). Babayan was awarded the two highest honors in the former Soviet Union: the USSR State Prize for his achievements in 1974 in the field of computer-aided design, and the Len
https://en.wikipedia.org/wiki/Calculator%20input%20methods
There are various ways in which calculators interpret keystrokes. These can be categorized into two main types: On a single-step or immediate-execution calculator, the user presses a key for each operation, calculating all the intermediate results, before the final value is shown. On an expression or formula calculator, one types in an expression and then presses a key, such as "=" or "Enter", to evaluate the expression. There are various systems for typing in an expression, as described below. Immediate execution The immediate execution mode of operation (also known as single-step, algebraic entry system (AES) or chain calculation mode) is commonly employed on most general-purpose calculators. In most simple four-function calculators, such as the Windows calculator in Standard mode and those included with most early operating systems, each binary operation is executed as soon as the next operator is pressed, and therefore the order of operations in a mathematical expression is not taken into account. Scientific calculators, including the Scientific mode in the Windows calculator and most modern software calculators, have buttons for brackets and can take order of operation into account. Also, for unary operations, like √ or x2, the number is entered first, then the operator; this is largely because the display screens on these kinds of calculators are generally composed entirely of seven-segment characters and thus capable of displaying only numbers, not the functions associated with them. This mode of operation also makes it impossible to change the expression being input without clearing the display entirely. The first two examples have been given twice. The first version is for simple calculators, showing how it is necessary to rearrange operands in order to get the correct result. The second version is for scientific calculators, where operator precedence is observed. Different forms of operator precedence schemes exist. In the algebraic entry system with
https://en.wikipedia.org/wiki/Jahn%E2%80%93Teller%20effect
The Jahn–Teller effect (JT effect or JTE) is an important mechanism of spontaneous symmetry breaking in molecular and solid-state systems which has far-reaching consequences in different fields, and is responsible for a variety of phenomena in spectroscopy, stereochemistry, crystal chemistry, molecular and solid-state physics, and materials science. The effect is named for Hermann Arthur Jahn and Edward Teller, who first reported studies about it in 1937. Simplified overview The Jahn–Teller effect, sometimes also referred to as Jahn–Teller distortion, describes the geometrical distortion of molecules and ions that results from certain electron configurations. The Jahn–Teller theorem essentially states that any non-linear molecule with a spatially degenerate electronic ground state will undergo a geometrical distortion that removes that degeneracy, because the distortion lowers the overall energy of the species. For a description of another type of geometrical distortion that occurs in crystals with substitutional impurities see article off-center ions. Transition metal chemistry The Jahn–Teller effect is most often encountered in octahedral complexes of the transition metals. The phenomenon is very common in six-coordinate copper(II) complexes. The d9 electronic configuration of this ion gives three electrons in the two degenerate eg orbitals, leading to a doubly degenerate electronic ground state. Such complexes distort along one of the molecular fourfold axes (always labelled the z axis), which has the effect of removing the orbital and electronic degeneracies and lowering the overall energy. The distortion normally takes the form of elongating the bonds to the ligands lying along the z axis, but occasionally occurs as a shortening of these bonds instead (the Jahn–Teller theorem does not predict the direction of the distortion, only the presence of an unstable geometry). When such an elongation occurs, the effect is to lower the electrostatic repulsion between
https://en.wikipedia.org/wiki/ParaHox
The ParaHox gene cluster is an array of homeobox genes (involved in morphogenesis, the regulation of patterns of anatomical development) from the Gsx, Xlox (Pdx) and Cdx gene families. Regulatory gene cluster These genes were first shown to be arranged into a physically-linked chromosomal cluster in amphioxus, an invertebrate with a single member of each of the three gene families. All the ParaHox genes in the amphioxus genome are therefore in the ParaHox gene cluster. In contrast, the human genome has six ParaHox genes (GSX1, GSX2, PDX1, CDX1, CDX2, CDX4), of which three genes (GSX1, PDX1 (=IPF1), CDX2) are physically linked to form a human ParaHox gene cluster on chromosome 13. Mouse has a homologous ParaHox gene cluster on chromosome 5. The other three human ParaHox genes are remnants from duplicated ParaHox gene clusters that were generated in the 2R genome duplications at the base of vertebrate evolution. Some vertebrates, notably chondrichthyan fish and coelacanths, have retained an additional ParaHox gene (PDX2). The ParaHox gene cluster has been proposed to be a paralogue, or evolutionary sister, of the Hox gene cluster; the two gene clusters being descendent from a segmental duplication early in animal evolution, preceding the divergence of cnidarians and bilaterian animals. It has been suggested that an ancient role of the ParaHox gene cluster in bilaterians was the specify or pattern the through-gut, with Gsx patterning the mouth, Xlox (=Pdx) patterning the midgut and Cdx marking the anus. Gene expression and functional data lends tentative support to this hypothesis, although in many animals the roles of the genes have changed in evolution, notably the Gsx gene family which plays a role in brain (not foregut) development in vertebrates.
https://en.wikipedia.org/wiki/Whitney%20umbrella
In geometry, the Whitney umbrella (or Whitney's umbrella, named after American mathematician Hassler Whitney, and sometimes called a Cayley umbrella) is a specific self-intersecting ruled surface placed in three dimensions. It is the union of all straight lines that pass through points of a fixed parabola and are perpendicular to a fixed straight line which is parallel to the axis of the parabola and lies on its perpendicular bisecting plane. Formulas Whitney's umbrella can be given by the parametric equations in Cartesian coordinates where the parameters u and v range over the real numbers. It is also given by the implicit equation This formula also includes the negative z axis (which is called the handle of the umbrella). Properties Whitney's umbrella is a ruled surface and a right conoid. It is important in the field of singularity theory, as a simple local model of a pinch point singularity. The pinch point and the fold singularity are the only stable local singularities of maps from R2 to R3. It is named after the American mathematician Hassler Whitney. In string theory, a Whitney brane is a D7-brane wrapping a variety whose singularities are locally modeled by the Whitney umbrella. Whitney branes appear naturally when taking Sen's weak coupling limit of F-theory. See also Cross-cap Right conoid Ruled surface
https://en.wikipedia.org/wiki/Sigil%20of%20Baphomet
The Sigil of Baphomet is the official insignia of the Church of Satan, first appearing on the cover of The Satanic Mass album in 1968, and adorning the cover of The Satanic Bible the following year. The sigil has been called a "material pentagram" representational of carnality and earthy principles. The Church describes the symbol as the "...preeminent visual distillation of the iconoclastic philosophy of Satanism." History The familiar goat's head inside an inverted pentagram did not become the foremost symbol of Satanism until the founding of the Church of Satan in 1966. The original goat pentagram containing the Hebrew letters at the five points of the pentagram spelling out Leviathan (לויתן), the ancient serpent from the biblical Chaoskampf, first appeared in the book La Clef de la Magie Noire by French occultist Stanislas de Guaita, in 1897. With the pentagram inverted, matter is ruling over spirit, a condition associated with evil. In the book, de Guaita also showed an upright pentagram with the Pentagrammaton (יהשוה) at the vertices of the pentagram. The Pentagrammaton is an esoteric version of the Hebrew name of Jesus, Yeshua, (ישוע) by adding the letter shin (ש) in the middle of the Tetragrammaton divine name Yod-He-Vav-He, (יהוה). The lower four points represented the four elements of the material world, while the uppermost point represented spirit ruling over matter. The upper Pentacle includes the concept of an inverted pentagram being a representation of evil and an upright pentagram symbolizing holiness, which originated with the 19th-century French occultist Eliphas Lévi. Anton LaVey, the founder of the Church of Satan based his “Sigil of Baphomet” on this image of Baphomet. This symbol was later used in Maurice Bessy's book A Pictorial History of Magic and the Supernatural, with the words "Samael" and "Lilith" removed. During his years of research into the "black arts", LaVey had come across this book and added it to his collection. When he chose t
https://en.wikipedia.org/wiki/Umbel
In botany, an umbel is an inflorescence that consists of a number of short flower stalks (called pedicels) that spread from a common point, somewhat like umbrella ribs. The word was coined in botanical usage in the 1590s, from Latin umbella "parasol, sunshade". The arrangement can vary from being flat-topped to almost spherical. Umbels can be simple or compound. The secondary umbels of compound umbels are known as umbellules or umbellets. A small umbel is called an umbellule. The arrangement of the inflorescence in umbels is referred to as umbellate, or occasionally subumbellate (almost umbellate). Umbels are a characteristic of plants such as carrot, parsley, dill, and fennel in the family Apiaceae; ivy, Aralia and Fatsia in the family Araliaceae; and onion (Allium) in the family Alliaceae. An umbel is a type of indeterminate inflorescence. A compressed cyme, which is a determinate inflorescence, is called umbelliform if it resembles an umbel. Gallery
https://en.wikipedia.org/wiki/Photoperiodism
Photoperiodism is the physiological reaction of organisms to the length of night or a dark period. It occurs in plants and animals. Plant photoperiodism can also be defined as the developmental responses of plants to the relative lengths of light and dark periods. They are classified under three groups according to the photoperiods: short-day plants, long-day plants, and day-neutral plants. In animals photoperiodism (sometimes called seasonality) is the suite of physiological changes that occur in response to changes in day length. This allows animals to respond to a temporally changing environment associated with changing seasons as the earth orbits the sun. Plants Many flowering plants (angiosperms) use a circadian rhythm together with photoreceptor protein, such as phytochrome or cryptochrome, to sense seasonal changes in night length, or photoperiod, which they take as signals to flower. In a further subdivision, obligate photoperiodic plants absolutely require a long or short enough night before flowering, whereas facultative photoperiodic plants are more likely to flower under one condition. Phytochrome comes in two forms: Pr and Pfr. Red light (which is present during the day) converts phytochrome to its active form (Pfr) which then stimulates various processes such as germination, flowering or branching. In comparison, plants receive more far-red in the shade, and this converts phytochrome from Pfr to its inactive form, Pr, inhibiting germination. This system of Pfr to Pr conversion allows the plant to sense when it is night and when it is day. Pfr can also be converted back to Pr by a process known as dark reversion, where long periods of darkness trigger the conversion of Pfr. This is important in regards to plant flowering. Experiments by Halliday et al. showed that manipulations of the red-to far-red ratio in Arabidopsis can alter flowering. They discovered that plants tend to flower later when exposed to more red light, proving that red light i
https://en.wikipedia.org/wiki/SEAC%20%28computer%29
SEAC (Standards Eastern Automatic Computer or Standards Electronic Automatic Computer) was a first-generation electronic computer, built in 1950 by the U.S. National Bureau of Standards (NBS) and was initially called the National Bureau of Standards Interim Computer, because it was a small-scale computer designed to be built quickly and put into operation while the NBS waited for more powerful computers to be completed (the DYSEAC). The team that developed SEAC was organized by Samuel N. Alexander. SEAC was demonstrated in April 1950 and was dedicated in June 1950; it is claimed to be the first fully operational stored-program electronic computer in the US. Description Based on EDVAC, SEAC used only 747 vacuum tubes (a small number for the time) eventually expanded to 1,500 tubes. It had 10,500 germanium diodes which performed all of the logic functions (see the article diode–transistor logic for the working principles of diode logic), later expanded to 16,000 diodes. It was the first computer to do most of its logic with solid-state devices. The tubes were used for amplification, inversion and storing information in dynamic flip-flops. The machine used 64 acoustic delay lines to store 512 words of memory, with each word being 45 bits in size. The clock rate was kept low (1 MHz). The computer's instruction set consisted of only 11 types of instructions: fixed-point addition, subtraction, multiplication, and division; comparison, and input & output. It eventually expanded to 16 instructions. The addition time was 864 microseconds and the multiplication time was 2,980 microseconds (i.e. close to 3 milliseconds). Weight: (central machine). Applications On some occasions SEAC was used by a remote teletype. This makes it one of the first computers to be used remotely. With many modifications, it was used until 1964. Some of the problems run on it dealt with: digital imaging, led by Russell A. Kirsch computer animation of the city traffic simulation meteorol
https://en.wikipedia.org/wiki/Hilbert%27s%20twenty-first%20problem
The twenty-first problem of the 23 Hilbert problems, from the celebrated list put forth in 1900 by David Hilbert, concerns the existence of a certain class of linear differential equations with specified singular points and monodromic group. Statement The original problem was stated as follows (English translation from 1902): Proof of the existence of linear differential equations having a prescribed monodromic group In the theory of linear differential equations with one independent variable z, I wish to indicate an important problem one which very likely Riemann himself may have had in mind. This problem is as follows: To show that there always exists a linear differential equation of the Fuchsian class, with given singular points and monodromic group. The problem requires the production of n functions of the variable z, regular throughout the complex z-plane except at the given singular points; at these points the functions may become infinite of only finite order, and when z describes circuits about these points the functions shall undergo the prescribed linear substitutions. The existence of such differential equations has been shown to be probable by counting the constants, but the rigorous proof has been obtained up to this time only in the particular case where the fundamental equations of the given substitutions have roots all of absolute magnitude unity. has given this proof, based upon Poincaré's theory of the Fuchsian zeta-functions. The theory of linear differential equations would evidently have a more finished appearance if the problem here sketched could be disposed of by some perfectly general method. Definitions In fact it is more appropriate to speak not about differential equations but about linear systems of differential equations: in order to realise any monodromy by a differential equation one has to admit, in general, the presence of additional apparent singularities, i.e. singularities with trivial local monodromy. In more modern langua
https://en.wikipedia.org/wiki/Quadrupole%20mass%20analyzer
In mass spectrometry, the quadrupole mass analyzer (or quadrupole mass filter) is a type of mass analyzer originally conceived by Nobel laureate Wolfgang Paul and his student Helmut Steinwedel. As the name implies, it consists of four cylindrical rods, set parallel to each other. In a quadrupole mass spectrometer (QMS) the quadrupole is the mass analyzer - the component of the instrument responsible for selecting sample ions based on their mass-to-charge ratio (m/z). Ions are separated in a quadrupole based on the stability of their trajectories in the oscillating electric fields that are applied to the rods. Principle of operation The quadrupole consists of four parallel metal rods. Each opposing rod pair is connected together electrically, and a radio frequency (RF) voltage with a DC offset voltage is applied between one pair of rods and the other. Ions travel down the quadrupole between the rods. Only ions of a certain mass-to-charge ratio will reach the detector for a given ratio of voltages: other ions have unstable trajectories and will collide with the rods. This permits selection of an ion with a particular m/z or allows the operator to scan for a range of m/z-values by continuously varying the applied voltage. Mathematically this can be modeled with the help of the Mathieu differential equation. Ideally, the rods are hyperbolic, however cylindrical rods with a specific ratio of rod diameter-to-spacing provide an easier-to-manufacture adequate approximation to hyperbolas. Small variations in the ratio have large effects on resolution and peak shape. Different manufacturers choose slightly different ratios to fine-tune operating characteristics in context of anticipated application requirements. Since the 1980s, the MAT company and subsequently Finnigan Instrument Corporation used hyperbolic rods produced with a mechanical tolerance of 0.001 mm, whose exact production process was a well-kept secret within the company. Multiple quadrupoles, hybrids and var
https://en.wikipedia.org/wiki/SWAC%20%28computer%29
The SWAC (Standards Western Automatic Computer) was an early electronic digital computer built in 1950 by the U.S. National Bureau of Standards (NBS) in Los Angeles, California. It was designed by Harry Huskey. Overview Like the SEAC which was built about the same time, the SWAC was a small-scale interim computer designed to be built quickly and put into operation while the NBS waited for more powerful computers to be completed (in particular, the RAYDAC by Raytheon). The machine used 2,300 vacuum tubes. It had 256 words of memory, using Williams tubes, with each word being 37 bits. It had only seven basic operations: add, subtract, and fixed-point multiply; comparison, data extraction, input and output. Several years later, drum memory was added. When the SWAC was completed in August 1950, it was the fastest computer in the world. It continued to hold that status until the IAS computer was completed a year later. It could add two numbers and store the result in 64 microseconds. A similar multiplication took 384 microseconds. It was used by the NBS until 1954 when the Los Angeles office was closed, and then by UCLA until 1967 (with modifications). It was charged out there for $40 per hour. In January 1952, Raphael M. Robinson used the SWAC to discover five Mersenne primes—the largest prime numbers known at the time, with 157, 183, 386, 664 and 687 digits. Additionally, the SWAC was vital in doing the intense calculation required for the X-ray analysis of the structure of vitamin B12 done by Dorothy Hodgkin. This was fundamental in Hodgkin receiving the Nobel Prize in Chemistry in 1964. See also List of vacuum tube computers
https://en.wikipedia.org/wiki/149%20%28number%29
149 (one hundred [and] forty-nine) is the natural number between 148 and 150. In mathematics 149 is a prime number, the first prime whose difference from the previous prime is exactly 10, an emirp, and an irregular prime. After 1 and 127, it is the third smallest de Polignac number, an odd number that cannot be represented as a prime plus a power of two. More strongly, after 1, it is the second smallest number that is not a sum of two prime powers. It is a tribonacci number, being the sum of the three preceding terms, 24, 44, 81. There are exactly 149 integer points in a closed circular disk of radius 7, and exactly 149 ways of placing six queens (the maximum possible) on a 5 × 5 chess board so that each queen attacks exactly one other. The barycentric subdivision of a tetrahedron produces an abstract simplicial complex with exactly 149 simplices. See also The year AD 149 or 149 BC List of highways numbered 149
https://en.wikipedia.org/wiki/Sagittaria
Sagittaria is a genus of about 30 species of aquatic plants whose members go by a variety of common names, including arrowhead, duck potato, swamp potato, tule potato, and wapato. Most are native to South, Central, and North America, but there are also some from Europe, Africa, and Asia. Description Sagittaria plant stock (the perennial rhizome) is a horizontal creeper (stoloniferous) and obliquely obovate, the margins winged, with apical or ventral beak; in other words, they are a small, dry, one-seeded fruit that do not open to release the seed, set on a slant, narrower at the base, with winged edges, and having a "beaked" aperture (one side longer than the other) for sprouting, set above or below the fruit body. One of the names for this plant is derived from the edible underwater tuber that the plant produces. In late fall or early spring, disturbing the aquatic mud in which the plant grows will cause its small tubers to float to the surface where they can be harvested and then boiled. Uses Many species have edible roots, prized for millennia as a reliable source of starch and carbohydrates, even during the winter. Some are edible raw, though are less bitter when cooked. They can be harvested by hand or by treading with one's feet in the mud. They are easy to propagate by replanting the roots. Species Accepted species: Sagittaria aginashii Makino – Japan, Korea, Primorye Sagittaria ambigua J.G.Sm. – Missouri Arrowhead – from Oklahoma to Indiana Sagittaria australis (J.G.Sm.) Small – Appalachian Arrowhead – southeastern US from Louisiana to Florida and as far north as Iowa and New Jersey Sagittaria brevirostra Mack. & Bush – Shortbeak Arrowhead – central US (Great Plains, Mississippi and Ohio Valleys, Great Lakes); also Virginia and Saskatchewan; naturalized in California Sagittaria chapmanii (J.G.Sm.) C.Mohr – from Texas to the Carolinas Sagittaria cristata Engelm – Crested arrowhead – Great Lakes region Sagittaria cuneata E.P.Sheld. – Wapato, No
https://en.wikipedia.org/wiki/Hydrolyzed%20protein
Hydrolyzed protein is a solution derived from the hydrolysis of a protein into its component amino acids and peptides. While many means of achieving this exist, most common is prolonged heating with hydrochloric acid, sometimes with an enzyme such as pancreatic protease to simulate the naturally occurring hydrolytic process. Uses Protein hydrolysis is a useful route to the isolation of individual amino acids. Examples include cystine from hydrolysis of hair, tryptophane from casein, histidine from red blood cells, and arginine from gelatin. Common hydrolyzed products used in food are hydrolyzed vegetable protein and yeast extract, which are used as flavor enhancers because the hydrolysis of the protein produces free glutamic acid. Some hydrolyzed beef protein powders are used for specialized diets. Protein hydrolysis can be used to modify the allergenic properties of infant formula. Reducing the size of cow milk proteins in the formula makes it more suitable for consumption by babies suffering from milk protein intolerance. The US FDA has approved a label for this usage of partially-hydrolyzed proteins in 2017, but a meta-analysis published the same year shows insufficient evidence for this use. Hydrolyzed protein is also used in certain specially formulated hypoallergenic pet foods, notably dog foods for dogs and puppies that suffer from allergies caused by certain protein types in standard commercial dog food brands. The protein contents of the foods are split into peptides which reduces the likelihood for an animal's immune system recognizing an allergic threat. Hydrolyzed protein diets for cats are often recommended for felines with food allergies and certain types of digestive issues. See also Acceptable daily intake Acid-hydrolyzed vegetable protein E number Food allergy Food intolerance Food labeling regulations Glutamic acid Monosodium glutamate Protein allergy
https://en.wikipedia.org/wiki/Beau%27s%20lines
Beau's lines are deep grooved lines that run from side to side on the fingernail or the toenail. They may look like indentations or ridges in the nail plate. This condition of the nail was named by a French physician, Joseph Honoré Simon Beau (1806–1865), who first described it in 1846. Signs and symptoms Beau's lines are horizontal, going across the nailline, and should not be confused with vertical ridges going from the bottom (cuticle) of the nail out to the fingertip. These vertical lines are usually a natural consequence of aging and are harmless. Beau's lines should also be distinguished from Muehrcke's lines of the fingernails. While Beau's lines are actual ridges and indentations in the nail plate, Muehrcke lines are areas of hypopigmentation without palpable ridges; they affect the underlying nail bed, and not the nail itself. Beau's lines should also be distinguished from Mees' lines of the fingernails, which are areas of discoloration in the nail plate. As the nail grows out, the ridge in the nail can be seen to move upwards until it reaches the fingertip. When it reaches this point the fingertips can become sore for a few days as the nail bed is exposed by the misshapen nail. Causes There are several causes of Beau's lines. It is believed that there is a temporary cessation of cell division in the nail matrix. This may be caused by an infection or problem in the nail fold, where the nail begins to form, or it may be caused by an injury to that area. Some other reasons for these lines include trauma, coronary occlusion, hypocalcaemia, and skin disease. They may be a sign of systemic disease, or may also be caused by an illness of the body, as well as drugs used in chemotherapy, or malnutrition. Beau's lines can also be seen one to two months after the onset of fever in children with Kawasaki disease. Conditions also associated with Beau's lines include uncontrolled diabetes and peripheral vascular disease, as well as illnesses associated
https://en.wikipedia.org/wiki/Chloropicrin
Chloropicrin, also known as PS and nitrochloroform, is a chemical compound currently used as a broad-spectrum antimicrobial, fungicide, herbicide, insecticide, and nematicide. It was used as a poison gas in World War I. Its chemical structural formula is Cl3CNO2. Synthesis Chloropicrin was discovered in 1848 by Scottish chemist John Stenhouse. He prepared it by the reaction of sodium hypochlorite with picric acid: HOC6H2(NO2)3 + 11 NaOCl → 3 Cl3CNO2 + 3 Na2CO3 + 3 NaOH + 2 NaCl Because of the precursor used, Stenhouse named the compound chloropicrin, although the two compounds are structurally dissimilar. Today, chloropicrin is manufactured by the reaction of nitromethane with sodium hypochlorite: H3CNO2 + 3 NaOCl → Cl3CNO2 + 3 NaOH or by the reaction of chloroform with nitric acid: CHCl3 + HNO3 → CCl3NO2 + H2O Properties Chloropicrin's chemical formula is CCl3NO2 and its molecular weight is 164.38 grams/mole. Pure chloropicrin is a colorless liquid, with a boiling point of 112 °C. Chloropicrin is sparingly soluble in water with solubility of 2 g/L at 25 °C. It is volatile, with a vapor pressure of 23.2 millimeters of mercury (mmHg) at 25 °C; the corresponding Henry's law constant is 0.00251 atmosphere-cubic meter per mole. The octanol-water partition coefficient (Kow) of chloropicrin is estimated to be 269. Its soil adsorption coefficient (Koc; normalized to soil organic matter content) is 25 cm3/g. Use Chloropicrin was manufactured for use as poison gas in World War I. In agriculture, chloropicrin is injected into soil prior to planting a crop to fumigate soil. Chloropicrin affects a broad spectrum of fungi, microbes, and insects. It is commonly used as a stand-alone treatment or in combination / co-formulation with methyl bromide and 1,3-dichloropropene. Chloropicrin is used as an indicator and repellent when fumigating residences for insects with sulfuryl fluoride which is an odorless gas. Chloropicrin's mode of action is unknown (IRAC MoA 8B). Chloropicr
https://en.wikipedia.org/wiki/Statistical%20time-division%20multiplexing
Statistical multiplexing is a type of communication link sharing, very similar to dynamic bandwidth allocation (DBA). In statistical multiplexing, a communication channel is divided into an arbitrary number of variable bitrate digital channels or data streams. The link sharing is adapted to the instantaneous traffic demands of the data streams that are transferred over each channel. This is an alternative to creating a fixed sharing of a link, such as in general time division multiplexing (TDM) and frequency division multiplexing (FDM). When performed correctly, statistical multiplexing can provide a link utilization improvement, called the statistical multiplexing gain. Statistical multiplexing is facilitated through packet mode or packet-oriented communication, which among others is utilized in packet switched computer networks. Each stream is divided into packets that normally are delivered asynchronously in a first-come first-served fashion. In alternative fashion, the packets may be delivered according to some scheduling discipline for fair queuing or differentiated and/or guaranteed quality of service. Statistical multiplexing of an analog channel, for example a wireless channel, is also facilitated through the following schemes: Random frequency-hopping orthogonal frequency division multiple access (RFH-OFDMA) Code-division multiple access (CDMA), where different amount of spreading codes or spreading factors can be assigned to different users. Statistical multiplexing normally implies "on-demand" service rather than one that preallocates resources for each data stream. Statistical multiplexing schemes do not control user data transmissions. Comparison with static TDM Time domain statistical multiplexing (packet mode communication) is similar to time-division multiplexing (TDM), except that, rather than assigning a data stream to the same recurrent time slot in every TDM, each data stream is assigned time slots (of fixed length) or data fra
https://en.wikipedia.org/wiki/Banks%E2%80%93Zaks%20fixed%20point
In quantum chromodynamics (and also N = 1 super quantum chromodynamics) with massless flavors, if the number of flavors, Nf, is sufficiently small (i.e. small enough to guarantee asymptotic freedom, depending on the number of colors), the theory can flow to an interacting conformal fixed point of the renormalization group. If the value of the coupling at that point is less than one (i.e. one can perform perturbation theory in weak coupling), then the fixed point is called a Banks–Zaks fixed point. The existence of the fixed point was first reported in 1974 by Belavin and Migdal and by Caswell, and later used by Banks and Zaks in their analysis of the phase structure of vector-like gauge theories with massless fermions. The name Caswell–Banks–Zaks fixed point is also used. More specifically, suppose that we find that the beta function of a theory up to two loops has the form where and are positive constants. Then there exists a value such that : If we can arrange to be smaller than , then we have . It follows that when the theory flows to the IR it is a conformal, weakly coupled theory with coupling . For the case of a non-Abelian gauge theory with gauge group and Dirac fermions in the fundamental representation of the gauge group for the flavored particles we have where is the number of colors and the number of flavors. Then should lie just below in order for the Banks–Zaks fixed point to appear. Note that this fixed point only occurs if, in addition to the previous requirement on (which guarantees asymptotic freedom), where the lower bound comes from requiring . This way remains positive while is still negative (see first equation in article) and one can solve with real solutions for . The coefficient was first correctly computed by Caswell, while the earlier paper by Belavin and Migdal has a wrong answer. See also Beta function
https://en.wikipedia.org/wiki/Isotropic%20radiation
Isotropic radiation is radiation that has the same intensity regardless of the direction of measurement, such as would be found in a thermal cavity. This can be electromagnetic radiation, sound, or elementary particles. Radiation
https://en.wikipedia.org/wiki/Isotropic%20bands
In physiology, isotropic bands (better known as I bands) are the lighter bands of skeletal muscle cells (a.k.a. muscle fibers). Isotropic bands contain only actin-containing thin filaments. The darker bands are called anisotropic bands (A bands). Together the I-bands and A-bands contribute to the striated appearance of skeletal muscle. Isotropic bands indicate the behavior of polarized light as it passes through I bands. Diagrams provide an indication of what I and A Bands look like, through a microscope.
https://en.wikipedia.org/wiki/Passive%20infrared%20sensor
A passive infrared sensor (PIR sensor) is an electronic sensor that measures infrared (IR) light radiating from objects in its field of view. They are most often used in PIR-based motion detectors. PIR sensors are commonly used in security alarms and automatic lighting applications. PIR sensors detect general movement, but do not give information on who or what moved. For that purpose, an imaging IR sensor is required. PIR sensors are commonly called simply "PIR", or sometimes "PID", for "passive infrared detector". The term passive refers to the fact that PIR devices do not radiate energy for detection purposes. They work entirely by detecting infrared radiation (radiant heat) emitted by or reflected from objects. Operating principles All objects with a temperature above absolute zero emit heat energy in the form of electromagnetic radiation. Usually this radiation isn't visible to the human eye because it radiates at infrared wavelengths, but it can be detected by electronic devices designed for such a purpose. PIR-based motion detector A PIR-based motion detector is used to sense movement of people, animals, or other objects. They are commonly used in burglar alarms and automatically activated lighting systems. Operation A PIR sensor can detect changes in the amount of infrared radiation impinging upon it, which varies depending on the temperature and surface characteristics of the objects in front of the sensor. When an object, such as a person, passes in front of the background, such as a wall, the temperature at that point in the sensor's field of view will rise from room temperature to body temperature, and then back again. The sensor converts the resulting change in the incoming infrared radiation into a change in the output voltage, and this triggers the detection. Objects of similar temperature but different surface characteristics may also have a different infrared emission pattern, and thus moving them with respect to the background may trig
https://en.wikipedia.org/wiki/Voluntary%20Human%20Extinction%20Movement
The Voluntary Human Extinction Movement (VHEMT) is an environmental movement that calls for all people to abstain from reproduction in order to cause the gradual voluntary extinction of humankind. VHEMT supports human extinction primarily because, in the group's view, it would prevent environmental degradation. The group states that a decrease in the human population would prevent a significant amount of human-caused suffering. The extinctions of non-human species and the scarcity of resources caused by humans are frequently cited by the group as evidence of the harm caused by human overpopulation. VHEMT was founded in 1991 by Les U. Knight, an American activist who became involved in the American environmental movement in the 1970s and thereafter concluded that human extinction was the best solution to the problems facing the Earth's biosphere and humanity. Knight publishes the group's newsletter and serves as its spokesman. Although the group is promoted by a website and represented at some environmental events, it relies heavily on coverage from outside media to spread its message. Many commentators view its platform as unacceptably extreme, while endorsing the logic of reducing the rate of human reproduction. In response to VHEMT, some journalists and academics have argued that humans can develop sustainable lifestyles or can reduce their population to sustainable levels. Others maintain that, whatever the merits of the idea, the human reproductive drive will prevent humankind from ever voluntarily seeking extinction. History The Voluntary Human Extinction Movement was founded by Les U. Knight, a graduate of Western Oregon University and high school substitute teacher living in Portland, Oregon. After becoming involved in the environmental movement as a college student in the 1970s, Knight attributed most of the dangers faced by the planet to human overpopulation. He joined the Zero Population Growth organization, and chose to be vasectomised at age 25. He la
https://en.wikipedia.org/wiki/Tuple%20space
A tuple space is an implementation of the associative memory paradigm for parallel/distributed computing. It provides a repository of tuples that can be accessed concurrently. As an illustrative example, consider that there are a group of processors that produce pieces of data and a group of processors that use the data. Producers post their data as tuples in the space, and the consumers then retrieve data from the space that match a certain pattern. This is also known as the blackboard metaphor. Tuple space may be thought as a form of distributed shared memory. Tuple spaces were the theoretical underpinning of the Linda language developed by David Gelernter and Nicholas Carriero at Yale University in 1986. Implementations of tuple spaces have also been developed for Java (JavaSpaces), Lisp, Lua, Prolog, Python, Ruby, Smalltalk, Tcl, and the .NET Framework. Object Spaces Object Spaces is a paradigm for development of distributed computing applications. It is characterized by the existence of logical entities, called Object Spaces. All the participants of the distributed application share an Object Space. A provider of a service encapsulates the service as an Object, and puts it in the Object Space. Clients of a service then access the Object Space, find out which object provides the needed service, and have the request serviced by the object. Object Spaces, as a computing paradigm, was put forward in the 1980s by David Gelernter at Yale University. Gelernter developed a language called Linda to support the concept of global object coordination. Object Space can be thought of as a virtual repository, shared amongst providers and accessors of network services, which are themselves abstracted as objects. Processes communicate among each other using these shared objects — by updating the state of the objects as and when needed. An object, when deposited into a space, needs to be registered with an Object Directory in the Object Space. Any processes can then i
https://en.wikipedia.org/wiki/NetStumbler
NetStumbler (also known as Network Stumbler) was a tool for Windows that facilitates detection of Wireless LANs using the 802.11b, 802.11a and 802.11g WLAN standards. It runs on Microsoft Windows operating systems from Windows 2000 to Windows XP. A trimmed-down version called MiniStumbler is available for the handheld Windows CE operating system. Netstumbler has become one of the most popular programs for wardriving and wireless reconnaissance, although it has a disadvantage. It can be detected easily by most intrusion detection system, because it actively probes a network to collect information. Netstumbler has integrated support for a GPS unit. With this support, Netstumbler displays GPS coordinate information next to the information about each discovered network, which can be useful for finding specific networks again after having sorted out collected data. The program is commonly used for: Verifying network configurations Finding locations with poor coverage in a WLAN Detecting causes of wireless interference Detecting unauthorized ("rogue") access points Aiming directional antennas for long-haul WLAN links No updated version has been developed since 2004. See also InSSIDer was created as an alternative to Network Stumbler for the current generation of Windows operating system Kismet for Linux, FreeBSD, NetBSD, OpenBSD, and Mac OS X KisMAC for Mac OS X
https://en.wikipedia.org/wiki/Gluon%20condensate
In quantum chromodynamics (QCD), the gluon condensate is a non-perturbative property of the QCD vacuum which could be partly responsible for giving masses to light mesons. If the gluon field tensor is represented as Gμν, then the gluon condensate is the vacuum expectation value . It is not clear yet whether this condensate is related to any of the known phase changes in quark matter. There have been scattered studies of other types of gluon condensates, involving a different number of gluon fields. For more on the context in which this quantity occurs, see the article on the QCD vacuum. See also Quantum chromodynamics QCD vacuum and chiral condensates Vacuum in quantum field theory Quark–gluon plasma QCD matter
https://en.wikipedia.org/wiki/Allen%20Telescope%20Array
The Allen Telescope Array (ATA), formerly known as the One Hectare Telescope (1hT), is a radio telescope array dedicated to astronomical observations and a simultaneous search for extraterrestrial intelligence (SETI). The array is situated at the Hat Creek Radio Observatory in Shasta County, northeast of San Francisco, California. The project was originally developed as a joint effort between the SETI Institute and the Radio Astronomy Laboratory (RAL) at the University of California, Berkeley (UC Berkeley), with funds obtained from an initial 12.5 million donation by the Paul G. Allen Family Foundation and Nathan Myhrvold. The first phase of construction was completed and the ATA finally became operational on 11 October 2007 with 42 antennas (ATA-42), after Paul Allen (co-founder of Microsoft) had pledged an additional $13.5 million to support the construction of the first and second phases. Although overall Allen has contributed more than $30 million to the project, it has not succeeded in building the 350 6.1 m (20 ft) dishes originally conceived, and the project suffered an operational hiatus due to funding shortfalls between April and August 2011, after which observations resumed. Subsequently, UC Berkeley exited the project, completing divestment in April 2012. The facility is now managed by SRI International (formerly Stanford Research Institute), an independent, nonprofit research institute. As of 2016, the SETI Institute performs observations with the ATA between the hours of 6 pm and 6 am daily. In August 2014, the installation was threatened by a forest fire in the area and was briefly forced to shut down, but ultimately emerged largely unscathed. Overview First conceived by SETI pioneer Frank Drake, the idea has been a dream of the SETI Institute for years. However, it was not until early 2001 that research and development began, after a donation of $11.5 million by the Paul G. Allen Family Foundation. In March 2004, following the successful comple
https://en.wikipedia.org/wiki/Bjerrum%20length
The Bjerrum length (after Danish chemist Niels Bjerrum 1879–1958 ) is the separation at which the electrostatic interaction between two elementary charges is comparable in magnitude to the thermal energy scale, , where is the Boltzmann constant and is the absolute temperature in kelvins. This length scale arises naturally in discussions of electrostatic, electrodynamic and electrokinetic phenomena in electrolytes, polyelectrolytes and colloidal dispersions. In standard units, the Bjerrum length is given by where is the elementary charge, is the relative dielectric constant of the medium and is the vacuum permittivity. For water at room temperature , so that In Gaussian units, and the Bjerrum length has the simpler form The relative permittivity εr of water decreases so strongly with temperature that the product (εr·T) decreases. Therefore, in spite of the (1/T) relation, the Bjerrum length λB increases with temperature, as shown in the graph. See also Debye length Debye–Hückel equation Shielding effect Screening effect Electrical double layer, (EDL) Brownian motion
https://en.wikipedia.org/wiki/Red%20edge
Red edge refers to the region of rapid change in reflectance of vegetation in the near infrared range of the electromagnetic spectrum. Chlorophyll contained in vegetation absorbs most of the light in the visible part of the spectrum but becomes almost transparent at wavelengths greater than 700 nm. The cellular structure of the vegetation then causes this infrared light to be reflected because each cell acts something like an elementary corner reflector. The change can be from 5% to 50% reflectance going from 680 nm to 730 nm. This is an advantage to plants in avoiding overheating during photosynthesis. For a more detailed explanation and a graph of the photosynthetically active radiation (PAR) spectral region, see . The phenomenon accounts for the brightness of foliage in infrared photography and is extensively utilized in the form of so-called vegetation indices (e.g. Normalized difference vegetation index). It is used in remote sensing to monitor plant activity, and it has been suggested that it could be useful to detect light-harvesting organisms on distant planets. See also