source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Diastasis%20%28physiology%29
In physiology, diastasis is the middle stage of diastole during the cycle of a heartbeat, where the initial passive filling of the heart's ventricles has slowed, but before the atria contract to complete the active filling. Diastasis is the longest phase of cardiac cycle. See also Compare diastasis (pathology)
https://en.wikipedia.org/wiki/RNA%20spike-in
An RNA spike-in is an RNA transcript of known sequence and quantity used to calibrate measurements in RNA hybridization assays, such as DNA microarray experiments, RT-qPCR, and RNA-Seq. A spike-in is designed to bind to a DNA molecule with a matching sequence, known as a control probe. This process of specific binding is called hybridization. A known quantity of RNA spike-in is mixed with the experiment sample during preparation. The degree of hybridization between the spike-ins and the control probes is used to normalize the hybridization measurements of the sample RNA. History Nucleic acid hybridization assays have been used for decades to detect specific sequences of DNA or RNA, with a DNA microarray precursor used as early as 1965. In such assays, positive control oligonucleotides are necessary to provide a standard for comparison of target sequence concentration, and to check and correct for nonspecific binding; that is, incidental binding of the RNA to non-complementary DNA sequences. These controls became known as "spike-ins". With the advent of DNA microarray chips in the 1990s and the commercialization of high-throughput methods for sequencing and RNA detection assays, manufacturers of hybridization assay "kits" started to provide pre-developed spike-ins. In the case of gene expression assay microarrays or RNA sequencing (RNA-seq), RNA spike-ins are used. Manufacturing RNA spike-ins can be synthesized by any means of creating RNA synthetically, or by using cells to transcribe DNA to RNA in vivo (in cells). RNA can be produced in vitro (cell free) using RNA polymerase and DNA with the desired sequence. Large scale biotech manufacturers produce RNA synthetically via high-throughput techniques and provide solutions of RNA spike-ins at predetermined concentration. Bacteria containing DNA (usually on plasmids) for transcription to spike-ins are also commercially available. The purified RNA can be stored long-term in a buffered solution at low temperature.
https://en.wikipedia.org/wiki/Marjorie%20Guthrie
Marjorie Guthrie ( Greenblatt; October 6, 1917 – March 13, 1983), who used Marjorie Mazia as her professional name, was a dancer, dance teacher, and health science activist. She was married to folk musician Woody Guthrie. Her children with him include folk musician Arlo Guthrie and Woody Guthrie Publications president Nora Guthrie. She was a principal dancer with the Martha Graham Company. With Graham's permission, she started her own dance studio where she taught Graham methods and style. Due to her husband's affliction with Huntington's disease, she became an activist, founding a predecessor of the Huntington's Disease Society of America. Life and work Marjorie Greenblatt () was born in Atlantic City, New Jersey, United States, on October 6, 1917 to Aliza Waitzman and Izadore Greenblatt. Her parents were Jewish immigrants from Ukraine. She had three brothers - David, Herbert and Bernard - and one sister, Gertrude. In 1935, after graduation from the Overbrook High School in Philadelphia, Pennsylvania, Marjorie moved to New York City on scholarship and joined the Martha Graham Dance Company. As a core company member, Marjorie appeared in such iconic pieces as "Primitive Mysteries", "American Document", "Every Soul is a Circus", and "Appalachian Spring". She grew to become Graham's assistant for fifteen years and was the first company member invited to teach the Graham technique independently of Martha's own school. Two of Marjorie's early students were Erick Hawkins and Merce Cunningham. Woody Guthrie Mazia was introduced to Guthrie in 1940 through her activity as a Martha Graham dancer. According to the Marjorie Guthrie Project: Mazia and Guthrie wed on November 13, 1945. Together they had four children; Cathy Guthrie (1943-1947), Arlo Guthrie (b. 1947), Joady Guthrie (b. 1948), and Nora Guthrie (1950). Cathy tragically died at age four in a fire. Majorie Mazia School of Dance Mazia founded the Marjorie Mazia School of Dance, located at 1618 Sheepshea
https://en.wikipedia.org/wiki/History%20of%20chiropractic
The history of chiropractic began in 1895 when Daniel David Palmer of Iowa performed the first chiropractic adjustment on a partially deaf janitor, Harvey Lillard. Palmer claims to have had principles of chiropractic treatment passed along to him during a seance by a long-dead doctor named Dr. Jim Atkinson. While Lillard was working without his shirt on in Palmer's office, Lillard bent over to empty the trash can. Palmer noticed that Lillard had a vertebra out of position. He asked Lillard what happened, and Lillard replied, "I moved the wrong way, and I heard a 'pop' in my back, and that's when I lost my hearing." Palmer, who was also involved in many other natural healing philosophies, had Lillard lie face down on the floor and proceeded with the adjustment. The next day, Lillard told Palmer, "I can hear that rackets on the streets." This experience led Palmer to open a school of chiropractic two years later. Rev. Samuel H. Weed coined the word "chiropractic" by combining the Greek words cheiro (hand) and praktikos (doing or action). Chiropractic's early philosophy was rooted in vitalism, naturalism, magnetism, spiritualism and other constructs that are not amenable to the scientific method, although Palmer tried to merge science and metaphysics. In 1896, Palmer's first descriptions and underlying philosophy of chiropractic echoed Andrew Still's principles of osteopathy established a decade earlier. Both described the body as a "machine" whose parts could be manipulated to produce a drugless cure. Both professed the use of spinal manipulation on joint dysfunction/subluxation to improve health. Palmer distinguished his work by noting that he was the first to use short-lever HVLA (high velocity low amplitude) joint manipulation techniques using the spinous process and transverse processes as mechanical levers. He described the effects of chiropractic spinal manipulation as being mediated primarily by the nervous system. Despite the similarities between chiropracti
https://en.wikipedia.org/wiki/Dan-Virgil%20Voiculescu
Dan-Virgil Voiculescu (born 14 June 1949) is a Romanian professor of mathematics at the University of California, Berkeley. He has worked in single operator theory, operator K-theory and von Neumann algebras. More recently, he developed free probability theory. Education and career Voiculescu studied at the University of Bucharest, receiving his PhD in 1977 under the direction of Ciprian Foias. He was an assistant at the University of Bucharest (1972–1973), a researcher at the Institute of Mathematics of the Romanian Academy (1973–1975), and a researcher at INCREST (1975–1986). He came to Berkeley in 1986 for the International Congress of Mathematicians, and stayed on as visiting professor. Voiculescu was appointed professor at Berkeley in 1987. Awards and honors He received the 2004 NAS Award in Mathematics from the National Academy of Sciences (NAS) for “the theory of free probability, in particular, using random matrices and a new concept of entropy to solve several hitherto intractable problems in von Neumann algebras.” Voiculescu was elected to the National Academy of Sciences in 2006. In 2012 he became a fellow of the American Mathematical Society.
https://en.wikipedia.org/wiki/Proxim%20Wireless
Proxim Wireless Corporation is a San Jose, California-based company that builds scalable broadband wireless networking systems for communities, enterprises, governments, and service providers. It offers wireless LAN, point-to-multipoint and point-to-point products through a channel network. The company is a product of many mergers and acquisitions over the years. History Proxim Corporation was founded in 1984, initially headquartered in Mountain View, California. Starting in 1989, it began to develop radio frequency modules using spread spectrum technology. Its first commercial product in 1990 used 900 MHz frequency bands. End-user products called RangeLAN were introduced in 1991 and 1992, reflecting their use as network interface controllers for local area networks (LANs). The first adapter using the Industry Standard Architecture bus and supported Novell NetWare. An initial public offering (IPO) on December 15, 1993 listed the company shares on NASDAQ under symbol PROX. In 1994 the RangLAN2 products starting using 2.4 GHz bands, and in 1995 the RangeLINK product line was introduced. Proxim was a founding member of the Wireless LAN Interoperability Forum in May 1996 to help interoperability between its RangeLAN2 and other wireless technologies. A secondary offering expected to raise an estimated $95 million in June 1996 was delayed and reduced to about $41 million by late July 1996. A product line called Symphony was introduced in 1998. Proxim was also a core member of the HomeRF Working Group, formed in 1997. The group disbanded at the end of 2002. In January 2000, Proxim announced it had acquired privately held Micrilor of Wakefield, Massachusetts a year earlier. In January 2001 Proxim (then headquartered in Sunnyvale, California) announced it would acquire Netopia Incorporated (listed as symbol NTPA) for approximately $223 million in stock. However, that merger fell apart after Intel announced it would stop developing HomeRF technology in March 2001. Wester
https://en.wikipedia.org/wiki/Cerent%20Corporation
Cerent Corporation was an optical equipment maker based in Petaluma, California. It was founded in 1997 as Fiberlane Communications with funding from Kleiner Perkins Caufield & Byers and Vinod Khosla as the managing VC. The company was founded with three divisions: Systems in Petaluma, Chip Design in Mountain View, California and Network Management Systems in Burnaby, British Columbia. In early 1998 the company split into two companies with the Petaluma branch becoming Cerent and the Burnaby and Mountain View branches becoming Siara Systems (Acquired by Redback Networks in 1999). The Cerent 454 Cerent's first product was the Cerent 454 (later the Cisco 15454). The Cerent 454 was a second generation SONET ADM (Add-Drop Multiplexor) that also supported TCP/IP data switching. When operating as a pure ADM, the 454 could add and drop circuits from OC-192 down to Digital Signal 1 (DS1) -- later it would support wavelength-division multiplexing (WDM). Unlike the ADMs that preceded it, a transport signal did not have to be terminated outside the box to switch or route the TCP/IP packets. "Data cards" could be inserted into the chassis which would terminate the circuits then switch or route the packets between those terminated circuits. This capability meant carriers no longer had to purchase two boxes (e.g. an ADM and a router) just to move TCP/IP packets around its telecom network. Other advantages of the Cerent 454 included: smaller form factor, higher port density, greater chip integration, and lower power consumption than competitors at the time. The unit also was the one of the first network element to utilize TCP/IP and a web server on its management interface (The first TCP/IP management network was Ditech Communications in its DWDM system, marketed in 1996) meaning it could be managed over a standard TCP/IP network as opposed to a more restrictive OSI network interface which was the standard in telecom networks at the time. This decision, while initially contr
https://en.wikipedia.org/wiki/MICAD
The Molecular Imaging and Contrast Agent Database or MICAD is a freely accessible online source of information on in vivo molecular imaging agents. It was established as a key component of the "Molecular Libraries and Imaging" program of the NIH Roadmap, a set of major inter-agency initiatives accelerating medical research and the development of new, more specific therapies for a wide range of diseases. Content MICAD includes agents developed for imaging modalities such as positron emission tomography (PET), single photon emission computed tomography (SPECT), magnetic resonance imaging (MRI), ultrasound, computed tomography, optical imaging, and planar gamma imaging. It contains textual information, references, numerous links to MEDLINE and to other relevant resources from the National Center for Biotechnology Information (NCBI). Process MICAD is edited by a team of scientific editors and curators at the National Library of Medicine, NIH. It is being developed under the guidance of a trans-NIH panel of experts in the field. Members of the imaging community are invited to contribute to the MICAD database by writing and submitting entries (chapters) on agents of their choice for online publication. The MICAD staff will work with individual guest authors to prepare the chapters. Interested members of the imaging community should contact the MICAD staff at micad@ncbi.nlm.nih.gov.
https://en.wikipedia.org/wiki/The%20Sun%20and%20the%20Rain
"The Sun and the Rain" is a single by Madness. It was released in 1983 as a stand-alone single and in 1984 it was included on the American/Canadian version of their album Keep Moving. The single spent 10 weeks on the UK Singles Chart, peaking at number 5. The song was also their last to ever enter the USA Billboard Hot 100 Charts, peaking as high as No. 72 on that following chart in 1984. "The Sun and the Rain" was the last Madness single written solely by Mike Barson until 2009's "Sugar and Spice". It was also the last original release of theirs to reach the UK top 10 until "Lovestruck" in 1999. Music video The music video for the song shows Madness performing in a rainy street, with a couple of references to Christmastime and the holiday seasons. Toward the end they are joined by a number of Madness fans who join in the dancing. An introductory scene shows the band entering 'Holts' shoe shop in Camden Town, since renamed British Boot Company. There are also scenes showing the band dressed in red with umbrella hats, supposedly wreaking havoc inside Suggs' ear, and shots of Lee Thompson running around with a rocket strapped to his back, a reference to the single's b-side. Cover painting The cover is a detail from the painting The Storm by the French artist Narcisse Virgilio Díaz de la Peña. Painted in 1871, it can be seen in the National Gallery in London. Critical reception Upon its release, Peter Martin of Smash Hits praised "The Sun and the Rain" as "a belter" and commented, "The song trundles along merrily, carried by a jaunty pub piano that gives the song a slightly off-beat, light-hearted feel. There's also a touch of a Beatles-ish string section thrown in for good measure." Robin Smith of Record Mirror noted that it "boasts some particularly plaintive vocals and a neat shuffling back-up" and felt it is "a shade more traditional than some of their previous works". Debbi Voller of Number One described it as "the Madness we've all come to love" and
https://en.wikipedia.org/wiki/Disgregation
In the history of thermodynamics, disgregation is an early formulation of the concept of entropy. It was defined in 1862 by Rudolf Clausius as the magnitude of the degree in which the molecules of a body are separated from each other. Disgregation was the stepping stone for Clausius to create the mathematical expression for the Second Law of Thermodynamics. Clausius modeled the concept on certain passages in French physicist Sadi Carnot's 1824 paper On the Motive Power of Fire which characterized the transformations of working substances (particles of a thermodynamic system) of an engine cycle, namely "mode of aggregation". The concept was later extended by Clausius in 1865 in the formulation of entropy, and in Ludwig Boltzmann's 1870s developments including the diversities of the motions of the microscopic constituents of matter, described in terms of order and disorder. In 1949, Edward Armand Guggenheim developed the concept of energy dispersal. The terms disgregation and dispersal are near in meaning. Historical context In 1824, French physicist Sadi Carnot assumed that heat, like a substance, cannot be diminished in quantity and that it cannot increase. Specifically, he states that in a complete engine cycle ‘that when a body has experienced any changes, and when after a certain number of transformations it returns to precisely its original state, that is, to that state considered in respect to density, to temperature, to mode of aggregation, let us suppose, I say that this body is found to contain the same quantity of heat that it contained at first, or else that the quantities of heat absorbed or set free in these different transformations are exactly compensated.’ Furthermore, he states that ‘this fact has never been called into question’ and ‘to deny this would overthrow the whole theory of heat to which it serves as a basis.’ This famous sentence, which Carnot spent fifteen years thinking about, marks the start of thermodynamics and signals the slow tra
https://en.wikipedia.org/wiki/Trillium%20Model
The Trillium Model, created by a collaborative team from Bell Canada, Northern Telecom and Bell Northern Research (Northern Telecom and Bell Northern Research later merged into Nortel Networks) combines requirements from the ISO 9000 series, the Capability Maturity Model (CMM) for software, and the Baldrige Criteria for Performance Excellence, with software quality standards from the IEEE. Trillium has a telecommunications orientation and provides customer focus. The practices in the Trillium Model are derived from a benchmarking exercise which focused on all practices that would contribute to an organization's product development and support capability. The Trillium Model covers all aspects of the software development life-cycle, most system and product development and support activities, and a significant number of related marketing activities. Many of the practices described in the model can be applied directly to hardware development. Objectives The Trillium Model has been developed from a customer perspective, as perceived in a competitive, commercial environment. The Model is used in a variety of ways: In benchmarking an organization's product development and support process capability against best practices in the industry, In self-assessment mode, to help identify opportunities for improvement within a product development organization, and In pre-contractual negotiations, to assist in selecting a supplier. This Model and its accompanying tools are not in themselves a product development process or life-cycle model. Rather, the Trillium Model provides key industry best practices which can be used to improve an existing process or life-cycle Scale The Trillium scale spans levels 1 through 5. Levels can be characterized in the following way: Unstructured: The development process is ad hoc. Projects often cannot meet quality or schedule targets. Success, while possible, is based on individuals rather than on organizational infrastructure. (Risk –
https://en.wikipedia.org/wiki/Test%20effort
In software development, test effort refers to the expenses for (still to come) tests. There is a relation with test costs and failure costs (direct, indirect, costs for fault correction). Some factors which influence test effort are: maturity of the software development process, quality and testability of the testobject, test infrastructure, skills of staff members, quality goals and test strategy. Methods for estimation of the test effort To analyse all factors is difficult, because most of the factors influence each other. Following approaches can be used for the estimation: top-down estimation and bottom-up estimation. The top-down techniques are formula based and they are relative to the expenses for development: Function Point Analysis (FPA) and Test Point Analysis (TPA) amongst others. Bottom-up techniques are based on detailed information and involve often experts. The following techniques belong here: Work Breakdown Structure (WBS) and Wide Band Delphi (WBD). We can also use the following techniques for estimating the test effort: Conversion of software size into person hours of effort directly using a conversion factor. For example, we assign 2 person hours of testing effort per one Function Point of software size or 4 person hours of testing effort per one use case point or 3 person hours of testing effort per one Software Size Unit Conversion of software size into testing project size such as Test Points or Software Test Units using a conversion factor and then convert testing project size into effort Compute testing project size using Test Points of Software Test Units. Methodology for deriving the testing project size in Test Points is not well documented. However, methodology for deriving Software Test Units is defined in a paper by Murali We can also derive software testing project size and effort using Delphi Technique or Analogy Based Estimation technique. Test efforts from literature In literature test efforts relative to total costs are
https://en.wikipedia.org/wiki/Amino%20acid%20synthesis
Amino acid synthesis is the set of biochemical processes (metabolic pathways) by which the amino acids are produced. The substrates for these processes are various compounds in the organism's diet or growth media. Not all organisms are able to synthesize all amino acids. For example, humans can synthesize 11 of the 20 standard amino acids. These 11 are called the non-essential amino acids). α-Ketoglutarates: glutamate, glutamine, proline, arginine Most amino acids are synthesized from α-ketoacids, and later transaminated from another amino acid, usually glutamate. The enzyme involved in this reaction is an aminotransferase. α-ketoacid + glutamate ⇄ amino acid + α-ketoglutarate Glutamate itself is formed by amination of α-ketoglutarate: α-ketoglutarate + ⇄ glutamate The α-ketoglutarate family of amino acid synthesis (synthesis of glutamate, glutamine, proline and arginine) begins with α-ketoglutarate, an intermediate in the Citric Acid Cycle. The concentration of α-ketoglutarate is dependent on the activity and metabolism within the cell along with the regulation of enzymatic activity. In E. coli citrate synthase, the enzyme involved in the condensation reaction initiating the Citric Acid Cycle is strongly inhibited by α-ketoglutarate feedback inhibition and can be inhibited by DPNH as well high concentrations of ATP. This is one of the initial regulations of the α-ketoglutarate family of amino acid synthesis. The regulation of the synthesis of glutamate from α-ketoglutarate is subject to regulatory control of the Citric Acid Cycle as well as mass action dependent on the concentrations of reactants involved due to the reversible nature of the transamination and glutamate dehydrogenase reactions. The conversion of glutamate to glutamine is regulated by glutamine synthetase (GS) and is a key step in nitrogen metabolism. This enzyme is regulated by at least four different mechanisms: 1. Repression and depression due to nitrogen levels; 2. Activation and inact
https://en.wikipedia.org/wiki/Generalized%20Maxwell%20model
The Generalized Maxwell model also known as the Maxwell–Wiechert model (after James Clerk Maxwell and E Wiechert) is the most general form of the linear model for viscoelasticity. In this model several Maxwell elements are assembled in parallel. It takes into account that the relaxation does not occur at a single time, but in a set of times. Due to the presence of molecular segments of different lengths, with shorter ones contributing less than longer ones, there is a varying time distribution. The Wiechert model shows this by having as many spring–dashpot Maxwell elements as are necessary to accurately represent the distribution. The figure on the right shows the generalised Wiechert model. General model form Solids Given elements with moduli , viscosities , and relaxation times The general form for the model for solids is given by : Example: standard linear solid model Following the above model with elements yields the standard linear solid model: Fluids Given elements with moduli , viscosities , and relaxation times The general form for the model for fluids is given by: Example: three parameter fluid The analogous model to the standard linear solid model is the three parameter fluid, also known as the Jeffreys model:
https://en.wikipedia.org/wiki/Management%20of%20multiple%20sclerosis
Multiple sclerosis (MS) is a chronic inflammatory demyelinating disease that affects the central nervous system (CNS). Several therapies for it exist, although there is no known cure. The most common initial course of the disease is the relapsing-remitting subtype, which is characterized by unpredictable attacks (relapses) followed by periods of relative remission with no new signs of disease activity. After some years, many of the people who have this subtype begin to experience neurologic decline without acute relapses. When this happens it is called secondary progressive multiple sclerosis. Other, less common, courses of the disease are the primary progressive (decline from the beginning without attacks) and the progressive-relapsing (steady neurologic decline and superimposed attacks). Different therapies are used for patients experiencing acute attacks, for patients who have the relapsing-remitting subtype, for patients who have the progressive subtypes, for patients without a diagnosis of MS who have a demyelinating event, and for managing the various consequences of MS. The primary aims of therapy are returning function after an attack, preventing new attacks, and preventing disability. As with any medical treatment, medications used in the management of MS may have several adverse effects, and many possible therapies are still under investigation. At the same time different alternative treatments are pursued by many people, despite the fact that there is little supporting, comparable, replicated scientific study. Stem cell therapy is being studied. This article focuses on therapies for standard MS; borderline forms of MS have particular treatments that are excluded. Acute attacks Administration of high doses of intravenous corticosteroids, such as methylprednisolone, is the routine therapy for acute relapses. This is administered over a period of three to five days, and has a well-established efficacy in promoting a faster recovery from disability afte
https://en.wikipedia.org/wiki/Brace%27s%20emerald
Brace's emerald (Riccordia bracei) is an extinct species of hummingbird which was endemic to the main island of the Bahamas, New Providence. Description Its size was 9.5 cm, the wing length 11.4 cm and length of the tail 2.7 cm. The black bill was slightly curved and conical pointed. The feet were black. The back exhibited a bronze green hue with a golden gleam. The head was similarly coloured to the back, with the absence of the golden gloss. Directly behind the eyes was a white spot. The throat gleamed in magnificent blue green colour hues. The abdomen had green feathers with ash-grey tips. The wings exhibited a purplish hue. The rectrices were greenish. The crissum (the undertail covert which surrounded the cloacal opening) was grey with a faint cinnamon hue at the edges. Status and extinction For more than a hundred years, Brace's emerald was only known by the type specimen, one single male which was shot by bird collector Lewis J. K. Brace on July 13, 1877 around three miles (4.8 kilometres) away from Nassau on the island of New Providence. The skin (which is unfortunately heavily damaged at the throat) is now at the Smithsonian Institution in Washington, D.C. The species was long ignored by ornithological authorities. In 1880 it was listed without commentary as a synonym of the Cuban emerald (Riccordia ricordii). Not until the 1930s was the unique status of the holotype even recognized, as it was seen as an aberrant specimen of the Cuban emerald that had become a vagrant to New Providence. American ornithologist James Bond was the first to discuss the differences between R. ricordii and R. bracei. In 1945, he split R. ricordii and regarded R. ricordii bracei as a new subspecies. In contrast to the Cuban species, the specimen from New Providence was smaller, had a longer bill and a different plumage. In 1982, palaeornithologists William Hilgartner and Storrs Olson discovered fossil remains of three hummingbird species from the Pleistocene in the deposits i
https://en.wikipedia.org/wiki/Movable%20singularity
In the theory of ordinary differential equations, a movable singularity is a point where the solution of the equation behaves badly and which is "movable" in the sense that its location depends on the initial conditions of the differential equation. Suppose we have an ordinary differential equation in the complex domain. Any given solution y(x) of this equation may well have singularities at various points (i.e. points at which it is not a regular holomorphic function, such as branch points, essential singularities or poles). A singular point is said to be movable if its location depends on the particular solution we have chosen, rather than being fixed by the equation itself. For example the equation has solution for any constant c. This solution has a branchpoint at , and so the equation has a movable branchpoint (since it depends on the choice of the solution, i.e. the choice of the constant c). It is a basic feature of linear ordinary differential equations that singularities of solutions occur only at singularities of the equation, and so linear equations do not have movable singularities. When attempting to look for 'good' nonlinear differential equations it is this property of linear equations that one would like to see: asking for no movable singularities is often too stringent, instead one often asks for the so-called Painlevé property: 'any movable singularity should be a pole', first used by Sofia Kovalevskaya. See also Painlevé transcendents Regular singular point
https://en.wikipedia.org/wiki/Fosmid
Fosmids are similar to cosmids but are based on the bacterial F-plasmid. The cloning vector is limited, as a host (usually E. coli) can only contain one fosmid molecule. Fosmids can hold DNA inserts of up to 40 kb in size; often the source of the insert is random genomic DNA. A fosmid library is prepared by extracting the genomic DNA from the target organism and cloning it into the fosmid vector. The ligation mix is then packaged into phage particles and the DNA is transfected into the bacterial host. Bacterial clones propagate the fosmid library. The low copy number offers higher stability than vectors with relatively higher copy numbers, including cosmids. Fosmids may be useful for constructing stable libraries from complex genomes. Fosmids have high structural stability and have been found to maintain human DNA effectively even after 100 generations of bacterial growth. Fosmid clones were used to help assess the accuracy of the Public Human Genome Sequence. Discovery The fertility plasmid or F-plasmid was discovered by Esther Lederberg and encodes information for the biosynthesis of sex pilus to aid in bacterial conjugation. Conjugation involves using the sex pilus to form a bridge between two bacteria cells; this bridge allows the F+ cell to transfer a single-stranded copy of the plasmid so that both cells contain a copy of the plasmid. On the way into the recipient cell, the corresponding DNA strand is synthesized by the recipient. The donor cell maintains a functional copy of the plasmid. It later was discovered that the F factor was the first episome and can exist as an independent plasmid making it a very stable vector for cloning. Conjugation aids in the formation of bacterial clone libraries by ensuring all cells contain the desired fosmid. Fosmids are DNA vectors that use the F-plasmid origin of replication and partitioning mechanisms to allow cloning of large DNA fragments. A library that provides 20–70-fold redundant coverage of the genome can easily
https://en.wikipedia.org/wiki/Cross-domain%20solution
A cross-domain solution (CDS) is an integrated information assurance system composed of specialized software, and sometimes hardware, that provides a controlled interface to manually or automatically enable and/or restrict the access or transfer of information between two or more security domains based on a predetermined security policy. CDSs are designed to enforce domain separation and typically include some form of content filtering, which is used to designate information that is unauthorized for transfer between security domains or levels of classification, such as between different military divisions, intelligence agencies, or other operations which critically depend on the timely sharing of potentially sensitive information. The goal of a CDS is to allow a trusted network domain to exchange information with other domains, either one-way or bi-directionally, without introducing the potential for security threats that would normally come with network connectivity. Although the goal is 100% assurance, this is not possible in practice, thus CDS development, assessment, and deployment are based on comprehensive risk management. Due to the sensitive nature of their use, every aspect of an accredited CDS must be rigorously evaluated under what is known as a Lab-Based Security Assessment (LBSA) to reduce the potential vulnerabilities and risks to the system itself and those to which it will be deployed. The evaluation and accreditation of CDSs in the United States are primarily under the authority of the National Cross Domain Strategy and Management Office (NCDSMO) within the National Security Agency (NSA). The three primary elements demanded from cross-domain solutions are: Data confidentiality; is most often imposed by hardware-enforced one-way data transfer Data integrity: content management using filtering for viruses and malware; content examination utilities; in high-to-low security transfer audited human review Data availability: security-hardened operati
https://en.wikipedia.org/wiki/System%20high%20mode
System high mode, or simply system high, is a security mode of using an automated information system (AIS) that pertains to an environment that contains restricted data that is classified in a hierarchical scheme, such as Top Secret, Secret and Unclassified. System high pertains to the IA features of information processed, and specifically not to the strength or trustworthiness of the system. System high mode is distinguished from other modes (such as multilevel security) by its lack of need for the system to contribute to the protection or separation of unequal security classifications. In particular, this precludes use of the features of objects (e.g. content or format) produced by or exposed to an AIS operating in system high mode as criteria to securely downgrade those objects. As a result, all information in a system high AIS is treated as if it were classified at the highest security level of any data in the AIS. For example, Unclassified information can exist in a secret system high computer but it must be treated as secret, therefore it can never be shared with unclassified destinations (unless downgraded by reliable human review, which itself is risky because of lack of omniscient humans.) There is no known technology to securely declassify system high information by automated means because no reliable features of the data can be trusted after having been potentially corrupted by the system high host. When unreliable means are used (including cross-domain solutions and bypass guards) a serious risk of system exploitation via the bypass is introduced. Nevertheless, it has been done where the resulting risk is overlooked or accepted. Example: When Daniel is granted access to a computer system that uses System High mode, Daniel must have a valid security clearance for all information processed by the system and valid "need to know" for some, but not necessary all, informations processes by the system. Sources NCSC (1985). "Trusted Computer System Eval
https://en.wikipedia.org/wiki/Capability-based%20operating%20system
Capability-based operating system generally refers to an operating system that uses capability-based security. Examples include: Hydra KeyKOS EROS CapROS Midori seL4 Genode Fuchsia Control Program Facility Capability systems Operating system security
https://en.wikipedia.org/wiki/Korean%20Bioinformation%20Center
The Korean Bioinformation Center (KOBIC) is the Korean national research centre in bioinformatics, based in Daejeon, South Korea. The centre is comparable to the National Center for Biotechnology Information (NCBI) in the United States and the European Bioinformatics Institute (EBI) in Europe and plays a key role in various areas such as genomics, proteomics, systems biology, and personalized medicine. The research centre was originally established as the National Genome Information Center (NGIC) in 2001. It was designated as the National Center for Registration of Research Results in 2003 and gained its current name in 2006 by the NGIC director Jong Bhak who passed a bill to establish the national center for bioresource, biodiversity, and bioinformation. Prof. Jong Bhak coined the name KOGIC in 2005 and the formalization of the name was carried out by the vice prime minister Prof. Woosik Kim in 2006. KOBIC was designated as the National Center for Biological Research Resource Information in 2010. KOBIC is affiliated with the Korea Research Institute of Bioscience and Biotechnology (KRIBB). As a national bioinformation management centre, KOBIC manages biological data from a number of different sources, with an emphasis on omics data. Research at KOBIC has an emphasis on next-generation sequencing methods, systems bioinformatics, biomedical informatics and structural informatics. Notable databases and tools provided by KOBIC include the MiRGator database for the functional annotation of miRNAs and Patome, a database of biological sequence data of issued patents and/or published patent applications. Researchers at KOBIC have also been involved in the analysis for the Korean Reference Genome Project. The Korean Reference Genome Project was initiated by the then director Jong Bhak who made a formal collaboration with National Standard Reference Center of KRISS, Daejeon, Korea in 2006. The first project under the Korean Reference Genome Project was KOREF. As it was
https://en.wikipedia.org/wiki/Active%20suspension
An active suspension is a type of automotive suspension that uses an onboard control system to control the vertical movement of the vehicle's wheels and axles relative to the chassis or vehicle frame, rather than the conventional passive suspension that relies solely on large springs to maintain static support and dampen the vertical wheel movements caused by the road surface. Active suspensions are divided into two classes: true active suspensions, and adaptive or semi-active suspensions. While semi-adaptive suspensions only vary shock absorber firmness to match changing road or dynamic conditions, active suspensions use some type of actuator to raise and lower the chassis independently at each wheel. These technologies allow car manufacturers to achieve a greater degree of ride quality and car handling by keeping the tires consistently perpendicular to the road when turning corners, preventing unwanted contacts between the vehicle frame and the ground (especially when going over a depression), and allowing overall better traction and steering control. An onboard computer detects body movement from sensors throughout the vehicle and, using that data, controls the action of the active and semi-active suspensions. The system virtually eliminates body roll and pitch variation in many driving situations including cornering, accelerating and braking. When used on commercial vehicles such as buses, active suspension can also be used to temporarily lower the vehicle's floor, thus making it easier for passengers to board and exit the vehicle. Principle Skyhook theory is that the ideal suspension would let the vehicle maintain a stable posture, unaffected by weight transfer or road surface irregularities, as if suspended from an imaginary hook in the sky continuing at a constant altitude above sea level, therefore remaining stable. Since an actual skyhook is obviously impractical, real active suspension systems are based on actuator operations. The imaginary line (of z
https://en.wikipedia.org/wiki/Shared%20resource
In computing, a shared resource, or network share, is a computer resource made available from one host to other hosts on a computer network. It is a device or piece of information on a computer that can be remotely accessed from another computer transparently as if it were a resource in the local machine. Network sharing is made possible by inter-process communication over the network. Some examples of shareable resources are computer programs, data, storage devices, and printers. E.g. shared file access (also known as disk sharing and folder sharing), shared printer access, shared scanner access, etc. The shared resource is called a shared disk, shared folder or shared document The term file sharing traditionally means shared file access, especially in the context of operating systems and LAN and Intranet services, for example in Microsoft Windows documentation. Though, as BitTorrent and similar applications became available in the early 2000s, the term file sharing increasingly has become associated with peer-to-peer file sharing over the Internet. Common file systems and protocols Shared file and printer access require an operating system on the client that supports access to resources on a server, an operating system on the server that supports access to its resources from a client, and an application layer (in the four or five layer TCP/IP reference model) file sharing protocol and transport layer protocol to provide that shared access. Modern operating systems for personal computers include distributed file systems that support file sharing, while hand-held computing devices sometimes require additional software for shared file access. The most common such file systems and protocols are: The "primary operating system" is the operating system on which the file sharing protocol in question is most commonly used. On Microsoft Windows, a network share is provided by the Windows network component "File and Printer Sharing for Microsoft Networks", using Micros
https://en.wikipedia.org/wiki/McCarthy%20Formalism
In computer science and recursion theory the McCarthy Formalism (1963) of computer scientist John McCarthy clarifies the notion of recursive functions by use of the IF-THEN-ELSE construction common to computer science, together with four of the operators of primitive recursive functions: zero, successor, equality of numbers and composition. The conditional operator replaces both primitive recursion and the mu-operator. Introduction McCarthy's notion of conditional expression McCarthy (1960) described his formalism this way: "In this article, we first describe a formalism for defining functions recursively. We believe this formalism has advantages both as a programming language and as a vehicle for developing a theory of computation.... We shall need a number of mathematical ideas and notations concerning functions in general. Most of the ideas are well known, but the notion of conditional expression is believed to be new, and the use of conditional expressions permits functions to be defined recursively in a new and convenient way." Minsky's explanation of the "formalism" In his 1967 Computation: Finite and Infinite Machines, Marvin Minsky in his § 10.6 Conditional Expressions: The McCarthy Formalism describes the "formalism" as follows: "Practical computer languages do not lend themselves to formal mathematical treatment--they are not designed to make it easy to prove theorems about the procedures they describe. In a paper by McCarthy [1963] we find a formalism that enhances the practical aspect of the recursive-function concept, while preserving and improving its mathematical clarity. ¶ McCarthy introduces "conditional expressions" of the form f = (if p1 then e1 else e2) where the ei are expressions and p1 is a statement (or equation) that may be true or false. ¶ This expression means See if p1 is true; if so the value of f is given by e1. IF p1 is false, the value of f is given by e2. This conditional expression . . . has also the power of the minimizat
https://en.wikipedia.org/wiki/The%20Indian%20Picture%20Opera
The Indian Picture Opera is a magic lantern slide show by photographer Edward S. Curtis. In the early 1900s, Curtis published the renowned 20-volume book subscription entitled The North American Indian. He compiled about 2400 photographs with detailed ethnological and language studies of tribes of the American West. In 1911, in an effort to promote his book sales, Curtis created a traveling Magic Lantern slide show, The Indian Picture Opera. Stereo-Opticon projectors put Curtis's stunning images on screens in America's largest cities, one scene dissolving into another. A small orchestra played music derived from Indian chants and rhythms, and Edward Curtis lectured on the intimate stories of tribal life. This Magic Lantern show was played to breathless audiences, stunned by the humanity, fascinated by the imagery, and shamed by the destruction of Indian cultures. The shows received standing ovations and generous reviews. Curtis went on to produce and direct In the Land of the Head Hunters in 1914. This production was a full-length documentary motion picture of aboriginal North Americans. In 2006, there was a contemporary remake of the Picture Opera published on DVD. Following the original script, images and music were reconstituted into a modern-day multi-media production of The Indian Picture Opera.
https://en.wikipedia.org/wiki/Analog%20models%20of%20gravity
Analog models of gravity are attempts to model various phenomena of general relativity (e.g., black holes or cosmological geometries) using other physical systems such as acoustics in a moving fluid, superfluid helium, or Bose–Einstein condensate; gravity waves in water; and propagation of electromagnetic waves in a dielectric medium. These analogs (or analogies) serve to provide new ways of looking at problems, permit ideas from other realms of science to be applied, and may create opportunities for practical experiments within the analog that can be applied back to the source phenomena. History Analog models of gravity have been used in hundreds of published articles in the last decade. The use of these analogs can be traced back to the very start of scientific theories for gravity, with Newton and Einstein. Bose-Einstein condensates It has been shown that Bose-Einstein condensates (BEC) are a good platform to study analog gravity. Kerr (rotating) black holes have been implemented in a BEC of exciton-polaritons (a quantum fluid of light). See also Acoustic metric Transformation optics Optical metric#Analogue gravity Optical black hole Sonic black hole
https://en.wikipedia.org/wiki/Cytolysin
Cytolysin refers to the substance secreted by microorganisms, plants or animals that is specifically toxic to individual cells, in many cases causing their dissolution through lysis. Cytolysins that have a specific action for certain cells are named accordingly. For instance, the cytolysins responsible for the destruction of red blood cells, thereby liberating hemoglobins, are named hemolysins, and so on. Cytolysins may be involved in immunity as well as in venoms. Hemolysin is also used by certain bacteria, such as Listeria monocytogenes, to disrupt the phagosome membrane of macrophages and escape into the cytoplasm of the cell. History and background The term "Cytolysin" or "Cytolytic toxin" was first introduced by Alan Bernheimer to describe membrane damaging toxins (MDTs) that have cytolytic effects to cells. The first kind of cytolytic toxin discovered have hemolytic effects on erythrocytes of certain sensitive species, such as Human. For this reason "Hemolysin" was first used to describe any MDTs. In the 1960s certain MDTs were proved to be destructive on cells other than erythrocytes, such as leukocytes. The term "Cytolysin" is then introduced by Bernheimer to replace "Hemolysin". Cytolysins can destruct membranes without creating lysis to cells. Therefore, "membrane damaging toxins" (MDTs) describes the essential actions of cytolysins. Cytolysins comprise more than 1/3 of all bacterial protein toxins. Bacterial protein toxins can be highly poisonous to human. For example, Botulinum is 3x105 more toxic than snake venom to human and its toxic dose is only 0.8x10−8 mg. A wide variety of gram-positive and gram-negative bacteria use cytolysin as their primary weapon for creating diseases, such as Enterococcus faecalis, Staphylococcus and Clostridium perfringens. A diverse range of studies has been done on cytolysins. Since the 1970s, more than 40 new cytolysins have been discovered and grouped into different families. At genetic level, the genetic structures o
https://en.wikipedia.org/wiki/List%20of%20protein%20structure%20prediction%20software
This list of protein structure prediction software summarizes notable used software tools in protein structure prediction, including homology modeling, protein threading, ab initio methods, secondary structure prediction, and transmembrane helix and signal peptide prediction. Software list Below is a list which separates programs according to the method used for structure prediction. Homology modeling Threading/fold recognition Ab initio structure prediction Secondary structure prediction Detailed list of programs can be found at List of protein secondary structure prediction programs See also List of protein secondary structure prediction programs Comparison of nucleic acid simulation software List of software for molecular mechanics modeling Molecular design software Protein design External links bio.tools, finding more tools
https://en.wikipedia.org/wiki/Pythagorean%20quadruple
A Pythagorean quadruple is a tuple of integers , , , and , such that . They are solutions of a Diophantine equation and often only positive integer values are considered. However, to provide a more complete geometric interpretation, the integer values can be allowed to be negative and zero (thus allowing Pythagorean triples to be included) with the only condition being that . In this setting, a Pythagorean quadruple defines a cuboid with integer side lengths , , and , whose space diagonal has integer length ; with this interpretation, Pythagorean quadruples are thus also called Pythagorean boxes. In this article we will assume, unless otherwise stated, that the values of a Pythagorean quadruple are all positive integers. Parametrization of primitive quadruples A Pythagorean quadruple is called primitive if the greatest common divisor of its entries is 1. Every Pythagorean quadruple is an integer multiple of a primitive quadruple. The set of primitive Pythagorean quadruples for which is odd can be generated by the formulas where , , , are non-negative integers with greatest common divisor 1 such that is odd. Thus, all primitive Pythagorean quadruples are characterized by the identity Alternate parametrization All Pythagorean quadruples (including non-primitives, and with repetition, though , , and do not appear in all possible orders) can be generated from two positive integers and as follows: If and have different parity, let be any factor of such that . Then and . Note that . A similar method exists for generating all Pythagorean quadruples for which and are both even. Let and and let be a factor of such that . Then and . This method generates all Pythagorean quadruples exactly once each when and run through all pairs of natural numbers and runs through all permissible values for each pair. No such method exists if both and are odd, in which case no solutions exist as can be seen by the parametrization in the previous section. Pr
https://en.wikipedia.org/wiki/Orb%20%28software%29
Orb was a freeware streaming software that enabled users to remotely access all their personal digital media files including pictures, music, videos and television. It could be used from any Internet-enabled device, including laptops, pocket PC, smartphones, PS3, Xbox 360 and Wii video game consoles. In 2013, Orb Networks, Inc. announced that they were acquired by a strategic partner and would be shutting down operations. Also in 2013, Co-founder Luc Julia indicated that Orb Networks' technology had been acquired by Qualcomm, but no accompanying press release had been issued. Orb's website (accessed May, 2014) announced: "...about a year ago Orb's team and technology were acquired by Qualcomm Connected Experiences, Inc." and "Orb Networks will no longer be offering any Orb software downloads or support for our web based products such as OrbLive and Mycast." The statement invited people to "check out Qualcomm's AllPlay media platform" but did not specify how Orb software may have been utilized. What it did Orb was available for Intel Macintosh, Windows or Media Center PCs. Users create an online account to remotely access their computer. Access to videos, audio, images All the music, pictures and video files stored on a home computer are made available for streaming, provided that the computer is connected to the Internet. The media files are transcoded and streamed directly from that PC, or available for download with the use of a file explorer plug-in. The current version of Orb can be used as a replacement for Microsoft's Windows Media Connect software for computers running the Windows operating system. This allows the Xbox 360 or PlayStation 3 consoles to access the videos, audios, and images on the computer with Orb installed, natively. This allows the user to also watch videos not in the Windows Media Video format without having previously re-transcoded their videos. Orb will transcode the video files from the computer on the fly as they are requested a
https://en.wikipedia.org/wiki/JQuery
jQuery is a JavaScript library designed to simplify HTML DOM tree traversal and manipulation, as well as event handling, CSS animation, and Ajax. It is free, open-source software using the permissive MIT License. , jQuery is used by 77% of the 10 million most popular websites. Web analysis indicates that it is the most widely deployed JavaScript library by a large margin, having at least 3 to 4 times more usage than any other JavaScript library. jQuery's syntax is designed to make it easier to navigate a document, select DOM elements, create animations, handle events, and develop Ajax applications. jQuery also provides capabilities for developers to create plug-ins on top of the JavaScript library. This enables developers to create abstractions for low-level interaction and animation, advanced effects and high-level, theme-able widgets. The modular approach to the jQuery library allows the creation of powerful dynamic web pages and Web applications. The set of jQuery core features—DOM element selections, traversal, and manipulation—enabled by its selector engine (named "Sizzle" from v1.3), created a new "programming style", fusing algorithms and DOM data structures. This style influenced the architecture of other JavaScript frameworks like YUI v3 and Dojo, later stimulating the creation of the standard Selectors API. Microsoft and Nokia bundle jQuery on their platforms. Microsoft includes it with Visual Studio for use within Microsoft's ASP.NET AJAX and ASP.NET MVC frameworks while Nokia has integrated it into the Web Run-Time widget development platform. Overview jQuery, at its core, is a Document Object Model (DOM) manipulation library. The DOM is a tree-structure representation of all the elements of a Web page. jQuery simplifies the syntax for finding, selecting, and manipulating these DOM elements. For example, jQuery can be used for finding an element in the document with a certain property (e.g. all elements with an h1 tag), changing one or more of its at
https://en.wikipedia.org/wiki/David%20P.%20Robbins
David Peter Robbins (12 August 1942 in Brooklyn – 4 September 2003 in Princeton) was an American mathematician. He is most famous for introducing alternating sign matrices. He is also known for his work on generalizations of Heron's formula on the area of polygons, due to which Robbins pentagons (cyclic pentagons with integer side lengths and areas) were named after him. Robbins grew up in Manhattan, where he attended the Fieldston School. He studied at Harvard, where his undergraduate advisor was Andrew Gleason. He went to MIT to do his graduate work and, after a hiatus during which he taught at Fieldston, finished his Ph.D. in 1970. He then taught at MIT, Phillips Exeter Academy, Hamilton College and Washington and Lee University. In 1980 he moved to Princeton, New Jersey and worked at the Institute for Defense Analyses Center for Communications Research there until his death from pancreatic cancer. A symposium was held in Robbins' honor in June 2003, the papers from which were published as a special issue of the journal Advances in Applied Mathematics. The Mathematical Association of America established a prize named in his honor in 2005, given every three years to one or more researchers in algebra, combinatorics, or discrete mathematics. The first winner of the prize, in 2008, was Neil Sloane for the On-Line Encyclopedia of Integer Sequences. The American Mathematical Society has another prize, the David P. Robbins Prize (AMS) with the same name the first winners of which were Samuel P. Ferguson and Thomas C. Hales for their work on the Kepler conjecture. See also Robbins constant, the average distance between two random points in a unit cube
https://en.wikipedia.org/wiki/Employee%20monitoring%20software
Employee monitoring software is a means of employee monitoring, and allows company administrators to monitor and supervise all their employee computers from a central location. It is normally deployed over a business network and allows for easy centralized log viewing via one central networked PC. Sometimes, companies opt to monitor their employees using remote desktop software instead. Purpose Employee monitoring software is used to supervise employees' performance, prevent illegal activities, avoid confidential info leakage, and catch insider threats. Nowadays employee monitoring software is widely used in technology companies. Features An employee monitoring program can monitor almost everything on a computer, such as keystrokes and passwords inputted, websites visited, chats in Facebook Messenger, Skype and other social chats. A piece of monitoring software can also capture screenshots of mobile activities. E-mail monitoring includes employers having access to records of employee’s e-mails that are sent through the company’s servers. Companies may use keyword searches to natural language processing to analyze e-mails. The administrator can view the logs through a cloud panel, or receive the logs by email. Criticism Bossware, also known as tattleware, is software that allows supervisors to automatically monitor the productivity of their employees. Common features of bossware include activity monitoring, screenshotting and/or screen recording, keystroke logging, webcam and/or microphone activation, and "invisible" monitoring. Bossware has been called a form of spyware. During the COVID-19 pandemic, the use of bossware by companies to monitor their employees increased. The Electronic Frontier Foundation (EFF) denounced bossware as a violation of privacy. The Center for Democracy and Technology (CDT) denounced bossware as a threat to the safety and health of employees. During the COVID-19 pandemic, members of the r/antiwork subreddit shared various mouse jigg
https://en.wikipedia.org/wiki/Charles%20E.%20M.%20Pearce
Charles Edward Miller Pearce (29 March 1940 – 8 June 2012) was a New Zealand/Australian mathematician. At the time of his death on 8 June 2012 he was the (Sir Thomas) Elder Professor of Mathematics at the University of Adelaide. Early life Pearce was born in Wellington. His early schooling was in Wellington and he was dux of Hutt Valley High School in 1957. He earned his Bachelor of Science (a double major in Applied and Pure Mathematics and a further double major in Physics and Mathematical Physics) and in 1962 he earned a Masters of Science with first class honours in Mathematics, all from Victoria University of Wellington. The bachelor's degree was from the University of New Zealand, as the constituent colleges of UNZ, of which Victoria University College was one of four, had proliferated into four autonomous Universities by the time Pearce completed his master's degree. New Zealand origins Pearce always remained proud of his New Zealand origins. Being descended from Maori people, he claimed his New Zealand ancestry was longer than almost all his peers from New Zealand. Pearce is descended from Alexander Gray, one of just five Scots who settled in New Zealand as part of the original and largely strong interest in Maoritanga and claimed ancestral connection to three waka (canoes) in the heke (migration): Aotea, Kurahaupo and Takatimu. His principal tribal connection was with the Ngati Ruanui, based in the southern Taranaki. Life and career In 1963 Pearce left New Zealand for doctoral study at the Australian National University (ANU) in Canberra, under the supervision of Pat Moran. Thereafter followed short stints (1 to 3 years) as lecturer at: ANU; University of Queensland (visiting Professor); Université de Rennes 1, France; and University of Sheffield (1966–68). He was appointed to the University of Adelaide in 1968 and remained there for the ensuing years, having been promoted to senior lecturer in 1971, Reader in 1982 and professor in 2003. He was a lead
https://en.wikipedia.org/wiki/Network%20tomography
Network tomography is the study of a network's internal characteristics using information derived from end point data. The word tomography is used to link the field, in concept, to other processes that infer the internal characteristics of an object from external observation, as is done in MRI or PET scanning (even though the term tomography strictly refers to imaging by slicing). The field is a recent development in electrical engineering and computer science, dating from 1996. Network tomography seeks to map the path data takes through the Internet by examining information from “edge nodes,” the computers in which the data are originated and from which they are requested. The field is useful for engineers attempting to develop more efficient computer networks. Data derived from network tomography studies can be used to increase quality of service by limiting link packet loss and increasing routing optimization. Recent developments There have been many published papers and tools in the area of network tomography, which aim to monitor the health of various links in a network in real-time. These can be classified into loss and delay tomography. Loss tomography Loss tomography aims to find “lossy” links in a network by sending active “probes” from various vantage points in the network or the Internet. Delay tomography The area of delay tomography has also attracted attention in the recent past. It aims to find link delays using end-to-end probes sent from vantage points. This can potentially help isolate links with large queueing delays caused by congestion. More applications Network tomography may be able to infer network topology using end-to-end probes. Topology discovery is a tradeoff between accuracy vs. overhead. With network tomography, the emphasis is to achieve as accurate a picture of the network with minimal overhead. In comparison, other network topology discovery techniques using SNMP or route analytics aim for greater accuracy with less emphasis on
https://en.wikipedia.org/wiki/Cyathium
A cyathium (plural: cyathia) is one of the specialised pseudanthia ("false flowers") forming the inflorescence of plants in the genus Euphorbia (Euphorbiaceae). A cyathium consists of: Five (rarely four) bracteoles. These are small, united bracts, which form a cup-like involucre. Their upper tips are free and in the beginning cover the opening of the involucre (like the shutter of a camera). These alternate with: Five (1 to 10) nectar glands, which are sometimes fused. One extremely reduced female flower standing in the centre at the base of the involucre, consisting of an ovary on a short stem with pistil, and surrounded by: Five groups (one group at the base of each bracteole) of extremely reduced male flowers, which each consist of a single anther on a stem. The flower-like characteristics of the cyathia are underlined by brightly coloured nectar glands and often by petal-like appendages to the nectar glands, or brightly coloured, petal-like bracts positioned under the cyathia. The paired petal-like bracts of Euphorbia section Goniostema are called cyathophylls. here female to male flower ratio is 1:α The cyathia are sometimes solitary, but are usually in cymes, inflorescences of the second order, in pseudumbels, on dichotomously branched stalks or in so-called simple cymes which consist of one central and two lateral cyathia. In one group of Madagascan species in the subfamily Euphorbia section Goniostema (E. aueoviridiflora, E. capmanambatoensis, E. iharanae, E. leuconeura, E. neohumbertii, E. viguieri) there is a tendency for a further pseudanthium to grow from the cyme. Probably as an adaptation to pollination by birds, the cyathia have become specialised: Most cyathia have upright cyathophylls which surround them protectively, but render the nectar glands inaccessible. To compensate, between them are naked sterile cyathia whose only job is to produce nectar.
https://en.wikipedia.org/wiki/Catabiosis
Catabiosis is the process of growing older, aging and physical degradation. The word comes from Greek "kata"—down, against, reverse and "biosis"—way of life and is generally used to describe senescence and degeneration in living organisms and biophysics of aging in general. One of the popular catabiotic theories is the entropy theory of aging, where aging is characterized by thermodynamically favourable increase in structural disorder. Living organisms are open systems that take free energy from the environment and offload their entropy as waste. However, basic components of living systems—DNA, proteins, lipids and sugars—tend towards the state of maximum entropy while continuously accumulating damages causing catabiosis of the living structure. Catabiotic force on the contrary is the influence exerted by living structures on adjoining cells, by which the latter are developed in harmony with the primary structures. See also Onpedia definition of catabiosis Catabiotic force Dictionary.com - Catabiosis DNA damage theory of aging Medical aspects of death Biology terminology Senescence
https://en.wikipedia.org/wiki/Pseudanthium
A pseudanthium (; ) is an inflorescence that resembles a flower. The word is sometimes used for other structures that are neither a true flower nor a true inflorescence. Examples of pseudanthia include flower heads, composite flowers, or capitula, which are special types of inflorescences in which anything from a small cluster to hundreds or sometimes thousands of flowers are grouped together to form a single flower-like structure. Pseudanthia take various forms. The real flowers (the florets) are generally small and often greatly reduced, but the pseudanthium itself can sometimes be quite large (as in the heads of some varieties of sunflower). Pseudanthia are characteristic of the daisy and sunflower family (Asteraceae), whose flowers are differentiated into ray flowers and disk flowers, unique to this family. The disk flowers in the center of the pseudanthium are actinomorphic and the corolla is fused into a tube. Flowers on the periphery are zygomorphic and the corolla has one large lobe (the so-called "petals" of a daisy are individual ray flowers, for example). Either ray or disk flowers may be absent in some plants: Senecio vulgaris lacks ray flowers and Taraxacum officinale lacks disk flowers. The individual flowers of a pseudanthium in the family Asteraceae (or Compositae) are commonly called florets. The pseudanthium has a whorl of bracts below the flowers, forming an involucre. In all cases, a pseudanthium is superficially indistinguishable from a flower, but closer inspection of its anatomy will reveal that it is composed of multiple flowers. Thus, the pseudanthium represents an evolutionary convergence of the inflorescence to a reduced reproductive unit that may function in pollination like a single flower, at least in plants that are animal pollinated. Pseudanthia may be grouped into types. The first type has units of individual flowers that are recognizable as single flowers even if fused. In the second type, the flowers do not appear as individua
https://en.wikipedia.org/wiki/Joint%20Committee%20for%20Traceability%20in%20Laboratory%20Medicine
The Joint Committee for Traceability in Laboratory Medicine or JCTLM is collaboration between the International Committee for Weights and Measures (CIPM), the International Federation for Clinical Chemistry and Laboratory Medicine (IFCC), and the International Laboratory Accreditation Cooperation (ILAC). The goal of the JCTLM is to provide a worldwide platform to promote and give guidance on internationally recognized and accepted equivalence of measurements in laboratory medicine and traceability to appropriate measurement standards. See also Good laboratory practice (GLP) Institute for Reference Materials and Measurements (IRMM) Reference range Reference values
https://en.wikipedia.org/wiki/Local%20Langlands%20conjectures
In mathematics, the local Langlands conjectures, introduced by , are part of the Langlands program. They describe a correspondence between the complex representations of a reductive algebraic group G over a local field F, and representations of the Langlands group of F into the L-group of G. This correspondence is not a bijection in general. The conjectures can be thought of as a generalization of local class field theory from abelian Galois groups to non-abelian Galois groups. Local Langlands conjectures for GL1 The local Langlands conjectures for GL1(K) follow from (and are essentially equivalent to) local class field theory. More precisely the Artin map gives an isomorphism from the group GL1(K)= K* to the abelianization of the Weil group. In particular irreducible smooth representations of GL1(K) are 1-dimensional as the group is abelian, so can be identified with homomorphisms of the Weil group to GL1(C). This gives the Langlands correspondence between homomorphisms of the Weil group to GL1(C) and irreducible smooth representations of GL1(K). Representations of the Weil group Representations of the Weil group do not quite correspond to irreducible smooth representations of general linear groups. To get a bijection, one has to slightly modify the notion of a representation of the Weil group, to something called a Weil–Deligne representation. This consists of a representation of the Weil group on a vector space V together with a nilpotent endomorphism N of V such that wNw−1=||w||N, or equivalently a representation of the Weil–Deligne group. In addition the representation of the Weil group should have an open kernel, and should be (Frobenius) semisimple. For every Frobenius semisimple complex n-dimensional Weil–Deligne representation ρ of the Weil group of F there is an L-function L(s,ρ) and a local ε-factor ε(s,ρ,ψ) (depending on a character ψ of F). Representations of GLn(F) The representations of GLn(F) appearing in the local Langlands correspond
https://en.wikipedia.org/wiki/Scleronomous
A mechanical system is scleronomous if the equations of constraints do not contain the time as an explicit variable and the equation of constraints can be described by generalized coordinates. Such constraints are called scleronomic constraints. The opposite of scleronomous is rheonomous. Application In 3-D space, a particle with mass , velocity has kinetic energy Velocity is the derivative of position with respect to time . Use chain rule for several variables: where are generalized coordinates. Therefore, Rearranging the terms carefully, where , , are respectively homogeneous functions of degree 0, 1, and 2 in generalized velocities. If this system is scleronomous, then the position does not depend explicitly with time: Therefore, only term does not vanish: Kinetic energy is a homogeneous function of degree 2 in generalized velocities . Example: pendulum As shown at right, a simple pendulum is a system composed of a weight and a string. The string is attached at the top end to a pivot and at the bottom end to a weight. Being inextensible, the string’s length is a constant. Therefore, this system is scleronomous; it obeys scleronomic constraint where is the position of the weight and is length of the string. Take a more complicated example. Refer to the next figure at right, Assume the top end of the string is attached to a pivot point undergoing a simple harmonic motion where is amplitude, is angular frequency, and is time. Although the top end of the string is not fixed, the length of this inextensible string is still a constant. The distance between the top end and the weight must stay the same. Therefore, this system is rheonomous as it obeys constraint explicitly dependent on time See also Lagrangian mechanics Holonomic system Nonholonomic system Rheonomous Mass matrix
https://en.wikipedia.org/wiki/Rheonomous
A mechanical system is rheonomous if its equations of constraints contain the time as an explicit variable. Such constraints are called rheonomic constraints. The opposite of rheonomous is scleronomous. Example: simple 2D pendulum As shown at right, a simple pendulum is a system composed of a weight and a string. The string is attached at the top end to a pivot and at the bottom end to a weight. Being inextensible, the string has a constant length. Therefore, this system is scleronomous; it obeys the scleronomic constraint , where is the position of the weight and the length of the string. The situation changes if the pivot point is moving, e.g. undergoing a simple harmonic motion , where is the amplitude, the angular frequency, and time. Although the top end of the string is not fixed, the length of this inextensible string is still a constant. The distance between the top end and the weight must stay the same. Therefore, this system is rheonomous; it obeys the rheonomic constraint . See also Lagrangian mechanics Holonomic constraints
https://en.wikipedia.org/wiki/Temporary%20equilibrium%20method
The temporary equilibrium method has been devised by Alfred Marshall for analyzing economic systems that comprise interdependent variables of different speed. Sometimes it is referred to as the moving equilibrium method. For example, assume an industry with a certain capacity that produces a certain commodity. Given this capacity, the supply offered by the industry will depend on the prevailing price. The corresponding supply schedule gives short-run supply. The demand depends on the market price. The price in the market declines if supply exceeds demand, and it increases, if supply is less than demand. The price mechanism leads to market clearing in the short run. However, if this short-run equilibrium price is sufficiently high, production will be very profitable, and capacity will increase. This shifts the short-run supply schedule to the right, and a new short-run equilibrium price will be obtained. The resulting sequence of short-run equilibria are termed temporary equilibria. The overall system involves two state variables: price and capacity. Using the temporary equilibrium method, it can be reduced to a system involving only state variable. This is possible because each short-run equilibrium price will be a function of the prevailing capacity, and the change of capacity will be determined by the prevailing price. Hence the change of capacity will be determined by the prevailing capacity. The method works if the price adjusts fast and capacity adjustment is comparatively slow. The mathematical background is provided by the Moving equilibrium theorem. In physics, the method is known as scale separation,
https://en.wikipedia.org/wiki/The%20Herbert%20Medal
The Herbert Medal is awarded by the International Bulb Society to those whose achievements in advancing knowledge of ornamental bulbous plants is considered to be outstanding. The medal is named for William Herbert, a noted 19th-century botanist. He published many articles in the Botanical Register and the Botanical Magazine on the subject of bulbous plants, many of which he cultivated in his own gardens. He wrote what became the standard work on the family Amaryllidaceae in 1837. He also published extensively on hybridization based on his own experiments, not only on bulbs but also on other groups of plants. Herbert Medalists A full list of those awarded the Herbert Medal is given on the International Bulb Society website, and in Herbertia (1937–1988).
https://en.wikipedia.org/wiki/Moving%20equilibrium%20theorem
Consider a dynamical system (1).......... (2).......... with the state variables and . Assume that is fast and is slow. Assume that the system (1) gives, for any fixed , an asymptotically stable solution . Substituting this for in (2) yields (3).......... Here has been replaced by to indicate that the solution to (3) differs from the solution for obtainable from the system (1), (2). The Moving Equilibrium Theorem suggested by Lotka states that the solutions obtainable from (3) approximate the solutions obtainable from (1), (2) provided the partial system (1) is asymptotically stable in for any given and heavily damped (fast). The theorem has been proved for linear systems comprising real vectors and . It permits reducing high-dimensional dynamical problems to lower dimensions and underlies Alfred Marshall's temporary equilibrium method.
https://en.wikipedia.org/wiki/Coordinate-induced%20basis
In mathematics, a coordinate-induced basis is a basis for the tangent space or cotangent space of a manifold that is induced by a certain coordinate system. Given the coordinate system , the coordinate-induced basis of the tangent space is given by and the dual basis of the cotangent space is
https://en.wikipedia.org/wiki/Nicholas%20Shepherd-Barron
Nicholas Ian Shepherd-Barron, FRS (born 17 March 1955), is a British mathematician working in algebraic geometry. He is a professor of mathematics at King's College London. Education and career Shepherd-Barron was a scholar of Winchester College. He obtained his B.A. at Jesus College, Cambridge in 1976, and received his Ph.D. at the University of Warwick under the supervision of Miles Reid in 1981. In 2013, he moved from the University of Cambridge to King's College London. Research Shepherd-Barron works in various aspects of algebraic geometry, such as: singularities in the minimal model program; compactification of moduli spaces; the rationality of orbit spaces, including the moduli spaces of curves of genus 4 and 6; the geography of algebraic surfaces in positive characteristic, including a proof of Raynaud's conjecture; canonical models of moduli spaces of abelian varieties; the Schottky problem at the boundary; the relation between algebraic groups and del Pezzo surfaces; the period map for elliptic surfaces. In 2008, with the number theorists Michael Harris and Richard Taylor, he proved the original version of the Sato–Tate conjecture and its generalization to totally real fields, under mild assumptions. Awards and honors Shepherd-Barron was elected Fellow of the Royal Society in 2006. Personal life He is the son of John Shepherd-Barron, a Scottish inventor, who was responsible for inventing the first cash machine in 1967. Notes
https://en.wikipedia.org/wiki/Emma%20Lehmer
Emma Markovna Lehmer (née Trotskaia) (November 6, 1906 – May 7, 2007) was a mathematician known for her work on reciprocity laws in algebraic number theory. She preferred to deal with complex number fields and integers, rather than the more abstract aspects of the theory. Biography She was born in Samara, Russian Empire, but her father's job as a representative with a Russian sugar company moved the family to Harbin, China in 1910. Emma was tutored at home until the age of 14, when a school was opened locally. She managed to make her way to the US for her higher education. At UC Berkeley, she started out in engineering in 1924, but found her niche in mathematics. One of her professors was Derrick N. Lehmer, the number theorist well known for his work on prime number tables and factorizations. While working for him at Berkeley finding pseudosquares, she met his son, her future husband Derrick H. Lehmer. Upon her graduation summa cum laude with a B.A. in Mathematics (1928), Emma married the younger Lehmer. They moved to Brown University, where Emma received her M.Sc., and Derrick his Ph.D., both in 1930. Emma did not obtain a Ph.D. herself; she claimed there were many advantages to not holding a doctorate. The Lehmers had two children, Laura (1932) and Donald (1934). Contributions Lehmer did independent mathematical work, including a translation from Russian to English of Pontryagin's book Topological Groups. She worked closely with her husband on many projects; 21 of her 56 publications were joint work with him. Her publications were mainly in number theory and computation, with emphasis on reciprocity laws, special primes, and congruences. She proved that there were infinitely many Fibonacci pseudoprimes. Paul Halmos, in his book I want to be a mathematician: An automathography, wrote about Lehmer's translation of Pontryagin's Topological Groups: "I read the English translation by Mrs. Lehmer (usually referred to as Emma Lemma)...". Several later publicat
https://en.wikipedia.org/wiki/Asymmetric%20Warfare%20Group
The Asymmetric Warfare Group was a United States Army unit created during the War on Terrorism to mitigate various threats with regard to asymmetric warfare. The unit was headquartered at Fort Meade, Maryland and had a training facility at Fort A.P. Hill, Virginia. The unit provided the linkage between Training and Doctrine Command (TRADOC) and the operational Army, and reported directly to the commanding general of TRADOC. In March 2021, the AWG held a casing of the colors ceremony and officially deactivated. Organization The Asymmetric Warfare Group was made up by a headquarters and headquarters detachment and four squadrons: Able Squadron (Operations) Baker Squadron (Operations) Charlie Squadron (Operations) Dog Squadron (Concepts & Integration) Easy Squadron (Training) Each squadron was commanded by a Lieutenant Colonel and subsequently divided into troops each commanded by a Major. Mission The U.S. Army Asymmetric Warfare Group (AWG) provides global operational advisory support to enable U.S. Army forces to win against current and emerging asymmetric threats, and prepare for Large Scale Combat Operations (LSCO). The key tasks of Asymmetric Warfare Group was: Advise, Scout and to Assist Doctrine, Organization, Training, Materiel, Leadership and education, Personnel, Facilities and Policy (DOTMLPF-P) Integration. History The U.S. Army Asymmetric Warfare Group (AWG) was charged with identifying Army and joint force capability gaps to DOTMLPF-P, and developing solutions to those gaps. It further seeks to identify enemy threats and develop methods to defeat those threats. 2016 marked the group's 10th anniversary. In January, 2006, the AWG was established as a Field Operating Agency under the operational control of the Deputy Chief of Staff, G-3/5/7, Headquarters, Department of the Army. The AWG was activated on March 8, 2006, at Fort Meade, MD. The AWG was assigned to the TRADOC on November 11, 2011 as a direct reporting unit to the commanding gener
https://en.wikipedia.org/wiki/Donaldson%20theory
In mathematics, and especially gauge theory, Donaldson theory is the study of the topology of smooth 4-manifolds using moduli spaces of anti-self-dual instantons. It was started by Simon Donaldson (1983) who proved Donaldson's theorem restricting the possible quadratic forms on the second cohomology group of a compact simply connected 4-manifold. Important consequences of this theorem include the existence of an Exotic R4 and the failure of the smooth h-cobordism theorem in 4 dimensions. The results of Donaldson theory depend therefore on the manifold having a differential structure, and are largely false for topological 4-manifolds. Many of the theorems in Donaldson theory can now be proved more easily using Seiberg–Witten theory, though there are a number of open problems remaining in Donaldson theory, such as the Witten conjecture and the Atiyah–Floer conjecture. See also Kronheimer–Mrowka basic class Instanton Floer homology Yang–Mills equations
https://en.wikipedia.org/wiki/Cot%20filtration
C0t filtration, or CF, is a technique that uses the principles of DNA renaturation kinetics (i.e. Cot analysis) to separate the repetitive DNA sequences that dominate many eukaryotic genomes from "gene-rich" single/low-copy sequences. This allows DNA sequencing to concentrate on the parts of the genome that are most informative and interesting. Concept Briefly, when sheared genomic DNA in solution is heated to near boiling temperature, the molecular forces holding complementary base pairs together are disrupted, and the two strands of each double-helix dissociate or ‘denature.’ If the denatured DNA is then slowly returned to a cooler temperature, sequences will begin to ‘reassociate’ (renature) with complementary strands. The temperature at which renaturation occurs can be regulated so that little or no sequence mismatch is tolerated. The rate at which a sequence finds a complementary strand with which to hybridize is directly related to how common that sequence is in the genome. In other words, those sequences that are extremely abundant (on average) find complementary strands with which to pair relatively quickly while single-copy sequences take much longer to find complements. In CF, genomic DNA is heat-denatured and allowed to renature to a Cot value (Cot = DNA concentration x time x a factor based on the cation concentration of the buffer) at which the majority of repetitive elements have reassociated but single and low-copy elements remain single stranded. Double-stranded, repetitive DNA is separated from single-stranded, low-copy DNA by hydroxyapatite chromatography or other means. Application CF allows the single/low copy sequences and the repetitive sequences of a genome to be studied independently of each other. It can also be used to fractionate highly repetitive DNA from moderately repetitive sequences or to further fractionate isolated kinetic components. CF is most accurately performed if fractionation is based upon the results of a Cot a
https://en.wikipedia.org/wiki/Vesper%20mouse
Vesper mice are rodents belonging to the genus Calomys. They are widely distributed in South America. Some species are notable as the vectors of Argentinian hemorrhagic fever and Bolivian hemorrhagic fever. The genus was originally named Hesperomys, but was changed to Calomys in 1962. History Hesperomys was introduced by George Robert Waterhouse in 1839 for the American rodents with cusps arranged in two series. The name combines the Greek ἑσπερος "west" and μυς "mouse". He considered it possible that species of Hesperomys would be found in the Old World, but did not doubt that the Americas were their chief abode. He included as species Mus bimaculatus (=Calomys laucha), Mus griseo-flavus (=Graomys griseoflavus), Mus Darwinii (=Phyllotis darwini), Mus zanthopygus (=Phyllotis xanthopygus), Mus galapagoensis (=Aegialomys galapagoensis), Symidon hispidum (=Sigmodon hispidus), Mus leucopus (=Peromyscus leucopus), and the woodrats (Neotoma). In following years, authors like Johann Andreas Wagner and Spencer Fullerton Baird expanded the genus to include additional American species, such as those placed now in Scapteromys, Oxymycterus, Abrothrix, and Peromyscus. In 1874, Elliott Coues designated Mus bimaculatus Waterhouse as the type species of Hesperomys. In 1888, Herluf Winge used Hesperomys in a sense similar to modern Calomys (but confusingly placed species related to what is now known as Oryzomys in Calomys), but in the same year Oldfield Thomas argued that Hesperomys could not be separated from the hamsters (Cricetus). In 1896, however, he united it with Eligmodontia instead, where it remained until he reinstated it for modern Calomys in 1916. He did not use Calomys (introduced by Waterhouse in 1837 for Mus bimaculatus), because he thought it to be preoccupied by an earlier name Callomys d'Orbigny and Geoffroy, 1830. In 1962, Philip Hershkovitz noted that the International Code of Zoological Nomenclature mandates that a name cannot be considered preoccupied even w
https://en.wikipedia.org/wiki/CLHEP
CLHEP (short for A Class Library for High Energy Physics) is a C++ library that provides utility classes for general numerical programming, vector arithmetic, geometry, pseudorandom number generation, and linear algebra, specifically targeted for high energy physics simulation and analysis software. The project is hosted by CERN and currently managed by a collaboration of researchers from CERN and other physics research laboratories and academic institutions. According to the project's website, CLHEP is in maintenance mode (accepting bug fixes but no further development is expected). CLHEP was proposed by Swedish physicist Leif Lönnblad in 1992 at a Conference on Computing in High-Energy Physics. Lönnblad is still involved in maintaining CLHEP. The project has more recently accepted contributions from other projects built on top of CLHEP, including the physics packages Geant4 and ZOOM, and the BaBar experiment at SLAC. See also Geant4, a software using CLHEP FreeHEP, a similar library to CLHEP COLT, a Java package for High Performance Scientific and Technical Computing, provided by CERN.
https://en.wikipedia.org/wiki/Crisp%20%27n%20Dry
Crisp 'n Dry is a brand of rapeseed oil manufactured by Edible Oils Limited and marketed by Princes Group. The manufacturer claims this vegetable oil leaves food dry after frying (hence its name Crisp 'n Dry), compared to other vegetable oils which require the fried food to be dried with kitchen paper to absorb excess oil. Crisp 'n Dry was previously marketed by Spry, then Unilever, before being acquired by Princes Limited. Crisp 'n Dry contains no cholesterol and the block of Crisp 'n Dry no longer contains trans fat. A long running advertising campaign for Crisp 'n Dry stated no matter what day of the week, it can turn any day into a fry day.
https://en.wikipedia.org/wiki/Cot%20analysis
C0t analysis, a technique based on the principles of DNA reassociation kinetics, is a biochemical technique that measures how much repetitive DNA is in a DNA sample such as a genome. It is used to study genome structure and organization and has also been used to simplify the sequencing of genomes that contain large amounts of repetitive sequence. Procedure The procedure involves heating a sample of genomic DNA until it denatures into the single stranded-form, and then slowly cooling it, so the strands can pair back together. While the sample is cooling, measurements are taken of how much of the DNA is base paired at each temperature. The amount of single and double-stranded DNA is measured by rapidly diluting the sample, which slows reassociation, and then binding the DNA to a hydroxylapatite column. The column is first washed with a low concentration of sodium phosphate buffer, which elutes the single-stranded DNA, and then with high concentrations of phosphate, which elutes the double stranded DNA. The amount of DNA in these two solutions is then measured using a spectrophotometer. Analysis Since a sequence of single-stranded DNA needs to find its complementary strand to reform a double helix, common sequences renature more rapidly than rare sequences. Indeed, the rate at which a sequence will reassociate is proportional to the number of copies of that sequence in the DNA sample. A sample with a highly-repetitive sequence will renature rapidly, while complex sequences will renature slowly. However, instead of simply measuring the percentage of double-stranded DNA versus time, the amount of renaturation is measured relative to a C0t value. The C0t value is the product of C0 (the initial concentration of DNA), t (time in seconds), and a constant that depends on the concentration of cations in the buffer. Repetitive DNA will renature at low C0t values, while complex and unique DNA sequences will renature at high C0t values. The fast renaturation of the repe
https://en.wikipedia.org/wiki/Hutter%20Prize
The Hutter Prize is a cash prize funded by Marcus Hutter which rewards data compression improvements on a specific 1 GB English text file, with the goal of encouraging research in artificial intelligence (AI). Launched in 2006, the prize awards 5000 euros for each one percent improvement (with 500,000 euros total funding) in the compressed size of the file enwik9, which is the larger of two files used in the Large Text Compression Benchmark (LTCB); enwik9 consists of the first 109 bytes of a specific version of English Wikipedia. The ongoing competition is organized by Hutter, Matt Mahoney, and Jim Bowery. , the text data of enwik8 and enwik9 remains a key tool for evaluating the performance of compression algorithms (as done in Hutter's LTCB) and of language models. Goals The goal of the Hutter Prize is to encourage research in artificial intelligence (AI). The organizers believe that text compression and AI are equivalent problems. Hutter proved that the optimal behavior of a goal-seeking agent in an unknown but computable environment is to guess at each step that the environment is probably controlled by one of the shortest programs consistent with all interaction so far. However, there is no general solution because Kolmogorov complexity is not computable. Hutter proved that in the restricted case (called AIXItl) where the environment is restricted to time t and space l, a solution can be computed in time O(t2l), which is still intractable. The organizers further believe that compressing natural language text is a hard AI problem, equivalent to passing the Turing test. Thus, progress toward one goal represents progress toward the other. They argue that predicting which characters are most likely to occur next in a text sequence requires vast real-world knowledge. A text compressor must solve the same problem in order to assign the shortest codes to the most likely text sequences. Most large language models and neural network models are not eligible for the
https://en.wikipedia.org/wiki/Software%20Updater
In several Linux operating systems, the Software Updater program updates installed software and their associated packages with important software updates for security or with recommended patches. It also informs users when updates are available, listing them in alphabetical order for users to choose which updates to install, if any. It was originally written for Ubuntu, although it is now part of other APT-based systems. The application was originally called Update Manager; it was announced in May 2012 that starting with Ubuntu 12.10 the name would change to Software Updater to better describe its functions. Technically the rename only affected the GUI; the name of the APT package containing the application, the executable file itself, and internally the software itself, still use the name update-manager. The Software Updater cannot uninstall updates, although this can be accomplished by other package managers such as Ubuntu Software Center and more technically advanced ones such as Synaptic. In Ubuntu, the Software Updater can update the operating system to new versions which are released every six months for standard releases or every two years for Long Term Support releases. This functionality is included by default in the desktop version but needs to be added to the server version. Distributions that use the Software Updater Kubuntu Ubuntu Ubuntu GNOME Ubuntu Kylin Ubuntu MATE Xubuntu Zorin OS See also Advanced Packaging Tool KPackage Package management system Synaptic (software) Ubuntu Software Center
https://en.wikipedia.org/wiki/Geodesic%20grid
A geodesic grid is a spatial grid based on a geodesic polyhedron or Goldberg polyhedron. History The earliest use of the (icosahedral) geodesic grid in geophysical modeling dates back to 1968 and the work by Sadourny, Arakawa, and Mintz and Williamson. Later work expanded on this base. Construction A geodesic grid is a global Earth reference that uses triangular tiles based on the subdivision of a polyhedron (usually the icosahedron, and usually a Class I subdivision) to subdivide the surface of the Earth. Such a grid does not have a straightforward relationship to latitude and longitude, but conforms to many of the main criteria for a statistically valid discrete global grid. Primarily, the cells' area and shape are generally similar, especially near the poles where many other spatial grids have singularities or heavy distortion. The popular Quaternary Triangular Mesh (QTM) falls into this category. Geodesic grids may use the dual polyhedron of the geodesic polyhedron, which is the Goldberg polyhedron. Goldberg polyhedra are made up of hexagons and (if based on the icosahedron) 12 pentagons. One implementation that uses an icosahedron as the base polyhedron, hexagonal cells, and the Snyder equal-area projection is known as the Icosahedron Snyder Equal Area (ISEA) grid. Applications In biodiversity science, geodesic grids are a global extension of local discrete grids that are staked out in field studies to ensure appropriate statistical sampling and larger multi-use grids deployed at regional and national levels to develop an aggregated understanding of biodiversity. These grids translate environmental and ecological monitoring data from multiple spatial and temporal scales into assessments of current ecological condition and forecasts of risks to our natural resources. A geodesic grid allows local to global assimilation of ecologically significant information at its own level of granularity. When modeling the weather, ocean circulation, or the climate, parti
https://en.wikipedia.org/wiki/Terminal%20and%20nonterminal%20symbols
In formal languages, terminal and nonterminal symbols are the lexical elements used in specifying the production rules constituting a formal grammar. Terminal symbols are the elementary symbols of the language defined as part of a formal grammar. Nonterminal symbols (or syntactic variables) are replaced by groups of terminal symbols according to the production rules. The terminals and nonterminals of a particular grammar are in two completely separate sets. Terminal symbols Terminal symbols are symbols that may appear in the outputs of the production rules of a formal grammar and which cannot be changed using the rules of the grammar. Applying the rules recursively to a source string of symbols will usually terminate in a final output string consisting only of terminal symbols. Consider a grammar defined by two rules. In this grammar, the symbol Б is a terminal symbol and Ψ is both a non-terminal symbol and the start symbol. The production rules for creating strings are as follows: The symbol Ψ can become БΨ The symbol Ψ can become Б Here Б is a terminal symbol because no rule exists which would change it into something else. On the other hand, Ψ has two rules that can change it, thus it is nonterminal. A formal language defined or generated by a particular grammar is the set of strings that can be produced by the grammar and that consist only of terminal symbols. Diagram 1 illustrates a string that can be produced with this grammar. Nonterminal symbols Nonterminal symbols are those symbols that can be replaced. They may also be called simply syntactic variables. A formal grammar includes a start symbol, a designated member of the set of nonterminals from which all the strings in the language may be derived by successive applications of the production rules. In fact, the language defined by a grammar is precisely the set of terminal strings that can be so derived. Context-free grammars are those grammars in which the left-hand side of each production
https://en.wikipedia.org/wiki/TAMDAR
TAMDAR (Tropospheric Airborne Meteorological Data Reporting) is a weather monitoring system that consists of an in situ atmospheric sensor mounted on commercial aircraft for data gathering. It collects information similar to that collected by radiosondes carried aloft by weather balloons. It was developed by AirDat LLC, which was acquired by Panasonic Avionics Corporation in April 2013 and was operated until October 2018 under the name Panasonic Weather Solutions. It is now owned by FLYHT Aerospace Solutions Ltd. History In response to a governmental aviation safety initiative in the early 2000s, NASA, in partnership with the FAA, NOAA, and private industry, sponsored the early development and evaluation of a proprietary multifunction in situ atmospheric sensor for aircraft. The predecessor to Panasonic Weather Solutions, AirDat (formerly ODS of Rapid City, SD), located in Morrisville, North Carolina and Lakewood, Colorado, was formed in 2003 to develop and deploy the Tropospheric Airborne Meteorological Data Reporting (TAMDAR) system based on requirements provided by the Global Systems Division (GSD) of NOAA's Earth System Research Laboratory (ESRL), the FAA, and the World Meteorological Organization (WMO). The TAMDAR sensor was originally deployed in December 2004 on a fleet of 63 Saab SF340 aircraft operated by Mesaba Airlines in the Great Lakes region of the United States as a part of the NASA-sponsored Great Lakes Fleet Experiment (GLFE). Over the last twelve years, equipage of the sensors has expanded beyond the continental US to include Alaska, the Caribbean, Mexico, Central America, Europe, and Asia. Airlines flying the system include Icelandair, Horizon (Alaska Air Group), Chautauqua (Republic Airways), Piedmont (American Airlines), AeroMéxico, Ravn Alaska, Hageland, PenAir, Silver Airways, and Flybe, as well as a few research aircraft including the UK Met Office BAe-146 FAAM aircraft. Recently, an installation agreement has been reached with a large So
https://en.wikipedia.org/wiki/Leonidas%20Alaoglu
Leonidas (Leon) Alaoglu (; March 19, 1914 – August 1981) was a mathematician, known for his result, called Alaoglu's theorem on the weak-star compactness of the closed unit ball in the dual of a normed space, also known as the Banach–Alaoglu theorem. Life and work Alaoglu was born in Red Deer, Alberta to Greek parents. He received his BS in 1936, Master's in 1937, and PhD in 1938 (at the age of 24), all from the University of Chicago. His thesis, written under the direction of Lawrence M. Graves was entitled Weak topologies of normed linear spaces. His doctoral thesis is the source of Alaoglu's theorem. The Bourbaki–Alaoglu theorem is a generalization of this result by Bourbaki to dual topologies. After some years teaching at Pennsylvania State College, Harvard University and Purdue University, in 1944 he became an operations analyst for the United States Air Force. In his last position, from 1953 to 1981 he worked as a senior scientist in operations research at the Lockheed Corporation in Burbank, California. In this latter period he wrote numerous research reports, some of them classified. During the Lockheed years he took an active part in seminars and other mathematical activities at Caltech, UCLA and USC. After his death in 1981 a Leonidas Alaoglu Memorial Lecture Series was established at Caltech. Speakers have included Paul Erdős, Irving Kaplansky, Paul Halmos and Hugh Woodin. See also Axiom of Choice – The Banach–Alaoglu theorem is not provable from ZF without use of the Axiom of Choice. Banach–Alaoglu theorem Gelfand representation List of functional analysis topics Superabundant number – Article explains the 1944 results of Alaoglu and Erdős on this topic Tychonoff's theorem Weak topology – Leads to the weak-star topology to which the Banach–Alaoglu theorem applies. Publications Alaoglu, Leonidas (M.S. thesis, U. of Chicago, 1937). "The asymptotic Waring problem for fifth and sixth powers" (24 pages). Advisor: Leonard Eugene Dickson Alao
https://en.wikipedia.org/wiki/RNA-dependent%20RNA%20polymerase
RNA-dependent RNA polymerase (RdRp) or RNA replicase is an enzyme that catalyzes the replication of RNA from an RNA template. Specifically, it catalyzes synthesis of the RNA strand complementary to a given RNA template. This is in contrast to typical DNA-dependent RNA polymerases, which all organisms use to catalyze the transcription of RNA from a DNA template. RdRp is an essential protein encoded in the genomes of most RNA-containing viruses with no DNA stage including SARS-CoV-2. Some eukaryotes also contain RdRps, which are involved in RNA interference and differ structurally from viral RdRps. History Viral RdRps were discovered in the early 1960s from studies on mengovirus and polio virus when it was observed that these viruses were not sensitive to actinomycin D, a drug that inhibits cellular DNA-directed RNA synthesis. This lack of sensitivity suggested that there is a virus-specific enzyme that could copy RNA from an RNA template and not from a DNA template. Distribution RdRps are highly conserved throughout viruses and are even related to telomerase, though the reason for this is an ongoing question as of 2009. The similarity has led to speculation that viral RdRps are ancestral to human telomerase. The most famous example of RdRp is that of the polio virus. The viral genome is composed of RNA, which enters the cell through receptor-mediated endocytosis. From there, the RNA is able to act as a template for complementary RNA synthesis, immediately. The complementary strand is then, itself, able to act as a template for the production of new viral genomes that are further packaged and released from the cell ready to infect more host cells. The advantage of this method of replication is that there is no DNA stage; replication is quick and easy. The disadvantage is that there is no 'back-up' DNA copy. Many RdRps are associated tightly with membranes and are, therefore, difficult to study. The best-known RdRps are polioviral 3Dpol, vesicular stomatiti
https://en.wikipedia.org/wiki/2006%20Missouri%20Amendment%202
Missouri Constitutional Amendment 2 (The Missouri Stem Cell Research and Cures Initiative) was a state constitutional amendment initiative that concerned stem cell research and human cloning. It allows any stem cell research and therapy in the U.S. state of Missouri that is legal under federal law, including somatic cell nuclear transfer to produce human embryos for stem cell production. It prohibits cloning or attempting to clone a human being, which is defined to mean "to implant in a uterus or attempt to implant in a uterus anything other than the product of fertilization of an egg of a human female by a sperm of a human male for the purpose of initiating a pregnancy that could result in the creation of a human fetus, or the birth of a human being". Commercials supporting and opposing the amendment aired during the 2006 World Series, in which the St. Louis Cardinals participated. The issue became especially intertwined with the 2006 U.S. Senate election in Missouri, with the Republican and Democratic candidates on opposite sides of the issue. Missouri Constitutional Amendment 2 appeared on the ballot for the November 2006 general election and passed with 51% of the vote. Support The organization that led the movement to get the initiative on the ballot and later supported its adoption was called the Missouri Coalition for Lifesaving Cures. The measure was proposed to stop repeated attempts by the Missouri Legislature to ban certain types of stem cell research, namely SCNT. Claire McCaskill, the Democratic nominee for U.S. Senate, supported the measure. During the 2006 World Series, which was partially held in St. Louis, a television ad featuring actor Michael J. Fox aired. The ad was paid for by McCaskill's campaign, and the primary reason Fox gave for his support for McCaskill was her stance in favor of stem cell research. The advertisement was controversial because Fox was visibly suffering tremors, which were side effects of the medications used to treat
https://en.wikipedia.org/wiki/International%20Ranger%20Federation
The International Ranger Federation is an organisation which represents Park Rangers and Park Wardens across the world. Many countries have agencies that undertake the protection and management of natural areas. The rangers within these organisations are represented at the international level by the International Ranger Federation (IRF). The IRF seeks to represent Park Rangers on a professional level. A number of countries also have affiliated organisations with the same goals. Every 3 years the World Ranger Congress is held by a host country with the next one being in Chitwan, Nepal - 11 to 16 November 2019. History The International Ranger Federation (IRF) was founded in 1992 with a signed agreement between the Countryside Management Association (CMA), representing rangers in England and Wales; the Scottish Countryside Rangers Association (SCRA); and the U.S. Association of National Park Rangers (ANPR). The IRF is a non-profit organisation established to raise awareness of and support the critical work that Rangers do in conserving the world’s natural and cultural heritage. The role of the IRF is to empower Rangers by supporting their national or state Ranger organisations, or assisting in the establishment of local Ranger associations in countries where they do not currently exist. As of 2019 the IRF has over 100 associations of Rangers. There has been over 60 countries that have applied for one of three membership types - Regular, Provisional & Associate Membership. Purpose The goals of the IRF are to provide a forum for rangers from around the world to share their successes and failures in protecting the world's heritage and to promote information and technology transfer from countries in which protected area management enjoys broad public and government support to countries in which protected area management is less well supported. Affiliated Organisations The Australian Ranger Federation (ARF), Associazione Italiana Guardie dei Parchi e delle Aree Protett
https://en.wikipedia.org/wiki/Context-aware%20pervasive%20systems
Context-aware computing refers to a general class of mobile systems that can sense their physical environment, and adapt their behavior accordingly. Three important aspects of context are: where you are; who you are with; and what resources are nearby. Although location is a primary capability, location-aware does not necessarily capture things of interest that are mobile or changing. Context-aware in contrast is used more generally to include nearby people, devices, lighting, noise level, network availability, and even the social situation, e.g., whether you are with your family or a friend from school. History The concept emerged from ubiquitous computing research at Xerox PARC and elsewhere in the early 1990s. The term 'context-aware' was first used by Schilit and Theimer in their 1994 paper Disseminating Active Map Information to Mobile Hosts where they describe a model of computing in which users interact with many different mobile and stationary computers and classify a context-aware systems as one that can adapt according to its location of use, the collection of nearby people and objects, as well as the changes to those objects over time over the course of the day. See also Ambient intelligence Context awareness Differentiated service (design pattern) Locative Media
https://en.wikipedia.org/wiki/Visual%20hull
A visual hull is a geometric entity created by shape-from-silhouette 3D reconstruction technique introduced by A. Laurentini. This technique assumes the foreground object in an image can be separated from the background. Under this assumption, the original image can be thresholded into a foreground/background binary image, which we call a silhouette image. The foreground mask, known as a silhouette, is the 2D projection of the corresponding 3D foreground object. Along with the camera viewing parameters, the silhouette defines a back-projected generalized cone that contains the actual object. This cone is called a silhouette cone. The upper right thumbnail shows two such cones produced from two silhouette images taken from different viewpoints. The intersection of the two cones is called a visual hull, which is a bounding geometry of the actual 3D object (see the bottom right thumbnail). When the reconstructed geometry is only used for rendering from a different viewpoint, the implicit reconstruction together with rendering can be done using graphics hardware. In two dimensions A technique used in some modern touchscreen devices employs cameras placed in the corners situated opposite infrared LEDs. The one-dimensional projection (shadow) of objects on the surface may be used to reconstruct the convex hull of the object. See also 3D reconstruction from multiple images Tomographic reconstruction
https://en.wikipedia.org/wiki/Error%20amplifier%20%28electronics%29
An error amplifier is most commonly encountered in feedback unidirectional voltage control circuits, where the sampled output voltage of the circuit under control, is fed back and compared to a stable reference voltage. Any difference between the two generates a compensating error voltage which tends to move the output voltage towards the design specification. An error amplifier is essentially what its name says, that is, it amplifies an error signal. This error is based on the difference between a reference signal and the input signal. It can also be treated as the difference between the two inputs. These are usually used in unison with feedback loops, owing to their self-correcting mechanism. They have an inverting and a non-inverting input pin set, which is what is responsible for the output to be the difference of the inputs. Devices Discrete Transistors Operational amplifiers Applications Regulated power supply. D.C Power Amplifiers Measurement Equipment Servomechanisms See also Differential amplifier External links Error Amplifier Design and Application , alphascientific.com. Originally accessed 27 April 2009, now 404. Try https://web.archive.org/web/20081006222215/http://www.alphascientific.com/technotes/technote3.pdf Error amplifier as an element in a voltage regulator:Stability analysis of low-dropout linear regulators with a PMOS pass element Electronic amplifiers
https://en.wikipedia.org/wiki/Regulated%20power%20supply
A regulated power supply is an embedded circuit; it converts unregulated AC (alternating current) into a constant DC. With the help of a rectifier it converts AC supply into DC. Its function is to supply a stable voltage (or less often current), to a circuit or device that must be operated within certain power supply limits. The output from the regulated power supply may be alternating or unidirectional, but is nearly always DC (direct current). The type of stabilization used may be restricted to ensuring that the output remains within certain limits under various load conditions, or it may also include compensation for variations in its own supply source. The latter is much more common today. Applications D.C. variable bench supply A bench power supply usually refers to a power supply capable of supplying a variety of output voltages useful for BE (bench testing) electronic circuits, possibly with continuous variation of the output voltage, or just some preset voltages. Some have multiple selectable ranges of current/voltage limits which tend to be anti-proportional. A laboratory ("lab") power supply normally implies an accurate bench power supply, while a balanced or tracking power supply refers to twin supplies for use when a circuit requires both positive and negative supply rails). Types Variable bench power supplies exist both as linear (transformer first) and switched-mode power supply (full-bridge rectifier first), each with a different set of benefits and disadvantages: Linear The linear type produces only very little noise (or "ripple voltage") and is less prone to external electromagnetic and radio frequency interference (EMI, RFI), making it preferable for audio equipment and radio-related applications and for powering delicate circuitry. Linear power supplies also have fewer failable parts which increases longevity, and have a quicker transient response. Linear variable bench power supplies have existed since longer ago, dating back at least to
https://en.wikipedia.org/wiki/NavPix
NavPix is the proprietary name applied by Navman to its technology that combines an image with geographical data. The "NavPix" name is used for both the software and the geo-referenced image that results from that software. NavPix software The NavPix technology enables users to take a JPEG image using the integrated digital camera on the N Series ("N" for NavPix), iCN 720 or iCN 750 portable Navman GPS navigation devices. The Navman's GPS (Global Positioning System) receiver determines the latitude and longitude of where that image was taken. That information is then written into the image's Exif (Exchangeable image file format) meta data by the NavPix software. The NavPix, therefore, effectively provides a Georeference of the location where the image was taken, which is not necessarily the same georeference as the object being "NavPix-ed". The NavPix image can then be used to define a destination or point of interest on compatible Navman devices. NavPix sources Furthermore, as the geographical information is written to the meta data, the image itself can be shared between compatible devices or uploaded to Navman's NavPix Library which offers a wide range of NavPix images that have been taken by both Navman users and sourced from professional photo providers, including Lonely Planet. The NavPix Library also enables people to upload non-NavPix images (including other formats such as GIF) and convert them to NavPix images by using entering either the latitude and longitude they want to associate with the image or by entering the address and using the Library's software to generate the latitude and longitude values based on a Postal code look-up. Unlike some geo-referencing applications, the NavPix Library writes the georeference values to the image itself via the Exif meta data. Common Misconceptions The photo taking abilities do not help navigation. See also GPS eXchange Format (XML schema for interchange of waypoints) External links NavPix Library GPS G
https://en.wikipedia.org/wiki/S4%20Index
The Index is a standard index used to measure ionospheric disturbances. It is defined as the ratio of the standard deviation of signal intensity to the average signal intensity. Real Time data This parameter is displayed in real time by many institutions: at Arecibo Observatory at Cornell University at INPE, in Brasil
https://en.wikipedia.org/wiki/Teaching%20clinic
A teaching clinic is an outpatient clinic that provides health care for ambulatory patients - as opposed to inpatients treated in a hospital. Teaching clinics traditionally are operated by educational facilities and provide free or low-cost services to patients. Teaching clinics differ from standard health clinics in that treatment is performed by graduate students under the supervision of licensed health care providers or by licensed health care providers while graduate students observe. Teaching clinics serve the dual purpose of providing a setting for students in the health care profession to learn and practice skills, while simultaneously offering lower cost treatments to patients. Patients In some instances, patient groups may be based in part upon a paradigm in which friends and family are commonly recruited to be patients. See also Teaching hospital
https://en.wikipedia.org/wiki/Cooking%20base
Cooking base, sometimes called soup base, is a concentrated flavoring compound used in place of stock for the creation of soups, sauces, and gravies. Since it can be purchased rather than prepared fresh, it is commonly used in restaurants where cost is a more important factor than achieving haute cuisine. Veal and chicken base are common, as are beef, lamb and vegetable base. Soup base is available in many levels of quality. Today, these products are produced in low and very low sodium varieties, seafood and vegetarian.
https://en.wikipedia.org/wiki/Femlin
The Femlin is a character used on the Party Jokes page of Playboy magazine. Created in 1955 by LeRoy Neiman, Femlins became a mainstay of the magazine for more than five decades. Some Femlin figurines produced in the 1960s have become much sought-after by collectors. History Femlins were created by sport illustrator LeRoy Neiman in 1955 when publisher/editor Hugh Hefner decided the Party Jokes page needed a visual element. The name is a portmanteau of "female" and "gremlin." They are portrayed as mischievous black and white female sprites, apparently tall, wearing only opera gloves, stockings and high heel shoes. They are usually drawn in two or three panel vignettes, interacting with various life-sized items such as shoes, jewelry, neckties and such. Femlins have appeared on the Party Jokes page in every issue since their creation, and were featured on the magazine's cover numerous times, either as drawn by Neiman or in photographed tableaus of sculpted clay models. Neiman reportedly submitted two drawings of Femlin to Playboy every month for more than 50 years, working on the character late into his life, before his death at the age of 91 in 2012. Merchandising Femlins have been featured on a variety of merchandise throughout the years, such as ashtrays, shotglasses, and coffee mugs. A set of four plaster statues, the tallest approximately 14" high, was advertised for sale in the back pages of Playboy in 1963. (Like the drawings on which they were based, these statues were not anatomically detailed.) Originally priced at US$7.50 apiece in 1963, a complete set of the four statuettes was auctioned off by Leland's auction house in June 2004 for US$7,904.80, according to a Google cache of the auction. In 2004, Playboy produced a new, updated figurine of a Femlin sitting in a champagne glass. Though now out of production, these are extremely common, and should not be confused with the older figurines. External links
https://en.wikipedia.org/wiki/Sympetrum
Sympetrum is a genus of small to medium-sized skimmer dragonflies, known as darters in the UK and as meadowhawks in North America. The more than 50 species predominantly live in the temperate zone of the Northern Hemisphere; no Sympetrum species is native to Australia. Most North American darters fly in late summer and autumn, breeding in ponds and foraging over meadows. Commonly, they are yellow-gold as juveniles, with mature males and some females becoming bright red on part or all of their bodies. An exception to this color scheme is the black darter (Sympetrum danae). The genus includes the following species: Sympetrum ambiguum – blue-faced meadowhawk Sympetrum anomalum Sympetrum arenicolor Sympetrum baccha Sympetrum chaconi Sympetrum commixtum Sympetrum cordulegaster Sympetrum corruptum – variegated meadowhawk Sympetrum costiferum – saffron-winged meadowhawk Sympetrum croceolum Sympetrum daliensis Sympetrum danae – black darter, black meadowhawk Sympetrum darwinianum Sympetrum depressiusculum – spotted darter Sympetrum dilatatum – St. Helena darter Sympetrum durum Sympetrum eroticum Sympetrum evanescens Sympetrum flaveolum – yellow-winged darter Sympetrum fonscolombii – red-veined darter, nomad Sympetrum frequens Sympetrum gilvum Sympetrum gracile Sympetrum haematoneura Sympetrum haritonovi – dwarf darter Sympetrum himalayanum Sympetrum hypomelas Sympetrum illotum – cardinal meadowhawk Sympetrum imitans Sympetrum infuscatum Sympetrum internum – cherry-faced meadowhawk Sympetrum kunckeli Sympetrum maculatum Sympetrum madidum – red-veined meadowhawk Sympetrum meridionale – southern darter Sympetrum nigrifemur – island darter Sympetrum nigrocreatum – Talamanca meadowhawk Sympetrum nomurai Sympetrum obtrusum – white-faced meadowhawk Sympetrum orientale Sympetrum pallipes – striped meadowhawk Sympetrum paramo Sympetrum parvulum Sympetrum pedemontanum – banded darter Sympetrum risi Sympetrum roraimae Sympetrum rubicu
https://en.wikipedia.org/wiki/Index%20of%20protein-related%20articles
Proteins are a class of biomolecules composed of amino acid chains. Biochemistry Antifreeze protein, class of polypeptides produced by certain fish, vertebrates, plants, fungi and bacteria Conjugated protein, protein that functions in interaction with other chemical groups attached by covalent bonds Denatured protein, protein which has lost its functional conformation Matrix protein, structural protein linking the viral envelope with the virus core Protein A, bacterial surface protein that binds antibodies Protein A/G, recombinant protein that binds antibodies Protein C, anticoagulant Protein G, bacterial surface protein that binds antibodies Protein L, bacterial surface protein that binds antibodies Protein S, plasma glycoprotein Protein Z, glycoprotein Protein catabolism, the breakdown of proteins into amino acids and simple derivative compounds Protein complex, group of two or more associated proteins Protein electrophoresis, method of analysing a mixture of proteins by means of gel electrophoresis Protein folding, process by which a protein assumes its characteristic functional shape or tertiary structure Protein isoform, version of a protein with some small differences Protein kinase, enzyme that modifies other proteins by chemically adding phosphate groups to them Protein ligands, atoms, molecules, and ions which can bind to specific sites on proteins Protein microarray, piece of glass on which different molecules of protein have been affixed at separate locations in an ordered manner Protein phosphatase, enzyme that removes phosphate groups that have been attached to amino acid residues of proteins Protein purification, series of processes intended to isolate a single type of protein from a complex mixture Protein sequencing, protein method Protein splicing, intramolecular reaction of a particular protein in which an internal protein segment is removed from a precursor protein Protein structure, unique three-dimensional shape of amino
https://en.wikipedia.org/wiki/Structured%20analysis
In software engineering, structured analysis (SA) and structured design (SD) are methods for analyzing business requirements and developing specifications for converting practices into computer programs, hardware configurations, and related manual procedures. Structured analysis and design techniques are fundamental tools of systems analysis. They developed from classical systems analysis of the 1960s and 1970s. Objectives of structured analysis Structured analysis became popular in the 1980s and is still in use today. Structured analysis consists of interpreting the system concept (or real world situations) into data and control terminology represented by data flow diagrams. The flow of data and control from bubble to the data store to bubble can be difficult to track and the number of bubbles can increase. One approach is to first define events from the outside world that require the system to react, then assign a bubble to that event. Bubbles that need to interact are then connected until the system is defined. Bubbles are usually grouped into higher level bubbles to decrease complexity. Data dictionaries are needed to describe the data and command flows, and a process specification is needed to capture the transaction/transformation information. SA and SD are displayed with structure charts, data flow diagrams and data model diagrams, of which there were many variations, including those developed by Tom DeMarco, Ken Orr, Larry Constantine, Vaughn Frick, Ed Yourdon, Steven Ward, Peter Chen, and others. These techniques were combined in various published system development methodologies, including structured systems analysis and design method, profitable information by design (PRIDE), Nastec structured analysis & design, SDM/70 and the Spectrum structured system development methodology. History Structured analysis is part of a series of structured methods that represent a collection of analysis, design, and programming techniques that were developed in re
https://en.wikipedia.org/wiki/Weather%20buoy
Weather buoys are instruments which collect weather and ocean data within the world's oceans, as well as aid during emergency response to chemical spills, legal proceedings, and engineering design. Moored buoys have been in use since 1951, while drifting buoys have been used since 1979. Moored buoys are connected with the ocean bottom using either chains, nylon, or buoyant polypropylene. With the decline of the weather ship, they have taken a more primary role in measuring conditions over the open seas since the 1970s. During the 1980s and 1990s, a network of buoys in the central and eastern tropical Pacific Ocean helped study the El Niño-Southern Oscillation. Moored weather buoys range from in diameter, while drifting buoys are smaller, with diameters of . Drifting buoys are the dominant form of weather buoy in sheer number, with 1250 located worldwide. Wind data from buoys has smaller error than that from ships. There are differences in the values of sea surface temperature measurements between the two platforms as well, relating to the depth of the measurement and whether or not the water is heated by the ship which measures the quantity. History The first known proposal for surface weather observations at sea occurred in connection with aviation in August 1927, when Grover Loening stated that "weather stations along the ocean coupled with the development of the seaplane to have an equally long range, would result in regular ocean flights within ten years." Starting in 1939, United States Coast Guard vessels were being used as weather ships to protect transatlantic air commerce. During World War II The German Navy deployed weather buoys (Wetterfunkgerät See — WFS) at fifteen fixed positions in the North Atlantic and Barents Sea. They were launched from U-boats into a maximum depth of ocean of 1000 fathoms (1,800 metres), limited by the length of the anchor cable. Overall height of the body was 10.5 metres (of which most was submerged), surmounted by a ma
https://en.wikipedia.org/wiki/Ottobock
Ottobock SE & Co. KGaA, formerly Otto Bock, is a company based in Duderstadt Germany, that operates in the field of orthopedic technology. It is considered the world market leader in the field of prosthetics and one of the leading suppliers in orthotics, wheelchairs and exoskeletons. In 2019, the Ottobock Group as a whole generated sales of €1,002 billion with 8,367 employees worldwide. History Foundation until 1945 The company was founded on January 13, 1919 as Orthopädische Industrie GmbH in Berlin by a group surrounding a manufacturer from Krefeld, to supply prostheses and orthopedic products to the many thousands of war invalids of World War I. Otto Bock acted as production manager during this phase. He moved into the management of the company in 1924 and finally took over as sole managing director in 1927. At the end of October 1933, Bock, who had joined the NSDAP in May, had the old company liquidated and paid out the remaining shareholders. The company was renamed Orthopedic Industry Otto Bock in Königssee. In 1920, production had been relocated to Königsee in Thuringia, where up to 600 people worked at times. Since the high demand could hardly be met by handicraft methods, Otto Bock began to mass-produce prosthetic parts, thus laying the foundation for the orthopedic industry. New materials were used in production, so that aluminum parts were already being used in prosthetics in the 1930s. After graduating from high school in 1935, Max Näder began training as an orthopedic mechanic and industrial clerk at Ottobock. In 1943, during wartime leave, he married Marie Bock, the younger sister of the entrepreneur. During World War II, the company employed forced laborers. After graduating from high school in 1935, Max Näder began training as an orthopedic mechanic and industrial clerk at Otto Bock. While on war leave in 1943, he married Marie Bock, the entrepreneur's younger sister. During the Second World War, the company employed forced laborers.[7] 1946-
https://en.wikipedia.org/wiki/Drug-induced%20lupus%20erythematosus
Drug-induced lupus erythematosus is an autoimmune disorder caused by chronic use of certain drugs. These drugs cause an autoimmune response (the body attacks its own cells) producing symptoms similar to those of systemic lupus erythematosus (SLE). There are 38 known medications to cause DIL but there are three that report the highest number of cases: hydralazine, procainamide, and quinidine. While the criteria for diagnosing DIL has not been thoroughly established, symptoms of DIL typically present as muscle pain and joint pain. Generally, the symptoms recede after discontinuing use of the drugs. Signs and symptoms Signs and symptoms of drug-induced lupus erythematosus include the following: Joint pain (arthralgia) and muscle pain (myalgia) Fatigue Serositis—inflammation of the tissues lining the heart and lungs. Anti-histone antibodies in 95% of cases among those taking procainamide, hydralazine, chlorpromazine, and quinidine; however, these antibodies have been found in a smaller proportion of patients associated with other medications, including minocycline, propylthiouracil, and statins. These signs and symptoms are not side effects of the drugs taken which occur during short term use. DIL occurs over long-term and chronic use of the medications listed below. While these symptoms are similar to those of systemic lupus erythematosus, they are generally not as severe unless they are ignored which leads to more harsh symptoms, and in some reported cases, death. Causes The processes that lead to drug-induced lupus erythematosus are not entirely understood. The exact processes that occur are not known even after 50 years since its discovery, but many studies present theories on the mechanisms of DIL. A predisposing factor to developing DIL is N-acetylation speed, or the rate at which the body can metabolize the drug. This is greatly decreased in patients with a genetic deficiency of the enzyme N-acetyltransferase. A study showed that 29 of 30 patients wit
https://en.wikipedia.org/wiki/Emerin
Emerin is a protein that in humans is encoded by the EMD gene, also known as the STA gene. Emerin, together with LEMD3, is a LEM domain-containing integral protein of the inner nuclear membrane in vertebrates. Emerin is highly expressed in cardiac and skeletal muscle. In cardiac muscle, emerin localizes to adherens junctions within intercalated discs where it appears to function in mechanotransduction of cellular strain and in beta-catenin signaling. Mutations in emerin cause X-linked recessive Emery–Dreifuss muscular dystrophy, cardiac conduction abnormalities and dilated cardiomyopathy. It is named after Alan Emery. Structure Emerin is a 29.0 kDa (34 kDa observed MW) protein composed of 254 amino acids. Emerin is a serine-rich protein with an N-terminal 20-amino acid hydrophobic region that is flanked by charged residues; the hydrophobic region may be important for anchoring the protein to the membrane, with the charged terminal tails being cytosolic. In cardiac, skeletal, and smooth muscle, emerin localizes to the inner nuclear membrane; expression of emerin is highest in skeletal and cardiac muscle. In cardiac muscle specifically, emerin also resides at adherens junctions within intercalated discs. Function Emerin is a serine-rich nuclear membrane protein and a member of the nuclear lamina-associated protein family. It mediates membrane anchorage to the cytoskeleton. Emery–Dreifuss muscular dystrophy is an X-linked inherited degenerative myopathy resulting from mutation in the EMD (also known clinically as STA) gene. Emerin appears to be involved in mechanotransduction, as emerin-deficient mouse fibroblasts failed to transduce normal mechanosensitive gene expression responses to strain stimuli. In cardiac muscle, emerin is also found complexed to beta-catenin at adherens junctions of intercalated discs, and cardiomyocytes from hearts lacking emerin showed beta-catenin redistribution as well as perturbed intercalated disc architecture and myocyte shape. This
https://en.wikipedia.org/wiki/Source%20routing
In computer networking, source routing, also called path addressing, allows a sender of a packet to partially or completely specify the route the packet takes through the network. In contrast, in conventional routing, routers in the network determine the path incrementally based on the packet's destination. Another routing alternative, label switching, is used in connection-oriented networks such as X.25, Frame Relay, Asynchronous Transfer Mode and Multiprotocol Label Switching. Source routing allows easier troubleshooting, improved traceroute, and enables a node to discover all the possible routes to a host. It does not allow a source to directly manage network performance by forcing packets to travel over one path to prevent congestion on another. Many high-performance interconnects including Myrinet, Quadrics, IEEE 1355, and SpaceWire support source routing. Internet Protocol In the Internet Protocol, two header options are available which are rarely used: "strict source and record route" (SSRR) and "loose source and record route" (LSRR). Because of security concerns, packets marked LSRR are frequently blocked on the Internet. If not blocked, LSRR can allow an attacker to spoof an address but still successfully receive response packets by forcing return traffic for spoofed packets to return through the attacker's device. In IPv6, two forms of source routing have been developed. The first approach was the Type 0 Routing header. This routing header was designed to support the same use cases as the IPv4 header options. Unfortunately there were several significant attacks against this routing header and its utilisation was deprecated. A more secure form of source routing is being developed within the IETF to support the IPv6 version of Segment Routing. Software-defined networking Software-defined networking can also be enhanced when source routing is used in the forwarding plane. Studies have shown significant improvements in convergence times as a result of th
https://en.wikipedia.org/wiki/Frank%E2%80%93Tamm%20formula
The Frank–Tamm formula yields the amount of Cherenkov radiation emitted on a given frequency as a charged particle moves through a medium at superluminal velocity. It is named for Russian physicists Ilya Frank and Igor Tamm who developed the theory of the Cherenkov effect in 1937, for which they were awarded a Nobel Prize in Physics in 1958. When a charged particle moves faster than the phase speed of light in a medium, electrons interacting with the particle can emit coherent photons while conserving energy and momentum. This process can be viewed as a decay. See Cherenkov radiation and nonradiation condition for an explanation of this effect. Equation The energy emitted per unit length travelled by the particle per unit of frequency is: provided that . Here and are the frequency-dependent permeability and index of refraction of the medium respectively, is the electric charge of the particle, is the speed of the particle, and is the speed of light in vacuum. Cherenkov radiation does not have characteristic spectral peaks, as typical for fluorescence or emission spectra. The relative intensity of one frequency is approximately proportional to the frequency. That is, higher frequencies (shorter wavelengths) are more intense in Cherenkov radiation. This is why visible Cherenkov radiation is observed to be brilliant blue. In fact, most Cherenkov radiation is in the ultraviolet spectrum; the sensitivity of the human eye peaks at green, and is very low in the violet portion of the spectrum. The total amount of energy radiated per unit length is: This integral is done over the frequencies for which the particle's speed is greater than speed of light of the media . The integral is convergent (finite) because at high frequencies the refractive index becomes less than unity and for extremely high frequencies it becomes unity. Derivation of Frank–Tamm formula Consider a charged particle moving relativistically along -axis in a medium with refraction index
https://en.wikipedia.org/wiki/Comparison%20of%20programming%20languages%20%28strings%29
This comparison of programming languages (strings) compares the features of string data structures or text-string processing for over 52 various computer programming languages. Concatenation Different languages use different symbols for the concatenation operator. Many languages use the "+" symbol, though several deviate from this. Common variants Unique variants AWK uses the empty string: two expressions adjacent to each other are concatenated. This is called juxtaposition. Unix shells have a similar syntax. Rexx uses this syntax for concatenation including an intervening space. C (along with Python) allows juxtaposition for string literals, however, for strings stored as character arrays, the strcat function must be used. COBOL uses the STRING statement to concatenate string variables. MATLAB and Octave use the syntax "[x y]" to concatenate x and y. Visual Basic and Visual Basic .NET can also use the "+" sign but at the risk of ambiguity if a string representing a number and a number are together. Microsoft Excel allows both "&" and the function "=CONCATENATE(X,Y)". Rust has the concat! macro and the format! macro, of which the latter is the most prevalent throughout the documentation and examples. String literals This section compares styles for declaring a string literal. Quoted interpolated An expression is "interpolated" into a string when the compiler/interpreter evaluates it and inserts the result in its place. Escaped quotes "Escaped" quotes means that a 'flag' symbol is used to warn that the character after the flag is used in the string rather than ending the string. Dual quoting "Dual quoting" means that whenever a quote is used in a string, it is used twice, and one of them is discarded and the single quote is then used within the string. Quoted raw "Raw" means the compiler treats every character within the literal exactly as written, without processing any escapes or interpolations. Multiline string Many languages have a syntax
https://en.wikipedia.org/wiki/Vladimir%20Abramovich%20Rokhlin
Vladimir Abramovich Rokhlin (Russian: Влади́мир Абра́мович Ро́хлин) (23 August 1919 – 3 December 1984) was a Soviet mathematician, who made numerous contributions in algebraic topology, geometry, measure theory, probability theory, ergodic theory and entropy theory. Life Vladimir Abramovich Rokhlin was born in Baku, Azerbaijan, to a wealthy Jewish family. His mother, Henrietta Emmanuilovna Levenson, had studied medicine in France (she died in Baku in 1923, believed to have been killed during civil unrest provoked by an epidemic). His maternal grandmother, Clara Levenson, had been one of the first female doctors in Russia. His maternal grandfather Emmanuil Levenson was a wealthy businessman (he was also the illegitimate father of Korney Chukovsky, who was thus Henrietta's half-brother). Vladimir Rokhlin's father Abram Veniaminovich Rokhlin was a well-known social democrat (he was imprisoned during Stalin's Great Purge, and executed in 1941). Vladimir Rokhlin entered Moscow State University in 1935. His advisor was Abraham Plessner. He volunteered for the army in 1941, leading to four years as a prisoner of a German war camp. During this time he was able to hide his Jewish origins from the Nazis. Rokhlin was liberated by the Soviet military in January 1945. He then served as a German language translator for the 5th Army of the Belorussian front. In May 1945 he was sent to a Soviet 'verification camp' for former prisoners of war. In January 1946 he was transferred to another camp to determine if he was an "enemy of the Soviet." Rokhlin was cleared in June 1946 but was forced to remain in the camp as a guard. Due to intercession by mathematicians Andrey Kolmogorov and Lev Pontryagin, he was released in December 1946 and allowed to return to Moscow, after which he returned to mathematics. In 1959 Rokhlin joined Leningrad State University as a faculty member. He died in 1984 in Leningrad. His students include Viatcheslav Kharlamov, Yakov Eliashberg, Mikhail Gromov, N
https://en.wikipedia.org/wiki/SK%20channel
SK channels (small conductance calcium-activated potassium channels) are a subfamily of calcium-activated potassium channels. They are so called because of their small single channel conductance in the order of 10 pS. SK channels are a type of ion channel allowing potassium cations to cross the cell membrane and are activated (opened) by an increase in the concentration of intracellular calcium through N-type calcium channels. Their activation limits the firing frequency of action potentials and is important for regulating afterhyperpolarization in the neurons of the central nervous system as well as many other types of electrically excitable cells. This is accomplished through the hyperpolarizing leak of positively charged potassium ions along their concentration gradient into the extracellular space. This hyperpolarization causes the membrane potential to become more negative. SK channels are thought to be involved in synaptic plasticity and therefore play important roles in learning and memory. Function SK channels are expressed throughout the central nervous system. They are highly conserved in mammals as well as in other organisms such as Drosophila melanogaster and Caenorhabditis elegans. SK channels are specifically involved in the medium afterhyperpolarizing potential (mAHP). They affect both the intrinsic excitability of neurons and synaptic transmission. They are also involved in calcium signaling. SK channel activation can mediate neuroprotection in various models of cell death. SK channels control action potential discharge frequency in hippocampal neurons, midbrain dopaminergic neurons, dorsal vagal neurons, sympathetic neurons, nucleus reticularis thalamic neurons, inferior olive neurons, spinal and hypoglossal motoneurons, mitral cells in the olfactory bulb, and cortical neurons. Structure SK potassium channels share the same basic architecture with Shaker-like voltage-gated potassium channels. Four subunits associate to form a tetramer. Each
https://en.wikipedia.org/wiki/DNA%20Data%20Bank%20of%20Japan
The DNA Data Bank of Japan (DDBJ) is a biological database that collects DNA sequences. It is located at the National Institute of Genetics (NIG) in the Shizuoka prefecture of Japan. It is also a member of the International Nucleotide Sequence Database Collaboration or INSDC. It exchanges its data with European Molecular Biology Laboratory at the European Bioinformatics Institute and with GenBank at the National Center for Biotechnology Information on a daily basis. Thus these three databanks contain the same data at any given time. History DDBJ began data bank activities in 1987 at NIG and remains the only nucleotide sequence data bank in Asia. Organisation Although DDBJ mainly receives its data from Japanese researchers, it can accept data from contributors from any other country. DDBJ is primarily funded by the Japanese Ministry of Education, Culture, Sports, Science and Technology (MEXT). DDBJ has an international advisory committee which consists of nine members, 3 members each from Europe, US, and Japan. This committee advises DDBJ about its maintenance, management and future plans once a year. Apart from this, DDBJ also has an international collaborative committee which advises on various technical issues related to international collaboration and consists of working-level participants. See also National Center for Biotechnology Information (NCBI) European Bioinformatics Institute (EBI)
https://en.wikipedia.org/wiki/Mobile-device%20testing
Mobile-device testing functions to assure the quality of mobile devices, like mobile phones, PDAs, etc. It is conducted on both hardware and software, and from the view of different procedures, the testing comprises R&D testing, factory testing and certificate testing. It involves a set of activities from monitoring and trouble shooting mobile application, content and services on real handsets. It includes verification and validation of hardware devices and software applications. Test must be conducted with multiple operating system versions, hardware configurations, device types, network capabilities, and notably with the Android operating system, with various hardware vendor interface layers. Automation key features Add application/product space. Create test builds for application/product. Associate test builds with application/product space. Add your own remote devices, by getting a small service app installed on them. Record test cases/scripts/data on a reference device/emulator. Associate test cases/scripts/data with application/product space. Maintain test cases/scripts/data for each application/product. Select devices/emulators to run your test scripts. Get test results e-mailed to you (after completing the entire run, the fixed number of steps, and after every X units of time) – PDF format supported currently. Listed companies like Keynote Systems, Capgemini Consulting and Mobile Applications and Handset testing company Intertek and QA companies like PASS Technologies AG, and Testdroid provide mobile testing, helping application stores, developers and mobile device manufacturers in testing and monitoring of mobile content, applications and services. Static code analysis Static code analysis is the analysis of computer software that is performed without actually executing programs built from that software (analysis performed on executing programs is known as dynamic analysis) Static analysis rules are available for code written to target vario
https://en.wikipedia.org/wiki/Quantum%20tic-tac-toe
Quantum tic-tac-toe is a "quantum generalization" of tic-tac-toe in which the players' moves are "superpositions" of plays in the classical game. The game was invented by Allan Goff of Novatia Labs, who describes it as "a way of introducing quantum physics without mathematics", and offering "a conceptual foundation for understanding the meaning of quantum mechanics". Background The motivation to invent quantum tic-tac-toe was to explore what it means to be in two places at once. In classical physics, a single object cannot be in two places at once. In quantum physics, however, the mathematics used to describe quantum systems seems to imply that before being subjected to quantum measurement (or "observed") certain quantum particles can be in multiple places at once. (The textbook example of this is the double-slit experiment.) How the universe can be like this is rather counterintuitive. There is a disconnect between the mathematics and our mental images of reality, a disconnect that is absent in classical physics. This is why quantum mechanics supports multiple "interpretations". The researchers who invented quantum tic-tac-toe were studying abstract quantum systems, formal systems whose axiomatic foundation included only a few of the axioms of quantum mechanics. Quantum tic-tac-toe became the most thoroughly studied abstract quantum system and offered insights that spawned new research. It also turned out to be a fun and engaging game, a game which also provides good pedagogy in the classroom. The rules of quantum tic-tac-toe attempt to capture three phenomena of quantum systems: superposition the ability of quantum objects to be in two places at once. entanglement the phenomenon where distant parts of a quantum system display correlations that cannot be explained by either timelike causality or common cause. collapsethe phenomenon where the quantum states of a system are reduced to classical states. Collapses occur when a measurement happens, but the mathemati
https://en.wikipedia.org/wiki/Suppressor%20mutation
A suppressor mutation is a second mutation that alleviates or reverts the phenotypic effects of an already existing mutation in a process defined synthetic rescue. Genetic suppression therefore restores the phenotype seen prior to the original background mutation. Suppressor mutations are useful for identifying new genetic sites which affect a biological process of interest. They also provide evidence between functionally interacting molecules and intersecting biological pathways. Intragenic vs. intergenic suppression Intragenic suppression Intragenic suppression results from suppressor mutations that occur in the same gene as the original mutation. In a classic study, Francis Crick (et al.) used intragenic suppression to study the fundamental nature of the genetic code. From this study it was shown that genes are expressed as non-overlapping triplets (codons). Researchers showed that mutations caused by either a single base insertion (+) or a single base deletion (-) could be "suppressed" or restored by a second mutation of the opposite sign, as long as the two mutations occurred in the same vicinity of the gene. This led to the conclusion that genes needed to be read in a specific "reading frame" and a single base insertion or deletion would shift the reading frame (frameshift mutation) in such a way that the remaining DNA would code for a different polypeptide than the one intended. Therefore, researchers concluded that the second mutation of opposite sign suppresses the original mutation by restoring the reading frame, as long as the portion between the two mutations is not critical for protein function. In addition to the reading frame, Crick also used suppressor mutations to determine codon size. It was found that while one and two base insertions/deletions of the same sign resulted in a mutant phenotype, deleting or inserting three bases could give a wild type phenotype. From these results it was concluded that an inserted or deleted triplet
https://en.wikipedia.org/wiki/Packaging%20engineering
Packaging engineering, also package engineering, packaging technology and packaging science, is a broad topic ranging from design conceptualization to product placement. All steps along the manufacturing process, and more, must be taken into account in the design of the package for any given product. Package engineering is an interdisciplinary field integrating science, engineering, technology and management to protect and identify products for distribution, storage, sale, and use. It encompasses the process of design, evaluation, and production of packages. It is a system integral to the value chain that impacts product quality, user satisfaction, distribution efficiencies, and safety. Package engineering includes industry-specific aspects of industrial engineering, marketing, materials science, industrial design and logistics. Packaging engineers must interact with research and development, manufacturing, marketing, graphic design, regulatory, purchasing, planning and so on. The package must sell and protect the product, while maintaining an efficient, cost-effective process cycle. Engineers develop packages from a wide variety of rigid and flexible materials. Some materials have scores or creases to allow controlled folding into package shapes (sometimes resembling origami). Packaging involves extrusion, thermoforming, molding and other processing technologies. Packages are often developed for high speed fabrication, filling, processing, and shipment. Packaging engineers use principles of structural analysis and thermal analysis in their evaluations. Education Some packaging engineers have backgrounds in other science, engineering, or design disciplines while some have college degrees specializing in this field. Formal packaging programs might be listed as package engineering, packaging science, packaging technology, etc. BE, BS, MS, M.Tech and PhD programs are available. Students in a packaging program typically begin with generalized science, bu
https://en.wikipedia.org/wiki/European%20Assisted%20Conception%20Consortium
The European Assisted Conception Consortium (EACC) is an organization whose aim is to bring together national ART regulators and practitioners within the European Union for professional cooperation and joint action. Its inaugural meeting was in Copenhagen in 2005, under the chairmanship of Angela McNab. See also European Society of Human Reproduction and Embryology
https://en.wikipedia.org/wiki/Workflow%20engine
A workflow engine is a software application that manages business processes. It is a key component in workflow technology and typically makes use of a database server. A workflow engine manages and monitors the state of activities in a workflow, such as the processing and approval of a loan application form, and determines which new activity to transition to according to defined processes (workflows). The actions may be anything from saving an application form in a document management system to sending a reminder e-mail to users or escalating overdue items to management. A workflow engine facilitates the flow of information, tasks, and events. Workflow engines may also be referred to as Workflow Orchestration Engines. Workflow engines mainly have three functions: Verification of the current process status: Check whether it is valid executing a task, given current status. Determine the authority of users: Check if the current user is permitted to execute the task. Executing condition script: After passing the previous two steps, the workflow engine executes the task, and if the execution successfully completes, it returns the success, if not, it reports the error to trigger and roll back the change. A workflow engine is a core technique for task allocation software, such as business process management, in which the workflow engine allocates tasks to different executors while communicating data among participants. A workflow engine can execute any arbitrary sequence of steps, for example, a healthcare data analysis. See also Business rules engine Business rule management system Comparison of BPEL engines Inference engine Java Rules Engine API Rete algorithm Ripple down rules Semantic reasoner Business Process Execution Language Production system Workflow management system Joget Workflow Conductor (software)
https://en.wikipedia.org/wiki/Exclamation%20mark
The exclamation mark (also known as exclamation point in American English) is a punctuation mark usually used after an interjection or exclamation to indicate strong feelings or to show emphasis. The exclamation mark often marks the end of a sentence, for example: "Watch out!". Similarly, a bare exclamation mark (with nothing before or after) is often used in warning signs. The exclamation mark is often used in writing to make a character seem as though they are shouting, excited, or surprised. Other uses include: In mathematics, it denotes the factorial operation. Several computer languages use at the beginning of an expression to denote logical negation. For example, means "the logical negation of A", also called "not A". This usage has spread to ordinary language (e.g., "!clue" means no-clue or clueless). Some languages use ǃ, a symbol that looks like an exclamation mark, to denote a click consonant. History Graphically, the exclamation mark is represented by variations on the theme of a full stop point with a vertical line above. One theory of its origin posits derivation from a Latin exclamation of joy, namely , analogous to "hooray"; copyists wrote the Latin word at the end of a sentence, to indicate expression of joy. Over time, the i moved above the o; that o first became smaller, and (with time) a dot. Its evolution as a punctuation symbol after the Ancient Era can be traced back to the Middle Ages, when scribes would often add various marks and symbols to manuscripts to indicate changes in tone, pauses, or emphasis. These symbols included the punctus admirativus, a symbol that was similar in shape to the modern exclamation mark and was used to indicate admiration, surprise, or other strong emotions. The modern use of the exclamation mark was supposedly first described in the 14th century by Italian scholar Alpoleio da Urbisaglia. Literary scholar Florence Hazrat said he "felt very annoyed" that people were reading script with a flat tone, even if
https://en.wikipedia.org/wiki/Acoustic%20transmission
Acoustic transmission is the transmission of sounds through and between materials, including air, wall, and musical instruments. The degree to which sound is transferred between two materials depends on how well their acoustical impedances match. In musical instrument design Musical instruments are generally designed to radiate sound effectively. A high-impedance part of the instrument, such as a string, transmits vibrations through a bridge (intermediate impedance) to a sound board (lower impedance). The soundboard then moves the still lower-impedance air. Without bridge and soundboard, the instrument does not transmit enough sound to the air, and is too quiet to be performed with. An electric guitar has no soundboard; it uses a microphone pick-up and artificial amplification. Without amplification, electric guitars are very quiet. Stethoscope Stethoscopes roughly match the acoustical impedance of the human body, so they transmit sounds from a patient's chest to the doctor's ear much more effectively than the air does. Putting an ear to someone's chest would have a similar effect. Building acoustics Acoustic transmission in building design refers to a number of processes by which sound can be transferred from one part of a building to another. Typically these are: Airborne transmission - a noise source in one room sends air pressure waves which induce vibration to one side of a wall or element of structure setting it moving such that the other face of the wall vibrates in an adjacent room. Structural isolation therefore becomes an important consideration in the acoustic design of buildings. Highly sensitive areas of buildings, for example recording studios, may be almost entirely isolated from the rest of a structure by constructing the studios as effective boxes supported by springs. Air tightness also becomes an important control technique. A tightly sealed door might have reasonable sound reduction properties, but if it is left open only a few millimeters i
https://en.wikipedia.org/wiki/Senicide
Senicide, also known as geronticide, is the practice of killing the elderly. This killing of the elderly can be characterized by both active and passive methods as senio-euthanasia or altruistic self-sacrifice. The aim of active senio-euthanasia is to relieve the clan, family, or society from the burden of a “useless eater”. But an old person might kill himself (autothanasia) altruistically. In case of the altruistic self-sacrifice, the aim is to fulfill an old tradition or to stop being a burden to the clan. Both are understood as a sacrificial death. Senicide is found in various cultures all over the world and has been practiced during different time periods. The methods of senicide are rooted in the traditions and customs of a given society. Terminology The word senicide "is less well known, though of older provenance" than geronticide. It is "so rare a word that Microsoft Word’s spellcheck underlines it in red, itching to autocorrect it to suicide”, so the British historian Niall Ferguson. In an article for The Fortnightly Review, African explorer Harry Johnston first used 1889 the term “senicide”. He reported that in ancient Sardinia, the Sardi considered it a sacred duty to kill their elderly relatives with a club or by forcing them to jump from a high cliff. Various authors use the terms “gerontocide” and “geronticide” interchangeably. Maxwell might have used geronticide for the first time in 1983. Today we find both terms in common usage; “senicide”, referring to the cultural and ritual killing of the old aged; and “geronticide”, referring to the murder or manslaughter of any senior person. Senicide in ethnography and history Since there is little evidence of these killings, such as court records or very rare eyewitness accounts, it has been suggested that most of these reports are chilling myths about cruel practices of foreign peoples or past times. Schulte criticized in a review of sources on native North America the quality of the data, the role
https://en.wikipedia.org/wiki/Diffuse%20element%20method
In numerical analysis the diffuse element method (DEM) or simply diffuse approximation is a meshfree method. The diffuse element method was developed by B. Nayroles, G. Touzot and Pierre Villon at the Universite de Technologie de Compiegne, in 1992. It is in concept rather similar to the much older smoothed particle hydrodynamics. In the paper they describe a "diffuse approximation method", a method for function approximation from a given set of points. In fact the method boils down to the well-known moving least squares for the particular case of a global approximation (using all available data points). Using this function approximation method, partial differential equations and thus fluid dynamic problems can be solved. For this, they coined the term diffuse element method (DEM). Advantages over finite element methods are that DEM doesn't rely on a grid, and is more precise in the evaluation of the derivatives of the reconstructed functions. See also Computational fluid dynamics
https://en.wikipedia.org/wiki/International%20Federation%20of%20Clinical%20Chemistry%20and%20Laboratory%20Medicine
The International Federation of Clinical Chemistry and Laboratory Medicine or IFCC is a global organization that promotes the fields of clinical chemistry and laboratory medicine. It was established in 1952 as the International Association of Clinical Biochemists to organize the various national societies of these fields. The organization aims to transcend the boundaries of the field of clinical chemistry and laboratory medicine, to build professionalism of members worldwide, to disseminate information on ”best practice” at various levels of technology and of economic development, to provide a forum of standardization and traceability, to enhance the scientific level and the quality of diagnosis and therapy for patients. The IFCC membership comprises 95 national societies and is associated with 6 regional Federations, 55 corporate members and 21 affiliate members representing more than 45,000 laboratory medicine specialists worldwide. Structure and organization The IFCC carries out its objectives through its executive board, divisions, committees and working groups. Representatives from member organizations are volunteers, invited from throughout the world on the basis of their expertise. Scientific Division (SD) Its mission is to advance the science of clinical chemistry and to apply it to the practice of Clinical Laboratory medicine Participate actively in the scientific programs of IFCC scientific meetings and Congresses Respond to scientific and technical needs of IFCC Member Societies, IFCC Corporate members and external agencies The Scientific Division of the IFCC instigates and promotes theoretical and practical developments in the field of standards and standardisation in clinical chemistry. Education & Management Division (EMD) Its mission is to provide IFCC members and the health-care community with education relevant to clinical chemistry and laboratory medicine. Current projects include: Visiting Lecturer Program Clinical Molecular Biology Co
https://en.wikipedia.org/wiki/Social%20cognitive%20theory
Social cognitive theory (SCT), used in psychology, education, and communication, holds that portions of an individual's knowledge acquisition can be directly related to observing others within the context of social interactions, experiences, and outside media influences. This theory was advanced by Albert Bandura as an extension of his social learning theory. The theory states that when people observe a model performing a behavior and the consequences of that behavior, they remember the sequence of events and use this information to guide subsequent behaviors. Observing a model can also prompt the viewer to engage in behavior they already learned. In other words, people do not learn new behaviors solely by trying them and either succeeding or failing, but rather, the survival of humanity is dependent upon the replication of the actions of others. Depending on whether people are rewarded or punished for their behavior and the outcome of the behavior, the observer may choose to replicate behavior modeled. Media provides models for a vast array of people in many different environmental settings. History The conceptual roots for social cognitive theory come from Edwin B. Holt and Harold Chapman Brown's 1931 book theorizing that all animal action is based on fulfilling the psychological needs of "feeling, emotion, and desire." The most notable component of this theory is that it predicted a person cannot learn to imitate until they are imitated. In 1941, Neal E. Miller and John Dollard presented their book with a revision of Holt's social learning and imitation theory. They argued four factors contribute to learning: drives, cues, responses, and rewards. One driver is social motivation, which includes imitativeness, the process of matching an act to an appropriate cue of where and when to perform the act. A behavior is imitated depending on whether the model receives a positive or negative response consequences. Miller and Dollard argued that if one were motivated
https://en.wikipedia.org/wiki/Alkaline%20tide
Alkaline tide (mal del puerco) refers to a condition, normally encountered after eating a meal, where during the production of hydrochloric acid by the parietal cells in the stomach, the parietal cells secrete bicarbonate ions across their basolateral membranes and into the blood, causing a temporary increase in blood pH. During hydrochloric acid secretion in the stomach, the gastric parietal cells extract chloride anions, carbon dioxide, water and sodium cations from the blood plasma and in turn release bicarbonate back into the plasma after forming it from carbon dioxide and water constituents. This is to maintain the plasma's electrical balance, as the chloride anions have been extracted. The bicarbonate content causes the venous blood leaving the stomach to be more alkaline than the arterial blood delivered to it. The alkaline tide is neutralised by a secretion of H+ into the blood during HCO3− secretion in the pancreas. Postprandial (i.e., after a meal) alkaline tide lasts until the acids in food absorbed in the small intestine reunite with the bicarbonate that was produced when the food was in the stomach. Thus, alkaline tide is self-limited and normally lasts less than two hours. Postprandial alkaline tide has also been shown to be a causative agent of calcium oxalate urinary stones in cats, and potentially in other species. A more pronounced alkaline tide results from vomiting, which stimulates hyperactivity of gastric parietal cells to replace lost stomach acid. Thus, protracted vomiting can result in metabolic alkalosis.