source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Psychomotor%20education
Psychomotor therapy is a pedagogic and therapeutic approach, the aim of which is to support and aid an individual's personal development. It is based on a holistic view of human beings that considers each individual as a unity of physical, emotional and cognitive actualities, which interact with each other and the surrounding social environment. Psychomotor specialists study the body and its expressivity. The body is regarded not merely as a mechanism with neurophysiological developments, but also as a thing with deep-rooted emotional traits, which have come about through somato-psychic experiences, particularly in early-childhood. Psychomotor specialists work in the field of prevention, education, re-education, rehabilitation, and research. Psychomotor education and therapy can be used for any age group. A school of psychomotor education, called the Aucouturier Psychomotor Practice, was developed by French pedagogists Bernard Aucouturier and Andre Lapierre.
https://en.wikipedia.org/wiki/Rule%20of%20replacement
In logic, a rule of replacement is a transformation rule that may be applied to only a particular segment of an expression. A logical system may be constructed so that it uses either axioms, rules of inference, or both as transformation rules for logical expressions in the system. Whereas a rule of inference is always applied to a whole logical expression, a rule of replacement may be applied to only a particular segment. Within the context of a logical proof, logically equivalent expressions may replace each other. Rules of replacement are used in propositional logic to manipulate propositions. Common rules of replacement include de Morgan's laws, commutation, association, distribution, double negation, transposition, material implication, logical equivalence, exportation, and tautology. Table: Rules of Replacement The rules above can be summed up in the following table. The "Tautology" column shows how to interpret the notation of a given rule. See also Salva veritate Notes
https://en.wikipedia.org/wiki/Polyhaline
Polyhaline is a salinity category term applied to brackish estuaries and other water bodies with a salinity between 18 and 30 parts per thousand. It is the most dense saltwater type that is classified as "brackish." See also Salinity Aquatic ecology
https://en.wikipedia.org/wiki/Leidy%20Award
The Leidy Award is a medal and prize presented by the Academy of Natural Sciences of Drexel University (formerly the Academy of Natural Sciences of Philadelphia), Philadelphia, Pennsylvania, USA. It was named after US palaeontologist Joseph Leidy. The award was established in 1923 to recognize excellence in "publications, explorations, discoveries or research in the natural sciences", and was intended to be presented every three years. The award consists of a rectangular bronze medal (decorated with a bust depiction of Leidy) and an honorarium which was initially $5000. Laureates 1925 – Herbert Spencer Jennings 1928 – Henry Augustus Pilsbry 1931 – William Morton Wheeler 1934 – Gerrit Smith Miller Jr. 1937 – Edwin Linton 1940 – Merritt Lyndon Fernald 1943 – Chancey Juday 1946 – Ernst Mayr 1949 – Warren Poppino Spencer 1952 – G. Evelyn Hutchinson 1955 – Herbert Friedmann 1958 – Herbert Barker Hungerford 1961 – Robert Evans Snodgrass 1964 – Carl Leavitt Hubbs 1967 – Donn Eric Rosen 1970 – Arthur Cronquist 1975 – James Bond 1979 – Edward Osborne Wilson 1983 – G. Ledyard Stebbins 1985 – Hampton Carson 1989 – Daniel H. Janzen 1994 – Peter and Rosemary Grant 2006 – David B. Wake 2009 – Dan Otte 2010 – Tim Flannery 2012 – Douglas Futuyma See also List of general science and technology awards List of biology awards List of earth sciences awards List of paleontology awards
https://en.wikipedia.org/wiki/Rod%20Downey
Rodney Graham Downey (born 20 September 1957) is a New Zealand and Australian mathematician and computer scientist, an emeritus professor in the School of Mathematics and Statistics at Victoria University of Wellington in New Zealand. He is known for his work in mathematical logic and computational complexity theory, and in particular for founding the field of parameterised complexity together with Michael Fellows. Biography Downey earned a bachelor's degree at the University of Queensland in 1978, and then went on to graduate school at Monash University, earning a doctorate in 1982 under the supervision of John Crossley. After holding teaching and visiting positions at the Chisholm Institute of Technology, Western Illinois University, the National University of Singapore, and the University of Illinois at Urbana-Champaign, he came to New Zealand in 1986 as a lecturer at Victoria University. He was promoted to reader in 1991, was given a personal chair at Victoria in 1995, and retired in 2021. Downey was president of the New Zealand Mathematical Society from 2001 to 2003. Publications Downey is the co-author of five books: Parameterized Complexity (with Michael Fellows, Springer, 1999) Algorithmic Randomness and Complexity (with D. Hirschfeldt, Springer, 2010) Fundamentals of Parameterized Complexity (with Michael Fellows, Springer, 2013) Minimal Weak Truth Table Degrees and Computably Enumerable Turing Degrees (with Keng Meng Ng and David Reed Solomon, Memoirs American Mathematical Society, Vol. 2184, 2020) A Hierarchy of Turing Degrees (with Noam Greenberg, Annals of Mathematics Studies No. 206, Princeton University Press, 2020) He is also the author or co-author of over 200 research papers, including a highly cited sequence of four papers with Michael Fellows and Karl Abrahamson setting the foundation for the study of parameterised complexity. Awards and honours In 1990, Downey won the Hamilton Research Award from the Royal Society of New Zealand. In 1992, Do
https://en.wikipedia.org/wiki/Transuranic%20waste
Transuranic waste (TRU) is stated by U.S. regulations, and independent of state or origin, to be waste which has been contaminated with alpha emitting transuranic radionuclides possessing half-lives greater than 20 years and in concentrations greater than 100 nCi/g (3.7 MBq/kg). Elements having atomic numbers greater than that of uranium are called transuranic. Elements within TRU are typically man-made and are known to contain americium-241 and several isotopes of plutonium. Because of the elements' longer half-lives, TRU is disposed of more cautiously than low level waste and intermediate level waste. In the U.S. it is a byproduct of weapons production, nuclear research and power production, and consists of protective gear, tools, residue, debris and other items contaminated with small amounts of radioactive elements (mainly plutonium). Under U.S. law, TRU is further categorized into "contact-handled" (CH) and "remote-handled" (RH) on the basis of the radiation field measured on the waste container's surface. CH TRU has a surface dose rate not greater than 2 mSv per hour (200 mrem/h), whereas RH TRU has rates of 2 mSv/h or higher. CH TRU has neither the high radioactivity of high level waste, nor its high heat generation. In contrast, RH TRU can be highly radioactive, with surface dose rates up to 10 Sv/h (1000 rem/h). The United States currently permanently disposes of TRU generated from defense nuclear activities at the Waste Isolation Pilot Plant, a deep geologic repository. Other countries do not include this category, favoring variations of High, Medium/Intermediate, and Low Level waste.
https://en.wikipedia.org/wiki/Merle%20Randall
Merle Randall (January 29, 1888 – March 17, 1950) was an American physical chemist famous for his work with Gilbert N. Lewis, over a period of 25 years, in measuring reaction heat of chemical compounds and determining their corresponding free energy. Together, their 1923 textbook "Thermodynamics and the Free Energy of Chemical Substances" became a classic work in the field of chemical thermodynamics. In 1932, Merle Randall authored two scientific papers with Mikkel Frandsen: "The Standard Electrode Potential of Iron and the Activity Coefficient of Ferrous Chloride," and "Determination of the Free Energy of Ferrous Hydroxide from Measurements of Electromotive Force." Education Randall completed his Ph.D. at the Massachusetts Institute of Technology in 1912 with a dissertation on "Studies in Free Energy". Related Based on work by J. Willard Gibbs, it was known that chemical reactions proceeded to an equilibrium determined by the free energy of the substances taking part. Using this theory, Gilbert Lewis spent 25 years determining free energies of various substances. In 1923, he and Randall published the results of this study and formalizing chemical thermodynamics. According to the Belgian thermodynamicist Ilya Prigogine, their influential 1923 textbook led to the replacement of the term "affinity" by the term "free energy" in much of the English-speaking world. See also Ionic strength
https://en.wikipedia.org/wiki/Human%20Genome%20Project
The Human Genome Project (HGP) was an international scientific research project with the goal of determining the base pairs that make up human DNA, and of identifying, mapping and sequencing all of the genes of the human genome from both a physical and a functional standpoint. It started in 1990 and was completed in 2003. It remains the world's largest collaborative biological project. Planning for the project started after it was adopted in 1984 by the US government, and it officially launched in 1990. It was declared complete on April 14, 2003, and included about 92% of the genome. Level "complete genome" was achieved in May 2021, with a remaining only 0.3% bases covered by potential issues. The final gapless assembly was finished in January 2022. Funding came from the United States government through the National Institutes of Health (NIH) as well as numerous other groups from around the world. A parallel project was conducted outside the government by the Celera Corporation, or Celera Genomics, which was formally launched in 1998. Most of the government-sponsored sequencing was performed in twenty universities and research centres in the United States, the United Kingdom, Japan, France, Germany, and China, working in the International Human Genome Sequencing Consortium (IHGSC). The Human Genome Project originally aimed to map the complete set of nucleotides contained in a human haploid reference genome, of which there are more than three billion. The "genome" of any given individual is unique; mapping the "human genome" involved sequencing samples collected from a small number of individuals and then assembling the sequenced fragments to get a complete sequence for each of 24 human chromosomes (22 autosomes and 2 sex chromosomes). Therefore, the finished human genome is a mosaic, not representing any one individual. Much of the project's utility comes from the fact that the vast majority of the human genome is the same in all humans. History The Human G
https://en.wikipedia.org/wiki/Extended%20Enterprise%20Modeling%20Language
Extended Enterprise Modeling Language (EEML) in software engineering is a modelling language used for Enterprise modelling across a number of layers. Overview Extended Enterprise Modeling Language (EEML) is a modelling language which combines structural modelling, business process modelling, goal modelling with goal hierarchies and resource modelling. It was intended to bridge the gap between goal modelling and other modelling approaches. According to Johannesson and Söderström (2008) "the process logic in EEML is mainly expressed through nested structures of tasks and decision points. The sequencing of tasks is expressed by the flow relation between decision points. Each task has an input port and the output port being decision points for modeling process logic". EEML was designed as a simple language, making it easy to update models. In addition to capturing tasks and their interdependencies, models show which roles perform each task, and the tools, services and information they apply. History Extended Enterprise Modeling Language (EEML) is from the late 1990s, developed in the EU project EXTERNAL as extension of the Action Port Model (APM) by S. Carlsen (1998). The EXTERNAL project aimed to "facilitate inter-organisational cooperation in knowledge intensive industries. The project worked on the hypothesis that interactive process models form a suitable framework for tools and methodologies for dynamically networked organisations. In the project EEML (Extended Enterprise Modelling Language) was first constructed as a common metamodel, designed to enable syntactic and semantic interoperability". It was further developed in the EU projects Unified Enterprise Modelling Language (UEML) from 2002 to 2003 and the ongoing ATHENA project. The objectives of the UEML Working group were to "define, to validate and to disseminate a set of core language constructs to support a Unified Language for Enterprise Modelling, named UEML, to serve as a basis for interoperability
https://en.wikipedia.org/wiki/Cancer%20immunoprevention
Cancer immunoprevention is the prevention of cancer onset with immunological means such as vaccines, immunostimulators or antibodies. Cancer immunoprevention is conceptually different from cancer immunotherapy, which aims at stimulating immunity in patients only after tumor onset, however the same immunological means can be used both in immunoprevention and in immunotherapy. Immunoprevention of tumors caused by viruses Immunoprevention of tumors caused by viruses or other infectious agents aims at preventing or curing infection before the onset of cancer. Effective vaccines are available for use in humans. Some tumor types in humans and in animals are the consequence of viral infections. In humans the most frequent viral tumors are liver cancer (also called hepatocellular carcinoma), arising in a small proportion of patients with chronic infection by hepatitis B virus (HBV) or hepatitis C virus (HCV), and carcinoma of the uterine cervix (also called cervical cancer), caused by human papilloma virus (HPV). Altogether these two tumors make 10% of all human cancers, affecting almost one million new patients each year worldwide. The HBV vaccine, now in worldwide use, was shown to reduce the incidence of liver carcinoma. Cancer immunoprevention by the HBV vaccine can be thought of as a beneficial side effect of vaccine developed and used to prevent hepatitis B. This is not the case with HPV vaccines, which were primarily developed for cancer prevention. Clinical trials showed that HPV vaccines can prevent HPV infection and carcinogenesis almost completely; these results led to vaccine approval by regulatory agencies in USA and Europe. Immunoprevention of non-infectious tumors Is it possible to devise immunopreventive strategies for tumors not caused by infectious agents? The challenge is to predict in each individual the risk of specific cancer types and to design immune strategies targeting these cancer types. This is not yet feasible in humans, thus immunoprevention
https://en.wikipedia.org/wiki/Mullvad
Mullvad is a commercial VPN service based in Sweden. Launched in March 2009, Mullvad operates using the WireGuard and OpenVPN protocols. It also supports ShadowSocks as a bridge protocol for censorship circumvention. History Mullvad was launched in March 2009 by Amagicom AB. Its name is Swedish for mole. Mullvad began supporting connections via the OpenVPN protocol in 2009. Mullvad was an early adopter and supporter of the WireGuard protocol, announcing the availability of the new VPN protocol in March 2017 and making a "generous donation" supporting WireGuard development between July and December 2017. In September 2018, the cybersecurity firm Cure53 performed a penetration test on Mullvad's macOS, Windows and Linux applications. Seven issues were found which were addressed by Mullvad. Cure53 tested only the applications and supporting functions. No assessment was made on the Mullvad server side and back end. In October 2019, Mullvad partnered with Mozilla. Mozilla's VPN service, Mozilla VPN, utilizes Mullvad's WireGuard servers. In April 2020, Mullvad partnered with Malwarebytes and provided WireGuard servers for their VPN service, Malwarebytes Privacy. In May 2022, Mullvad started officially accepting Monero. On 18 April 2023, Mullvad's head office in Gothenburg was visited by officers from the National Operations Department of the Swedish Police, who had a search warrant to seize computers containing customer data. Mullvad demonstrated that, in accordance with their policies, no such data existed on their systems. After consulting with the prosecutor, the officers left without seizing any equipment or obtaining customer information. Mullvad shared this information in a blog post two days later making the incident public knowledge and mentioning this was the first time their offices had been visited with a search warrant. In a letter sent to Mullvad 9 days after the search the Swedish Police stated they conducted the search at the request of Germany for a
https://en.wikipedia.org/wiki/Longevity%20insurance
Longevity insurance, describes the process of mitigating against Longevity risk. Such risk mitigation is often achieved using a longevity annuity or Tontine, qualifying longevity annuity contract (QLAC), deferred income annuity, is an annuity contract designed to provide to the policyholder payments for life starting at a pre-established future age, e.g., 85, and purchased many years before reaching that age. General description Longevity annuities are like "reverse life insurance", meaning premium dollars are collected by the life insurance company by its policy holders to pay income when a policy holder lives a long life, instead of collecting premium dollars and paying a death claim on a policy holder's short life in ordinary life insurance. Longevity annuities use mortality credits to pool money and pay out the remaining policy holders' claims, this being living a long life. The term "longevity insurance" comes from this type of annuity being insurance against unusually long life. It may seem odd to insure against an event that most people would welcome. However, living a very long time would strain many people's financial resources, just as a fire which destroys their house would strain many people's finances if they didn't have fire insurance. The logic that makes fire insurance a prevalent means for coping with the financial risk of house fires would seem to argue for greater use of longevity insurance for retirement planning: Few people will live to a very old age, so it doesn't make sense for everyone to try to cover that possibility with savings and investments. (The same type of reasoning applies to house insurance: because few people will experience house fires, therefore it is not realistic to expect everyone to save and invest specifically for purposes of house replacement.) Longevity insurance is not designed for the early retirement years, so it is not intended as a complete retirement plan by itself. Summer of 2014, the IRS and Treasury Depart
https://en.wikipedia.org/wiki/Praxeme
Praxeme is a methodology for enterprise architecture which provides a structured approach to the design and implementation of an enterprise information architecture. Overview Praxeme is an enterprise methodology which aims to embrace all aspects of the enterprise, from strategy to deployment. The name "Praxeme" is a contraction of "praxis" (action) and "semeion" (sense, meaning). The methodology contains design procedures for the information system and IT systems of the enterprise. It reconciles different modeling approaches. In particular, it proposes a semantic modeling technique which benefits from the object-oriented approach, to formalize the knowledge about the business fundamentals. Praxeme follows in the footsteps of the methodological tradition: It takes up the legacies from Merise, TACT, analyze-design methods, Zachman framework. It updating them in light of recent advances (SOA, BPM, and ontology, terminology); and It has synthesized these approaches and articulates them in accordance with the Model Driven Architecture Using the standard Unified Modeling Language, Praxeme's modeling techniques enable the “Enterprise System” to be rigorously defined. That is to say that the enterprise, in an effort of rationality, perceives itself to be a system. The notion of “Enterprise System” applies to enterprises and organizations, as well as any action system – organized and striving to achieve an aim. Praxeme has been used in contexts as varied as the insurance sector, drone or weaponry systems, energy and distribution. History Praxeme is initiated in 2003 as open method to respond to the need for enterprises to share a reference method in order to successfully manage their transformation projects. The foundations of the Praxeme method is initially enabled by the Aeronautics & Defense department of the SAGEM company. In 2004 the French mutual insurance company SMABTP supported the design procedures required to overhaul its information system in Service
https://en.wikipedia.org/wiki/Feynman%E2%80%93Kac%20formula
The Feynman–Kac formula, named after Richard Feynman and Mark Kac, establishes a link between parabolic partial differential equations (PDEs) and stochastic processes. In 1947, when Kac and Feynman were both Cornell faculty, Kac attended a presentation of Feynman's and remarked that the two of them were working on the same thing from different directions. The Feynman–Kac formula resulted, which proves rigorously the real-valued case of Feynman's path integrals. The complex case, which occurs when a particle's spin is included, is still an open question. It offers a method of solving certain partial differential equations by simulating random paths of a stochastic process. Conversely, an important class of expectations of random processes can be computed by deterministic methods. Theorem Consider the partial differential equation defined for all and , subject to the terminal condition where are known functions, is a parameter, and is the unknown. Then the Feynman–Kac formula tells us that the solution can be written as a conditional expectationunder the probability measure such that is an Itô process driven by the equation with is a Wiener process (also called Brownian motion) under , and the initial condition for is . Intuitive interpretation Suppose we have a particle moving according to the diffusion processLet the particle incur "cost" at a rate of at location at time . Let it incur a final cost at . Also, allow the particle to decay. If the particle is at location at time , then it decays with rate . After the particle has decayed, all future cost is zero. Then, is the expected cost-to-go, if the particle starts at . Partial proof A proof that the above formula is a solution of the differential equation is long, difficult and not presented here. It is however reasonably straightforward to show that, if a solution exists, it must have the above form. The proof of that lesser result is as follows: Let be the solution to the above partial
https://en.wikipedia.org/wiki/Sweet%20Workers%27%20Union
The Sweet Workers' Union (SWU) was a small but long-lived union representing confectionery workers in South Africa. In 1925, a Women Workers' Union was established by F. Klenerman. It affiliated to the South African Trades Union Congress (SATUC), but as a general union, it was prohibited from registering with the government. As it only had membership in two industries, in 1926 it split into the Waitresses' Union and the Sweet Makers' Union. It had about 200 members, which represented 75% of the sweet makers in Johannesburg. It remained affiliated when the SATUC merged into the South African Trades and Labour Council. In the late 1930s, the union was led by Dulcie Hartwell, and in 1937, E. J. Burford established a parallel African Sweet Workers' Union to represent black labourers in the industry. In 1939, the Garment Workers' Union of South Africa helped the union expand nationwide, and by 1947, it had grown to 1,843 members. It was associated with the left wing of the movement, and in 1951, its secretary, H. Le Roux, was banned by the government. Unlike most unions in South Africa, the union continued accepting white, "coloured" and Asian members. Its membership had declined to only 298 in 1979, but in 1980, it affiliated to the Trade Union Council of South Africa and began accepting black members, its membership growing to 1,396 by the end of the year. In 1997, it was a founding affiliate of the Federation of Unions of South Africa. It was dissolved in 2005.
https://en.wikipedia.org/wiki/Texas%20Memory%20Systems
Texas Memory Systems, Inc. (TMS) was an American corporation that designed and manufactured solid-state disks (SSDs) and digital signal processors (DSPs). TMS was founded in 1978 and that same year introduced their first solid-state drive, followed by their first digital signal processor. In 2000 they introduced the RamSan line of SSDs. Based in Houston, Texas, they supply these two product categories (directly as well as OEM and reseller partners) to large enterprise and government organizations. TMS has been supplying SSD products to the market longer than any other company. On August 16, 2012, IBM Corporation announced a definitive agreement to acquire Texas Memory Systems, Inc. This acquisition was completed as planned on October 1, 2012. History TMS was founded in 1978 in Houston, Texas by Holly Frost to address a need in seismic processing for the oil and gas industry. The company's first product, the CMPS was a 16 Kilobyte (KB) custom SSD designed for Gulf Oil. SAM product line Around 1988, TMS designed and sold hundreds of SAM-600/800 (Shared Attached Memory) storage enclosures mainly to the United States Department of Defense. These enclosures used 128 Megabytes (MB) of Dynamic random-access memory (DRAM) for data storage and several high-speed Emitter-coupled logic (ECL) inputs and outputs for data transfer. These systems were mainly used to acquire and analyze signals in real time. When the 1980s oil glut caused disruption in the oil and gas industry, TMS shifted focus away from SSDs and onto Digital signal processing products. The previously-designed SAM storage systems were enhanced by adding in a custom designed DSP board. Prior to this added DSP capability, to analyze a signal, a user would have to send the signal to the SAM storage for staging, engage a separate system to perform digital signal processing, then store the result back to the SAM system for analyzing. Adding the DSP processor into the storage system itself meant that the d
https://en.wikipedia.org/wiki/80th%20meridian%20east
The meridian 80° east of Greenwich is a line of longitude that extends from the North Pole across the Arctic Ocean, Asia, the Indian Ocean, the Southern Ocean, and Antarctica to the South Pole. The 80th meridian east forms a great circle with the 100th meridian west. There is also an 80 Degrees East cafe in Nanganallur, Chennai, India named after this longitude. From Pole to Pole Starting at the North Pole and heading south to the South Pole, the 80th meridian east passes through: {| class="wikitable plainrowheaders" ! scope="col" width="120" | Co-ordinates ! scope="col" | Country, territory or sea ! scope="col" | Notes |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Arctic Ocean | style="background:#b0e0e6;" | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Kara Sea | style="background:#b0e0e6;" | |- | ! scope="row" | | Krasnoyarsk Krai — Ushakov Island |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Kara Sea | style="background:#b0e0e6;" | Passing just west of Dikson Island, Krasnoyarsk Krai, |-valign="top" | ! scope="row" | | Krasnoyarsk Krai Yamalo-Nenets Autonomous Okrug — from Krasnoyarsk Krai — from Yamalo-Nenets Autonomous Okrug — from Khanty-Mansi Autonomous Okrug — from Tomsk Oblast — from Novosibirsk Oblast — from Altai Krai — from |- | ! scope="row" | | |- | ! scope="row" | | Xinjiang – for about 18 km |- | ! scope="row" | | |- | ! scope="row" | | |- | ! scope="row" | | Xinjiang |- | ! scope="row" | Aksai Chin | Disputed between and |- | ! scope="row" | | Tibet |-valign="top" | ! scope="row" | Aksai Chin | Disputed between and – for about 4 km |-valign="top" | ! scope="row" | | Uttarakhand Uttar Pradesh — from Madhya Pradesh — from Maharashtra — from Telangana — from Andhra Pradesh — from Tamil Nadu — from |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Indian Ocean | style="background:#
https://en.wikipedia.org/wiki/Adaptive%20heap%20sort
In computer science, adaptive heap sort is a comparison-based sorting algorithm of the adaptive sort family. It is a variant of heap sort that performs better when the data contains existing order. Published by Christos Levcopoulos and Ola Petersson in 1992, the algorithm utilizes a new measure of presortedness, Osc, as the number of oscillations. Instead of putting all the data into the heap as the traditional heap sort did, adaptive heap sort only take part of the data into the heap so that the run time will reduce significantly when the presortedness of the data is high. Heapsort Heap sort is a sorting algorithm that utilizes binary heap data structure. The method treats an array as a complete binary tree and builds up a Max-Heap/Min-Heap to achieve sorting. It usually involves the following four steps. Build a Max-Heap(Min-Heap): put all the data into the heap so that all nodes are either greater than or equal (less than or equal to for Min-Heap) to each of its child nodes. Swap the first element of the heap with the last element of the heap. Remove the last element from the heap and put it at the end of the list. Adjust the heap so that the first element ends up at the right place in the heap. Repeat Step 2 and 3 until the heap has only one element. Put this last element at the end of the list and output the list. The data in the list will be sorted. Below is a C/C++ implementation that builds up a Max-Heap and sorts the array after the heap is built. /* A C/C++ sample heap sort code that sort an array to an increasing order */ // A function that build up a max-heap binary tree void heapify(int array[], int start, int end) { int parent = start; int child = parent * 2 + 1; while (child <= end) { if (child + 1 <= end) // when there are two child nodes { if (array[child + 1] > array[child]) { child ++; //take the bigger child node } } if (array[parent] > array[child]) { return; //if the parent node is greater, then it's
https://en.wikipedia.org/wiki/Arithmetic%20function
In number theory, an arithmetic, arithmetical, or number-theoretic function is for most authors any function f(n) whose domain is the positive integers and whose range is a subset of the complex numbers. Hardy & Wright include in their definition the requirement that an arithmetical function "expresses some arithmetical property of n". An example of an arithmetic function is the divisor function whose value at a positive integer n is equal to the number of divisors of n. There is a larger class of number-theoretic functions that do not fit the above definition, for example, the prime-counting functions. This article provides links to functions of both classes. Arithmetic functions are often extremely irregular (see table), but some of them have series expansions in terms of Ramanujan's sum. Multiplicative and additive functions An arithmetic function a is completely additive if a(mn) = a(m) + a(n) for all natural numbers m and n; completely multiplicative if a(mn) = a(m)a(n) for all natural numbers m and n; Two whole numbers m and n are called coprime if their greatest common divisor is 1, that is, if there is no prime number that divides both of them. Then an arithmetic function a is additive if a(mn) = a(m) + a(n) for all coprime natural numbers m and n; multiplicative if a(mn) = a(m)a(n) for all coprime natural numbers m and n. Notation In this article, and mean that the sum or product is over all prime numbers: and Similarly, and mean that the sum or product is over all prime powers with strictly positive exponent (so is not included): The notations and mean that the sum or product is over all positive divisors of n, including 1 and n. For example, if , then The notations can be combined: and mean that the sum or product is over all prime divisors of n. For example, if n = 18, then and similarly and mean that the sum or product is over all prime powers dividing n. For example, if n = 24, then Ω(n), ω(n), νp(n) – prime power decomposit
https://en.wikipedia.org/wiki/Orbital%20magnetization
In quantum mechanics, orbital magnetization, Morb, refers to the magnetization induced by orbital motion of charged particles, usually electrons in solids. The term "orbital" distinguishes it from the contribution of spin degrees of freedom, Mspin, to the total magnetization. A nonzero orbital magnetization requires broken time-reversal symmetry, which can occur spontaneously in ferromagnetic and ferrimagnetic materials, or can be induced in a non-magnetic material by an applied magnetic field. Definitions The orbital magnetic moment of a finite system, such as a molecule, is given classically by where J(r) is the current density at point r. (Here SI units are used; in Gaussian units, the prefactor would be 1/2c instead, where c is the speed of light.) In a quantum-mechanical context, this can also be written as where −e and me are the charge and mass of the electron, Ψ is the ground-state wave function, and L is the angular momentum operator. The total magnetic moment is where the spin contribution is intrinsically quantum-mechanical and is given by where gs is the electron spin g-factor, μB is the Bohr magneton, ħ is the reduced Planck constant, and S is the electron spin operator. The orbital magnetization M is defined as the orbital moment density; i.e., orbital moment per unit volume. For a crystal of volume V composed of isolated entities (e.g., molecules) labelled by an index j having magnetic moments morb, j, this is However, real crystals are made up out of atomic or molecular constituents whose charge clouds overlap, so that the above formula cannot be taken as a fundamental definition of orbital magnetization. Only recently have theoretical developments led to a proper theory of orbital magnetization in crystals, as explained below. Theory Difficulties in the definition of orbital magnetization For a magnetic crystal, it is tempting to try to define where the limit is taken as the volume V of the system becomes large. However, because of
https://en.wikipedia.org/wiki/Guanosine%20nucleotide%20dissociation%20inhibitor
In molecular biology, the Guanosine dissociation inhibitors (GDIs) constitute a family of small GTPases that serve a regulatory role in vesicular membrane traffic. GDIs bind to the GDP-bound form of Rho and Rab small GTPases and not only prevent exchange (maintaining the small GTPase in an off-state), but also prevent the small GTPase from localizing at the membrane, which is their place of action. This inhibition can be removed by the action of a GDI displacement factor. GDIs also inhibit cdc42 by binding to its tail and preventing its insertion into membranes; hence it cannot trigger WASPs and cannot lead to nucleation of F-actin. The GDIs' C-terminal geranylgeranylation is crucial for their membrane association and function. This post-translational modification is catalysed by Rab geranylgeranyl transferase (Rab-GGTase), a multi-subunit enzyme that contains a catalytic heterodimer and an accessory component, termed Rab escort protein (REP)-1. REP-1 presents newly synthesised Rab proteins to the catalytic component, and forms a stable complex with the prenylated proteins following the transfer reaction. The mechanism of REP-1-mediated membrane association of Rab5 is similar to that mediated by Rab GDP dissociation inhibitor (GDI). REP-1 and Rab GDI also share other functional properties, including the ability to inhibit the release of GDP and to remove Rab proteins from membranes. The crystal structure of the bovine alpha-isoform of Rab GDI has been determined to a resolution of 1.81 Angstrom. The protein is composed of two main structural units: a large complex multi-sheet domain I, and a smaller alpha-helical domain II. The structural organisation of domain I is closely related to FAD-containing monooxygenases and oxidases. Conserved regions common to GDI and the choroideraemia gene product, which delivers Rab to catalytic subunits of Rab geranylgeranyltransferase II, are clustered on one face of the domain. The two most conserved regions form a compact struc
https://en.wikipedia.org/wiki/Redundant%20array%20of%20independent%20memory
A redundant array of independent memory (RAIM) is a design feature found in certain computers' main random access memory. RAIM utilizes additional memory modules and striping algorithms to protect against the failure of any particular module and keep the memory system operating continuously. RAIM is similar in concept to a redundant array of independent disks (RAID), which protects against the failure of a disk drive, but in the case of memory it supports several DRAM device chipkills and entire memory channel failures. RAIM is much more robust than parity checking and ECC memory technologies which cannot protect against many varieties of memory failures. On July 22, 2010, IBM introduced the first high end computer server featuring RAIM, the zEnterprise 196. Each z196 machine contains up to 3 TB (usable) of RAIM-protected main memory. In 2011 the business class model z114 was introduced also supporting RAIM. The formal announcement letter offered some additional information regarding the implementation: See also IBM mainframe
https://en.wikipedia.org/wiki/Stroop%20effect
In psychology, the Stroop effect is the delay in reaction time between congruent and incongruent stimuli. The effect has been used to create a psychological test (the Stroop test) that is widely used in clinical practice and investigation. A basic task that demonstrates this effect occurs when there is a mismatch between the name of a color (e.g., "blue", "green", or "red") and the color it is printed in (i.e., the word "red" printed in blue ink instead of red ink). When asked to name the color of the word it takes longer and is more prone to errors when the color of the ink does not match the name of the color. The effect is named after John Ridley Stroop, who first published the effect in English in 1935. The effect had previously been published in Germany in 1929 by other authors. The original paper by Stroop has been one of the most cited papers in the history of experimental psychology, leading to more than 700 Stroop-related articles in literature. Original experiment The effect was named after John Ridley Stroop, who published the effect in English in 1935 in an article in the Journal of Experimental Psychology entitled "Studies of interference in serial verbal reactions" that includes three different experiments. However, the effect was first published in 1929 in Germany by Erich Rudolf Jaensch, and its roots can be followed back to works of James McKeen Cattell and Wilhelm Maximilian Wundt in the nineteenth century. In his experiments, Stroop administered several variations of the same test for which three different kinds of stimuli were created: Names of colors appeared in black ink; Names of colors in a different ink than the color named; and Squares of a given color. In the first experiment, words and conflict-words were used. The task required the participants to read the written color names of the words independently of the color of the ink (for example, they would have to read "purple" no matter what the color of the font). In experiment 2, st
https://en.wikipedia.org/wiki/Intervirology
Intervirology is a bimonthly peer-reviewed medical journal covering all aspects of virology, especially concerning animal viruses. It was established in 1973 and is published by Karger Publishers. The editor-in-chief is Jean-Claude Manuguerra. History The journal was established in 1973 by J.L. Melnick as the Journal of the Virology Division of the International Union of Microbiological Societies. In 1982, it published a paper providing the first taxonomic description of the Ebola virus into the Filoviridae. Abstracting and indexing Intervirology is abstracted and indexed in According to the Journal Citation Reports, the journal has a 2013 impact factor of 1.773.
https://en.wikipedia.org/wiki/Semi-log%20plot
In science and engineering, a semi-log plot/graph or semi-logarithmic plot/graph has one axis on a logarithmic scale, the other on a linear scale. It is useful for data with exponential relationships, where one variable covers a large range of values, or to zoom in and visualize that - what seems to be a straight line in the beginning - is in fact the slow start of a logarithmic curve that is about to spike and changes are much bigger than thought initially. All equations of the form form straight lines when plotted semi-logarithmically, since taking logs of both sides gives This is a line with slope and vertical intercept. The logarithmic scale is usually labeled in base 10; occasionally in base 2: A log–linear (sometimes log–lin) plot has the logarithmic scale on the y-axis, and a linear scale on the x-axis; a linear–log (sometimes lin–log) is the opposite. The naming is output–input (y–x), the opposite order from (x, y). On a semi-log plot the spacing of the scale on the y-axis (or x-axis) is proportional to the logarithm of the number, not the number itself. It is equivalent to converting the y values (or x values) to their log, and plotting the data on linear scales. A log–log plot uses the logarithmic scale for both axes, and hence is not a semi-log plot. Equations The equation of a line on a linear–log plot, where the abscissa axis is scaled logarithmically (with a logarithmic base of n), would be The equation for a line on a log–linear plot, with an ordinate axis logarithmically scaled (with a logarithmic base of n), would be: Finding the function from the semi–log plot Linear–log plot On a linear–log plot, pick some fixed point (x0, F0), where F0 is shorthand for F(x0), somewhere on the straight line in the above graph, and further some other arbitrary point (x1, F1) on the same graph. The slope formula of the plot is: which leads to or which means that In other words, F is proportional to the logarithm of x times the slope of th
https://en.wikipedia.org/wiki/Max%20Planck%20Institute%20for%20Biological%20Cybernetics
The Max Planck Institute for Biological Cybernetics is located in Tübingen, Baden-Württemberg, Germany. It is one of 80 institutes in the Max Planck Society (Max Planck Gesellschaft). The institute is studying signal and information processing in the brain. We know that our brain is constantly processing a vast amount of sensory and intrinsic information by which our behavior is coordinated accordingly. How the brain actually achieves these tasks is less well understood, for example, how it perceives, recognizes, and learns new objects. The scientists at the Max Planck Institute for Biological Cybernetics aim to determine which signals and processes are responsible for creating a coherent percept of our environment and for eliciting the appropriate behavior. Scientists of three departments and seven research groups are working towards answering fundamental questions about processing in the brain, using different approaches and methods. Departments Department for Sensory and Sensorimotor Systems (Zhaoping Li) Department for High-field Magnetic Resonance (Klaus Scheffler) Department for Computational Neuroscience (Peter Dayan) Research groups Dynamic Cognition Group (Assaf Breska) Translational Sensory and Circadian Neuroscience (Manuel Spitschan) Computational Principles of Intelligence (Eric Schulz) Systems Neuroscience & Neuroengineering (Jennifer Li & Drew Robson) Former departments Department for Physiology of Cognitive Processes (Nikos Logothetis) Department for Human Perception, Cognition and Action (Heinrich H. Bülthoff) Empirical Inference (Bernhard Schölkopf) Information Processing in Insects (Werner E. Reichardt) Structure & Function of Natural Nerve-Net (Valentin von Braitenberg) External links Homepage of the Max Planck Institute for Biological Cybernetics Biological Cybernetics Biological research institutes Systems science institutes Organisations based in Tübingen Education in Tübingen Cybernetics
https://en.wikipedia.org/wiki/Promotional%20video
In video production, a promotional video is marketing or advertising: Arts, media and entertainment Promotional recording, an audio or video recording distributed to publicize a recording Trailer (promotion), a commercial advertisement for a feature film Music video, a short film that integrates a song with imagery Corporate use Corporate video, non-advertisement media created for and commissioned by an organization Personal use Video resume, a recording used to promote a jobseeker Promotional dating video, a video dating recording made to find a romantic partner
https://en.wikipedia.org/wiki/Provider%20edge%20router
A provider edge router (PE router) is a router between one network service provider's area and areas administered by other network providers. A network provider is usually an Internet service provider as well (or only that). The term PE router covers equipment capable of a broad range of routing protocols, notably: Border Gateway Protocol (BGP) (PE to PE or PE to CE communication) Open Shortest Path First (OSPF) (PE to CE router communication) Multiprotocol Label Switching (MPLS) (PE to P router communication) PE routers do not need to be aware of what kind of traffic is coming from the provider's network, as opposed to a P router that functions as a transit within the service provider's network. However, some PE routers also do labelling. See also Customer edge router Provider router
https://en.wikipedia.org/wiki/Butterfat
Butterfat or milkfat is the fatty portion of milk. Milk and cream are often sold according to the amount of butterfat they contain. Composition Butterfat is mainly composed of triglycerides. Each triglyceride contains three fatty acids. Butterfat triglycerides contain the following amounts of fatty acids (by mass fraction): Butterfat contains about 3% trans fat, which is slightly less than 0.5 grams per US tablespoon. Trans fats occur naturally in meat and milk from ruminants. The predominant kind of trans fat found in milk is vaccenic fatty acid. Trans fats may be also found in some industrially produced foods, such as shortenings obtained by hydrogenation of vegetable oils. In light of recognized scientific evidence, nutritional authorities consider all trans fats equally harmful for health and recommend that their consumption be reduced to trace amounts. However, two Canadian studies have shown that vaccenic acid could be beneficial compared to vegetable shortenings containing trans fats, or a mixture of pork lard and soy fat, by lowering total LDL and triglyceride levels. A study by the US Department of Agriculture showed that vaccenic acid raises both HDL and LDL cholesterol, whereas industrial trans fats only raise LDL with no beneficial effect on HDL. U.S. standards In the U.S., there are federal standards for butterfat content of dairy products. Many other countries also have standards for minimum fat levels in dairy products. Commercial products generally contain the minimum legal amount of fat with any excess being removed to make cream, a valuable commodity. Milks Non-fat milk, also labeled "fat-free milk" or "skim milk", contains less than 0.5% fat Low-fat milk is 1% fat Reduced-fat milk is 2% fat Whole milk contains at least 3.25% fat Cheeses Dry curd and nonfat cottage cheese contain less than 0.5% fat Lowfat cottage cheese contains 0.5–2% fat Cottage cheese contains at least 4% fat Swiss cheese contains at least 43% fat relative t
https://en.wikipedia.org/wiki/Bivalent%20chromatin
Bivalent chromatin are segments of DNA, bound to histone proteins, that have both repressing and activating epigenetic regulators in the same region. These regulators work to enhance or silence the expression of genes. Since these regulators work in opposition to each other, they normally interact with chromatin at different times. However, in bivalent chromatin, both types of regulators are interacting with the same domain at the same time. Bivalent chromatin domains are normally associated with promoters of transcription factor genes that are expressed at low levels. Bivalent domains have also been found to play a role in developmental regulation in pluripotent embryonic stems cells, gene imprinting and cancer. Bivalent epigenetic regulators The most common antagonistic epigenetic regulators found together on bivalent chromatin domains are methylation marks on histone 3 lysine 4 (H3K4me3) and histone 3 lysine 27 (H3K27me3). The H3K27me3 mark silences the gene while the H3K4me3 mark allows the gene to not be permanently silenced, and activated when needed. Embryonic stem cells and imprinted genes are associated with both activating (H3K4me3) and repressive (H3K27me3) marks, as they allow a gene to be repressed until activation is needed. Although there is abundant evidence for co-localization of H3K4me3 and H3K27me3 at the same location in the genome, most evidence suggests that they do not occur on the same molecule but may occur on different copies of histone H3 within the same nucleosome. Embryonic stem cells and development Bivalent chromatin domains are found in embryonic stem (ES) cells and play an important role in cell differentiation. When keeping an ES cell in its undifferentiated state, bivalent domains of DNA are used to silence developmental genes that would activate cell differentiation, while keeping the genes poised and ready to be activated. When an ES cell receives a signal to differentiate into a specified cell lineage, activation of the spec
https://en.wikipedia.org/wiki/Nocebo
A nocebo effect is said to occur when negative expectations of the patient regarding a treatment cause the treatment to have a more negative effect than it otherwise would have. For example, when a patient anticipates a side effect of a medication, they can experience that effect even if the "medication" is actually an inert substance. The complementary concept, the placebo effect, is said to occur when positive expectations improve an outcome. The nocebo effect is also said to occur in someone who falls ill owing to the erroneous belief that they were exposed to a toxin, e.g. SARS-CoV-2 vaccine adulterants, or to a physical phenomenon they believe is harmful, such as EM radiation. Both placebo and nocebo effects are presumably psychogenic, but they can induce measurable changes in the body. One article that reviewed 31 studies on nocebo effects reported a wide range of symptoms that could manifest as nocebo effects, including nausea, stomach pains, itching, bloating, depression, sleep problems, loss of appetite, sexual dysfunction, and severe hypotension. Etymology and usage The term nocebo (Latin , 'I shall harm', from , 'I harm') was coined by Walter Kennedy in 1961 to denote the counterpart to the use of placebo (Latin , 'I shall please', from , 'I please') a substance that may produce a beneficial, healthful, pleasant, or desirable effect). Kennedy emphasized that his use of the term nocebo refers strictly to a subject-centered response, a quality inherent in the patient rather than in the remedy". That is, Kennedy rejected the use of the term for pharmacologically induced negative side effects such as the ringing in the ears caused by quinine. That is not to say that the patient's psychologically induced response may not include physiological effects. For example, an expectation of pain may induce anxiety, which in turn causes the release of cholecystokinin, which facilitates pain transmission. Response In the narrowest sense, a nocebo response occurs when
https://en.wikipedia.org/wiki/Monoaminergic%20cell%20groups
Monoaminergic cell groups refers to collections of neurons in the central nervous system that have been demonstrated by histochemical fluorescence to contain one of the neurotransmitters serotonin, dopamine, norepinephrine or epinephrine. Thus, it represents the combination of catecholaminergic cell groups and serotonergic cell groups.
https://en.wikipedia.org/wiki/130th%20meridian%20east
The meridian 130° east of Greenwich is a line of longitude that extends from the North Pole across the Arctic Ocean, Asia, Australia, the Indian Ocean, the Southern Ocean, and Antarctica to the South Pole. The 130th meridian east forms a great circle with the 50th meridian west. From Pole to Pole Starting at the North Pole and heading south to the South Pole, the 130th meridian east passes through: {| class="wikitable plainrowheaders" ! scope="col" width="130" | Co-ordinates ! scope="col" | Country, territory or sea ! scope="col" | Notes |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Arctic Ocean | style="background:#b0e0e6;" | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Laptev Sea | style="background:#b0e0e6;" | |-valign="top" | ! scope="row" | | Sakha Republic Amur Oblast — from |-valign="top" | ! scope="row" | | Heilongjiang Jilin — from |- | ! scope="row" | | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Sea of Japan | style="background:#b0e0e6;" | |-valign="top" | ! scope="row" | | Island of Kyūshū— Saga Prefecture— Nagasaki Prefecture — from |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | East China Sea | style="background:#b0e0e6;" | |-valign="top" | ! scope="row" | | Island of Shimoshima— Kumamoto Prefecture |-valign="top" | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | East China Sea | style="background:#b0e0e6;" | Passing just east of the Koshikijima Islands, Kagoshima Prefecture, (at ) Passing just east of the island of Kuroshima, Kagoshima Prefecture, (at ) Passing just west of the island of Kuchinoerabujima, Kagoshima Prefecture, (at ) |-valign="top" | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Pacific Ocean | style="background:#b0e0e6;" | Passing just east of the island of Kuchinoshima, Kagoshima Prefecture, (at ) |-valign="top" | ! scope=
https://en.wikipedia.org/wiki/Quantitative%20fund
A quantitative fund is an investment fund that uses quantitative investment management instead of fundamental human analysis. Investment process See for a listing of relevant articles. An investment process is classified as quantitative when investment management is fully based on the use of mathematical and statistical methods to make investment decisions. If investment decisions are based on fundamental analysis and human judgement, the process is classified as fundamental. The quantitative investment process, essentially, breaks down into three key components: Input system: Providing all necessary inputs such as market data and rules (see financial data vendor); Forecasting engine: Generating estimations for prices and returns and also, risk parameters; Portfolio construction engine: portfolio composition using optimizers or a heuristics-based system (see Portfolio optimization and Mathematical tools). Quantitative portfolio managers and quantitative analysts usually require a strong background in mathematics and computer science, besides knowledge of the academic financial literature. Many quantitative specialists have a PhD in Financial Economics, Engineering or Mathematics. In depth knowledge is needed to as the investment algorithms employ advanced optimization methods using the latest academic insights. Statistical models are used to explore profits that may be made out of systematic market abnormalities which can be very fast such and requires high-frequency trading, but can also be slower requiring less turnover when the alpha is based on factor investing. History and performance Hedge funds have been driving the growth of quantitative funds over the past decades. A good description of the history of hedge funds can be found in the book "More Money than God". Several of these early funds were quantitatively managed. Over the past two decades quantitatively managed funds have become popular as an increasing number of asset managers adopted quantitativ
https://en.wikipedia.org/wiki/Cauliflower
Cauliflower is one of several vegetables in the species Brassica oleracea in the genus Brassica, which is in the Brassicaceae (or mustard) family. It is an annual plant that reproduces by seed. Typically, only the head is eaten – the edible white flesh is sometimes called "curd" (with a similar appearance to cheese curd). The cauliflower head is composed of a white inflorescence meristem. Cauliflower heads resemble those in broccoli, which differs in having flower buds as the edible portion. Brassica oleracea also includes broccoli, Brussels sprouts, cabbage, collard greens, kale, and kohlrabi, collectively called "cole" crops, though they are of different cultivar groups. History Pliny the Elder included cyma among cultivated plants he described in Natural History: "Ex omnibus brassicae generibus suavissima est cyma" ("Of all the varieties of cabbage the most pleasant-tasted is cyma"). Pliny's description likely refers to the flowering heads of an earlier cultivated variety of Brassica oleracea. In the Middle Ages, early forms of cauliflower were associated with the island of Cyprus, with the 12th- and 13th-century Arab botanists Ibn al-'Awwam and Ibn al-Baitar claiming its origin to be Cyprus. This association continued into Western Europe, where cauliflowers were sometimes known as Cyprus colewort, and there was extensive trade in western Europe in cauliflower seeds from Cyprus, under the French Lusignan rulers of the island, until well into the 16th century. François Pierre La Varenne employed chouxfleurs in Le cuisinier françois. They were introduced to France from Genoa in the 16th century and are featured in Olivier de Serres' Théâtre de l'agriculture (1600), as cauli-fiori "as the Italians call it, which are still rather rare in France; they hold an honorable place in the garden because of their delicacy", but they did not commonly appear on grand tables until the time of Louis XIV. It was introduced to India in 1822 by the British. Etymology The wor
https://en.wikipedia.org/wiki/Seaman%20%28video%20game%29
is a virtual pet video game for the Sega Dreamcast. It is one of the few Dreamcast games to take advantage of the microphone attachment. The game developed a cult following for its dark humor, bizarre aesthetics, and innovative gameplay. Seaman was released multiple times, including a limited edition demo version titled Christmas Seaman that was released in Japan in 1999, alongside a limited edition red Dreamcast and a PlayStation 2 version in 2001, titled Seaman: Kindan no Pet - Gaze Hakushi no Jikken Shima, the first edition of which came with a microphone. A PC version for Microsoft Windows was planned, with the Seaman being able to interact with the user's applications. No release date was specified, and it was later cancelled. A sequel called Seaman 2 was released in Japan for the PlayStation 2 in 2007. Gameplay Seaman is considered a unique video game because it contains limited action. The player's goal is to feed and care for the Seaman while providing him with the company that he needs. The mechanic operates in real time, so the player is required to check on the Seaman every real-time day or he could die. A portion of the Seaman's knowledge is random trivia. When he asks what the player's birthday is (and the player responds via the microphone input), the Seaman will share significant events that happened on that date. Although the Seaman becomes fairly domesticated, it does not stop insulting the player or making less-than-friendly remarks. At the beginning of the game, the player is provided with an unhatched Seaman egg and develops and interacts with it through various stages of development. Through various buttons on the Dreamcast controller, the player controls all the machinery and physical contact with the mysterious creature. The player is also provided with multiple Seamen for breeding and interaction purposes. Over the course of the game, the player is required to evolve their Seaman to different stages in its life cycle, eventually transf
https://en.wikipedia.org/wiki/ISCB%20Africa%20ASBCB%20Conference%20on%20Bioinformatics
The ISCB Africa ASBCB Conference on Bioinformatics is a biennial academic conference on the subjects of bioinformatics and computational biology, organized by the African Society for Bioinformatics and Computational Biology (ASBCB). The conference was first held in 2007 as the "ASBCB Conference on the Bioinformatics of African Pathogens, Hosts and Vectors". Since 2009, the conference has been jointly organized with the International Society for Computational Biology (ISCB) and held in different locations within Africa. Although having an evident African focus, the meeting is intended to be a truly international event, encompassing scientists and students from leading institutions in the US, Latin America, Europe and Africa. Holding this event in Africa, ISCB and ASBCB intend to promote local efforts for cooperation and dissemination of leading research techniques to combat major African diseases. Format of the Meeting The meeting usually consists of a 3-day conference followed by practical workshops. The main 3-day meeting includes keynote presentations by up to 6 invited speakers from around the world, including Africa. Session Chairs introduce Keynote Speakers with an overview of the session, highlighting the most significant challenges and the current state of the art in the field before the keynote speakers launch their presentations. Highly accomplished researchers, primarily but not exclusively from non-African countries, present during the post conference tutorial workshops. Conference Goals To directly impact existing capacity to develop public health interventions in endemic Africa countries by driving collaboration and networks development and training. To expose and educate young and established scientists to the latest bioinformatics tools and techniques used in researching treatments and cures for African hosts, vectors and disease. Scientific publications Since 2009, the ISCB Africa ASBCB Conference has been partnering with the Genes, Infection an
https://en.wikipedia.org/wiki/Sim4
Sim4 is a nucleotide sequence alignment program akin to BLAST but specifically tailored to DNA to cDNA/EST (Expressed Sequence Tag) alignment (as opposed to DNA–DNA or protein–protein alignment). It was written by Florea et al. External links A Computer Program for Aligning a cDNA Sequence with a Genomic DNA Sequence Download Phylogenetics software
https://en.wikipedia.org/wiki/Homeomorphism%20%28graph%20theory%29
In graph theory, two graphs and are homeomorphic if there is a graph isomorphism from some subdivision of to some subdivision of . If the edges of a graph are thought of as lines drawn from one vertex to another (as they are usually depicted in illustrations), then two graphs are homeomorphic to each other in the graph-theoretic sense precisely if they are homeomorphic in the topological sense. Subdivision and smoothing In general, a subdivision of a graph G (sometimes known as an expansion) is a graph resulting from the subdivision of edges in G. The subdivision of some edge e with endpoints {u,v&hairsp;} yields a graph containing one new vertex w, and with an edge set replacing e by two new edges, {u,w&hairsp;} and {w,v&hairsp;}. For example, the edge e, with endpoints {u,v&hairsp;}: can be subdivided into two edges, e1 and e2, connecting to a new vertex w: The reverse operation, smoothing out or smoothing a vertex w with regards to the pair of edges (e1, e2) incident on w, removes both edges containing w and replaces (e1, e2) with a new edge that connects the other endpoints of the pair. Here, it is emphasized that only degree-2 (i.e., 2-valent) vertices can be smoothed. For example, the simple connected graph with two edges, e1 {u,w&hairsp;} and e2 {w,v&hairsp;}: has a vertex (namely w) that can be smoothed away, resulting in: Determining whether for graphs G and H, H is homeomorphic to a subgraph of G, is an NP-complete problem. Barycentric subdivisions The barycentric subdivision subdivides each edge of the graph. This is a special subdivision, as it always results in a bipartite graph. This procedure can be repeated, so that the nth barycentric subdivision is the barycentric subdivision of the n−1st barycentric subdivision of the graph. The second such subdivision is always a simple graph. Embedding on a surface It is evident that subdividing a graph preserves planarity. Kuratowski's theorem states that a finite graph is planar if and o
https://en.wikipedia.org/wiki/Stratus%20VOS
Stratus VOS (Virtual Operating System) is a proprietary operating system running on Stratus Technologies fault-tolerant computer systems. VOS is available on Stratus's ftServer and Continuum platforms. VOS customers use it to support high-volume transaction processing applications which require continuous availability. VOS is notable for being one of the few operating systems which run on fully lockstepped hardware. During the 1980s, an IBM version of Stratus VOS existed and was called the System/88 Operating System. History VOS was designed from its inception as a high-security transaction-processing environment tailored to fault-tolerant hardware. It incorporates much of the design experience that came out of the MIT/Bell-Laboratories/General-Electric (later Honeywell) Multics project. In 1984, Stratus added a UNIX System V implementation called Unix System Facilities (USF) to VOS, integrating Unix and VOS at the kernel level. In recent years, Stratus has added POSIX-compliance, and many open source packages can run on VOS. Like competing proprietary operating systems, VOS has seen its market share shrink steadily in the 1990s and early 2000s. Development Programming for VOS VOS provides compilers for PL/I, COBOL, Pascal, FORTRAN, C (with the VOS C and GCC compilers), and C++ (also GCC). Each of these programming languages can make VOS system calls (e.g. s$seq_read to read a record from a file), and has extensions to support varying-length strings in PL/I style. Developers typically code in their favourite VOS text editor, or offline, before compiling on the system; there are no VOS IDE applications. In its history, Stratus has offered hardware platforms based on the Motorola 68000 microprocessor family ("FT" and "XA" series), the Intel i860 microprocessor family ("XA/R" series), the HP PA-RISC processor family ("Continuum" series), and the Intel Xeon x86 processor family ("V Series"). All versions of VOS offer compilers targeted at the native instructi
https://en.wikipedia.org/wiki/Syllabical%20and%20Steganographical%20Table
Syllabical and Steganographical Table (French: Tableau syllabique et stéganographique) is an eighteenth-century cryptographical work by P. R. Wouves. Published by Benjamin Franklin Bache in 1797, it provided a method for representing pairs of letters by numbers. It may have been the first chart for cryptographic purposes to have been printed in the United States.
https://en.wikipedia.org/wiki/179th%20meridian%20west
The meridian 179° west of Greenwich is a line of longitude that extends from the North Pole across the Arctic Ocean, Asia, the Pacific Ocean, the Southern Ocean, and Antarctica to the South Pole. The 179th meridian west forms a great circle with the 1st meridian east. From Pole to Pole Starting at the North Pole and heading south to the South Pole, the 179th meridian west passes through: {| class="wikitable plainrowheaders" ! scope="col" width="130" | Co-ordinates ! scope="col" width="110" | Country, territory or sea ! scope="col" | Notes |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Arctic Ocean | style="background:#b0e0e6;" | |- | ! scope="row" | | Chukotka Autonomous Okrug — Wrangel Island |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Chukchi Sea | style="background:#b0e0e6;" | |- | ! scope="row" | | Chukotka Autonomous Okrug |-valign="top" | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Bering Sea | style="background:#b0e0e6;" | Passing just west of Gareloi Island, Alaska, (at ) Passing just east of Unalga Island, Alaska, (at ) Passing just west of Kavalga Island, Alaska, (at ) Passing just west of Ulak Island, Alaska, (at ) Passing just east of Amatignak Island, Alaska, (at ) |-valign="top" | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Pacific Ocean | style="background:#b0e0e6;" | Passing just east of the island of Qelelevu, (at ) |- | ! scope="row" | | Island of Vanua Balavu |-valign="top" | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Pacific Ocean | style="background:#b0e0e6;" | Passing just east of the island of Mago, (at ) Passing just west of the island of Tuvuca, (at ) Passing just east of the island of Nayau, (at ) Passing just west of the island of Lakeba, (at ) Passing just west of the island of Vuaqava, (at ) Passing just west of the island of Kabara, (at ) Passing just west
https://en.wikipedia.org/wiki/QoI
{{DISPLAYTITLE:QoI}} Qo inhibitors (QoI), or quinone outside inhibitors, are a group of fungicides used in agriculture. Some of these fungicides are among the most popular in the world. QoI are chemical compounds which act at the quinol outer binding site of the cytochrome bc1 complex. Most QI common names end in -strobin and so are often called strobs. QoI's are the resulting fusion of three fungicides families, the well-known family of strobilurins and two new families, represented by fenamidone and famoxadone. Some strobilurins are azoxystrobin, kresoxim-methyl, picoxystrobin, pyraclostrobin, and trifloxystrobin. Usage QoI fungicides are used on a wide range of crops, such as cereals, vines, pome fruits, cucurbits, tomatoes, and potatoes. For example, they are used as fungicides for cereals, against Erysiphe graminis f.sp tritici responsible for the powdery mildew in wheat or against Septoria tritici, responsible for septoria leaf spot in wheat. They are also commonly used for vine culture, against Plasmopara viticola, responsible for downy mildew or in oïdium treatment. List QIs: methoxy-acrylates: azoxystrobin coumoxystrobin enoxastrobin flufenoxystrobin picoxystrobin pyraoxystrobin methoxy-acetamides: mandestrobin methoxy-carbamates: pyraclostrobin pyrametostrobin triclopyricarb oximino-acetates: kresoxim-methyl trifloxystrobin oximino-acetamides: dimoxystrobin fenaminstrobin metominostrobin orysastrobin oxazolidine-diones: famoxadone dihydro-dioxazines: fluoxastrobin imidazolinones: fenamidone benzyl-carbamates: pyribencarb QI subgroup A: tetrazolinones: metyltetraprole Resistance Main group resistance Almost all these fungicides are in the same cross-resistance group (FRAC 11) and must be managed carefully to avoid the appearance of fungicide resistance. All group 11s are cross-resistant with each other. Some fungicide resistance has been observed in many crop pathogens (such as in the case of wheat powdery mildew),
https://en.wikipedia.org/wiki/Chemical%20field-effect%20transistor
A ChemFET is a chemically-sensitive field-effect transistor, that is a field-effect transistor used as a sensor for measuring chemical concentrations in solution. When the target analyte concentration changes, the current through the transistor will change accordingly. Here, the analyte solution separates the source and gate electrodes. A concentration gradient between the solution and the gate electrode arises due to a semi-permeable membrane on the FET surface containing receptor moieties that preferentially bind the target analyte. This concentration gradient of charged analyte ions creates a chemical potential between the source and gate, which is in turn measured by the FET. Construction A ChemFET's source and drain are constructed as for an ISFET, with the gate electrode separated from the source electrode by a solution. The gate electrode's interface with the solution is a semi-permeable membrane containing the receptors, and a gap to allow the substance under test to come in contact with the sensitive receptor moieties. A ChemFET's threshold voltage depends on the concentration gradient between the analyte in solution and the analyte in contact with its receptor-embedded semi-permeable barrier. Often, ionophores are used to facilitate analyte ion mobility through the substrate to the receptor. For example, when targeting anions, quaternary ammonium salts (such as tetraoctylammonium bromide) are used to provide cationic nature to the membrane, facilitating anion mobility through the substrate to the receptor moieties. Applications ChemFETs can be utilized in either liquid or gas phase to detect target analyte, requiring reversible binding of analyte with a receptor located in the gate electrode membrane. There is a wide range of applications of ChemFETs, including most notably anion or cation selective sensing. More work has been done with cation-sensing ChemFETs than anion-sensing ChemFETs. Anion-sensing is more complicated than cation-sensing in ChemF
https://en.wikipedia.org/wiki/Po-Shen%20Loh
Po-Shen Loh (born June 18, 1982) is an American professor of mathematics at Carnegie Mellon University, specializing in combinatorics, and formerly served as the national coach of the United States' International Math Olympiad team. He is the founder of educational websites Expii and Live, and lead developer of contact-tracing app NOVID. Early life and education Loh was born on June 18, 1982 in Madison, Wisconsin to Singaporean immigrants Wei-Yin and Theresa Loh. As a middle school student, Loh twice represented Wisconsin in the Mathcounts competition. He attended James Madison Memorial High School, and in 1999 won a silver medal representing the US in the International Mathematical Olympiad (IMO). Loh studied mathematics as an undergraduate student at the California Institute of Technology. In 2003, he won a Goldwater Scholarship. In 2004 he graduated with honors, ranked first in his graduating class, and his undergraduate thesis received an honorable mention for the 2004 Morgan Prize. Loh completed a one-year master's degree at Cambridge University on a Churchill Scholarship. Loh pursued graduate studies in mathematics at Princeton University with the support of a Hertz Fellowship, and, under the supervision of Benny Sudakov, received a Ph.D. in 2010 with his dissertation Results in extremal and probabilistic combinatorics. Career Teaching and coaching Loh's math coaching career started in 2002 when he first served as an assistant coach at the US national IMO training camp, Mathematical Olympiad Summer Program (MOSP). In 2010, Loh was appointed deputy leader Team USA for the IMO, and in 2014 he was appointed leader, and remains the national coach. Under his coaching, the team won the competition in 2015, 2016, 2018, and 2019—their first victories since 1994. He left his position as MOP director and IMO team leader in 2023. Loh has been a professor at Carnegie Mellon University since 2010, where he teaches courses on discrete mathematics and extremal com
https://en.wikipedia.org/wiki/Mary%20Celine%20Fasenmyer
Mary Celine Fasenmyer, RSM (October 4, 1906, Crown, Pennsylvania – December 27, 1996, Erie, Pennsylvania) was an American mathematician and Catholic religious sister. She is most noted for her work on hypergeometric functions and linear algebra. Biography Fasenmyer grew up in Pennsylvania's oil country, and displayed mathematical talent in high school. For ten years after her graduation she taught and studied at Mercyhurst College in Erie, where she joined the Sisters of Mercy. She pursued her mathematical studies in Pittsburgh and the University of Michigan, obtaining her doctorate in 1946 under the direction of Earl Rainville, with a dissertation entitled Some Generalized Hypergeometric Polynomials. After earning her Ph.D., Fasenmyer published two papers which expanded on her doctorate work. These would be further elaborated by Doron Zeilberger and Herbert Wilf into "WZ theory", which allowed computerized proof of many combinatorial identities. After this, she returned to Mercyhurst to teach and did not engage in further research. Fasenmyer died in 1996. "Sister Celine's" method Fasenmyer is most remembered for the method that bears her name, first described in her Ph.D. thesis concerning recurrence relations in hypergeometric series. The thesis demonstrated a purely algorithmic method to find recurrence relations satisfied by sums of terms of a hypergeometric polynomial, and requires only the series expansions of the polynomial. The beauty of her method is that it lends itself readily to computer automation. The work of Wilf and Zeilberger generalized the algorithm and established its correctness. The hypergeometric polynomials she studied are called Sister Celine's polynomials.
https://en.wikipedia.org/wiki/Stokes%20parameters
The Stokes parameters are a set of values that describe the polarization state of electromagnetic radiation. They were defined by George Gabriel Stokes in 1852,<ref>S. Chandrasekhar 'Radiative Transfer, Dover Publications, New York, 1960, , page 25</ref> as a mathematically convenient alternative to the more common description of incoherent or partially polarized radiation in terms of its total intensity (I), (fractional) degree of polarization (p), and the shape parameters of the polarization ellipse. The effect of an optical system on the polarization of light can be determined by constructing the Stokes vector for the input light and applying Mueller calculus, to obtain the Stokes vector of the light leaving the system. The original Stokes paper was discovered independently by Francis Perrin in 1942 and by Subrahamanyan Chandrasekhar in 1947,Chandrasekhar, S. (1947). The transfer of radiation in stellar atmospheres. Bulletin of the American Mathematical Society, 53(7), 641-711. who named it as the Stokes parameters. Definitions The relationship of the Stokes parameters S0, S1, S2, S3 to intensity and polarization ellipse parameters is shown in the equations below and the figure on the right. Here , and are the spherical coordinates of the three-dimensional vector of cartesian coordinates . is the total intensity of the beam, and is the degree of polarization, constrained by . The factor of two before represents the fact that any polarization ellipse is indistinguishable from one rotated by 180°, while the factor of two before indicates that an ellipse is indistinguishable from one with the semi-axis lengths swapped accompanied by a 90° rotation. The phase information of the polarized light is not recorded in the Stokes parameters. The four Stokes parameters are sometimes denoted I, Q, U and V, respectively. Given the Stokes parameters, one can solve for the spherical coordinates with the following equations: Stokes vectors The Stokes parameters are oft
https://en.wikipedia.org/wiki/Phoslock
Phoslock is the commercial name for a bentonite clay in which the sodium and/or calcium ions are exchanged for lanthanum. The lanthanum contained within Phoslock reacts with phosphate to form an inert mineral known as rhabdophane (LaPO4.\mathit{n}H2O). Phoslock is used in lake restoration projects to remove excess phosphorus from aquatic systems, thereby improving water quality and inducing biological recovery in impaired freshwater systems. It was developed in Australia by the CSIRO in the late 1990s by Dr Grant Douglas (US Patent 6350383) as a way of utilising the ability of lanthanum to bind phosphate in freshwater natural aquatic systems. The first large-scale trial took place in January 2000 in the Canning River, Western Australia. During its development, patenting and commercialisation by CSIRO and subsequent commercial production, Phoslock has been a subject in academic research and has been used globally in lake restoration projects. The largest number of whole lake applications and the most comprehensive pre- and post-application monitoring has taken place in Europe, primarily Germany (where it is sold under the tradename Bentophos), the Netherlands and the UK. There are studies indicating that lanthanum release due to application of this clay could lead to increased concentrations of this rare element in water and soils, resulting in bioaccumulation in animal tissues and there are still concerns and precautions to be taken as currently there is not enough complete and independent information. See also Eutrophication Harmful algal bloom
https://en.wikipedia.org/wiki/Aliens%3A%20The%20Computer%20Game%20%28US%20Version%29
Aliens: The Computer Game is a 1986 video game developed and published by Activision for the Commodore 64, Apple II based on the film of the same title. As Activision's UK subsidiary Electric Dreams Software had independently released their own version of the game with the same title, the game was renamed for European release. Initially planned to be released as Aliens: The Second Part., it was finally published under the title Aliens: US Version with ports for the Amstrad CPC and ZX Spectrum produced by Mr Micro. Gameplay Aliens is a series of six minigames strung together via graphical interactive sequences, akin to an adventure game, though the only interaction possible is advancing the dialog, displayed in speech balloons. The minigames are mostly action sequences that involve piloting a ship from Sulaco to the planet's surface, recognizing equipment, and fighting aliens. Reception At the time of its release, the game received mixed reviews, including the scores of 85% from Commodore Format, 8/10 (averaged) from Computer and Video Games, 45% from Crash, 5/10 from Sinclair User, 9/10 from Your Sinclair, and 60% from Zzap!64. Info gave the Commodore 64 version four stars out of five: "The aliens are appropriately creepy, and each sequence is well done & plays quite differently from the others". Retrospective VentureBeats Stephen Kleckner commented in a 2014 feature that "as with a lot of compilation-designed titles, Aliens falls into that trap of being a collection of mediocre experiences instead of a game with a singular focus. […] Hardcore fans who own a Commodore 64 should load this one up. Everyone else isn’t missing much that a Let's Play video won't provide." On the other hand, Chris Cummins from Topless Robot wrote in 2010 that "the now-crude graphics aside, it's still arguably the best game based on any of the films in the Alien saga." Reviews Isaac Asimov's Science Fiction Magazine v11 n8 (1987 08) See also List of Alien, Predator and Alien vs. Pr
https://en.wikipedia.org/wiki/Epistasis
Epistasis is a phenomenon in genetics in which the effect of a gene mutation is dependent on the presence or absence of mutations in one or more other genes, respectively termed modifier genes. In other words, the effect of the mutation is dependent on the genetic background in which it appears. Epistatic mutations therefore have different effects on their own than when they occur together. Originally, the term epistasis specifically meant that the effect of a gene variant is masked by that of a different gene. The concept of epistasis originated in genetics in 1907 but is now used in biochemistry, computational biology and evolutionary biology. The phenomenon arises due to interactions, either between genes (such as mutations also being needed in regulators of gene expression) or within them (multiple mutations being needed before the gene loses function), leading to non-linear effects. Epistasis has a great influence on the shape of evolutionary landscapes, which leads to profound consequences for evolution and for the evolvability of phenotypic traits. History Understanding of epistasis has changed considerably through the history of genetics and so too has the use of the term. The term was first used by William Bateson and his collaborators Florence Durham and Muriel Wheldale Onslow. In early models of natural selection devised in the early 20th century, each gene was considered to make its own characteristic contribution to fitness, against an average background of other genes. Some introductory courses still teach population genetics this way. Because of the way that the science of population genetics was developed, evolutionary geneticists have tended to think of epistasis as the exception. However, in general, the expression of any one allele depends in a complicated way on many other alleles. In classical genetics, if genes A and B are mutated, and each mutation by itself produces a unique phenotype but the two mutations together show the same phenotype
https://en.wikipedia.org/wiki/Hybrid%20cell%20line
A hybrid cell line is a fusion of cells from two different cell types. When the membrane of two cells merge, the nuclei combine to form a polykaryote (poly- multiple; karyon- chromosome). These fusions can happen spontaneously as in the case of tumor hybrid cells, or may be induced by a variety of laboratory techniques. The first instance of intentionally generated hybrid cells was described in 1960 by Barski, Sorieul, and Cornefert in their paper "Production of cells of a 'hybrid' nature in cultures in vitro of 2 cellular strains in combination," originally published in French. Today, one of the main purposes of generating hybrid cell lines is to fuse cells that secrete a useful product with an immortal cell line to maximize the secretions. For example, immunoglobin-producing B lymphocytes can be fused with myeloma to produce an immortal line of cells called hybridoma that secrete immunoglobin. Hybrid cell lines are also used to study cancer and map genes Generating hybrid cells The three most common methods of fusing cells to make a hybrid line are via oncogenic viruses, polyethylene glycol, or electrofusion. Oncogenic viruses Certain oncogenic viruses encode for fusogens, which are proteins that encourage two cell membranes to fuse. When the viral genes are expressed by the host cell, the membranes of two cells may fuse. Some oncogenic viruses that code for fusogens and are capable of hybridizing cells are: Epstein-Barr virus HPV Hepatitis B and Hepatitis C Human T-cell lymphotropic virus type 1 (HTLV-1) The virus used in laboratories for controlled cell hybridization is inactivated Sendai virus. Polyethylene glycol Using polyethylene glycol is the method for inducing cell hybridization that requires the fewest steps. Polyethylene glycol functions by changing the direction and configuration of lipid molecules in the cell membrane, which increases their permeability and allows two membranes to fuse. Because of the low specificity of polyethylene
https://en.wikipedia.org/wiki/L-function
In mathematics, an L-function is a meromorphic function on the complex plane, associated to one out of several categories of mathematical objects. An L-series is a Dirichlet series, usually convergent on a half-plane, that may give rise to an L-function via analytic continuation. The Riemann zeta function is an example of an L-function, and one important conjecture involving L-functions is the Riemann hypothesis and its generalization. The theory of L-functions has become a very substantial, and still largely conjectural, part of contemporary analytic number theory. In it, broad generalisations of the Riemann zeta function and the L-series for a Dirichlet character are constructed, and their general properties, in most cases still out of reach of proof, are set out in a systematic way. Because of the Euler product formula there is a deep connection between L-functions and the theory of prime numbers. The mathematical field that studies L-functions is sometimes called analytic theory of L-functions. Construction We distinguish at the outset between the L-series, an infinite series representation (for example the Dirichlet series for the Riemann zeta function), and the L-function, the function in the complex plane that is its analytic continuation. The general constructions start with an L-series, defined first as a Dirichlet series, and then by an expansion as an Euler product indexed by prime numbers. Estimates are required to prove that this converges in some right half-plane of the complex numbers. Then one asks whether the function so defined can be analytically continued to the rest of the complex plane (perhaps with some poles). It is this (conjectural) meromorphic continuation to the complex plane which is called an L-function. In the classical cases, already, one knows that useful information is contained in the values and behaviour of the L-function at points where the series representation does not converge. The general term L-function here includes
https://en.wikipedia.org/wiki/DNA%20fragmentation
DNA fragmentation is the separation or breaking of DNA strands into pieces. It can be done intentionally by laboratory personnel or by cells, or can occur spontaneously. Spontaneous or accidental DNA fragmentation is fragmentation that gradually accumulates in a cell. It can be measured by e.g. the Comet assay or by the TUNEL assay. Its main units of measurement is the DNA Fragmentation Index (DFI). A DFI of 20% or more significantly reduces the success rates after ICSI. DNA fragmentation was first documented by Williamson in 1970 when he observed discrete oligomeric fragments occurring during cell death in primary neonatal liver cultures. He described the cytoplasmic DNA isolated from mouse liver cells after culture as characterized by DNA fragments with a molecular weight consisting of multiples of 135 kDa. This finding was consistent with the hypothesis that these DNA fragments were a specific degradation product of nuclear DNA. Intentional DNA fragmentation is often necessary prior to library construction or subcloning for DNA sequences. A variety of methods involving the mechanical breakage of DNA have been employed where DNA is fragmented by laboratory personnel. Such methods include sonication, needle shear, nebulisation, point-sink shearing and passage through a pressure cell. Restriction digest is the intentional laboratory breaking of DNA strands. It is an enzyme-based treatment used in biotechnology to cut DNA into smaller strands in order to study fragment length differences among individuals or for gene cloning. This method fragments DNA either by the simultaneous cleavage of both strands, or by generation of nicks on each strand of dsDNA to produce dsDNA breaks. Acoustic shearing of the transmission of high-frequency acoustic energy waves delivered to a DNA library. The transducer is bowl shaped so that the waves converge at the target of interest. Nebulization forces DNA through a small hole in a nebulizer unit, which results in the formation
https://en.wikipedia.org/wiki/Animal%20husbandry
Animal husbandry is the branch of agriculture concerned with animals that are raised for meat, fibre, milk, or other products. It includes day-to-day care, selective breeding, and the raising of livestock. Husbandry has a long history, starting with the Neolithic Revolution when animals were first domesticated, from around 13,000 BC onwards, predating farming of the first crops. By the time of early civilisations such as ancient Egypt, cattle, sheep, goats, and pigs were being raised on farms. Major changes took place in the Columbian exchange, when Old World livestock were brought to the New World, and then in the British Agricultural Revolution of the 18th century, when livestock breeds like the Dishley Longhorn cattle and Lincoln Longwool sheep were rapidly improved by agriculturalists, such as Robert Bakewell, to yield more meat, milk, and wool. A wide range of other species, such as horse, water buffalo, llama, rabbit, and guinea pig, are used as livestock in some parts of the world. Insect farming, as well as aquaculture of fish, molluscs, and crustaceans, is widespread. Modern animal husbandry relies on production systems adapted to the type of land available. Subsistence farming is being superseded by intensive animal farming in the more developed parts of the world, where, for example, beef cattle are kept in high-density feedlots, and thousands of chickens may be raised in broiler houses or batteries. On poorer soil, such as in uplands, animals are often kept more extensively and may be allowed to roam widely, foraging for themselves. Most livestock are herbivores, except for pigs and chickens which are omnivores. Ruminants like cattle and sheep are adapted to feed on grass; they can forage outdoors or may be fed entirely or in part on rations richer in energy and protein, such as pelleted cereals. Pigs and poultry cannot digest the cellulose in forage and require other high-protein foods. Etymology The verb to husband, meaning "to manage carefully," d
https://en.wikipedia.org/wiki/Sound%20recording%20copyright%20symbol
The sound recording copyright symbol or phonogram symbol, (letter P in a circle), is the copyright symbol used to provide notice of copyright in a sound recording (phonogram) embodied in a phonorecord (LPs, audiotapes, cassette tapes, compact discs, etc.). It was first introduced in the Rome Convention for the Protection of Performers, Producers of Phonograms and Broadcasting Organisations. The United States added it to its copyright law as part of its adherence to the Geneva Phonograms Convention in 17 U.S.C. § 402, the codification of the Copyright Act of 1976. The letter P in stands for phonogram, the legal term used in most English-speaking countries to refer to works known in U.S. copyright law as "sound recordings". A sound recording has a separate copyright that is distinct from that of the underlying work (usually a musical work, expressible in musical notation and written lyrics), if any. The sound recording copyright notice extends to a copyright for just the sound itself and will not apply to any other rendition or version, even if performed by the same artist(s). International treaties The symbol first appeared in the Rome Convention for the Protection of Performers, Producers of Phonograms and Broadcasting Organisations, a multilateral treaty relating to copyright, in 1961. Article 11 of the Rome Convention provided: When the Geneva Phonograms Convention, another multilateral copyright treaty, was signed in 1971, it included a similar provision in its Article 5: United States law The symbol was introduced into United States copyright law in 1971, when the US extended limited copyright protection to sound recordings. The United States anticipated signing onto the Geneva Phonograms Convention, which it had helped draft. On October 15, 1971, Congress enacted the Sound Recording Act of 1971, also known as the Sound Recording Amendment of 1971, which amended the 1909 Copyright Act by adding protection for sound recordings and prescribed a copyright n
https://en.wikipedia.org/wiki/Gr%C3%B6bner%20fan
In computer algebra, the Gröbner fan of an ideal in the ring of polynomials is a concept in the theory of Gröbner bases. It is defined to be a fan consisting of cones that correspond to different monomial orders on that ideal. The concept was introduced by Mora and Robbiano in 1988. The result is a weaker version of the result presented in the same issue of the journal by Bayer and Morrison. Gröbner fan is a base for the nowadays active field of tropical geometry. One implementation of the Gröbner fan is called Gfan, based on an article of Fukuda, et al. which is included in some computer algebra systems such as Singular, Macaulay2, and CoCoA. See also Gröbner basis Tropical geometry
https://en.wikipedia.org/wiki/Thermal%20management%20%28electronics%29
All electronic devices and circuitry generate excess heat and thus require thermal management to improve reliability and prevent premature failure. The amount of heat output is equal to the power input, if there are no other energy interactions. There are several techniques for cooling including various styles of heat sinks, thermoelectric coolers, forced air systems and fans, heat pipes, and others. In cases of extreme low environmental temperatures, it may actually be necessary to heat the electronic components to achieve satisfactory operation. Overview Thermal resistance of devices This is usually quoted as the thermal resistance from junction to case of the semiconductor device. The units are °C/W. For example, a heatsink rated at 10 °C/W will get 10 °C hotter than the surrounding air when it dissipates 1 Watt of heat. Thus, a heatsink with a low °C/W value is more efficient than a heatsink with a high °C/W value. Given two semiconductor devices in the same package, a lower junction to ambient resistance (RθJ-C) indicates a more efficient device. However, when comparing two devices with different die-free package thermal resistances (Ex. DirectFET MT vs wirebond 5x6mm PQFN), their junction to ambient or junction to case resistance values may not correlate directly to their comparative efficiencies. Different semiconductor packages may have different die orientations, different copper(or other metal) mass surrounding the die, different die attach mechanics, and different molding thickness, all of which could yield significantly different junction to case or junction to ambient resistance values, and could thus obscure overall efficiency numbers. Thermal time constants A heatsink's thermal mass can be considered as a capacitor (storing heat instead of charge) and the thermal resistance as an electrical resistance (giving a measure of how fast stored heat can be dissipated). Together, these two components form a thermal RC circuit with an associated time co
https://en.wikipedia.org/wiki/Gas%20chromatography%E2%80%93vacuum%20ultraviolet%20spectroscopy
Gas chromatography–vacuum ultraviolet spectroscopy (GC-VUV) is a universal detection technique for gas chromatography. VUV detection provides both qualitative and quantitative spectral information for most gas phase compounds. GC-VUV spectral data is three-dimensional (time, absorbance, wavelength) and specific to chemical structure. Nearly all compounds absorb in the VUV region of the electromagnetic spectrum with the exception of carrier gases hydrogen, helium, and argon. The high energy, short wavelength VUV photons probe electronic transitions in almost all chemical bonds including ground state to excited state. The result is spectral "fingerprints" that are specific to individual compound structure and can be readily identified by the VUV library. Unique VUV spectra enable closely related compounds such as structural isomers to be clearly differentiated. VUV detectors complement mass spectrometry, which struggles with characterizing constitutional isomers and compounds with low mass quantitation ions. VUV spectra can also be used to deconvolve analyte co-elution, resulting in an accurate quantitative representation of individual analyte contribution to the original response. This characteristically lends itself to significantly reducing GC runtimes through flow rate-enhanced chromatographic compression. VUV spectroscopy follows the simple linear relationship between absorbance and concentration described by the Beer-Lambert Law, resulting in more accurate retention time-based identification. VUV absorbance spectra also exhibit feature similarity within compound classes, meaning VUV detectors can rapidly compound class characterization in complex samples through compound spectral shape and retention index information. Advances in technology reduces the typical group analysis data processing time from 15-30 minutes to <1 minute per sample. History The first benchtop detector was introduced in 2014 with detection capabilities between 120 - 240 nm. This porti
https://en.wikipedia.org/wiki/Negroid
Negroid (less commonly called Congoid) is an obsolete racial grouping of various people indigenous to Africa south of the area which stretched from the southern Sahara desert in the west to the African Great Lakes in the southeast, but also to isolated parts of South and Southeast Asia (Negritos). The term is derived from now-disproven conceptions of race as a biological category. The concept of dividing humankind into three races called Caucasoid, Mongoloid, and Negroid (originally named "Ethiopian") was introduced in the 1780s by members of the Göttingen School of History and further developed by Western scholars in the context of "racist ideologies" during the age of colonialism. With the rise of modern genetics, the concept of distinct human races in a biological sense has become obsolete. In 2019, the American Association of Biological Anthropologists stated: "Race does not provide an accurate representation of human biological variation. It was never accurate in the past, and it remains inaccurate when referencing contemporary human populations." Etymology Negroid has Portuguese or Spanish and Ancient Greek etymological roots. It literally translates as "black resemblance" from the Portuguese and Spanish word negro (black) from Latin nigrum, and Greek οειδές -oeidēs, equivalent to -o- + είδες -eidēs "having the appearance of", derivative of είδος eîdos "appearance". The earliest recorded use of the term "Negroid" came in 1859. History of the concept Origins Johann Friedrich Blumenbach, a scholar at the then modern Göttingen University developed a concept dividing mankind into five races in the revised 1795 edition of his De generis humani varietate nativa (On the Natural Variety of Mankind). Although Blumenbach's concept later gave rise to scientific racism, his arguments were basically anti-racist, since he underlined that mankind as a whole forms one single species, and points out that the transition from one race to another is so gradual that the di
https://en.wikipedia.org/wiki/Audio%20analyzer
An audio analyzer is a test and measurement instrument used to objectively quantify the audio performance of electronic and electro-acoustical devices. Audio quality metrics cover a wide variety of parameters, including level, gain, noise, harmonic and intermodulation distortion, frequency response, relative phase of signals, interchannel crosstalk, and more. In addition, many manufacturers have requirements for behavior and connectivity of audio devices that require specific tests and confirmations. Audio analysis requires that the device under test receive a stimulus signal of known characteristics, with which the output signal (response) may be compared by the analyzer in order to determine differences expressed in the specific measurements. This signal may be generated or controlled by the analyzer itself or may come from another source (e.g., a recording) as long as characteristics relative to the desired measurement are defined. As test and measurement equipment, audio analyzers are required to provide performance well beyond that of the typical devices under test (DUTs). High quality audio analyzers must demonstrate vanishingly low levels of noise, distortion and interference in order to be deemed worthwhile, and must do so consistently and reliably to be trusted by engineers and designers. For example, while a commercial CD player can achieve a total harmonic distortion plus noise (THD+N) ratio of approximately −98 dB at 1 kHz, a high quality audio analyzer may exhibit THD+N as low as −121 dB (this is the specified typical performance of the Audio Precision APx555). Audio analyzers are used in both development and production of products. A design engineer will find it very useful when understanding and refining product performance, while a production engineer will wish to perform tests to rapidly confirm that units meet specifications. Very often audio analyzers are optimized for one of these two cases. Current popular audio analyzer models include: APx5
https://en.wikipedia.org/wiki/Bouligand%20structure
A Bouligand structure is a layered and rotated microstructure resembling plywood, which is frequently found in naturally evolved materials. It consists of multiple lamellae, or layers, each one composed of aligned fibers. Adjacent lamellae are progressively rotated with respect to their neighbors. This structure enhances the mechanical properties of materials, especially its fracture resistance, and enables strength and in plane isotropy. It is found in various natural structures, including the cosmoid scale of the coelacanth, and the dactyl club of the mantis shrimp and many other stomatopods. Due to its desirable mechanical properties, there are ongoing attempts to replicate Bouligand arrangements in the creation of failure resistant bioinspired materials. For example, it has been shown that layered composites (such as CFRP) utilizing this structure have enhanced impact properties. However, replicating the structure on small length scales is challenging, and the development and advancement of manufacturing techniques continually improves the ability to replicate this desirable structure. Mechanical Properties Toughening Mechanisms The Bouligand structure found in many natural materials is credited with imparting a very high toughness and fracture resistance to the overall material it is a part of. The mechanisms by which this toughening occurs are many, and no one mechanism has yet to be identified as the main source of the structure's toughness. Both computational work and physical experiments have been done to determine these pathways by which the structure resists fracture so that synthetic tough Bouligand structures can be taken advantage of. Crack deflection of one form or another is considered the main toughening mechanism in the bouligand structure. Deflection can take the form of crack tilting, and crack bridging. In the former, the crack propagates along the direction of the fiber plane; at the interface with the matrix material. Once the energy re
https://en.wikipedia.org/wiki/Shifting%20bottleneck%20heuristic
The Shifting Bottleneck Heuristic is a procedure intended to minimize the time it takes to do work, or specifically, the makespan in a job shop. The makespan is defined as the amount of time, from start to finish, to complete a set of multi-machine jobs where machine order is pre-set for each job. Assuming that the jobs are actually competing for the same resources (machines) then there will always be one or more resources that act as a 'bottleneck' in the processing. This heuristic, or 'rule of thumb' procedure minimises the effect of the bottleneck. The Shifting Bottleneck Heuristic is intended for job shops with a finite number of jobs and a finite number of machines. Uses The Shifting Bottleneck Heuristic is used in manufacturing and service industries that include job shops with constraints on the order that the machines must be used for each job. A good example of a service industry that may use this technique is a hospital. The different areas within a hospital, such as physical examination, x-ray booth, cat scan, or surgery, could all be considered machines for this particular application. A precedence constraint in this context is when one machine must be used before another machine on any given job (or patient). These types of problems with multiple machines are known to be computationally very difficult. The processing time of each job on each machine is given (see chart on right for an example). Job j being performed on machine i is denoted ij. It is assumed that each machine can only work on one job at a time. The objective is to determine the schedule that will produce the shortest makespan. Procedure Make graph Determine starting makespan Determine optimal sequence for bottleneck machine (considering precedence constraints) Perform an iteration Solve lowest maximum lateness problem Include optimal sequence in graph Determine optimal sequences for remaining machines (considering precedence and machine constraints) Perform further iterations
https://en.wikipedia.org/wiki/Manuel%20Belgrano
Manuel José Joaquín del Corazón de Jesús Belgrano y González (3 June 1770 – 20 June 1820), usually referred to as Manuel Belgrano (), was an Argentine public servant, economist, lawyer, politician, journalist, and military leader. He took part in the Argentine Wars of Independence and designed what became the flag of Argentina. Argentines regard him as one of the main Founding Fathers of the country. Belgrano was born in Buenos Aires, the fourth child of Italian businessman Domingo Belgrano y Peri and of María Josefa González Casero. He came into contact with the ideas of the Age of Enlightenment while at university in Spain around the time of the 1789 French Revolution. In 1794 he returned to the Viceroyalty of the Río de la Plata, where he became a notable member of the criollo population of Buenos Aires; he tried to promote some of the new political and economic ideals, but found severe resistance from local . This rejection led him to work towards a greater autonomy for his country from the Spanish colonial regime. At first he unsuccessfully promoted the aspirations of Carlota Joaquina to become a regent ruler for the Viceroyalty during the period when the French imprisoned the Spanish King Ferdinand VII during the Peninsular War (1807–1814). Belgrano favoured the May Revolution, which removed the viceroy Baltasar Hidalgo de Cisneros from power on 25 May 1810. He was elected as a voting member of the Primera Junta that took power after the ouster. As a delegate for the Junta, he led the ill-fated Paraguay campaign of 1810-1811. Belgrano's troops were beaten by Bernardo de Velasco at the battles of Paraguarí and Tacuarí. Though his army was defeated, the military campaign initiated the chain of events that led to the independence of Paraguay in May 1811. He retreated to the vicinity of Rosario, to fortify it against a possible royalist attack from the Eastern Band of the Uruguay River. While there, he developed the design of the flag of Argentina. The First Tri
https://en.wikipedia.org/wiki/Cleanroom%20software%20engineering
The cleanroom software engineering process is a software development process intended to produce software with a certifiable level of reliability. The central principles are software development based on formal methods, incremental implementation under statistical quality control, and statistically sound testing. History The cleanroom process was originally developed by Harlan Mills and several of his colleagues including Alan Hevner at IBM. The cleanroom process first saw use in the mid to late 1980s. Demonstration projects within the military began in the early 1990s. Recent work on the cleanroom process has examined fusing cleanroom with the automated verification capabilities provided by specifications expressed in CSP. Philosophy The focus of the cleanroom process is on defect prevention, rather than defect removal. The name "cleanroom" was chosen to evoke the cleanrooms used in the electronics industry to prevent the introduction of defects during the fabrication of semiconductors. Central principles The basic principles of the cleanroom process are Software development based on formal methods Software tool support based on some mathematical formalism includes model checking, process algebras, and Petri nets. The Box Structure Method might be one such means of specifying and designing a software product. Verification that the design correctly implements the specification is performed through team review, often with software tool support. Incremental implementation under statistical quality control Cleanroom development uses an iterative approach, in which the product is developed in increments that gradually increase the implemented functionality. The quality of each increment is measured against pre-established standards to verify that the development process is proceeding acceptably. A failure to meet quality standards results in the cessation of testing for the current increment, and a return to the design phase. Statistically sound testing Softw
https://en.wikipedia.org/wiki/PEG-PVA
Polyethylene glycol–polyvinyl alcohol (PEG-PVA) brand name Kollicoat IR (BASF) is a multifunctional excipient used as a pill binder as well as a wet binder. A typical formulation is composed of 25% polyethylene glycol (PEG) and 75% polyvinyl alcohol (PVA); where the vinyl alcohol moieties are grafted on a polyethylene glycol backbone. See also Polyvinylpolypyrrolidone
https://en.wikipedia.org/wiki/Barbary%20dove
The Barbary dove, ringed turtle dove, ringneck dove, ring-necked turtle dove, or ring dove (Streptopelia risoria) is a domestic member of the dove and pigeon family (Columbidae). Taxonomy and domestication Although the Barbary dove is normally assigned its own systematic name, as Streptopelia risoria, considerable doubt exists as to its appropriate classification. Some sources assert confidently that it is a domesticated form of the Eurasian collared dove (Streptopelia decaocto), but the majority of evidence points to it being a domesticated form of the African collared dove (Streptopelia roseogrisea). It appears that it can hybridize freely with either species, and its status as a species must therefore be regarded as doubtful. However, because of the wide use of both the common and systematic names, it is best to consider it separately from either of the parent species. Their time of domestication is also uncertain. While Linnaeus described them in 1756, they may have been imported into Italy from North Africa in the late 16th century. Behavior Barbary doves are easily kept and long-lived in captivity, living for up to 12 years. There have been cases of doves living over 20 years, and, in one case, of a dove living for 29 years. In recent years they have been used extensively in biological research, particularly into the hormonal bases of reproductive behaviour, because their sequences of courtship, mating and parental behaviour have been described accurately and are highly consistent in form. Dove fanciers have bred them in a great variety of colours; the number of colours available has increased dramatically in the latter half of the 20th century, and it is thought that this has been achieved by interbreeding with Streptopelia roseogrisea. Some of these doves carry a mutation that makes them completely white. These white Barbary doves are most commonly used in stage magic acts. White Barbary doves are also traditionally released in large public ceremonie
https://en.wikipedia.org/wiki/Comparison%20of%20file%20synchronization%20software
This is a list of file synchronization software for which there are Wikipedia articles. Free and open-source Freeware This is a comparison of the freeware (proprietary software release free of charge) file synchronization software. Commercial This is a comparison of commercial software in the field of file synchronization. These programs only provide full functionality with a payment. As indicated, some are trialware and provide functionality during a trial period; some are freemium, meaning that they have freeware editions. Glossary See also
https://en.wikipedia.org/wiki/Angband%20%28video%20game%29
Angband is a dungeon-crawling roguelike video game derived from Umoria. It is based on the writings of J. R. R. Tolkien, in which Angband is the fortress of Morgoth. The current version of Angband is available for all major operating systems, including Unix, Windows, Mac OS X, and Android. It is identified as one of the "major roguelikes" by John Harris. Angband is free and open source game under the GNU GPLv2 or the angband license Gameplay The goal of Angband is to survive 100 floor levels of the fortress Angband in order to defeat Morgoth. The game is reputed to be extremely difficult. The player begins in a town where they can buy equipment before beginning the descent. Once in the maze-like fortress, the player encounters traps, monsters, equipment, and hidden doors. With the help of found objects and enchantments, the player's attack and defense power increases, and can even neutralise specific attacks. The player also meets characters and finds artifacts from Tolkien's legendarium. Angband gameplay emphasises combat and careful resource management. The player has a certain amount of health points. Although Angband records the player's progress to a save file, it does not allow one to resume a saved game in which the player character has already been beaten. The levels are procedurally generated, allowing for a unique game in every play. Development The first version of Angband was created by Alex Cutler and Andy Astrand at the University of Warwick in 1990. They wanted to expand the game Umoria by adding items, monsters, and features. After Cutler and Astrand, the source code was maintained at the University of Warwick by Geoff Hill and Sean Marsh. They finally released the game to the public with the version named "2.4.frog_knows" on 11 April 1993. The game, which was previously confided to the University of Warwick, was then enhanced by others and widely ported to non-Unix platforms. Following their departure, the later principals of Angband have inclu
https://en.wikipedia.org/wiki/CompactFlash
CompactFlash (CF) is a flash memory mass storage device used mainly in portable electronic devices. The format was specified and the devices were first manufactured by SanDisk in 1994. CompactFlash became one of the most successful of the early memory card formats, surpassing Miniature Card and SmartMedia. Subsequent formats, such as MMC/SD, various Memory Stick formats, and xD-Picture Card offered stiff competition. Most of these cards are smaller than CompactFlash while offering comparable capacity and speed. Proprietary memory card formats for use in professional audio and video, such as P2 and SxS, are faster, but physically larger and more costly. CompactFlash's popularity is declining as CFexpress is taking over. As of 2022, both Canon and Nikon's newest high end cameras, e.g. the Canon EOS R5, Canon EOS R3, and Nikon Z 9 use CFexpress cards for the higher performance required to record 8K video. Traditional CompactFlash cards use the Parallel ATA interface, but in 2008, a variant of CompactFlash, CFast was announced. CFast (also known as CompactFast) is based on the Serial ATA interface. In November 2010, SanDisk, Sony and Nikon presented a next generation card format to the CompactFlash Association. The new format has a similar form factor to CF/CFast but is based on the PCI Express interface instead of Parallel ATA or Serial ATA. With potential read and write speeds of 1 Gbit/s (125 MB/s) and storage capabilities beyond 2 TiB, the new format is aimed at high-definition camcorders and high-resolution digital cameras, but the new cards are not backward compatible with either CompactFlash or CFast. The XQD card format was officially announced by the CompactFlash Association in December 2011. Description There are two main subdivisions of CF cards, 3.3 mm-thick type I and 5 mm-thick type II (CF2). The type II slot is used by miniature hard drives and some other devices, such as the Hasselblad CFV Digital Back for the Hasselblad series of medium format ca
https://en.wikipedia.org/wiki/Equinox%20Systems
Equinox Systems, Inc., was an American manufacturer of computer networking hardware and developer of networking software based in Miami, Florida, and active from 1983 to 2000. The company started out selling a well-regarded series of enterprise digital PBX systems for data transmissions in the early 1980s, before becoming a major vendor and OEM of modems, Ethernet network switches, and advanced serial communication cards in the 1990s. Equinox was eventually acquired by Avocent of Huntsville, Alabama, in 2000, who kept the company around as a subsidiary for several years. History Foundation (1983–1985) Equinox Systems was founded in Miami-Dade County, Florida, in 1983 by Bill Dambrackas, Mark Cole, and Kevin Doren. Dambrackas and Cole had previously worked for Racal-Milgo, a manufacturer of modems and other telecommunications equipment that had offices in South Miami-Dade. Racal-Milgo announced their intent to move 40 miles north to Broward County in 1982, to the chagrin of Dambrackas and Cole, who did not want to relocate their families in order to keep their jobs. In early 1983, they obtained $1.1 million in financial backing from TA Associates, a Boston-based investment firm, and in March 1983, they formally incorporated Equinox Systems, named so after the March equinox ongoing at the time of the company's foundation. The company was soon joined by eight other former employees of Racal-Milgo who also wanted to avoid moving northward, and by November 1983, Equinox had ten employees on its payroll. Equinox's first product, a data PBX, was released to market in early 1984, retailing for US$30,000. Data PBXes were a form of private branch exchange developed in the 1970s, which facilitated communications between data terminals and minicomputers and between personal computers and certain peripherals such as modems and printers. Data PBXes were once common in non-IBM shops, but they were prone to collisions and became antiquated in the early 1980s amid rapid developm
https://en.wikipedia.org/wiki/Trp%20operon
The trp operon''' is a group of genes that are transcribed together, encoding the enzymes that produce the amino acid tryptophan in bacteria. The trp operon was first characterized in Escherichia coli, and it has since been discovered in many other bacteria. The operon is regulated so that, when tryptophan is present in the environment, the genes for tryptophan synthesis are repressed. The trp operon contains five structural genes: trpE, trpD, trpC, trpB, and trpA, which encode the enzymes needed to synthesize tryptophan. It also contains a repressive regulator gene called trpR. When tryptophan is present, the trpR protein binds to the operator, blocking transcription of the trp operon by RNA polymerase. This operon is an example of repressible negative regulation of gene expression. The repressor protein binds to the operator in the presence of tryptophan (repressing transcription) and is released from the operon when tryptophan is absent (allowing transcription to proceed). The trp operon additionally uses attenuation to control expression of the operon, a second negative feedback control mechanism. The trp operon is well-studied and is commonly used as an example of gene regulation in bacteria alongside the lac operon. Genes trp operon contains five structural genes. The roles of their products are: TrpE (): Anthranilate synthase produces anthranilate. TrpD (): Cooperates with TrpE. TrpC (): Phosphoribosylanthranilate isomerase domain first turns N-(5-phospho-β-D-ribosyl)anthranilate into 1-(2-carboxyphenylamino)-1-deoxy-D-ribulose 5-phosphate. The Indole-3-glycerol-phosphate synthase on the same protein then turns the product into (1S,2R)-1-C-(indol-3-yl)glycerol 3-phosphate. TrpA (), TrpB (): two subunits of tryptophan synthetase. Combines TrpC's product with serine to produce tryptophan. Repression The operon operates by a negative repressible feedback mechanism. The repressor for the trp operon is produced upstream by the trpR gene, which is cons
https://en.wikipedia.org/wiki/Cadmium%20poisoning
Cadmium is a naturally occurring toxic metal with common exposure in industrial workplaces, plant soils, and from smoking. Due to its low permissible exposure in humans, overexposure may occur even in situations where trace quantities of cadmium are found. Cadmium is used extensively in electroplating, although the nature of the operation does not generally lead to overexposure. Cadmium is also found in some industrial paints and may represent a hazard when sprayed. Operations involving removal of cadmium paints by scraping or blasting may pose a significant hazard. The primary use of cadmium is in the manufacturing of NiCd rechargeable batteries. The primary source for cadmium is as a byproduct of refining zinc metal. Exposures to cadmium are addressed in specific standards for the general industry, shipyard employment, the construction industry, and the agricultural industry. Signs and symptoms Acute Acute exposure to cadmium fumes may cause flu-like symptoms including chills, fever, and muscle ache sometimes referred to as "the cadmium blues." Symptoms may resolve after a week if there is no respiratory damage. More severe exposures can cause tracheobronchitis, pneumonitis, and pulmonary edema. Symptoms of inflammation may start hours after the exposure and include cough, dryness and irritation of the nose and throat, headache, dizziness, weakness, fever, chills, and chest pain. Chronic Complications of cadmium poisoning include cough, anemia, and kidney failure (possibly leading to death). Cadmium exposure increases one's chances of developing cancer. Similar to zinc, long-term exposure to cadmium fumes can cause lifelong anosmia. Bone and joints One of the main effects of cadmium poisoning is weak and brittle bones. The bones become soft (osteomalacia), lose bone mineral density (osteoporosis), and become weaker. This results in joint and back pain, and increases the risk of fractures. Spinal and leg pain is common, and a waddling gait often develops d
https://en.wikipedia.org/wiki/Smart%20Bitrate%20Control
Smart Bitrate Control, commonly referred to as SBC, was a technique for achieving greatly improved video compression efficiency using the DivX 3.11 Alpha video codec or Microsoft's proprietary MPEG4v2 video codec and the Nandub video encoder. SBC relied on two main technologies to achieve this improved efficiency: Multipass encoding and Variable Keyframe Intervals (VKI). SBC ceased to be commonly used after XviD and DivX development progressed to a point where they incorporated the same features that SBC pioneered and could offer even more efficient video compression without the need for a specialized application. Files created by SBC are compatible with DivX 3.11 Alpha and can be decoded by most codecs that support ISO MPEG4 video. Technical details The DivX 3.11 Alpha codec allowed a user to control three aspects of the encoding process: the average bitrate, keyframe interval, and whether the codec preserved smoother motion or more detailed images. DivX attempted to encode an entire movie at an average bitrate the user specified, varying the quality of the video in order to achieve the target bitrate. This meant that a simple section of video, such as a still image, would look very good, but complex video, such as an action scene, would look very bad. DivX's keyframe placement was also very simplistic, it would place keyframes only on the interval that the user selected, every 300 frames (10 seconds at 30 frame/s) by default. Nandub's multipass encoding encoded the video twice; in the first pass it would analyze the video (and write information to a log file), in the second it would actually produce the output file. Instead of varying the image quality to achieve an average bitrate, this allowed SBC to vary the bitrate to achieve an average quality, using higher bitrate for more complex scenes and lower bitrate for simpler scenes. VKI would place keyframes only where needed, such as at scene changes, rather than at a fixed interval. This significantly improved b
https://en.wikipedia.org/wiki/Geodetic%20effect
The geodetic effect (also known as geodetic precession, de Sitter precession or de Sitter effect) represents the effect of the curvature of spacetime, predicted by general relativity, on a vector carried along with an orbiting body. For example, the vector could be the angular momentum of a gyroscope orbiting the Earth, as carried out by the Gravity Probe B experiment. The geodetic effect was first predicted by Willem de Sitter in 1916, who provided relativistic corrections to the Earth–Moon system's motion. De Sitter's work was extended in 1918 by Jan Schouten and in 1920 by Adriaan Fokker. It can also be applied to a particular secular precession of astronomical orbits, equivalent to the rotation of the Laplace–Runge–Lenz vector. The term geodetic effect has two slightly different meanings as the moving body may be spinning or non-spinning. Non-spinning bodies move in geodesics, whereas spinning bodies move in slightly different orbits. The difference between de Sitter precession and Lense–Thirring precession (frame dragging) is that the de Sitter effect is due simply to the presence of a central mass, whereas Lense–Thirring precession is due to the rotation of the central mass. The total precession is calculated by combining the de Sitter precession with the Lense–Thirring precession. Experimental confirmation The geodetic effect was verified to a precision of better than 0.5% percent by Gravity Probe B, an experiment which measures the tilting of the spin axis of gyroscopes in orbit about the Earth. The first results were announced on April 14, 2007 at the meeting of the American Physical Society. Formulae To derive the precession, assume the system is in a rotating Schwarzschild metric. The nonrotating metric is where c = G = 1. We introduce a rotating coordinate system, with an angular velocity , such that a satellite in a circular orbit in the θ = π/2 plane remains at rest. This gives us In this coordinate system, an observer at radial position r s
https://en.wikipedia.org/wiki/Acotyledon
Acotyledon is used to refer to seed plants or spermatophytes that lack cotyledons, such as orchids and dodder. Orchid seeds are tiny with underdeveloped embryos. They depend on mycorrhizal fungi for their early nutrition so are myco-heterotrophs at that stage. Although some authors, especially in the 19th century and earlier, use the word acotyledon to include plants which have no cotyledons because they lack seeds entirely (such as ferns and mosses), others restrict the term to plants which have seeds but no cotyledons. Flowering plants or angiosperms are divided into two large groups. Monocotyledons or monocots have one seed lobe, which is often modified to absorb stored nutrients from the seed so never emerges from the seed or becomes photosynthetic. Dicotyledons or dicots have two cotyledons and often germinate to produce two leaf-like cotyledons. Conifers and other gymnosperms lack flowers but may have two or more cotyledons in the seedling.
https://en.wikipedia.org/wiki/Acousto-optics
Acousto-optics is a branch of physics that studies the interactions between sound waves and light waves, especially the diffraction of laser light by ultrasound (or sound in general) through an ultrasonic grating. Introduction Optics has had a very long and full history, from ancient Greece, through the renaissance and modern times. As with optics, acoustics has a history of similar duration, again starting with the ancient Greeks. In contrast, the acousto-optic effect has had a relatively short history, beginning with Brillouin predicting the diffraction of light by an acoustic wave, being propagated in a medium of interaction, in 1922. This was then confirmed with experimentation in 1932 by Debye and Sears, and also by Lucas and Biquard. The particular case of diffraction on the first order, under a certain angle of incidence, (also predicted by Brillouin), has been observed by Rytow in 1935. Raman and Nath (1937) have designed a general ideal model of interaction taking into account several orders. This model was developed by Phariseau (1956) for diffraction including only one diffraction order. In general, acousto-optic effects are based on the change of the refractive index of a medium due to the presence of sound waves in that medium. Sound waves produce a refractive index grating in the material, and it is this grating that is "seen" by the light wave. These variations in the refractive index, due to the pressure fluctuations, may be detected optically by refraction, diffraction, and interference effects, reflection may also be used. The acousto-optic effect is extensively used in the measurement and study of ultrasonic waves. However, the growing principal area of interest is in acousto-optical devices for the deflection, modulation, signal processing and frequency shifting of light beams. This is due to the increasing availability and performance of lasers, which have made the acousto-optic effect easier to observe and measure. Technical progress in
https://en.wikipedia.org/wiki/Oral%20food%20challenge
An oral food challenge is a method for determining if a person has a specific food allergy. It involves giving increasing amounts of a food and watching to see if an allergic reaction occurs. They are potentially dangerous.
https://en.wikipedia.org/wiki/Electrochemical%20fatigue%20crack%20sensor
An Electrochemical Fatigue Crack Sensor (EFCS) is a type of low cost electrochemical nondestructive dynamic testing method used primarily in the aerospace and transportation infrastructure industries. The method is used to locate surface-breaking and slightly subsurface defects in all metallic materials. In bridge structures, EFCS is used at known fatigue susceptible areas, such as sharp-angled coped beams, stringer to beam attachments, and the toe of welds. This dynamic testing can be a form of short term or long-term monitoring, as long as the structure is undergoing dynamic cyclic loading. History In 1992, Dr. Campbell Laird and Dr. Yuanfeng Li invented the EFS™. The EFS™ relies on a patented electrical test method, which monitors the current flow at the surface of a metal while it is being mechanically flexed. The output current resembles a heart’s EKG pattern and can be interpreted to indicate the degree of fatigue as well as the presence of cracks in their earliest stages of development. The technology behind EFS was devised by researchers from the U.S. Air Force and the University of Pennsylvania for use in the aerospace industry. The original research was aimed at developing a technology for detecting problem cracks in airframes and engines. Since that time, additional research and development has resulted in the adaptation of the EFS system for steel bridge inspection. Principles The Electrochemical Fatigue Sensor (EFS) is a nondestructive crack dynamic inspection technology, similar in concept to a medical EKG, which is used to determine if actively growing fatigue cracks are present. An EFS sensor is first applied to the fatigue sensitive location on the bridge or metal structure, and then is injected with an electrolyte, at which point a small voltage is applied. The system subsequently monitors changes in the current response that results from the exposure of fresh steel during crack propagation. The EFS system consists of an electrolyte, a sensor a
https://en.wikipedia.org/wiki/List%20of%20portable%20software
For the purposes of this list, a portable application is software that can be used from portable storage devices such as USB flash drives, digital audio players, PDAs or external hard drives. To be considered for inclusion, an application must be executable on multiple computers from removable storage without installation, and without writing settings or data onto a computer's non-removable storage. This includes modified portable versions of non-portable applications. Bundles Ceedo MojoPac LiberKey PortableApps.com U3 WebLaminarTools WinPenPack Launchers Appetizer (Dock application) ASuite Launchy OpenDisc RocketDock Development Scripting languages Portable Python Portable NSIS Version Portable AutoIt Portable AutoHotkey (zip file) Portable Perl (Strawberry Perl Portable Version) Compilers MinGW Tiny C Compiler IDEs Alice IDE Portable Eclipse Portable Code::Blocks (needs MinGW installed, which is portable too) Portable Dev-C++ Hackety Hack, which is an educational version of Ruby SharpDevelop Portable Setup creators Nullsoft Scriptable Install System Portable (PortableApps.com format) Visual mapping/productivity tools XMIND Graphics 3D modeling and rendering Anim8or – Free 3D modeling and animating software. Blender: BlenderPortable Blender Pocket XBlender Animation Anim8or Blender Pivot Stickfigure Animator Graphic editors ArtRage Artweaver Dia EVE Fotografix GIMP: GIMP Portable VS 2008 is the Gimp portable version of Gimp on Windows platforms (Windows XP, Vista, NT Server 2003, NT Server 2008) Portable Gimp – for Mac OS X X-Gimp X-GimpShop Inkscape: X-Inkscape Portable Inkscape – for Mac OS X IrfanView Pixia Tux Paint Icon editors @icon sushi GIMP – Supports reading and writing Windows ICO files. IcoFX IrfanView – Supports converting graphic file formats into Windows ICO files. Viewers FastStone Image Viewer: supports screen capture, multiple pix into a single PDF Irfanview XnView Documen
https://en.wikipedia.org/wiki/Trove
Trove is an Australian online library database owned by the National Library of Australia in which it holds partnerships with source providers National and State Libraries Australia, an aggregator and service which includes full text documents, digital images, bibliographic and holdings data of items which are not available digitally, and a free faceted-search engine as a discovery tool. Content The database includes archives, images, newspapers, official documents, archived websites, manuscripts and other types of data. it is one of the most well-respected and accessed GLAM services in Australia, with over 70,000 daily users. Based on antecedents dating back to 1996, the first version of Trove was released for public use in late 2009. It includes content from libraries, museums, archives, repositories and other organisations with a focus on Australia. It allows searching of catalogue entries of books in Australian libraries (some fully available online), academic and other journals, full-text searching of digitised archived newspapers, government gazettes and archived websites. It provides access to digitised images, maps, aggregated information about people and organisations, archived diaries and letters, and all born-digital content which has been deposited via National edeposit (NED). Searchable content also includes music, sound and videos, and transcripts of radio programs. With the exception of the digitised newspapers, none of the contents is hosted by Trove itself, which indexes the content of its partners' collection metadata, formats and manages it, and displays the aggregated information in a relevance-ranked search result. In the wake of government funding cuts since 2015, the National Library and other organisations have been struggling to keep up with ensuring that content on Trove is kept flowing through and up to date. History Trove's origins can be seen in the development of earlier services such as the Australian Bibliographic Network (ABN), a
https://en.wikipedia.org/wiki/Fast%20Analog%20Computing%20with%20Emergent%20Transient%20States
Fast Analog Computing with Emergent Transient States or FACETS is a European project to research the properties of the human brain. Established and funded by the European Union in September 2005, the five-year project involves approximately 80 scientists from Austria, France, Germany, Hungary, Sweden, Switzerland and the United Kingdom. The main project goal is to address questions about how the brain computes. Another objective is to create microchip hardware equaling approximately 200,000 neurons with 50 million synapses on a single silicon wafer. Current prototypes are running 100,000 times faster than their biological counterparts, which would make them the fastest analog computing devices ever built for neuronal computations. The institutions involved are the University of Heidelberg, the French National Centre for Scientific Research (CNRS) of Gif sur Yvette, the CNRS of Marseille, the Institut national de recherche en informatique et en automatique, the University of Freiburg, the University of Graz, the École Polytechnique Fédérale de Lausanne, the Swedish Royal Institute of Technology, the University of London, the University of Plymouth, the University of Bordeaux, the University of Debrecen, the University of Dresden and the Institute for Theoretical Computer Science at Technische Universitat Graz. External links FACETS website a quick introduction Computational neuroscience Neurophysiology
https://en.wikipedia.org/wiki/Identity%20verification%20service
An identity verification service is used by businesses to ensure that users or customers provide information that is associated with the identity of a real person. The service may verify the authenticity of physical identity documents such as a drivers license, passport, or a nationally issued identity document through documentary verification. Additionally, also involve the verification of identity information (fields) against independent and authoritative sources, such as a credit bureau or proprietary government data. Background Identity verification services were developed to help companies comply with Anti-Money Laundering (AML) and Know Your Customer (KYC) rules, identity verification is now a vital component to the transaction ecosystems of eCommerce companies, financial institutions, online gaming, and even social media. Through adopting digital fraud prevention methods, businesses can achieve AML and KYC compliance while addressing the risks associated with fraud. In financial industries, verifying identity is often required by regulations known as Know Your Customer or Customer Identification Program. In the US, one of the many bodies regulating these procedures is the Financial Crimes Enforcement Network (FinCEN). The Financial Actions Task Force (FATF) is a global anti-money laundering and terrorist financing watchdog organization. A nondocumentary identity verification requires the user or customer to provide personal identity data, which is sent to the identity verification service. The service checks public and proprietary private databases for a match on the information provided. Optionally, knowledge-based authentication questions can be presented to the person providing the information to ensure that he or she is the owner of the identity. An identity "score" is calculated, and the identity of the user or customer is either given the "verified" status or not, based on the score. Customers of various businesses, such as retail merchants, govern
https://en.wikipedia.org/wiki/Project%20CETI
Project CETI is an international initiative to understand the communication of sperm whales using advances in Artificial Intelligence. Its name is a reference to the SETI Institute. The project has an interdisciplinary scientific board including marine biologists, artificial intelligence researchers, roboticists, theoretical computer scientists, and linguists. The project has a base on the island of Dominica where recordings are being collected. A roadmap has been published. The organization has been selected as a TED Audacious Project. See also Human–animal communication Animal cognition Animal communication Animal consciousness Interspecies communication
https://en.wikipedia.org/wiki/Gab%20operon
The gab operon is responsible for the conversion of γ-aminobutyrate (GABA) to succinate. The gab operon comprises three structural genes – gabD, gabT and gabP – that encode for a succinate semialdehyde dehydrogenase, GABA transaminase and a GABA permease respectively. There is a regulatory gene csiR, downstream of the operon, that codes for a putative transcriptional repressor and is activated when nitrogen is limiting. The gab operon has been characterized in Escherichia coli and significant homologies for the enzymes have been found in organisms such as Saccharomyces cerevisiae, rats and humans. Limited nitrogen conditions activate the gab genes. The enzymes produced by these genes convert GABA to succinate, which then enters the TCA cycle, to be used as a source of energy. The gab operon is also known to contribute to polyamine homeostasis during nitrogen-limited growth and to maintain high internal glutamate concentrations under stress conditions. Structure The gab operon consists of three structural genes: gabT : encodes a GABA transaminase that produces succinic semialdehyde. gabD : encodes an NADP-dependent succinic semialdehyde dehydrogenase, which oxidizes succinic semialdehyde to succinate. gabP : encodes a GABA-specific permease. Physiological significance of the operon The gabT gene encodes for GABA transaminase, an enzyme that catalyzes the conversion of GABA and 2-oxoglutarate into succinate semialdehyde and glutamate. Succinate semialdehyde is then oxidized into succinate by succinate semialdehyde dehydrogenase which is encoded by the gabP gene, thereby entering the TCA cycle as a usable source of energy. The gab operon contributes to homeostasis of polyamines such as putrescine, during nitrogen-limited growth. It is also known to maintain high internal glutamate concentrations under stress conditions. Regulation Differential Regulation of Promoters The expression of genes in the operon is controlled by three differentially regulated prom
https://en.wikipedia.org/wiki/Dirac%E2%80%93K%C3%A4hler%20equation
In theoretical physics, the Dirac–Kähler equation, also known as the Ivanenko–Landau–Kähler equation, is the geometric analogue of the Dirac equation that can be defined on any pseudo-Riemannian manifold using the Laplace–de Rham operator. In four-dimensional flat spacetime, it is equivalent to four copies of the Dirac equation that transform into each other under Lorentz transformations, although this is no longer true in curved spacetime. The geometric structure gives the equation a natural discretization that is equivalent to the staggered fermion formalism in lattice field theory, making Dirac–Kähler fermions the formal continuum limit of staggered fermions. The equation was discovered by Dmitri Ivanenko and Lev Landau in 1928 and later rediscovered by Erich Kähler in 1962. Mathematical overview In four dimensional Euclidean spacetime a generic fields of differential forms is written as a linear combination of sixteen basis forms indexed by , which runs over the sixteen ordered combinations of indices with . Each index runs from one to four. Here are antisymmetric tensor fields while are the corresponding differential form basis elements Using the Hodge star operator , the exterior derivative is related to the codifferential through . These form the Laplace–de Rham operator which can be viewed as the square root of the Laplacian operator since . The Dirac–Kähler equation is motivated by noting that this is also the property of the Dirac operator, yielding This equation is closely related to the usual Dirac equation, a connection which emerges from the close relation between the exterior algebra of differential forms and the Clifford algebra of which Dirac spinors are irreducible representations. For the basis elements to satisfy the Clifford algebra , it is required to introduce a new Clifford product acting on basis elements as Using this product, the action of the Laplace–de Rham operator on differential form basis elements is written as To acqui
https://en.wikipedia.org/wiki/Donor%20%28semiconductors%29
In semiconductor physics, a donor is a dopant atom that, when added to a semiconductor, can form a n-type region. For example, when silicon (Si), having four valence electrons, is to be doped as a n-type semiconductor, elements from group V like phosphorus (P) or arsenic (As) can be used because they have five valence electrons. A dopant with five valence electrons is also called a pentavalent impurity. Other pentavalent dopants are antimony (Sb) and bismuth (Bi). When substituting a Si atom in the crystal lattice, four of the valence electrons of phosphorus form covalent bonds with the neighbouring Si atoms but the fifth one remains weakly bonded. If that electron is liberated, the initially electro-neutral donor becomes positively charged (ionised). At room temperature, the liberated electron can move around the Si crystal and carry a current, thus acting as a charge carrier. See also Acceptor (semiconductors) Electron donor
https://en.wikipedia.org/wiki/Therapeutic%20inertia
Therapeutic inertia (also known as clinical inertia) is a measurement of the resistance to therapeutic treatment for an existing medical condition. It is commonly measured as a percentage of the number of encounters in which a patient with a condition received new or increased therapeutic treatment out of the total number of visits to a health care professional by the patient. A high percentage indicates that the health care provider is slow to treat a medical condition. A low percentage indicates that a provider is extremely quick in prescribing new treatment at the onset of any medical condition. Calculation There are two common methods used in calculating therapeutic inertia. For the following examples, consider that a patient has five visits with a health provider. In four of those visits, a condition is not controlled (such as high blood pressure or high cholesterol). In two of those visits, the provider made a change to the patient's treatment for the condition. In Dr. Okonofua's original paper, this patient's therapeutic inertia is calculated as where h is the number of visits with an uncontrolled condition, c is the number of visits in which a change was made, and v is the total number of visits. Therefore, the patient's therapeutic inertia is . An alternative, which avoids consideration of visits where the condition was already controlled and the provider should not be expected to make a treatment change, is . Using the above example, there are 2 changes and 4 visits with an uncontrolled condition. The therapeutic inertia is . Reception Therapeutic inertia was devised as a metric for measuring treatment of hypertension. It has now become a standard metric for analysing treatment of many common comorbidities such as diabetes and hyperlipidemia. Both feedback reporting processes and intervention studies aimed at reducing therapeutic inertia have been shown to increase control of hypertension, diabetes, and hyperlipidemia.
https://en.wikipedia.org/wiki/Dangerously%20irrelevant%20operator
In statistical mechanics and quantum field theory, a dangerously irrelevant operator (or dangerous irrelevant operator) is an operator which is irrelevant at a renormalization group fixed point, yet affects the infrared (IR) physics significantly (e.g. because the vacuum expectation value (VEV) of some field depends sensitively upon the coefficient of this operator). Critical phenomena In the theory of critical phenomena, free energy of a system near the critical point depends analytically on the coefficients of generic (not dangerous) irrelevant operators, while the dependence on the coefficients of dangerously irrelevant operators is non-analytic ( p. 49). The presence of dangerously irrelevant operators leads to the violation of the hyperscaling relation between the critical exponents and in dimensions. The simplest example ( p. 93) is the critical point of the Ising ferromagnet in dimensions, which is a gaussian theory (free massless scalar ), but the leading irrelevant perturbation is dangerously irrelevant. Another example occurs for the Ising model with random-field disorder, where the fixed point occurs at zero temperature, and the temperature perturbation is dangerously irrelevant ( p. 164). Quantum field theory Let us suppose there is a field with a potential depending upon two parameters, and . Let us also suppose that is positive and nonzero and > . If is zero, there is no stable equilibrium. If the scaling dimension of is , then the scaling dimension of is where is the number of dimensions. It is clear that if the scaling dimension of is negative, is an irrelevant parameter. However, the crucial point is, that the VEV . depends very sensitively upon , at least for small values of . Because the nature of infrared physics also depends upon the VEV, it looks very different even for a tiny change in not because the physics in the vicinity of changes much — it hardly changes at all — but because the VEV we are expanding about has ch
https://en.wikipedia.org/wiki/Bernard%20Tucker%20Medal
The Bernard Tucker Medal is awarded by the British Trust for Ornithology for services to ornithology. It is named in memory of Bernard Tucker, their first Secretary. It has been awarded since 1954, usually annually although there are some years when no medals were awarded. Bernard Tucker Medallists Source: British Trust for Ornithology 20th Century 21st Century See also List of ornithology awards External links Past medallists Ornithology awards British Trust for Ornithology 1954 establishments in the United Kingdom Awards established in 1954
https://en.wikipedia.org/wiki/Many-body%20localization
Many-body localization (MBL) is a dynamical phenomenon occurring in isolated many-body quantum systems. It is characterized by the system failing to reach thermal equilibrium, and retaining a memory of its initial condition in local observables for infinite times. Thermalization and localization Textbook quantum statistical mechanics assumes that systems go to thermal equilibrium (thermalization). The process of thermalization erases local memory of the initial conditions. In textbooks, thermalization is ensured by coupling the system to an external environment or "reservoir," with which the system can exchange energy. What happens if the system is isolated from the environment, and evolves according to its own Schrödinger equation? Does the system still thermalize? Quantum mechanical time evolution is unitary and formally preserves all information about the initial condition in the quantum state at all times. However, a quantum system generically contains a macroscopic number of degrees of freedom, but can only be probed through few-body measurements which are local in real space. The meaningful question then becomes whether accessible local measurements display thermalization. This question can be formalized by considering the quantum mechanical density matrix of the system. If the system is divided into a subregion (the region being probed) and its complement (everything else), then all information that can be extracted by measurements made on alone is encoded in the reduced density matrix . If, in the long time limit, approaches a thermal density matrix at a temperature set by the energy density in the state, then the system has "thermalized," and no local information about the initial condition can be extracted from local measurements. This process of "quantum thermalization" may be understood in terms of acting as a reservoir for . In this perspective, the entanglement entropy of a thermalizing system in a pure state plays the role of thermal entr
https://en.wikipedia.org/wiki/STEbus
The STEbus (also called the IEEE-1000 bus) is a non-proprietary, processor-independent, computer bus with 8 data lines and 20 address lines. It was popular for industrial control systems in the late 1980s and early 1990s before the ubiquitous IBM PC dominated this market. STE stands for STandard Eurocard. Although no longer competitive in its original market, it is valid choice for hobbyists wishing to make 'home brew' computer systems. The Z80 and probably the CMOS 65C02 are possible processors to use. The standardized bus allows hobbyists to interface to each other's designs. Origins In the early 1980s, there were many proprietary bus systems, each with its own strengths and weaknesses. Most had grown in an ad-hoc manner, typically around a particular microprocessor. The S-100 bus is based on Intel 8080 signals, the STD Bus around Z80 signals, the SS-50 bus around the Motorola 6800, and the G64 bus around 6809 signals. This made it harder to interface other processors. Upgrading to a more powerful processor would subtly change the timings, and timing restraints were not always tightly specified. Nor were electrical parameters and physical dimensions. They usually used edge-connectors for the bus, which were vulnerable to dirt and vibration. The VMEbus had provided a high-quality solution for high-performance 16-bit processors, using reliable DIN 41612 connectors and well-specified Eurocard board sizes and rack systems. However, these were too costly where an application only needed a modest 8-bit processor. In the mid 1980s, the STEbus standard addressed these issues by specifying what is rather like a VMEbus simplified for 8-bit processors. The bus signals are sufficiently generic so that they are easy for 8-bit processors to interface with. The board size was usually a single-height Eurocard (100 mm x 160 mm) but allowed for double-height boards (233 x 160 mm) as well. The latter positioned the bus connector so that it could neatly merge into VME-bus syste
https://en.wikipedia.org/wiki/ProtCID
The Protein Common Interface Database (ProtCID) is a database of similar protein-protein interfaces in crystal structures of homologous proteins. Its main goal is to identify and cluster homodimeric and heterodimeric interfaces observed in multiple crystal forms of homologous proteins. Such interfaces, especially of non-identical proteins or protein complexes, have been associated with biologically relevant interactions. A common interface in ProtCID indicates chain-chain or domain-domain interactions that occur in different crystal forms. All protein sequences of known structure in the Protein Data Bank (PDB) are assigned a ”Pfam chain architecture”, which denotes the ordered Pfam assignments for that sequence, e.g. (Pkinase) or (Cyclin_N)_(Cyclin_C). Homodimeric interfaces in all crystals that contain particular domain or chain architectures are compared, regardless of whether there are other protein types in the crystals. All interfaces between two different Pfam domains or Pfam architectures in all PDB entries that contain them are also compared (e.g., (Pkinase) and (Cyclin_N)_(Cyclin_C) ). For both homodimers and heterodimers, the interfaces are clustered into common interfaces based on a similarity score. ProtCID reports the number of crystal forms that contain a common interface, the number of PDB entries, the number of PDB and PISA biological assembly annotations that contain the same interface, the average surface area, and the minimum sequence identity of proteins that contain the interface. ProtCID provides an independent check on publicly available annotations of biological interactions for PDB entries. ProtCID also contains interface clusters between protein domains and peptides, nucleic acids, and ligands. See also Protein-protein interaction
https://en.wikipedia.org/wiki/Peripheral%20blood%20mononuclear%20cell
A peripheral blood mononuclear cell (PBMC) is any peripheral blood cell having a round nucleus. These cells consist of lymphocytes (T cells, B cells, NK cells) and monocytes, whereas erythrocytes and platelets have no nuclei, and granulocytes (neutrophils, basophils, and eosinophils) have multi-lobed nuclei. In humans, lymphocytes make up the majority of the PBMC population, followed by monocytes, and only a small percentage of dendritic cells. These cells can be extracted from whole blood using ficoll, a hydrophilic polysaccharide that separates layers of blood, and gradient centrifugation, which will separate the blood into a top layer of plasma, followed by a layer of PBMCs (buffy coat) and a bottom fraction of polymorphonuclear cells (such as neutrophils and eosinophils) and erythrocytes. The polymorphonuclear cells can be further isolated by lysing the red blood cells. Basophils are sometimes found in both the denser and the PBMC fractions. Clinical significance Infections Recent studies indicate that PBMCs may be susceptible to pathogenic infections, such as Ureaplasma parvum and urealiticum, Mycoplasma genitalium and hominis and Chlamydia trachomatis infections. PBMCs may be also susceptible to viral infections. Indeed, footprints of JC polyomavirus and Merkel cell polyomavirus have been detected in PBMCs from pregnant women and women affected by spontaneous abortion. Research uses Many scientists conducting research in the fields of immunology (including auto-immune disorders), infectious disease, hematological malignancies, vaccine development, transplant immunology, and high-throughput screening are frequent users of PBMCs. In many cases, PBMCs are derived from blood banks. PBMC fraction also contains progenitor populations, as demonstrated by methylcellulose based colony forming assays. PBMCs have been thought to be an important route of vaccination. PBMCs from cancer patients can be extracted and cultured in vitro. Subsequently, PBMCs are challen
https://en.wikipedia.org/wiki/Spectral%20variability%20hypothesis
The Spectral Variability Hypothesis (SVH) states that spatial variability in the reflectance of vegetated surfaces relates to plant species richness. It has been originally coined by Palmer et al. (2000) and states that "species richness will be positively related to any objective measure (e.g. standard deviation) of the variation in the spectral characteristics of a remotely sensed image". The underlying assumption is that habitats differ in reflectance and if there are more habitats in an area, higher numbers of species are to be expected. The hypothesis has later also been termed the Spectral Variation Hypothesis. With high spatial resolution, variability in reflectance may also be a direct expression of plant individuals belonging to different species. The Spectral Variability Hypothesis was well received in the research community due to its apparent straightforwardness. Tests of the hypothesis showed considerable variation in the connectedness between spatial variation of reflectance and plant species richness. This variation can be attributed to several reasons. A major problem already noted by Palmer (2002) is the fact that different habitats support different species numbers so the relationship between habitat heterogeneity and species numbers differs depending on which habitats are involved. This means that spatial heterogeneity in reflectance does not show a generalizable link to species richness. Accordingly, the Spectral Variability Hypothesis has been rejected for coarse scale levels where high plant species richness can occur in areas of low spectral variability and vice versa. Other factors impacting the relationship between spectral heterogeneity and species numbers are differences in extent and selection of the areas of investigation, spatial grain (pixel size of the applied remote sensing data), spectral resolution (including the width of the bands and the spectral region covered by the bands) and also the timing of an investigation See also Ess
https://en.wikipedia.org/wiki/Domain%20Based%20Security
"Domain Based Security", abbreviated to "DBSy", is a model-based approach to help analyze information security risks in a business context and provide a clear and direct mapping between the risks and the security controls needed to manage them. A variant of the approach is used by the UK government's HMG Infosec Standard No.1 technical risk-assessment method. DBSy is a registered trade mark of QinetiQ Ltd. DBSy was developed in the late 1990s by the Defence Evaluation and Research Agency (DERA). It is a model-based approach to information assurance that describes the requirements for security in an organisation, taking account of the business that needs to be supported. The model is based around the concept of a security domain, which represents a logical place where people work with information using a computer system, and which has connections with other security domains where this is necessary to support business activity. Hence the focus is on the information that needs protection, the people that work with it and the people they exchange information with. The model can also describe the physical environments where people work and the system boundaries where major system security measures are placed. A systematic method is then applied to the model to identify and describe the risks to which valuable information assets are exposed and specify security measures that are effective in managing the risks. History DBSy has its origins in the late 1990s, having been developed by the Defence Evaluation and Research Agency (DERA) for the Ministry of Defence (MOD). Initially called the Domain Based Approach, it was developed alongside Purple Penelope to support the MOD's increasing need for interconnections between systems operating at different security levels, It was recognised that the risks associated with such connections were directly related to the nature of the information exchange that was needed, and that an effective model for understanding and managing the
https://en.wikipedia.org/wiki/Wound%20tumor%20virus
Wound tumor virus is an invertebrate and plant virus found in the United States of America belonging to the genus Phytoreovirus and the family Reoviridae. It is a type III virus under the Baltimore classification system; that is it has a double-stranded RNA genome. This genome is approximately 25,000 base pairs long and organised into twelve segments. All the viral replication occurs in the cytoplasm. The virus is 22% RNA by weight, the other 78% being structural proteins. Structurally, the virus is constructed from 7 different structural proteins. The capsid has icosahedral symmetry, is non-enveloped and around 70 nm in diameter. There is an inner-shell with a diameter of around 50 nm. More than 50 species of plants are potential hosts for Wound tumor virus. It was first reported in Melilotus officinalis. The virus causes tumors to form on the plant at the stem and roots – with the root tumors being more severe. The virus is spread by an insect vector – the leaf hopper family, notably 'Agallia constricta'. Since viral replication occurs relatively independently of cellular processes, the virus also replicates in the insect vector. External links NCBI database entry for wound tumor virus Viral plant pathogens and diseases Phytoreoviruses
https://en.wikipedia.org/wiki/Borage
Borage ( or ; Borago officinalis), also known as starflower, is an annual herb in the flowering plant family Boraginaceae. It is native to the Mediterranean region, and has naturalized in many other locales. It grows satisfactorily in gardens in most of Europe, such as Denmark, France, Germany, the United Kingdom, and Ireland, remaining in the garden from year to year by self-seeding. The leaves are edible and the plant is grown in gardens for that purpose in some parts of Europe. The plant is also commercially cultivated for borage seed oil extracted from its seeds. The plant contains pyrrolizidine alkaloids, some of which are hepatotoxic, mutagenic, and carcinogenic (see below under Phytochemistry). Description B. officinalis grows to a height of , and is bristly or hairy all over the stems and leaves; the leaves are alternate, simple, and long. The flowers are complete, perfect with five narrow, triangular-pointed petals. Flowers are most often blue, although pink flowers are sometimes observed. White-flowered types are also cultivated. The blue flower is genetically dominant over the white flower. The flowers arise along scorpioid cymes to form large floral displays with multiple flowers blooming simultaneously, suggesting that borage has a high degree of geitonogamy (intraplant pollination). It has an indeterminate growth habit, which may lead to prolific spreading. In temperate climates such as in the UK, its flowering season is relatively long, from June to September. In milder climates, borage blooms continuously for most of the year. Characteristics and uses Traditionally, borage was cultivated for culinary and medicinal uses, although today, commercial cultivation is mainly as an oilseed. Borage is used as either a fresh vegetable or a dried herb. As a fresh vegetable, borage, with a cucumber-like taste, is often used in salads or as a garnish. The flower has a sweet, honey-like taste and is often used to decorate desserts and cocktails, som
https://en.wikipedia.org/wiki/Epimecis%20scolopaiae
Epimecis scolopaiae is a species of moth in the family Geometridae, subfamily Ennominae. It was first described by Dru Drury in 1773 from Jamaica. Description Upper side: antennae filiform. Thorax, abdomen, and wings brownish grey; the latter varied with dark indented brown streaks and lines, contrasted with white and ash colour, crossing them from the anterior to the posterior and abdominal edges. Under side: legs, sides, abdomen, and wings yellow wainscot-coloured. About half the anterior ones, from the tips towards the shoulders, are marked with faint dark brown lines and streaks. Posterior wings having a faintish dark brown cloud, situated near the upper corners. All the wings are deeply dentated. Wing-span nearly 3½ inches (87 mm). Subspecies Epimecis scolopaiae scolopaiae Epimecis scolopaiae transitaria (Walker, 1860)
https://en.wikipedia.org/wiki/Viridos%20%28company%29
In September 2021, Synthetic Genomics Inc. (SGI), a private company located in La Jolla, California, changed its name to Viridos. The company is focused on the field of synthetic biology, especially harnessing photosynthesis with micro algae to create alternatives to fossil fuels. Viridos designs and builds biological systems to address global sustainability problems. Synthetic biology is an interdisciplinary branch of biology and engineering, combining fields such as biotechnology, evolutionary biology, molecular biology, systems biology, biophysics, computer engineering, and genetic engineering. Synthetic Genomics uses techniques such as software engineering, bioprocessing, bioinformatics, biodiscovery, analytical chemistry, fermentation, cell optimization, and DNA synthesis to design and build biological systems. The company produces or performs research in the fields of sustainable bio-fuels, insect resistant crops, transplantable organs, targeted medicines, DNA synthesis instruments as well as a number of biological reagents. Core markets SGI mainly operates in three end markets: research, bioproduction and applied products. The research segment focuses on genomics solutions for academic and commercial research organizations. The commercial products and services include instrumentation, reagents, DNA synthesis services, and bioinformatics services and software. In 2015, the company launched the BioXP 3200 system, a fully automated benchtop instrument that produces DNA fragments from many different sources for genomic data. The company's efforts in bio-based production are intended to improve both existing production hosts and develop entirely new synthetic production hosts with the goal of more efficient routes to bioproducts. SGI has a number of commercial as well as research and development stage programs across a variety of industries. Some of these research partnerships include: History Synthetic Genomics was founded in the spring of 2005 by J. Craig
https://en.wikipedia.org/wiki/Fesenkov%20Astrophysical%20Institute
Fesenkov Astrophysical Institute (Астрофизический институт имени В. Г. Фесенкова, АФИФ), or FAI, is a research institute in Almaty, Kazakhstan. The institute was founded in 1941 as the Institute for Astronomy and Physics of the Kazakh branch of the USSR Academy of Sciences, when a group of Soviet astronomers was evacuated during World War II from the European parts of the USSR to Almaty. In 1948 G.A. Tikhov had organized an independent sector of astrobotany, and in 1950 astronomers established the Astrophysical Institute of the Kazakh SSR. In 1989 the institute was renamed after Vasily Fesenkov, one of its founders. FAI conducts both observational and theoretical research. The prime objects of observations are the Sun, outer planets, comets, Herbig Ae/Be stars, and active galaxies. The topics of theoretical research include stellar dynamics and computational astrophysics, active galactic nuclei, cosmology, physics of comets and interstellar medium. The institute runs three observational bases in mountains near Almaty: Kamenskoe Plateau Observatory, Assy-Turgen Observatory and Tien Shan Astronomical Observatory. FAI is a member of the International Astronomical Union. Interesting facts The comet 67P/Churyumov–Gerasimenko, famous for the Rosetta mission, was discovered at Fesenkov Astrophysical Institute. FAI was one of the first institutions where a new branch of science—astrobiology—was born in the middle of the twentieth century.