text stringlengths 11 1.65k | source stringlengths 38 44 |
|---|---|
Hamedan Museum of Natural History The is a natural history museum located in the Bu Al Sina University in Hamedan, north-western Iran. It is noted for its presentation of large horned Alborz red sheep and a black vulture. It also has a considerable taxidermic collection of animals and insects. The museum has an aquarium and live fish tanks. | https://en.wikipedia.org/wiki?curid=21131801 |
Oman Natural History Museum The Natural History Museum of Oman is a natural history museum, located at the Ministry of Heritage and Culture complex, Al Khuwair, opposite the Zawawi Mosque in Muscat, Oman. The museum opened on 20 December 1985, and has detailed coverage of Oman's flora and fauna, with displays on indigenous mammals, insects, and birds and botanical gardens. One of the highlights of the museum is the whale hall: it houses the huge skeleton of a sperm whale, which was washed up on the Omani coastline in the 1986. The museum contains marine and animal fossils and ancient mammals such as monkeys and elephants primitive, teeth Deinotherium and Gomphotherium, and stuffed animals: Arabian leopard. Caracal. Arabian Oryx. Arabian wolf. Arab Red fox. Ghazal skeleton Arabic. Ibex Arabic. Flamengo. Birds. Crow. Owl. Reptiles. Snakes' lizards. Snails and shells. In January 2014, the Ministry of Heritage and Culture announced plans to build new premises for the museum, the project will consist of three floors with a gross area rated to 5,000 square meters. The first floor is specialised for exhibiting marine environments such as sandy coasts, mangroves, rocky coasts and coral reef environments as well as the geological history of the Sultanate's seas while the second floor displays wild environments such as mammals, insects, birds, valleys, caves and water springs in addition to the geological history of Oman. The third floor is specialised for showcasing information about the solar system, planets, space and meteors | https://en.wikipedia.org/wiki?curid=21145044 |
Oman Natural History Museum The museum includes lectures halls and temporary exhibitions. Additionally, it has well equipped educational halls that will be used for workshops and educational programmes. The museum includes five fundamental scientific sections specialised for research, records, studies, and archives works. | https://en.wikipedia.org/wiki?curid=21145044 |
Erich Wasmann (29 May 1859 − 27 February 1931) was an Austrian (born in South Tyrol) entomologist, specializing in ants and termites, and Jesuit priest. He described the phenomenon known as Wasmannian mimicry. Wasmann was a supporter of evolution, although he did not accept the productivity of natural selection, the evolution of humans from other animals, or universal common descent of all life. Rather, he believed that common ancestry was restricted to what he called "natural species" which were generally larger groups than species (which he called "systematic species"), genera, or even families. His natural species he identified with the "paleontological species" of Melchior Neumayr. Wasmann also was involved in a long-running dispute with Ernst Haeckel over Monism. His father was the painter Friedrich Wasmann. Wasmann's collection of Formicidae, Isoptera, and myrmecophile and termitophile Coleoptera (especially Staphylinidae) is in the Maastricht Natural History Museum. From 1936 to 1994, a biological journal variously called "The Wasmann Club Collector", "The Wasmann Collector" or "The Wasmann Journal of Biology" was published by the University of San Francisco with notable editors that included Edward L. Kessel and Robert T. Orr. | https://en.wikipedia.org/wiki?curid=21146628 |
Mott criterion The describes the critical point of the metal–insulator transition. The criterion is formula_1 where formula_2 is the electron density of the material and formula_3 the effective bohr radius. The constant formula_4, according to various estimates, is 2.0, 2.78,4.0, or 4.2. If the criterion is satisfied (i.e. if the density of electrons is sufficiently high) the material becomes conductive (metal) and otherwise it will be an insulator. | https://en.wikipedia.org/wiki?curid=21162207 |
Comparator system A comparator system, or simply comparator, in the fields of biophysics, biology, and neurology is a particular organisation of neurons. Comparators, as their name suggests, compare several inputs of internal or external information, and are important to the field of neural learning. In biological systems, comparators help an organism adapt to changes in its surroundings. | https://en.wikipedia.org/wiki?curid=21167788 |
CHARISSA (derived from 'CHARged particle Instrumentation for a Solid State Array') is a nuclear structure research collaboration originally conceived, initiated and partially built by Dr. William Rae of the University of Oxford (retired) and now run by the School of Physics and Astronomy at the University of Birmingham, UK. The other members of the collaboration are the University of Surrey with occasional contributions from LPC CAEN and Ruđer Bošković Institute, Zagreb. The collaboration is funded by the Science and Technology Facilities Council (STFC). The collaboration carries out experiments at many of the world's leading research centres. Due to the nature of the research experiments performed must be undertaken with the use of a particle accelerator and complex detection systems. The group probes the structure of nuclei. Experiments are currently being carried out utilising the following facilities: Previous experiments have taken place using the following accelerators at their respective facilities: | https://en.wikipedia.org/wiki?curid=21177379 |
Ernő Csíki Ernst Csiki, Ernst Dietl or Ernő Csiki (Csíki) (, 1875 in Vulkan – 1954 in Budapest) was a Hungarian entomologist who specialised in Coleoptera. He was, between 1897 and 1932, Keeper of the Collections of the Hungarian Natural History Museum where his collection is conserved. At the time of Ernő Csiki's retirement (1932) the beetle collection contained over 1 million specimens largely due to his purchases and his obtaining funding for expeditions. Csiki wrote several parts of "Coleopterorum Catalogus" and many papers on Carpathian Coleoptera. | https://en.wikipedia.org/wiki?curid=21192785 |
LOWERN is an acronym for 6 factors that affect climate. There is also another name for this acronym that removes nearness to water, and is known as LOWER instead. They might add near water at the end. | https://en.wikipedia.org/wiki?curid=21201136 |
Franz Dannehl (7 February 1870, Rudolstadt – 1947) was a German entomologist who specialised in Lepidoptera. He was an insect dealer first in Bözen, then in Munich. His private collection, mainly butterflies and moths from the German Tyrol is in Museo Civico di Zoologia, Rome, Zoologische Staatssammlung München in Munich and Museum für Naturkunde in Berlin. According to Peter Levenda, Dannehl was a member of the occultist Thule Society, and joined the Nazi Party early on. Dannehl also trained as a composer. According to Daniel Gregory Mason, "He studied composition in Brussels, Weimar, and Berlin"; a violin sonata is among music compositions of his that survive in score. Some of his songs were reviewed in The Monthly Musical Record (February 1, 1901, page 40) as containing "only slight interest" when performed in a concert at the time. | https://en.wikipedia.org/wiki?curid=21209667 |
Georg Dieck (28 April 1847 in Zöschen – 21 October 1925 in Zöschen) was a German entomologist and botanist. After attending high school in Naumburg, he studied natural sciences at Jena, where he was a pupil and assistant of Ernst Haeckel. In 1870, he taught in Zöschen at the large arboretum, where over 6000 different tree and shrub species were cultivated. In addition to the maintenance of plant collections, Dieck went on expeditions in the Rockies (1888), in the Caucasus (1891) and Spain (1892), where he collected beetles, plants and mosses, while new taxa such as "Orthotrichum cupulatum" var "baldaccii" were discovered. Further journeys led him to France, Italy and Sicily, Morocco, the Balkans and Turkey. He wrote many scientific papers describing new taxa, and introduced several plants to western cultivation, notably "Ulmus pumila" L. var. "arborea" Litv. from Turkestan. His collections are in the University of Halle-Wittenberg (Biozentrum, World Coleoptera), Muséum national d'histoire naturelle in Paris (Turkey Lepidoptera), and the Natural History Museum in London (Turkey Lepidoptera). His collection of 450 rose species was presented at the world's fair in Paris in 1900. Dieck was a Member of the Société entomologique de France. Several plant taxa are named for Dieck, including "Brachythecium dieckii" and maple "Acer × dieckii" hybrids. | https://en.wikipedia.org/wiki?curid=21220976 |
Paul Dognin (10 May 1847 – 10 August 1931) was a French entomologist who specialised in the Lepidoptera of South America. Dognin named 101 new genera of moths. He was a member of the Royal Belgian Entomological Society and life member of the Société entomologique de France. Part of his collection was purchased by James John Joicey in 1921. The 82,000 other specimens (including 3,000 Dognin types and over 300 Thierry-Mieg types) were sold in 1926 to William Schaus, who then donated it to National Museum of Natural History in Washington, DC. Printed by Charles Oberthür | https://en.wikipedia.org/wiki?curid=21222312 |
Minnesota Geological Survey The is a unit of the Newton Horace Winchell School of Earth Sciences at the University of Minnesota. The Survey conducts field mapping, basic and applied research, publishes scientific and popular literature, and works toward the goal of science education through providing outreach programs in Earth science and technology in Minnesota. | https://en.wikipedia.org/wiki?curid=21239378 |
Patrick H. O'Farrell is a molecular biologist who made crucial contribution to the development of 2-dimensional protein electrophoresis and "Drosophila" genetics. He is now a professor of Biochemistry at the University of California, San Francisco (UCSF) and has a h-index of 67. O'Farrell received his B.Sc. in 1969 from McGill University in Montreal, Quebec. He then went on to graduate school at the University of Colorado, Boulder, where he worked with Jacques Pène. To optimize the resolution of the Electrophoresis of the proteins, O'Farrell needed to separate the proteins according to independent parameters. Two parameters were used: This permitted the simultaneous determination of molecular weight and isoelectric point for the proteins. Because the two parameters are unrelated, it was possible to obtain an almost uniform distribution of protein spots across the two-dimensional gel. Using his technique, O'Farrell was able to resolve 1100 different components from Escherichia coli and predicted his system should be capable of resolving up to 5000 proteins. | https://en.wikipedia.org/wiki?curid=21241192 |
Satya Churn Law (alternately transcribed as Satya Charan Law) (died 11 December 1984) was a wealthy naturalist, amateur ornithologist, educationist and intellectual in Calcutta. He was for a while a treasurer of the Indian Statistical Institute and was a Fellow of the Zoological Society of London and Member of the British Ornithologists' Union. In 1937, Nirad C. Chaudhuri became his literary assistant. He wrote books on a variety of topics including birds ("Pet Birds of Bengal" 1923) based on his experience in keeping aviaries. He was a vice president of the Calcutta Zoological Garden for a while. He founded a journal, "Prakriti", in Bengali for the popularization of science. | https://en.wikipedia.org/wiki?curid=21247166 |
Piper diagram In 1944, Arthur M. Piper proposed an effective graphic procedure for presenting water chemistry data to help in understanding the sources of the dissolved constituents in water. This procedure is based on the premise that cations and anions in water are generally in chemical equilibrium. A piper diagram is a graphical representation of the chemistry of a water sample or samples. The cations and anions are shown by separate ternary plots. The apexes of the cation plot are calcium, magnesium and sodium plus potassium cations. The apexes of the anion plot are sulfate, chloride and carbonate plus hydrogen carbonate anions. The two ternary plots are then projected onto a diamond. The diamond is a matrix transformation of a graph of the anions (sulfate + chloride/ total anions) and cations (sodium + potassium/total cations). The required matrix transformation of the anion/cation graph is: The piper diagram is suitable for comparing the ionic composition of a set of water samples, but does not lend itself to spatial comparisons. For geographical applications, the Stiff diagram and Maucha diagram are more applicable, because they can be used as markers on a map. Colour coding of the background of the allows linking Piper Diagrams and maps Water samples shown on the can be grouped in hydrochemical facies. The cation and anion triangles can be separated in regions based on the dominant cation(s) or anion(s) and their combination creates regions in the diamond shaped part of the diagram. | https://en.wikipedia.org/wiki?curid=21265548 |
Victor Manuel Velasco Herrera is a theoretical physicist and researcher at the Institute of Geophysics of the National Autonomous University of Mexico (UNAM). He disagrees with predictions about future climate change, arguing that proponents ignore the most important factor, which he submits is solar activity. In the summer of 2008, he also predicted the world would soon enter a little ice age. In climate change, Velasco claims that the models and forecasts of the U.N. IPCC "are incorrect because they only are based on mathematical models and presented results at scenarios that do not include, for example, solar activity." | https://en.wikipedia.org/wiki?curid=21269541 |
Viking Olver Eriksen (14 April 1922 – 6 March 2014) was a Norwegian nuclear physicist. He was born in Stavanger, and graduated as cand.real. in 1951. In 1952 he was hired at the Institute for Nuclear Energy at Kjeller—the institution is now named the Norwegian Institute for Energy Technology—as head of the physics department. He left this position in 1965 to become assisting director. In 1968 he was managing director for a brief time, from 1971 to 1982 he held this position on a permanent basis. He died in Lillestrøm in March 2014 at the age of 91. | https://en.wikipedia.org/wiki?curid=21271942 |
Philippe Janvier is a French paleontologist, specialising in Palaeozoic vertebrates, who currently works at the Museum National de l’Histoire Naturelle in Paris. He has written several books and scientific papers on Palaeozoic vertebrates and contributed to the Tree of Life phylogeny project. He has led the largest paleontology research group in France (currently called the CR2P), located in Paris. Janvier received the award of the "Grand prix scientifique de la Fondation Simone et Cino del Duca (Institut de France)" on June 11, 2008 for his work. He was a founding member of the . He is currently Associate Editor of the Comptes Rendus Palevol (one of the series of the Comptes rendus de l'Académie des sciences) for paleoichthyology. General: Philippe Janier at Palaeos.org | https://en.wikipedia.org/wiki?curid=21279624 |
Lugeon A is a unit devised to quantify the water permeability of bedrock and the hydraulic conductivity resulting from fractures; it is named after Maurice Lugeon, a Swiss geologist who first formulated the method in 1933. More specifically, the test is used to measure the amount of water injected into a segment of the bored hole under a steady pressure; the value (value) is defined as the loss of water in litres per minute and per metre borehole at an over-pressure of 1 MPa. Although the test may serve other purposes, its main object is to determine the "coefficient" which by definition is water absorption measured in litres per metre of test-stage per minute at a pressure of 10 kg/cm (1 MN/m). | https://en.wikipedia.org/wiki?curid=21291452 |
Integrated Forecast System The (IFS) is a global numerical weather prediction system jointly developed and maintained by the European Centre for Medium-Range Weather Forecasts (ECMWF) based in Reading, England, and Météo-France based in Toulouse. The version of the IFS run at ECMWF is often referred to as the "ECMWF" or the "European model" in North America, to distinguish it from the American GFS. It comprises a spectral atmospheric model with a terrain-following vertical coordinate system coupled to a 4D-Var data assimilation system. In 1997 the IFS became the first operational forecasting system to use 4D-Var. Both ECMWF and Météo-France use the IFS to make operational weather forecasts, but using a different configuration and resolution (the Météo-France configuration is referred to as ARPEGE). It is one of the predominant global medium-range models in general use worldwide; its most prominent rivals in the 6–10 day medium range include the American Global Forecast System (GFS), the Canadian Global Environmental Multiscale Model (GEM and GDPS) and the UK Met Office Unified Model. ECMWF runs the IFS in several configurations. The highest resolution "HRES" configuration is run every twelve hours out to ten days with a horizontal resolution of 9 km using 137 layers in the vertical. The 51-member ensemble system "ENS" is also run every twelve hours out to 15 days with a horizontal resolution of 18 km and 91 layers in the vertical | https://en.wikipedia.org/wiki?curid=21295446 |
Integrated Forecast System The ECMWF also runs a coarser version of the IFS out 45 days; this version is run weekly, with output in five-day intervals. There is also a version that runs out one year. All model versions except HRES are coupled to the ocean model NEMO. Many ECMWF member states use ECMWF global forecasts to provide boundary conditions for their own higher resolution, limited domain forecasts. ECMWF forecasts are free to the national weather services of its member states, but a fee is charged to commercial users, while limited operational data (select variables from the HRES and ENS out ten days) is available direct to consumers under the noncommercial Creative Commons license prohibiting derivative works (CC-BY ND NC). In contrast, output from the GFS and GEM/GDPS is freely licensed to all users. The full IFS source code is available only to the national weather services of ECMWF member states, but the source code for the atmosphere model is available to other non-commercial users in the form of the OpenIFS. The EC-Earth climate model is based on the IFS. | https://en.wikipedia.org/wiki?curid=21295446 |
Jupiter Europa Orbiter As a part of the defunct Europa Jupiter System Mission – Laplace (EJSM/Laplace), the (JEO) was a proposed orbiter probe slated for lift-off in 2020 and planned for detailed studies of Jupiter's moons Europa and Io as well as the Jovian magnetosphere. Its main goal would have been to look for evidence of a possible subsurface ocean. In June 2015, a more economical mission, the Europa Multiple-Flyby Mission ("Europa Clipper") was approved by NASA and entered the formulation stage. | https://en.wikipedia.org/wiki?curid=21301991 |
Yoseph Imry (Hebrew: יוסף אמרי; born 23 February 1939 – 29 May 2018) was an Israeli physicist. He was best known for taking part in the foundation of mesoscopic physics, a relatively new branch of condensed matter physics. It is concerned with how the behavior of systems whose size is in between micro- and macroscopic, crosses over between these two regimes. These systems can be handled and addressed by more or less usual macroscopic methods, but their behavior may still show quantum effects. In 1996, 2001 and 2016, Imry received the Rothschild Prize, Israel Prize and Wolf Prize in physics, respectively. He was a member of the European Academy of Sciences and Arts (Salzburg), the European Academy of Sciences, Sciences and Humanities (Paris), the National Academy of Sciences, the American Physical Society and the Israel Academy of Sciences and Humanities. | https://en.wikipedia.org/wiki?curid=21314351 |
Quintom scenario The (derived from the words "quintessence" and "phantom", as in phantom energy) is a hypothetical scenario involving dark energy. In this scenario, the equation of state formula_1 of the dark energy, relating its pressure and energy density, can cross the boundary formula_2 associated with the cosmological constant. The boundary separates the phantom-energy-like behavior with formula_3 from the quintessence-like behavior with formula_4. A no-go theorem shows that this behavior requires at least two degrees of freedom for dark energy models involving ideal gases or scalar fields. If this quintom energy exists, it may indicate how the universe avoids Big Bangs, Big Rips, and other time-like singularities. The Quintom Scenario was applied in 2008 to produce a model of inflationary cosmology with a Big Bounce instead of a Big Bang singularity. It was also applied in 2007 to a Big Bounce model of the universe. | https://en.wikipedia.org/wiki?curid=21321517 |
Stone Wall (Australia) Stone Wall is an escarpment overlooking the Murchison River Gorge about north-east of Kalbarri in Mid West Western Australia. It is of geological interest because it provides outstanding exposures of five Cretaceous formations unconformably overlying the Ordovician Tumblagooda sandstone. The Cretaceous formations contain trace fossils of "Skolithos" and "Cylindricum". It has been visited on geological excursions and is considered an important research site. | https://en.wikipedia.org/wiki?curid=21323543 |
Hamlet (crater) Hamlet is the largest crater on the known part of the surface of Uranus' moon Oberon. It has diameter of about 206 km and is named after the title character of the play Hamlet, by William Shakespeare. The crater has a dark floor and is surrounded by a system of bright rays, which are ice ejecta deposited during the impact event. The nature of the dark material on the floor is unknown, but it may have erupted from the depth through cryovolcanism. The crater was first imaged by the Voyager 2 spacecraft in January 1986. | https://en.wikipedia.org/wiki?curid=21341051 |
Kurt Peters Kurt Gustav Karl Peters (17 August 1897 – 23 May 1978) was an Austrian chemist. His work focused on the area of fuel technology, physical chemistry and catalytic reactions as well as the separation of rare gases and hydrocarbons. After serving in the Austrian army during World War I, he studied chemistry at the Technical University of Vienna (Vienna TH, today Vienna University of Technology) between 1918 and 1921. In 1923 he earned his doctorate at the University of Berlin under Walther Nernst. In 1927, Peters and Friedrich Paneth published their results on the transformation of hydrogen to helium, now known as Cold fusion. They later retracted the results, saying they had measured background helium from the air. After working for a number of years as an assistant, in 1928 he was promoted to Head of Department at the Kaiser Wilhelm Institute for Coal Research. In 1937 he moved into industry and worked in the department of experimental high-IG Farben under Matthias Pier in the field of catalyst development. After the Second World War, the American military government appointed him as trustee for a portion of the confiscated property of the Company IG Farben. In 1949 he returned to academia, and was appointed as professor at the Department of ordinary fuel at TH Vienna. From 1952 to 1954 he was Dean of the chemistry department and in the years 1955 and 1956, Rector of the TH Vienna. | https://en.wikipedia.org/wiki?curid=21344722 |
Chemical compound A chemical compound is a chemical substance composed of many identical molecules (or molecular entities) composed of atoms from more than one element held together by chemical bonds. Two atoms of the same element bonded in a molecule do not form a chemical compound, since this would require two different elements. There are four types of compounds, depending on how the constituent atoms are held together: A chemical formula specifies the number of atoms of each element in a compound molecule, using the standard abbreviations for the chemical elements and numerical subscripts. For example, a water molecule has formula HO indicating two hydrogen atoms bonded to one oxygen atom. Many chemical compounds have a unique CAS number identifier assigned by the Chemical Abstracts Service. Globally, more than 350,000 chemical compounds (including mixtures of chemicals) have been registered for production and use. A compound can be converted to a different chemical composition by interaction with a second chemical compound via a chemical reaction. In this process, bonds between atoms are broken in both of the interacting compounds, and new bonds formed. Any substance consisting of two or more different types of atoms (chemical elements) in a fixed stoichiometric proportion can be termed a "chemical compound"; the concept is most readily understood when considering pure chemical substances | https://en.wikipedia.org/wiki?curid=21347411 |
Chemical compound It follows from their being composed of fixed proportions of two or more types of atoms that chemical compounds can be converted, via chemical reaction, into compounds or substances each having fewer atoms. The ratio of each element in the compound is expressed in a ratio in its chemical formula. A chemical formula is a way of expressing information about the proportions of atoms that constitute a particular chemical compound, using the standard abbreviations for the chemical elements, and subscripts to indicate the number of atoms involved. For example, water is composed of two hydrogen atoms bonded to one oxygen atom: the chemical formula is HO. In the case of non-stoichiometric compounds, the proportions may be reproducible with regard to their preparation, and give fixed proportions of their component elements, but proportions that are not integral [e.g., for palladium hydride, PdH (0.02 < x < 0.58)]. Chemical compounds have a unique and defined chemical structure held together in a defined spatial arrangement by chemical bonds. Chemical compounds can be molecular compounds held together by covalent bonds, salts held together by ionic bonds, intermetallic compounds held together by metallic bonds, or the subset of chemical complexes that are held together by coordinate covalent bonds | https://en.wikipedia.org/wiki?curid=21347411 |
Chemical compound Pure chemical elements are generally not considered chemical compounds, failing the two or more atom requirement, though they often consist of molecules composed of multiple atoms (such as in the diatomic molecule H, or the polyatomic molecule S, etc.). Many chemical compounds have a unique numerical identifier assigned by the Chemical Abstracts Service (CAS): its CAS number. There is varying and sometimes inconsistent nomenclature differentiating substances, which include truly non-stoichiometric examples, from chemical compounds, which require the fixed ratios. Many solid chemical substances—for example many silicate minerals—are chemical substances, but do not have simple formulae reflecting chemically bonding of elements to one another in fixed ratios; even so, these crystalline substances are often called "non-stoichiometric compounds". It may be argued that they are related to, rather than being chemical compounds, insofar as the variability in their compositions is often due to either the presence of foreign elements trapped within the crystal structure of an otherwise known true "chemical compound", or due to perturbations in structure relative to the known compound that arise because of an excess of deficit of the constituent elements at places in its structure; such non-stoichiometric substances form most of the crust and mantle of the Earth | https://en.wikipedia.org/wiki?curid=21347411 |
Chemical compound Other compounds regarded as chemically identical may have varying amounts of heavy or light isotopes of the constituent elements, which changes the ratio of elements by mass slightly. Compounds are held together through a variety of different types of bonding and forces. The differences in the types of bonds in compounds differ based on the types of elements present in the compound. London dispersion forces are the weakest force of all intermolecular forces. They are temporary attractive forces that form when the electrons in two adjacent atoms are positioned so that they create a temporary dipole. Additionally, London dispersion forces are responsible for condensing non polar substances to liquids, and to further freeze to a solid state dependent on how low the temperature of the environment is. A covalent bond, also known as a molecular bond, involves the sharing of electrons between two atoms. Primarily, this type of bond occurs between elements that fall close to each other on the periodic table of elements, yet it is observed between some metals and nonmetals. This is due to the mechanism of this type of bond. Elements that fall close to each other on the periodic table tend to have similar electronegativities, which means they have a similar affinity for electrons. Since neither element has a stronger affinity to donate or gain electrons, it causes the elements to share electrons so both elements have a more stable octet | https://en.wikipedia.org/wiki?curid=21347411 |
Chemical compound Ionic bonding occurs when valence electrons are completely transferred between elements. Opposite to covalent bonding, this chemical bond creates two oppositely charged ions. The metals in ionic bonding usually lose their valence electrons, becoming a positively charged cation. The nonmetal will gain the electrons from the metal, making the nonmetal a negatively charged anion. As outlined, ionic bonds occur between an electron donor, usually a metal, and an electron acceptor, which tends to be a nonmetal. Hydrogen bonding occurs when a hydrogen atom bonded to an electronegative atom forms an electrostatic connection with another electronegative atom through interacting dipoles or charges. A compound can be converted to a different chemical composition by interaction with a second chemical compound via a chemical reaction. In this process, bonds between atoms are broken in both of the interacting compounds, and then bonds are reformed so that new associations are made between atoms. Schematically, this reaction could be described as , where A, B, C, and D are each unique atoms; and AB, AD, CD, and CB are each unique compounds. | https://en.wikipedia.org/wiki?curid=21347411 |
Homeokinetics is the study of self-organizing, complex systems. Standard physics studies systems at separate levels, such as atomic physics, nuclear physics, biophysics, social physics, and galactic physics. Homeokinetic physics studies the up-down processes that bind these levels. Tools such as mechanics, quantum field theory, and the laws of thermodynamics provide the key relationships. The subject, described as the physics and thermodynamics associated with the up down movement between levels of systems, originated in the late 1970s work of American physicists Harry Soodak and Arthur Iberall. Complex systems are universes, galaxies, social systems, people, or even those that seem as simple as gases. The basic premise is that the entire universe consists of atomistic-like units bound in interactive ensembles to form systems, level by level, in a nested hierarchy. treats all complex systems on an equal footing, animate and inanimate, providing them with a common viewpoint. The complexity in studying how they work is reduced by the emergence of common languages in all complex systems. Arthur Iberall, Warren McCulloch and Harry Soodak developed the concept of homeokinetics as a new branch of physics. It began through Iberall's biophysical research for the NASA exobiology program into the dynamics of mammalian physiological processes They were observing an area that physics has neglected, that of complex systems with their very long internal factory day delays | https://en.wikipedia.org/wiki?curid=21348562 |
Homeokinetics They were observing systems associated with nested hierarchy and with an extensive range of time scale processes. It was such connections, referred to as both up-down or in-out connections (as nested hierarchy) and side-side or flatland physics among atomistic-like components (as heterarchy) that became the hallmark of homeokinetic problems. By 1975, they began to put a formal catch-phrase name on those complex problems, associating them with nature, life, human, mind, and society. The major method of exposition that they began using was a combination of engineering physics and a more academic pure physics. In 1981, Iberall was invited to the Crump Institute for Medical Engineering of UCLA, where he further refined the key concepts of homeokinetics, developing a physical scientific foundation for complex systems. A system is a collective of interacting ‘atomistic’-like entities. The word ‘atomism’ is used to stand both for the entity and the doctrine. As is known from ‘kinetic’ theory, in mobile or simple systems, the atomisms share their ‘energy’ in interactive collisions. That so-called ‘equipartitioning’ process takes place within a few collisions. Physically, if there is little or no interaction, the process is considered to be very weak. Physics deals basically with the forces of interaction—few in number—that influence the interactions. They all tend to emerge with considerable force at high ‘density’ of atomistic interaction | https://en.wikipedia.org/wiki?curid=21348562 |
Homeokinetics In complex systems, there is also a result of internal processes in the atomisms. They exhibit, in addition to the pair-by-pair interactions, internal actions such as vibrations, rotations, and association. If the energy and time involved internally creates a very large—in time—cycle of performance of their actions compared to their pair interactions, the collective system is complex. If you eat a cookie and you do not see the resulting action for hours, that is complex; if boy meets girl and they become ‘engaged’ for a protracted period, that is complex. What emerges from that physics is a broad host of changes in state and stability transitions in state. Viewing Aristotle as having defined a general basis for systems in their static-logical states and trying to identify a logic-metalogic for physics, e.g., metaphysics, then homeokinetics is viewed to be an attempt to define the dynamics of all those systems in the universe. Ordinary physics is a flatland physics, a physics at some particular level. Examples include nuclear and atomic physics, biophysics, social physics, and stellar physics. Homeokinetic physics combines flatland physics with the study of the up down processes that binds the levels. Tools, such as mechanics, quantum field theory, and the laws of thermodynamics, provide key relationships for the binding of the levels, how they connect, and how the energy flows up and down | https://en.wikipedia.org/wiki?curid=21348562 |
Homeokinetics And whether the atomisms are atoms, molecules, cells, people, stars, galaxies, or universes, the same tools can be used to understand them. treats all complex systems on an equal footing, animate and inanimate, providing them with a common viewpoint. The complexity in studying how they work is reduced by the emergence of common languages in all complex systems. A homeokinetic approach to complex systems has been applied to ecological psychology, anthropology, geology, bioenergetics, and political science. It has also been applied to social physics where a homeokinetics analysis shows that one must account for flow variables such as the flow of energy, of materials, of action, reproduction rate, and value-in-exchange. | https://en.wikipedia.org/wiki?curid=21348562 |
Władysław Szafer Institute of Botany The (Instytut Botaniki im. Władysława Szafera, Polish) in Kraków, Poland is a major European herbarium containing a collection of over 650,000 vascular plants, bryophytes, algae, fungi, lichens, and various plant fossils. The vascular plant specimens are primarily from Central Europe with a specialization in alpine plants. The bryophytes are Polish, Antarctic and subAntarctic, and East African. The fossil plants are largely Central European. Main publications include "Acta Palaeobotanica", and the "Polish Botanical Journal". The herbarium was established in the 1950s by professor of botany and paleobotany, Władysław Szafer, at the Jagiellonian University in Kraków. | https://en.wikipedia.org/wiki?curid=21358560 |
Highlands J virus The Highlands J (HJ) virus is a zoonotic alphavirus native to North and South America. It maintains a natural reservoir in the songbird population of freshwater swamps (generally scrub jays and blue jays) and is transmitted by the bite of the female "Culiseta melanura" mosquito. Though nearly identical in structure and natural cycle to the Eastern equine encephalitis virus, it is considerably less virulent than its cousin, causing relatively mild symptoms in its primary avian reservoir and only nominally capable of transmission to mammals. A 1995 study conducted in Florida swampland found that 15% of swamp-dwelling jays tested positive for HJ antibodies, all of which were asymptomatic and in apparent good health. Recorded bird deaths from HJ infection are uncommon but not rare, and include several domestic turkeys at a commercial facility and young broiler chickens in an experimental setting. Transmission to equines or humans via mosquito is also possible, though even more rare. During the 1990-1991 St. Louis encephalitis outbreak in Missouri, 4 patients were found to be comorbidly infected with SLE and HJ, though no harmful effects were attributed to the HJ alone. A limited survey of swamp-dwelling rodents in Florida found one cotton mouse and one cotton rat with antibodies to HJ, both asymptomatic. The sole mammalian fatality attributed to HJ was a Florida horse originally diagnosed with Western equine encephalitis in 1964, which was later redetermined in 1989 to have been caused by HJ | https://en.wikipedia.org/wiki?curid=21369463 |
Highlands J virus Despite its negligible virulence in humans, it is often tested for in US domestic mosquito control programs as an indicator of fruitful conditions for other mosquito-borne zoonoses to multiply. | https://en.wikipedia.org/wiki?curid=21369463 |
Orthoretrovirinae is a subfamily of viruses belonging to "Retroviridae", a family of enveloped viruses that replicate in a host cell through the process of reverse transcription. The subfamily currently includes six genera, of which "Lentivirus" contains the human immunodeficiency virus (HIV). These viruses cause a variety of tumors, malignancies and immune deficiency disease in man, other mammals and birds. A few, like SIV, apparently cause no disease in their natural hosts. | https://en.wikipedia.org/wiki?curid=21382303 |
Amott test The is one of the most widely used empirical wettability measurements for reservoir cores in petroleum engineering. The method combines two spontaneous imbibition measurements and two forced displacement measurements. This test defines two different indices: the Amott water index (formula_1) and the Amott oil index (formula_2). The two Amott indices are often combined to give the Amott–Harvey index. It is a number between -1 and 1 describing wettability of a rock in drainage processes. It is defined as: These two indices are obtained from special core analysis (SCAL) experiments (porous plate or centrifuge) by plotting the capillary pressure curve as a function of the water saturation as shown on figure 1: with formula_5 is the water saturation for a zero capillary pressure during the imbibition process, formula_6 is the irreducible water saturation and formula_7 is the residual oil saturation after imbibition. with formula_9 is the oil saturation for a zero capillary pressure during the secondary drainage process, formula_6 is the irreducible water saturation and formula_7 is the residual non-wetting phase saturation after imbibition. A rock is defined as: | https://en.wikipedia.org/wiki?curid=21391441 |
Limiting oxygen index The limiting oxygen index (LOI) is the minimum concentration of oxygen, expressed as a percentage, that will support combustion of a polymer. It is measured by passing a mixture of oxygen and nitrogen over a burning specimen, and reducing the oxygen level until a critical level is reached. LOI values for different plastics are determined by standardized tests, such as the ISO 4589 and ASTM D2863. The LOI value is also dependent on the surrounding temperature of the sample. The percent of oxygen required for combustion reduces as the surrounding temperature is increased. Plastics and cable material is tested for its LOI value at both ambient temperature and elevated temperature to understand its oxygen requirement under actual fire conditions. Materials with an LOI greater than the atmospheric oxygen concentration are called fire retardant materials. | https://en.wikipedia.org/wiki?curid=21391686 |
Non-reversing mirror A non-reversing mirror (sometimes referred to as a flip mirror) is a mirror that presents its subject as it would be seen from the mirror. A non-reversing mirror can be made by connecting two regular mirrors at their edges at a 90 degree angle. If the join is positioned so that it is vertical, an observer looking into the angle will see a non-reversed image. This can be seen in places such as public toilets when there are two mirrors mounted on walls which meet at right angles. Such an image is visible while looking towards the corner where the two mirrors meet. The problem with this type of non-reversing mirror is that there is usually a line down the middle interrupting the image. However, if first surface mirrors are used, and care is taken to set the angle to exactly 90 degrees, the join can be made almost invisible. Another type of non-reversing mirror can be made by making the mirror concave (curved inwards like a bowl). At a certain distance from the mirror a non-reversed image will appear. The disadvantage of this is that it only works at a certain distance. A third type of non-reversing mirror was created by mathematics professor R. Andrew Hicks in 2009. It was created using computer algorithms to generate a "disco ball" like surface. The thousands of tiny mirrors are angled to create a surface which curves and bends in different directions. The curves direct rays from an object across the mirror's face before sending them back to the viewer, flipping the conventional mirror image | https://en.wikipedia.org/wiki?curid=21412510 |
Non-reversing mirror A patent for a non-reversing mirror was issued to John Joseph Hooker in 1887. | https://en.wikipedia.org/wiki?curid=21412510 |
NGC 1023 is a barred lenticular galaxy, a member of the group of galaxies in the Local Supercluster. Distance measurements vary from 9.3 to 19.7 million parsecs (30 to 64 million light-years). The supermassive black hole at the core has a mass of . is included in Halton Arp's Atlas of Peculiar Galaxies, under the category "Galaxies with Nearby Fragments" under the number 135. | https://en.wikipedia.org/wiki?curid=21416245 |
Chapais (crater) Chapais is a crater on Mars, named after the community of Chapais, Quebec, Canada. Chapais Crater is located at 22.6° south latitude, and 20.6° west longitude. Its diameter is . Its name was adopted in 1976 by the International Astronomical Union's Working Group for Planetary System Nomenclature (IAU/WGPSN). | https://en.wikipedia.org/wiki?curid=21422050 |
John Krige () is an historian of science and technology and the Kranzberg Professor at the School of History, Technology and Society, Georgia Institute of Technology, Atlanta. Krige is originally a physical chemist by training, earning a PhD from the University of Pretoria in the subject. After earning a PhD in philosophy at the University of Sussex, in the United Kingdom in 1979, Krige's intellectual career has been in the history of science and technology, including notable efforts within the project to write the history of CERN and the European Space Agency in the 1980s and 1990s. His main focus is on the place of science and technology in the foreign policies of governments both intra-European and between the U.S. and Western Europe in the cold war. In 2000, Krige became a professor at Georgia Institute of Technology's School of History and Sociology. As a Francis Bacon Award recipient, Krige became a visiting professor at Caltech's Division of Humanities and Social Science. | https://en.wikipedia.org/wiki?curid=21422193 |
Alexander Nikolsky Alexander Mikhailovich Nikolsky (February 18, 1858 – December 8, 1942) was a Russian and Ukrainian zoologist born in Astrakhan. From 1877 to 1881, he studied at the University of St. Petersburg, earning his doctorate several years later in 1887. From 1881 to 1891, he took part in numerous expeditions to Siberia, the Caucasus, Persia, Japan, et al. In 1887 he became an associate professor in St. Petersburg, later becoming director of the herpetology department at the Zoological Museum of the Academy of Sciences (1895). In 1903 he relocated as a professor to the Kharkiv University. In 1919 he was elected a member at the Academy of Sciences of the Ukraine. Among his written works were "Herpetologia Caucasica" (1913), and volumes on reptiles and amphibians that were part of the series "Fauna of Russia and Adjacent Countries". He is the taxonomic authority of 26 reptile species. The viper "Vipera nikolskii" (Nikolsky's adder) and the turtle subspecies "Testudo graeca nikolskii" (Nikolsky's tortoise) are named in his honor. Today in Russia, the "Nikolsky Herpetological Society" commemorates his name. | https://en.wikipedia.org/wiki?curid=21429052 |
Tibetan astronomy The Tantra of Kalachakra is the basis of Tibetan astronomy. It explains some phenomena in a similar manner as modern astronomy science. Hence, Sun eclipse is described as the Moon passing between the Sun and the Earth. In 1318, the 3rd Karmapa received vision of Kalachakra which he used to introduce a revised system of astronomy and astrology named the "Tsurphu Tradition of Astrology" (Tibetan: Tsur-tsi) which is still used in the Karma Kagyu school for the calculation of the Tibetan calendar. | https://en.wikipedia.org/wiki?curid=21438307 |
Lipid bilayer mechanics is the study of the physical material properties of lipid bilayers, classifying bilayer behavior with stress and strain rather than biochemical interactions. Local point deformations such as membrane protein interactions are typically modelled with the complex theory of biological liquid crystals but the mechanical properties of a homogeneous bilayer are often characterized in terms of only three mechanical elastic moduli: the area expansion modulus K, a bending modulus K and an edge energy formula_1. For fluid bilayers the shear modulus is by definition zero, as the free rearrangement of molecules within plane means that the structure will not support shear stresses. These mechanical properties affect several membrane-mediated biological processes. In particular, the values of K and K affect the ability of proteins and small molecules to insert into the bilayer. Bilayer mechanical properties have also been shown to alter the function of mechanically activated ion channels. Since lipid bilayers are essentially a two dimensional structure, K is typically defined only within the plane. Intuitively, one might expect that this modulus would vary linearly with bilayer thickness as it would for a thin plate of isotropic material. In fact this is not the case and K is only weakly dependent on bilayer thickness | https://en.wikipedia.org/wiki?curid=21451766 |
Lipid bilayer mechanics The reason for this is that the lipids in a fluid bilayer rearrange easily so, unlike a bulk material where the resistance to expansion comes from intermolecular bonds, the resistance to expansion in a bilayer is a result of the extra hydrophobic area exposed to water upon pulling the lipids apart. Based on this understanding, a good first approximation of K for a monolayer is 2γ, where gamma is the surface tension of the water-lipid interface. Typically gamma is in the range of 20-50mJ/m. To calculate K for a bilayer it is necessary to multiply the monolayer value by two, since a bilayer is composed of two monolayer leaflets. Based on this calculation, the estimate of K for a lipid bilayer should be 80-200 mN/m (note: N/m is equivalent to J/m). It is not surprising given this understanding of the forces involved that studies have shown that K varies strongly with solution conditions but only weakly with tail length and unsaturation. The compression modulus is difficult to measure experimentally because of the thin, fragile nature of bilayers and the consequently low forces involved. One method utilized has been to study how vesicles swell in response to osmotic stress. This method is, however, indirect and measurements can be perturbed by polydispersity in vesicle size. A more direct method of measuring K is the pipette aspiration method, in which a single giant unilamellar vesicle (GUV) is held and stretched with a micropipette | https://en.wikipedia.org/wiki?curid=21451766 |
Lipid bilayer mechanics More recently, atomic force microscopy (AFM) has been used to probe the mechanical properties of suspended bilayer membranes, but this method is still under development. One concern with all of these methods is that, since the bilayer is such a flexible structure, there exist considerable thermal fluctuations in the membrane at many length scales down to sub-microscopic. Thus, forces initially applied to an unstressed membrane are not actually changing the lipid packing but are rather “smoothing out” these undulations, resulting in erroneous values for mechanical properties. This can be a significant source of error. Without the thermal correction typical values for Ka are 100-150 mN/m and with the thermal correction this would change to 220-270 mN/m. Bending modulus is defined as the energy required to deform a membrane from its natural curvature to some other curvature. For an ideal bilayer the intrinsic curvature is zero, so this expression is somewhat simplified. The bending modulus, compression modulus and bilayer thickness are related by formula_2 such that if two of these parameters are known the other can be calculated. This relationship derives from the fact that to bend the inner face must be compressed and the outer face must be stretched. The thicker the membrane, the more each face must deform to accommodate a given curvature (see bending moment). Many of the values for K in literature have actually been calculated from experimentally measured values of K and t | https://en.wikipedia.org/wiki?curid=21451766 |
Lipid bilayer mechanics This relation holds only for small deformations, but this is generally a good approximation as most lipid bilayers can support only a few percent strain before rupturing. Only certain classes of lipids can form bilayers. Two factors primarily govern whether a lipid will form a bilayer or not: solubility and shape. For a self assembled structure such as a bilayer to form, the lipid should have a low solubility in water, which can also be described as a low critical micelle concentration (CMC). Above the CMC, molecules will aggregate and form larger structures such as bilayers, micelles or inverted micelles. The primary factor governing which structure a given lipid forms is its shape (i.e.- its intrinsic curvature). Intrinsic curvature is defined by the ratio of the diameter of the head group to that of the tail group. For two-tailed PC lipids, this ratio is nearly one so the intrinsic curvature is nearly zero. Other headgroups such as PS and PE are smaller and the resulting diacyl (two-tailed) lipids thus have a negative intrinsic curvature. Lysolipids tend to have positive spontaneous curvature because they have one rather than two alkyl chains in the tail region. If a particular lipid has too large a deviation from zero intrinsic curvature it will not form a bilayer. Edge energy is the energy per unit length of a free edge contacting water. This can be thought of as the work needed to create a hole in the bilayer of unit length L | https://en.wikipedia.org/wiki?curid=21451766 |
Lipid bilayer mechanics The origin of this energy is the fact that creating such an interface exposes some of the lipid tails to water, which is unfavorable. formula_1 is also an important parameter in biological phenomena as it regulates the self-healing properties of the bilayer following electroporation or mechanical perforation of the cell membrane. Unfortunately, this property is both difficult to measure experimentally and to calculate. One of the major difficulties in calculation is that the structural properties of this edge are not known. The simplest model would be no change in bilayer orientation, such that the full length of the tail is exposed. This is a high energy conformation and, to stabilize this edge, it is likely that some of the lipids rearrange their head groups to point out in a curved boundary. The extent to which this occurs is currently unknown and there is some evidence that both hydrophobic (tails straight) and hydrophilic (heads curved around) pores can coexist. | https://en.wikipedia.org/wiki?curid=21451766 |
Suctorial pertains to the adaptation for sucking or suction, as possessed by marine parasites such as the Cookiecutter shark, specifically in a specialised lip organ enabling attachment to the host. organs of a different form are possessed by the Solifugae arachnids, enabling the climbing of smooth, vertical surfaces. Another variation on the suctorial organ can be found as part of the glossa proboscis of Masarinae (pollen wasps), enabling nectar feeding from the deep and narrow corolla of flowers. | https://en.wikipedia.org/wiki?curid=21469177 |
Subsidence (atmosphere) Subsidence, in the Earth's atmosphere, is most commonly caused by a low temperature. As the air cools, it becomes denser and moves towards the ground, as warm air becomes less dense and moves upwards (Atmospheric convection). Subsiding air is cold and dry and raises atmospheric pressure forming a high-pressure or anticyclonic area. Subsidence generally causes high barometric pressure as more air moves into the same space: the polar highs are areas of almost constant subsidence, as are the horse latitudes, and the areas of subsidence are the sources of much of the world's prevailing winds. Subsidence also causes many smaller-scale weather phenomena, such as morning fog. An extreme form of subsidence is a downburst, which can result in damage similar to that produced by a tornado. A milder form of subsidence is referred to as downdraft. | https://en.wikipedia.org/wiki?curid=21478909 |
Differential stress is the difference between the greatest and the least compressive stress experienced by an object. For both the geological and civil engineering convention formula_1 is the greatest compressive stress and formula_2 is the weakest, formula_3. In other engineering fields and in physics, formula_2 is the greatest compressive stress and formula_1 is the weakest, so formula_6. These conventions originated because geologists and civil engineers (especially soil mechanicians) are often concerned with failure in compression, while many other engineers are concerned with failure in tension. A further reason for the second convention is that it allows a positive stress to cause a compressible object to increase in size, making the sign convention self-consistent. In structural geology, differential stress is used to assess whether tensile or shear failure will occur when a Mohr circle (plotted using formula_1 and formula_2) touches the failure envelope of the rocks. If the differential stress is less than four times the tensile strength of the rock, then extensional failure will occur. If the differential stress is more than four times the tensile strength of the rock, then shear failure will occur. | https://en.wikipedia.org/wiki?curid=21486509 |
Epoxy putty refers to a group of room-temperature-hardening substances used as space-filling adhesives. Exact compositions vary according to manufacturer and application. They are stored until use as two components of clay-like consistency. Kneading the two components into each other creates an exothermic chemical reaction that activates the substance for use by catalyzing an epoxide polymerisation reaction. Unlike many other types of glues, an epoxy adhesive can fill gaps and even be molded into a structural part. Some makers claim in advertising that one can drill and tap their cured products, and that they quickly cure "hard as steel" (as measured by Shore rating), though they are much weaker than steel in tensile strength and shear strength. | https://en.wikipedia.org/wiki?curid=21490478 |
Subglacial channel A subglacial meltwater channel is a channel beneath an ice mass, such as ice sheets and valley glaciers, roughly parallel to the main ice flow direction. These meltwater channels can have different sizes, ranging from very small channels of a metre deep and wide to big valleys which can be up to a kilometre wide. The dimensions of these channels are regulated by several factors: water temperature, meltwater volume, debris content in the water, ice wall closure rates (governed by the ice thickness) and squeezing of fluidized sediment. In the glaciological literature three forms of subglacial meltwater channels are commonly mentioned. The first type of channel is the "R-channel" after Hans Röthlisberger who initiated work on water pressures in tubes under glaciers. These are semi-circular channels cut upward into the ice. The balance between channel enlargement by viscous heating and closure by ice deformation when the channels are water-filled reflects their size and water pressure. He stated the equation where formula_2 is the discharge, formula_3 and formula_4 the same as in Glens's Flow Law, formula_5 is the steady state pressure, formula_6 is the initial pressure, and formula_7 is the distance upstream. The second type mentioned are H-channels, after Roger Hooke. These channels are similar to R channels, cut upward into the ice that tends to follow the local bed slope but are broad and flatter than R channels | https://en.wikipedia.org/wiki?curid=21507189 |
Subglacial channel Such channels form where water flows at atmospheric pressure beneath thin ice and on steep downglacier bedslopes. The final type, the N-channel (after John Nye), are those incised into bedrock, perhaps suggesting long-term channel stability under some glaciers. | https://en.wikipedia.org/wiki?curid=21507189 |
Kramers' law is a formula for the spectral distribution of X-rays produced by an electron hitting a solid target. The formula concerns only bremsstrahlung radiation, not the element specific characteristic radiation. It is named after its discoverer, the Dutch physicist Hendrik Anthony Kramers. The formula for is usually given as the distribution of intensity (photon count) formula_1 against the wavelength formula_2 of the emitted radiation: The constant "K" is proportional to the atomic number of the target element, and formula_4 is the minimum wavelength given by the Duane–Hunt law. The maximum intensity is formula_5 at formula_6. The intensity described above is a particle flux and not an energy flux as can be seen by the fact that the integral over values from formula_4 to formula_8 is infinite. However, the integral of the energy flux is finite. To obtain a simple expression for the energy flux, first change variables from formula_2 (the wavelength) to formula_10 (the angular frequency) using formula_11 and also using formula_12. Now formula_13 is that quantity which is integrated over formula_10 from 0 to formula_15 to get the total number (still infinite) of photons, where formula_16: The energy flux, which we will call formula_18 (but which may also be referred to as the "intensity" in conflict with the above name of formula_19) is obtained by multiplying the above formula_20 by the energy formula_21: for formula_23 for formula_25. It is a linear function that is zero at the maximum energy formula_26. | https://en.wikipedia.org/wiki?curid=21508600 |
3C 286 3C 286, also known by its position as 1328+307 (B1950 coordinates) is a quasar at redshift 0.8493 with a radial velocity of 164,137 km/s. It is part of the Third Cambridge Catalogue of Radio Sources. is one of four primary calibrators used by the Very Large Array (along with 3C 48, 3C 138, and 3C 147). Visibilities of all other sources are calibrated using observed visibilities of one of these four calibrators. | https://en.wikipedia.org/wiki?curid=21525637 |
Stephan von Breuning (entomologist) Stephan von Breuning (21 November 1894 – 11 March 1983) was an Austrian entomologist who specialised in Coleoptera, particularly Cerambycidae. An amateur working on the rich collections of the Muséum national d'Histoire naturelle, he described 7894 taxa of Cerambycidae. The complete list of his entomological works has been published in the "Bulletin de la Société Sciences Nat", number 41. One of his most famous remains "Études sur les Lamiaires", published in Novitates entomologicae, 1934–1946. He gave the photo shown together with a text to be published after his death. Von Breuning lived with his wife in a small studio rue Durantin, Paris 18ème at the top of an old building. | https://en.wikipedia.org/wiki?curid=21527133 |
Eigencolloid is a term derived from the German language ("eigen": own) and used to designate colloids made of pure phases. Most often such colloids are formed by the hydrolysis of heavy metals cations or radionuclides, such as, "e.g.", Tc(OH), Th(OH), U(OH), Pu(OH), or Am(OH). Colloids have been suspected for the long-range transport of plutonium on the Nevada Test Site. | https://en.wikipedia.org/wiki?curid=21529260 |
Colloid-facilitated transport designates a transport process by which colloidal particles serve as transport vector of diverse contaminants in the surface water (sea water, lakes, rivers, fresh water bodies) and in underground water circulating in fissured rocks (limestone, sandstone, granite, ...). The transport of colloidal particles in surface soils and in the ground can also occur, depending on the soil structure, soil compaction, and the particles size, but the importance of colloidal transport was only given sufficient attention during the 1980 years. Radionuclides, heavy metals, and organic pollutants, easily sorb onto colloids suspended in water and that can easily act as contaminant carrier. Various types of colloids are recognised: inorganic colloids (clay particles, silicates, iron oxy-hydroxides, ...), organic colloids (humic and fulvic substances). When heavy metals or radionuclides form their own pure colloids, the term ""Eigencolloid"" is used to designate pure phases, e.g., Tc(OH), Th(OH), U(OH), Am(OH). Colloids have been suspected for the long range transport of plutonium on the Nevada Nuclear Test Site. They have been the subject of detailed studies for many years. However, the mobility of inorganic colloids is very low in compacted bentonites and in deep clay formations because of the process of ultrafiltration occurring in dense clay membrane. The question is less clear for small organic colloids often mixed in porewater with truly dissolved organic molecules. | https://en.wikipedia.org/wiki?curid=21529462 |
Drought rhizogenesis is an adaptive root response to drought stress. New emerging roots are short, swollen, and hairless, capable of retaining turgor pressure and resistant to prolonged desiccation. Upon rewatering, they are capable of quickly forming an absorbing root surface and hair growth. This rhizogenesis has been called a drought tolerance strategy for after-stress recovery. These drought induced short roots can be found on either both the tap root and lateral roots or on lateral roots only. These patterns are mostly likely a reflection of the plants' individual ability to maintain meristematic activity under low water potential. This morphological phenomenon was found in some families of Angiosperm dicot perennials and has been documented in Arabidopsis thaliana. | https://en.wikipedia.org/wiki?curid=21529867 |
Olav Holt (born 7 January 1935) is a Norwegian physicist. Holt was born in Hedrum, and took his dr.philos. degree in 1963. Holt is a specialist in the upper atmosphere physics and radio wave propagation in the ionosphere. Later, Holt was hired as professor at the University of Tromsø in 1969, where he served as rector from 1973 to 1977. He is a fellow of the Norwegian Academy of Technological Sciences. In 2003 he was proclaimed Knight, First Class of the Royal Norwegian Order of St. Olav. His wife is Tordis Holt and he is the father of the Norwegian bestselling author and former Minister of Justice Anne Holt, and the Cardiologist Dr.med. Even Holt. | https://en.wikipedia.org/wiki?curid=21535761 |
Hydroxy ketone In organic chemistry a hydroxy ketone (often referred to simply as a ketol) is a functional group consisting of a ketone flanked by a hydroxyl group. In the two main classes, the hydroxyl group can be placed in the alpha position (an alpha-hydroxy ketone RCR′(OH)(CO)R) or in the beta position (a beta-hydroxy ketone, RCR′(OH)CR(CO)R). | https://en.wikipedia.org/wiki?curid=21539219 |
Aestivation (botany) Aestivation or estivation is the positional arrangement of the parts of a flower within a flower bud before it has opened. Aestivation is also sometimes referred to as praefoliation or prefoliation, but these terms may also mean vernation: the arrangement of leaves within a vegetative bud. Aestivation can be an important taxonomic diagnostic; for example Malvaceae flower buds have valvate sepals, with the exception of the genera "Fremontodendron" and "Chiranthodendron", which have sometimes been misplaced as a result. The terms used to describe aestivation are the same as those used to describe leaf vernation. Classes of aestivation include: Vexillary aestivation is a special type of aestivation occurring in plants like pea. In this type of aestivation a large petal called standard encloses two smaller petals called | https://en.wikipedia.org/wiki?curid=21543302 |
Norwegian Chemical Society The () is a professional society for chemists. Formed in 1893, its purpose is to "promote the interest and understanding of chemistry and chemical technology". Chair is Kenneth Ruud, vice chair is Hans Henrik Øvrebø and board members are Øyvind Mikkelsen, Lillian Zernichow and Kari Herder Kaggerud. | https://en.wikipedia.org/wiki?curid=21547335 |
Helical boundary conditions In mathematics, helical boundary conditions are a variation on periodic boundary conditions. provide a method for determining the index of a lattice site's neighbours when each lattice site is indexed by just a single coordinate. On a lattice of dimension "d" where the lattice sites are numbered from 1 to "N" and "L" is the width (i.e. number of elements per row) of the lattice in all but the last dimension, the neighbors of site "i" are: It is not necessary that "N" = "L". make it possible to use only one coordinate to describe arbitrary-dimensional lattices. | https://en.wikipedia.org/wiki?curid=21563622 |
Hyperdeformation In nuclear physics, hyperdeformation is theoretically predicted states of an atomic nucleus with extremely elongated shape and very high angular momentum. Less elongated states, superdeformation, have been well observed, but the experimental evidence for hyperdeformation is more limited. Hyperdeformed states correspond to an axis ratio of 3:1. They would be caused by a third minimum in the potential energy surface, the second causing superdeformation and the first minimum being normal deformation. is predicted to be found in Cd. | https://en.wikipedia.org/wiki?curid=21568490 |
Kenneth H. Hunt Kenneth Henderson Hunt (1920–2002) was Foundation Professor of Engineering at Monash University in Melbourne, Australia and an expert in kinematics. Hunt was born in Seaford, East Sussex, in the United Kingdom, on 7 June 1920. He studied engineering at Balliol College, Oxford University and, during World War II, served in the Royal Engineers. After the war, he worked in the oil industry until 1949, when he took a lecturership at the University of Melbourne. He moved to Monash in 1960, at which time he was appointed Foundation Professor, and was dean of engineering there from 1961 to 1975. He is the author of "Mechanisms and Motion" (1949) and "Kinematic Geometry of Mechanisms" (1978). | https://en.wikipedia.org/wiki?curid=21572370 |
Bunkering is the supplying of fuel for use by ships, and includes the shipboard logistics of loading fuel and distributing it among available bunker tanks. The term originated in the days of steamships, when the fuel, coal, was stored in bunkers. Nowadays the term bunker is generally applied to the storage of petroleum products in tanks, and the practice and business of refueling ships. operations are located at seaports, and they include the storage of "bunker" (ship) fuels and the provision of the fuel to vessels. In many maritime contracts, such as charterparties, contracts for carriage of goods by sea, and marine insurance policies, the shipowner or ship operator is required to ensure that the ship is "seaworthy". Seaworthiness requires not only that the ship is sound and properly crewed, but also that it is fully fuelled (or "bunkered") at the start of the voyage. If the ship operator wishes to bunker "en route", this must be provided for in a written agreement, or the interruption of the voyage may be deemed to be deviation (a serious breach of contract). If the vessel runs out of fuel in mid-ocean, this is also serious breach, allowing the insurer to cancel a policy, and allowing a consignee to make a cargo claim. It may also give rise to a salvage situation. Particularly in Nigeria, "bunkering" also means the clandestine siphoning off or diverting of oil from pipelines and storage facilities. Such bunkering is often performed crudely, causing both accidents and pollution. | https://en.wikipedia.org/wiki?curid=21598045 |
Medea hypothesis The is a term coined by paleontologist Peter Ward for the anti-Gaian hypothesis that multicellular life, understood as a superorganism, is suicidal. In this view, microbial-triggered mass extinctions are attempts to return the Earth to the microbial-dominated state it has been for most of its history. The metaphor refers to the mythological Medea (representing the Earth), who kills her own children (multicellular life). Past "suicide attempts" include: The list does not include the Cretaceous–Paleogene extinction event, since this was, at least partially, externally induced by a meteor impact. Peter Ward also believes that the current man-made climate change and mass extinction event may be considered to be the most recent Medean event. As these events are anthropogenic, he postulates that Medean events are not necessarily caused by microbes, but by intelligent life as well. He believes that the final mass extinction of complex life, roughly about 500 million years in the future, will also be considered a Medean event. Plant life that still exists by then will be forced to adapt to a warming and expanding Sun, causing them to remove even more carbon dioxide from the atmosphere (which in turn will have already be due to the increasing heat from the Sun gradually speeding up the weathering process that removes them from the atmosphere), and ultimately accelerating the complete extinction of complex life by making carbon dioxide levels drop down to just 10 ppm, below which plants can no longer survive | https://en.wikipedia.org/wiki?curid=21599341 |
Medea hypothesis However, Ward simultaneously argues that intelligent life such as humans may not necessarily just trigger future Medean events, but may eventually prevent them from occurring. | https://en.wikipedia.org/wiki?curid=21599341 |
Ola Hunderi Ola David Raa Hunderi (born 4 February 1939) is a Norwegian physicist. He took the dr.philos. degree in 1970, and was appointed as professor at the Norwegian Institute of Technology in 1981. From 1987 to 1993 he was the research director at SINTEF. | https://en.wikipedia.org/wiki?curid=21612033 |
CHELPG (CHarges from ELectrostatic Potentials using a Grid-based method) is an atomic charge calculation scheme developed by Breneman and Wiberg, in which atomic charges are fitted to reproduce the molecular electrostatic potential (MESP) at a number of points around the molecule. The charge calculation methods based on fitting of molecular electrostatic potential (MESP) (including CHELPG) are not well-suitable for the treatment of larger systems, where some of the innermost atoms are located far away from the points at which the MESP is computed. In such a situation, variations of the innermost atomic charges will not lead to significant changes of the MESP outside of the molecule, which means accurate values for the innermost atomic charges are not well-determined by the MESP outside of the molecule. This problem is solved by density derived electrostatic and chemical (DDEC) methods that partition the electron density cloud in order to provide chemically meaningful net atomic charges that approximately reproduce the electrostatic potential surrounding the material. It should be remembered that atomic charges depend on the molecular conformation. The representative atomic charges for flexible molecules hence should be computed as average values over several molecular conformations | https://en.wikipedia.org/wiki?curid=21614781 |
CHELPG A number of alternative MESP charge schemes have been developed, such as those employing Connolly surfaces or geodesic point selection algorithms, in order to improve rotational invariance by increasing the point selection density and reducing anisotropies in the sampled points on the MESP surface. While is restricted to non-periodic (e.g., molecular) systems, the DDEC methods can be applied to both non-periodic and periodic materials. charges can be computed using the popular "ab initio" quantum chemical packages such as Gaussian, GAMESS-US and ORCA. | https://en.wikipedia.org/wiki?curid=21614781 |
Alan Sargeson Alan McLeod Sargeson FAA FRS (30 October 1930 – 29 December 2008) was an Australian inorganic chemist. Sargeson was born at Armidale, New South Wales, Australia. He was educated at the University of Sydney and received his Ph.D. supervised by Francis Patrick Dwyer also at Sydney in 1956. His first academic appointment was at the University of Adelaide and then in 1958 he rejoined Dwyer at the Australian National University. Sargeson was best known as a coordination chemist with an interest in bioinorganic chemistry. In early work with Dwyer and throughout his career, he studied stereochemistry. His research group investigated the reactions of amine ligands, culminating in the synthesis of the clathrochelates called "sepulchrates". He was elected a Fellow of the Royal Society (FRS) in 1983 and the Australian Academy of Science, and a corresponding member of the U.S. National Academy of Sciences. | https://en.wikipedia.org/wiki?curid=21621730 |
NIXT The NIXT, or Normal Incidence X-ray Telescope, was a sounding rocket payload flown in the 1990s by Professor Leon Golub of the Smithsonian Astrophysical Observatory, to prototype normal-incidence (conventional) optical designs in extreme ultraviolet (EUV) solar imaging. In the EUV, the surface of the Sun appears dark, and hot structures in the solar corona appear bright; this allows study of the structure and dynamics of the solar corona near the surface of the Sun, which is not possible using visible light. and its sister rocket, the MSSTA, were the prototypes for all normal-incidence EUV imaging instruments in use today, including SOHO/EIT, TRACE, and STEREO/SECCHI. In 1989, a sounding rocket launch detected soft X-Rays coming from a Solar flare. It was launched when the solar event was detected to allow high resolution imaging of the Sun's corona. Results from the observations were presented in 1990 in different papers. was launched throughout the early 1990s and a paper summarizing the results from these mission was published in 1996. A successor program to NIXT, was the TXI (Tunable XUV Imager) sounding rocket program | https://en.wikipedia.org/wiki?curid=21625898 |
Gustav Budde-Lund Gustav Henrik Andreas Budde-Lund (11 January 1846 – 19 September 1911) was a Danish invertebrate zoologist. In 1868, he co-founded the "Entomologisk Forening", alongside Rasmus William Traugott Schlick, Carl August Møller, Andreas Haas and Ivar Frederik Christian Ammitzbøll. He was a student of entomologist J. C. Schiødte, and became a leading authority on terrestrial isopods (woodlice, pill bugs and relatives), describing over 70 genera and around 500 species. He married in 1875 and in 1885 produced his seminal work "Crustacea Isopoda terrestria". The woodlouse genus "Buddelundiella" was named in Budde-Lund's honour by Filippo Silvestri in 1897. | https://en.wikipedia.org/wiki?curid=21633243 |
Whole genome sequencing is ostensibly the process of determining the complete DNA sequence of an organism's genome at a single time. This entails sequencing all of an organism's chromosomal DNA as well as DNA contained in the mitochondria and, for plants, in the chloroplast. In practice, genome sequences that are nearly complete are also called whole genome sequences. has largely been used as a research tool, but was being introduced to clinics in 2014. In the future of personalized medicine, whole genome sequence data may be an important tool to guide therapeutic intervention. The tool of gene sequencing at SNP level is also used to pinpoint functional variants from association studies and improve the knowledge available to researchers interested in evolutionary biology, and hence may lay the foundation for predicting disease susceptibility and drug response. should not be confused with DNA profiling, which only determines the likelihood that genetic material came from a particular individual or group, and does not contain additional information on genetic relationships, origin or susceptibility to specific diseases. In addition, whole genome sequencing should not be confused with methods that sequence specific subsets of the genome - such methods include whole exome sequencing (1-2% of the genome) or SNP genotyping (<0.1% of the genome). As of 2017 there were no complete genomes for any mammals, including humans. Between 4% to 9% of the human genome, mostly satellite DNA, had not been sequenced | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing It is also known as WGS, full genome sequencing, complete genome sequencing, or entire genome sequencing. The DNA sequencing methods used in the 1970s and 1980s were manual, for example Maxam-Gilbert sequencing and Sanger sequencing. The shift to more rapid, automated sequencing methods in the 1990s finally allowed for sequencing of whole genomes. The first organism to have its entire genome sequenced was "Haemophilus influenzae" in 1995. After it, the genomes of other bacteria and some archaea were first sequenced, largely due to their small genome size. "H. influenzae" has a genome of 1,830,140 base pairs of DNA. In contrast, eukaryotes, both unicellular and multicellular such as "Amoeba dubia" and humans ("Homo sapiens") respectively, have much larger genomes (see C-value paradox). "Amoeba dubia" has a genome of 700 billion nucleotide pairs spread across thousands of chromosomes. Humans contain fewer nucleotide pairs (about 3.2 billion in each germ cell - note the exact size of the human genome is still being revised) than "A. dubia" however their genome size far outweighs the genome size of individual bacteria. The first bacterial and archaeal genomes, including that of "H. influenzae", were sequenced by Shotgun sequencing. In 1996 the first eukaryotic genome ("Saccharomyces cerevisiae") was sequenced. "S. cerevisiae", a model organism in biology has a genome of only around 12 million nucleotide pairs, and was the first "unicellular" eukaryote to have its whole genome sequenced | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing The first "multicellular" eukaryote, and animal, to have its whole genome sequenced was the nematode worm: "Caenorhabditis elegans" in 1998. Eukaryotic genomes are sequenced by several methods including Shotgun sequencing of short DNA fragments and sequencing of larger DNA clones from DNA libraries such as bacterial artificial chromosomes (BACs) and yeast artificial chromosomes (YACs). In 1999, the entire DNA sequence of human chromosome 22, the shortest human autosome, was published. By the year 2000, the second animal and second invertebrate (yet first insect) genome was sequenced - that of the fruit fly "Drosophila melanogaster" - a popular choice of model organism in experimental research. The first plant genome - that of the model organism "Arabidopsis thaliana" - was also fully sequenced by 2000. By 2001, a draft of the entire human genome sequence was published. The genome of the laboratory mouse "Mus musculus" was completed in 2002. In 2004, the Human Genome Project published an incomplete version of the human genome. In 2008, a group from Leiden, The Netherlands, reported the sequencing of the first female human genome (Marjolein Kriek). Currently thousands of genomes have been wholly or partially sequenced. Almost any biological sample containing a full copy of the DNA—even a very small amount of DNA or ancient DNA—can provide the genetic material necessary for full genome sequencing | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing Such samples may include saliva, epithelial cells, bone marrow, hair (as long as the hair contains a hair follicle), seeds, plant leaves, or anything else that has DNA-containing cells. The genome sequence of a single cell selected from a mixed population of cells can be determined using techniques of "single cell genome sequencing". This has important advantages in environmental microbiology in cases where a single cell of a particular microorganism species can be isolated from a mixed population by microscopy on the basis of its morphological or other distinguishing characteristics. In such cases the normally necessary steps of isolation and growth of the organism in culture may be omitted, thus allowing the sequencing of a much greater spectrum of organism genomes. Single cell genome sequencing is being tested as a method of preimplantation genetic diagnosis, wherein a cell from the embryo created by in vitro fertilization is taken and analyzed before embryo transfer into the uterus. After implantation, cell-free fetal DNA can be taken by simple venipuncture from the mother and used for whole genome sequencing of the fetus. Sequencing of nearly an entire human genome was first accomplished in 2000 partly through the use of shotgun sequencing technology. While full genome shotgun sequencing for small (4000–7000 base pair) genomes was already in use in 1979, broader application benefited from pairwise end sequencing, known colloquially as "double-barrel shotgun sequencing" | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing As sequencing projects began to take on longer and more complicated genomes, multiple groups began to realize that useful information could be obtained by sequencing both ends of a fragment of DNA. Although sequencing both ends of the same fragment and keeping track of the paired data was more cumbersome than sequencing a single end of two distinct fragments, the knowledge that the two sequences were oriented in opposite directions and were about the length of a fragment apart from each other was valuable in reconstructing the sequence of the original target fragment. The first published description of the use of paired ends was in 1990 as part of the sequencing of the human HPRT locus, although the use of paired ends was limited to closing gaps after the application of a traditional shotgun sequencing approach. The first theoretical description of a pure pairwise end sequencing strategy, assuming fragments of constant length, was in 1991. In 1995 the innovation of using fragments of varying sizes was introduced, and demonstrated that a pure pairwise end-sequencing strategy would be possible on large targets. The strategy was subsequently adopted by The Institute for Genomic Research (TIGR) to sequence the entire genome of the bacterium "Haemophilus influenzae" in 1995, and then by Celera Genomics to sequence the entire fruit fly genome in 2000, and subsequently the entire human genome | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing Applied Biosystems, now called Life Technologies, manufactured the automated capillary sequencers utilized by both Celera Genomics and The Human Genome Project. While capillary sequencing was the first approach to successfully sequence a nearly full human genome, it is still too expensive and takes too long for commercial purposes. Since 2005 capillary sequencing has been progressively displaced by high-throughput (formerly "next-generation") sequencing technologies such as Illumina dye sequencing, pyrosequencing, and SMRT sequencing. All of these technologies continue to employ the basic shotgun strategy, namely, parallelization and template generation via genome fragmentation. Other technologies are emerging, including nanopore technology. Though nanopore sequencing technology is still being refined, its portability and potential capability of generating long reads are of relevance to whole-genome sequencing applications. In principle, full genome sequencing can provide the raw nucleotide sequence of an individual organism's DNA. However, further analysis must be performed to provide the biological or medical meaning of this sequence, such as how this knowledge can be used to help prevent disease. Methods for analysing sequencing data are being developed and refined. Because sequencing generates a lot of data (for example, there are approximately six billion base pairs in each human diploid genome), its output is stored electronically and requires a large amount of computing power and storage capacity | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing While analysis of WGS data can be slow, it is possible to speed up this step by using dedicated hardware. A number of public and private companies are competing to develop a full genome sequencing platform that is commercially robust for both research and clinical use, including Illumina, Knome, Sequenom, 454 Life Sciences, Pacific Biosciences, Complete Genomics, Helicos Biosciences, GE Global Research (General Electric), Affymetrix, IBM, Intelligent Bio-Systems, Life Technologies, Oxford Nanopore Technologies, and the Beijing Genomics Institute. These companies are heavily financed and backed by venture capitalists, hedge funds, and investment banks. A commonly-referenced commercial target for sequencing cost until the late 2010s was $1,000, however, the private companies are working to reach a new target of only $100. In October 2006, the X Prize Foundation, working in collaboration with the J. Craig Venter Science Foundation, established the Archon X Prize for Genomics, intending to award $10 million to "the first team that can build a device and use it to sequence 100 human genomes within 10 days or less, with an accuracy of no more than one error in every 1,000,000 bases sequenced, with sequences accurately covering at least 98% of the genome, and at a recurring cost of no more than $1,000 per genome". The Archon X Prize for Genomics was cancelled in 2013, before its official start date. In 2007, Applied Biosystems started selling a new type of sequencer called SOLiD System | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing The technology allowed users to sequence 60 gigabases per run. In June 2009, Illumina announced that they were launching their own Personal Full Genome Sequencing Service at a depth of 30× for $48,000 per genome. In August, the founder of Helicos Biosciences, Stephen Quake, stated that using the company's Single Molecule Sequencer he sequenced his own full genome for less than $50,000. In November, Complete Genomics published a peer-reviewed paper in "Science" demonstrating its ability to sequence a complete human genome for $1,700. In May 2011, Illumina lowered its Full Genome Sequencing service to $5,000 per human genome, or $4,000 if ordering 50 or more. Helicos Biosciences, Pacific Biosciences, Complete Genomics, Illumina, Sequenom, ION Torrent Systems, Halcyon Molecular, NABsys, IBM, and GE Global appear to all be going head to head in the race to commercialize full genome sequencing. With sequencing costs declining, a number of companies began claiming that their equipment would soon achieve the $1,000 genome: these companies included Life Technologies in January 2012, Oxford Nanopore Technologies in February 2012, and Illumina in February 2014. In 2015, the NHGRI estimated the cost of obtaining a whole-genome sequence at around $1,500. In 2016, Veritas Genetics began selling whole genome sequencing, including a report as to some of the information in the sequencing for $999. In summer 2019 Veritas Genetics cut the cost for WGS to $599. In 2017, BGI began offering WGS for $600 | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing However, in 2015 some noted that effective use of whole gene sequencing can cost considerably more than $1000. Also, reportedly there remain parts of the human genome that have not been fully sequenced by 2017. Full genome sequencing provides information on a genome that is orders of magnitude larger than by DNA arrays, the previous leader in genotyping technology. For humans, DNA arrays currently provide genotypic information on up to one million genetic variants, while full genome sequencing will provide information on all six billion bases in the human genome, or 3,000 times more data. Because of this, full genome sequencing is considered a disruptive innovation to the DNA array markets as the accuracy of both range from 99.98% to 99.999% (in non-repetitive DNA regions) and their consumables cost of $5000 per 6 billion base pairs is competitive (for some applications) with DNA arrays ($500 per 1 million basepairs). has established the mutation frequency for whole human genomes. The mutation frequency in the whole genome between generations for humans (parent to child) is about 70 new mutations per generation. An even lower level of variation was found comparing whole genome sequencing in blood cells for a pair of monozygotic (identical twins) 100-year-old centenarians. Only 8 somatic differences were found, though somatic variation occurring in less than 20% of blood cells would be undetected. In the specifically protein coding regions of the human genome, it is estimated that there are about 0 | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing 35 mutations that would change the protein sequence between parent/child generations (less than one mutated protein per generation). In cancer, mutation frequencies are much higher, due to genome instability. This frequency can further depend on patient age, exposure to DNA damaging agents (such as UV-irradiation or components of tobacco smoke) and the activity/inactivity of DNA repair mechanisms. Furthermore, mutation frequency can vary between cancer types: in germline cells, mutation rates occur at approximately 0.023 mutations per megabase, but this number is much higher in breast cancer (1.18-1.66 somatic mutations per Mb), in lung cancer (17.7) or in melanomas (≈33). Since the haploid human genome consists of approximately 3,200 megabases, this translates into about 74 mutations (mostly in noncoding regions) in germline DNA per generation, but 3,776-5,312 somatic mutations per haploid genome in breast cancer, 56,640 in lung cancer and 105,600 in melanomas. The distribution of somatic mutations across the human genome is very uneven, such that the gene-rich, early-replicating regions receive fewer mutations than gene-poor, late-replicating heterochromatin, likely due to differential DNA repair activity. In particular, the histone modification H3K9me3 is associated with high, and H3K36me3 with low mutation frequencies | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing In research, whole-genome sequencing can be used in a Genome-Wide Association Study (GWAS) - a project aiming to determine the genetic variant or variants associated with a disease or some other phenotype. In 2009, Illumina released its first whole genome sequencers that were approved for clinical as opposed to research-only use and doctors at academic medical centers began quietly using them to try to diagnose what was wrong with people whom standard approaches had failed to help. The price to sequence a genome at that time was US$19,500, which was billed to the patient but usually paid for out of a research grant; one person at that time had applied for reimbursement from their insurance company. For example, one child had needed around 100 surgeries by the time he was three years old, and his doctor turned to whole genome sequencing to determine the problem; it took a team of around 30 people that included 12 bioinformatics experts, three sequencing technicians, five physicians, two genetic counsellors and two ethicists to identify a rare mutation in the XIAP that was causing widespread problems. Due to recent cost reductions (see above) whole genome sequencing has become a realistic application in DNA diagnostics. In 2013, the 3Gb-TEST consortium obtained funding from the European Union to prepare the health care system for these innovations in DNA diagnostics. Quality assessment schemes, Health technology assessment and guidelines have to be in place | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing The 3Gb-TEST consortium has identified the analysis and interpretation of sequence data as the most complicated step in the diagnostic process. At the Consortium meeting in Athens in September 2014, the Consortium coined the word "genotranslation" for this crucial step. This step leads to a so-called "genoreport". Guidelines are needed to determine the required content of these reports. Currently available newborn screening for childhood diseases allows detection of rare disorders that can be prevented or better treated by early detection and intervention. Specific genetic tests are also available to determine an etiology when a child's symptoms appear to have a genetic basis. Full genome sequencing, in addition has the potential to reveal a large amount of information (such as carrier status for autosomal recessive disorders, genetic risk factors for complex adult-onset diseases, and other predictive medical and non-medical information) that is currently not completely understood, may not be clinically useful to the child during childhood, and may not necessarily be wanted by the individual upon reaching adulthood. Genomes2People (G2P), an initiative of Brigham and Women's Hospital and Harvard Medical School was created in 2011 to examine the integration of genomic sequencing into clinical care of adults and children. G2P's director, Robert C | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing Green, had previously led the REVEAL study — Risk Evaluation and Education for Alzheimer's Disease – a series of clinical trials exploring patient reactions to the knowledge of their genetic risk for Alzheimer's. In 2018, researchers at Rady Children's Institute for Genomic Medicine in San Diego, CA determined that rapid whole-genome sequencing (rWGS) can diagnose genetic disorders in time to change acute medical or surgical management (clinical utility) and improve outcomes in acutely ill infants. The researchers reported a retrospective cohort study of acutely ill inpatient infants in a regional children's hospital from July 2016-March 2017. Forty-two families received rWGS for etiologic diagnosis of genetic disorders. The diagnostic sensitivity of rWGS was 43% (eighteen of 42 infants) and 10% (four of 42 infants) for standard genetic tests (P = .0005). The rate of clinical utility of rWGS (31%, thirteen of 42 infants) was significantly greater than for standard genetic tests (2%, one of 42; P = .0015). Eleven (26%) infants with diagnostic rWGS avoided morbidity, one had a 43% reduction in likelihood of mortality, and one started palliative care. In six of the eleven infants, the changes in management reduced inpatient cost by $800,000-$2,000,000. These findings replicate a prior study of the clinical utility of rWGS in acutely ill inpatient infants, and demonstrate improved outcomes and net healthcare savings. rWGS merits consideration as a first tier test in this setting | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing The introduction of whole genome sequencing may have ethical implications. On one hand, genetic testing can potentially diagnose preventable diseases, both in the individual undergoing genetic testing and in their relatives. On the other hand, genetic testing has potential downsides such as genetic discrimination, loss of anonymity, and psychological impacts such as discovery of non-paternity. Some ethicists insist that the privacy of individuals undergoing genetic testing must be protected. Indeed, privacy issues can be of particular concern when minors undergo genetic testing. Illumina's CEO, Jay Flatley, claimed in February 2009 that "by 2019 it will have become routine to map infants' genes when they are born". This potential use of genome sequencing is highly controversial, as it runs counter to established ethical norms for predictive genetic testing of asymptomatic minors that have been well established in the fields of medical genetics and genetic counseling. The traditional guidelines for genetic testing have been developed over the course of several decades since it first became possible to test for genetic markers associated with disease, prior to the advent of cost-effective, comprehensive genetic screening. When an individual undergoes whole genome sequencing, they reveal information about not only their own DNA sequences, but also about probable DNA sequences of their close genetic relatives | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing This information can further reveal useful predictive information about relatives' present and future health risks. Hence, there are important questions about what obligations, if any, are owed to the family members of the individuals who are undergoing genetic testing. In Western/European society, tested individuals are usually encouraged to share important information on any genetic diagnoses with their close relatives, since the importance of the genetic diagnosis for offspring and other close relatives is usually one of the reasons for seeking a genetic testing in the first place. Nevertheless, a major ethical dilemma can develop when the patients refuse to share information on a diagnosis that is made for serious genetic disorder that is highly preventable and where there is a high risk to relatives carrying the same disease mutation. Under such circumstances, the clinician may suspect that the relatives would rather know of the diagnosis and hence the clinician can face a conflict of interest with respect to patient-doctor confidentiality. Privacy concerns can also arise when whole genome sequencing is used in scientific research studies. Researchers often need to put information on patient's genotypes and phenotypes into public scientific databases, such as locus specific databases. Although only anonymous patient data are submitted to locus specific databases, patients might still be identifiable by their relatives in the case of finding a rare disease or a rare missense mutation | https://en.wikipedia.org/wiki?curid=21647820 |
Whole genome sequencing Public discussion around the introduction of advanced forensic techniques (such as advanced familial searching using public DNA ancestry websites and DNA phenotyping approaches) has been limited, disjointed, and unfocused. As forensic genetics and medical genetics converge toward genome sequencing, issues surrounding genetic data become increasingly connected, and additional legal protections may need to be established. The first nearly complete human genomes sequenced were two Americans of predominantly Northwestern European ancestry in 2007 (J. Craig Venter at 7.5-fold coverage, and James Watson at 7.4-fold). This was followed in 2008 by sequencing of an anonymous Han Chinese man (at 36-fold), a Yoruban man from Nigeria (at 30-fold), a female clinical geneticist (Marjolein Kriek) from the Netherlands (at 7 to 8-fold), and a female caucasian Leukemia patient (at 33 and 14-fold coverage for tumor and normal tissues). Steve Jobs was among the first 20 people to have their whole genome sequenced, reportedly for the cost of $100,000. , there were 69 nearly complete human genomes publicly available. In November 2013, a Spanish family made their personal genomics data publicly available under a Creative Commons public domain license. The work was led by Manuel Corpas and the data obtained by direct-to-consumer genetic testing with 23andMe and the Beijing Genomics Institute). This is believed to be the first such Public Genomics dataset for a whole family. | https://en.wikipedia.org/wiki?curid=21647820 |
Gametophore The word gametophore, also known as gametangiophore, is composed of gametangium and "phore" (Greek Φορά, "to be carried"). In moss and fern (Archegoniata) the gametophore is the bearer of the sex organs (gametangia), the female archegonia and the male antheridia. If both the archegonia and antheridia are arranged at the same plant, they are called monoicous. If there are female and male plants they are called dioicous. In Bryopsida the leafy moss plant (q. v. "Thallus") is called gametophore. It is the adult form of the haploid gametophyte and develops from the juvenile form, the protonema, under the influence of phytohormones (mainly cytokinins). Whereas the filamentous protonema grows by apical cell division, the gametophore grows by division of three faced apical cells. | https://en.wikipedia.org/wiki?curid=21686918 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.