text stringlengths 11 1.65k | source stringlengths 38 44 |
|---|---|
Transfer DNA The transfer DNA (abbreviated T-DNA) is the transferred DNA of the tumor-inducing (Ti) plasmid of some species of bacteria such as "Agrobacterium tumefaciens" and "Agrobacterium rhizogenes(actually an Ri plasmid)". The T-DNA is transferred from bacterium into the host plant's nuclear DNA genome. The capability of this specialized tumor-inducing (Ti) plasmid is attributed to two essential regions required for DNA transfer to the host cell. As the T-DNA is bordered by 25-base-pair repeats on each end. Transfer is initiated at the right border and terminated at the left border and requires the "vir" genes of the Ti plasmid. The bacterial T-DNA is about 24,000 base pairs long and contains genes that code for enzymes synthesizing opines and phytohormones. By transferring the T-DNA into the plant genome, the bacterium essentially reprograms the plant cells to grow into a tumor and produce a unique food source for the bacteria. The synthesis of the plant hormones auxin and cytokinin by enzymes encoded in the T-DNA enables the plant cell to grow uncontrollably, thus forming the crown gall tumors typically induced by "Agrobacterium tumefaciens" infection. Whereas "Agrobacterium rhizogenes" causes hairy root disease. The opines are amino acid derivatives used by the bacterium as a source of carbon and energy | https://en.wikipedia.org/wiki?curid=779824 |
Transfer DNA This natural process of horizontal gene transfer in plants is being utilized as a tool for fundamental and applied research in plant biology through "Agrobacterium tumefaciens" mediated foreign gene transformation and insertional mutagenesis. "Agrobacterium"-mediated T-DNA transfer is widely used as a tool in biotechnology. For more than two decades, "Agrobacterium tumefaciens" has been exploited for introducing genes into plants for basic research as well as for commercial production of transgenic crops. In genetic engineering, the tumor-promoting and opine-synthesis genes are removed from the T-DNA and replaced with a gene of interest and/or a selection marker, which is required to establish which plants have been successfully transformed. Examples of selection markers include neomycin phosphotransferase, hygromycin B phosphotransferase (which both phosphorylate antibiotics) and phosphinothricin acetyltransferase (which acetylates and deactivates phosphinothricin, a potent inhibitor of glutamine synthetase) or a herbicide formulations such as Basta or Bialophos. Another selection system that can be employed is usage of metabolic markers such as phospho-mannose isomerase. "Agrobacterium" is then used as a vector to transfer the engineered T-DNA into the plant cells where it integrates into the plant genome. This method can be used to generate transgenic plants carrying a foreign gene | https://en.wikipedia.org/wiki?curid=779824 |
Transfer DNA "Agrobacterium tumefaciens" is capable of transferring foreign DNA to both monocotyledons and dicotyledonous plants efficiently while taking care of critically important factors like the genotype of plants, types and ages of tissues inoculated, kind of vectors, strains of "Agrobacterium", selection marker genes and selective agents, and various conditions of tissue culture. The infection process of T-DNA into the host cell and integration into its nucleus involve multiple steps. First, the bacteria multiply in the wound sap before infection and then attach to the plant cell walls. The bacterial virulence genes expression of approximately 10 operons is activated by perception of phenolic compounds such as acetosyringone emitted by wounded plant tissue and follows cell-cell contact. Then this process succeeds the macromolecular translocation from "Agrobacterium" to cytoplasm of host cell, transmission of T-DNA along with associated proteins (called T-complex) to the host cell nucleus followed by disassembly of the T-complex, stable integration of T-DNA into host plant genome, and expression of the transferred genes. The integration of T-DNA into a host genome involves the formation of a nick at the right border of the Ti plasmid. This nick creates a region of single stranded DNA from the left border of the T-DNA gene over to the right border which was cut. Then, single stranded binding proteins attach to the single stranded DNA | https://en.wikipedia.org/wiki?curid=779824 |
Transfer DNA DNA synthesis displaces the single stranded region and then a second nick at the left border region releases the single stranded T-DNA fragment. Further this fragment can be incorporated into a host genome. "Agrobacterium" has been known to evolve a control system that hijacks host factors and cellular processes for several pathways of host-plant defense response to invade the host cell nucleus. For the integration of T-DNA into the target host genome, "Agrobacterium" carries out multiple interactions with host-plant factors. To interact with host plant proteins many "Agrobacterium" virulence proteins encoded by vir genes. "Agrobacterium" vir genes expression occurs via the VirA-VirG sensor that results in generation of a mobile single-stranded T-DNA copy (T-strand). Processed form of VirB2 is the major component of the T-complex that is required for transformation. VirD2 is the protein that caps the 5′ end of the transferred T-strand by covalent attachment and is transported to the host cell cytoplasm. VirE2 is the single-stranded DNA binding protein that presumably coats the T- strand in the host cytoplasm by cooperative binding. It is then directed into the nucleus via interactions with the host cell proteins such as importin a, bacterial VirE3, and dynein-like proteins. Several other bacterial virulence effectors like VirB5, VirB7 (the minor components of the T-complex), VirD5, VirE2, VirE3, and VirF that may also interact with proteins of host plant cells | https://en.wikipedia.org/wiki?curid=779824 |
Transfer DNA The same procedure of T-DNA transfer can be used to disrupt genes via insertional mutagenesis. Not only does the inserted T-DNA sequence create a mutation but it also 'tags' the affected gene, thus allowing for its isolation. A reporter gene can be linked to the right end of the T-DNA to be transformed along with a plasmid replicon and a selectable antibiotic (such as hygromycin)-resistance gene and can explicit approximately 30% of average efficiency having successful T-DNA inserts induced gene fusions in "Arabidopsis thaliana". Reverse genetics is usually followed as a functional genomics approach based on the dynamic of biological system that aims to assign a function(s) to genes and determines how genes and their products interact together. Transgenics involving screening of populations by T-DNA insertional mutagenesis is one of the powerful ways to study loss of function and ectopic expression to assign that function to the gene under investigation. Hence, this method is used widely to study gene function in plants, such as the model plant "Arabidopsis thaliana". While studying T-DNA insertion for gene disruption "Arabidopsis thaliana" mutants may also show different phenotypic classes like seedling-lethals, size variants, pigment, embryo-defective, reduced-fertility, dramatic (morphological), and physiological. For example, this gene disruption strategy used for assigning functions to genes defined only by sequence helped to demonstrate En-1 insertion in the flavanone 3-hydroxylase gene (F3H) | https://en.wikipedia.org/wiki?curid=779824 |
Transfer DNA Knock-out alleles of En-1 insertion in the flavonol synthase gene (FLS) were characterized by this reverse genetics approach that drastically reduced levels of kaempferol. | https://en.wikipedia.org/wiki?curid=779824 |
Amguid crater Amguid is a meteorite crater in Algeria. It is approximately in diameter, approximately 65 m deep and the age is estimated to be less than 100,000 years and is probably Pleistocene. The crater is exposed at the surface. Crater was discovered by Europeans in 1948, first scientific description was made by Jean-Phillippe Lefranc in 1969. | https://en.wikipedia.org/wiki?curid=779866 |
BP Structure The BP Structure, also known as Gebel Dalma, is an exposed impact crater in Libya. It is so called because it was identified by a BP (then British Petroleum) geological survey team. The crater is 2 km in diameter and its age is estimated to be less than 120 million years (Lower Cretaceous or later). | https://en.wikipedia.org/wiki?curid=779903 |
Many-body theory (or many-body physics) is an area of physics which provides the framework for understanding the collective behavior of large numbers of interacting particles, often on the order of Avogadro's number. In general terms, many-body theory deals with effects that manifest themselves only in systems containing large numbers of constituents. While the underlying physical laws that govern the motion of each individual particle may (or may not) be simple, the study of the collection of particles can be extremely complex. In some cases emergent phenomena may arise which bear little resemblance to the underlying elementary laws. plays a central role in condensed matter physics. | https://en.wikipedia.org/wiki?curid=784621 |
Endogeny (biology) Endogenous substances and processes are those that originate from within a system such as an organism, tissue, or cell. The term is chiefly used in biology but also in other fields. Endogenous substances and processes contrast with exogenous ones, such as drugs, which originate from outside of the organism. Cell signalling systems such as hormone and neurotransmitter systems use endogenous substances. Endogenous substances can regulate sleep. Examples of endogenous substances, and systems that use them, include Endogenous transcription factors are those manufactured by the cell, as distinguished from cloned transcription factors. Endogenous substances typically have some physiological utility, but they can also be pathologically endogenous. For example, in auto-brewery syndrome, ethanol is endogenously produced within the digestive system through endogenous fermentation of sugars. Endogeneity can, in some biological systems (particularly with viruses and prokaryotes), pertain to DNA incorporated (endogenized) into the organism. However, because of homeostasis, discerning between internal and external influences is often difficult. Endogenous viral elements, which are DNA sequences derived from viruses that are ancestrally inserted into the genomes of germ cells. These sequences, which may be fragments of viruses or entire viral genomes (proviruses), can persist in the germline, being passed on from one generation to the next as host alleles. Endogenous retroviruses are a type of endogenous viral element | https://en.wikipedia.org/wiki?curid=790808 |
Endogeny (biology) Endogenous effects can modulate and regulate systems, in conjunction with environmental influences. Endogeny can refer to changes that originate from within a system. Endogenous changes can occur in social systems and can be modelled by Marxian dialectics. Orthogenesis is a similar concept to endogeny but refers to changes within separate systems that results in their evolution along similar paths. The concept of orthogenesis has never been widely favored in evolutionary biology. Examples of endogenous processes include: Endogenous processes can also be pathological. For example, endogenous depression is an atypical type of depression caused by internal effects, such as cognitive and biological stressors. All processes that take place inside Earth (and other planets) are considered endogenous. They make the continents migrate, push the mountains up, and trigger earthquakes and volcanism. Endogenous processes are driven by the warmth that is produced in the core of Earth by radioactivity and gravity. An emotion or behavior is endogenous if it is spontaneously generated from an individual's internal state. A variable is called endogenous if it is explained within the model in which it appears. For example, in a supply and demand model of an agricultural market, the price and quantity of trade would be the endogenous variables explained by the model; changes in the weather or in consumer tastes would be exogenous variables that might shift the supply and demand curves | https://en.wikipedia.org/wiki?curid=790808 |
Endogeny (biology) In political science, something is endogenous if it is actually the result of the action for which it is generally labeled the cause. For example, ethnic violence is generally thought to be caused by ethnic division. However, endogeny would say that ethnic divisions are a result of ethnic violence. | https://en.wikipedia.org/wiki?curid=790808 |
List of geoscience organizations This is a list of organizations dealing with the various geosciences, including geology, geophysics, hydrology, oceanography, petrophysics, and related fields. | https://en.wikipedia.org/wiki?curid=794561 |
Radio noise source A radio noise source is a device that emits radio waves at a certain frequency, used to calibrate radio telescopes such that received data may be compared to a known value, as well as to find the focal point of a telescope soon after construction, so that the wave guide and front end may be properly located. | https://en.wikipedia.org/wiki?curid=794959 |
Noise floor In signal theory, the noise floor is the measure of the signal created from the sum of all the noise sources and unwanted signals within a measurement system, where noise is defined as any signal other than the one being monitored. In radio communication and electronics, this may include thermal noise, black body, cosmic noise as well as atmospheric noise from distant thunderstorms and similar and any other unwanted man-made signals, sometimes referred to as incidental noise. If the dominant noise is generated within the measuring equipment (for example by a receiver with a poor noise figure) then this is an example of an instrumentation noise floor, as opposed to a physical noise floor. These terms are not always clearly defined, and are sometimes confused. Avoiding interference between electrical systems is the distinct subject of electromagnetic compatibility (EMC). In a measurement system such as a seismograph, the physical noise floor may be set by the incidental noise, and may include nearby foot traffic or a nearby road. The noise floor limits the smallest measurement that can be taken with certainty since any measured amplitude can on average be no less than the noise floor. A common way to lower the noise floor in electronics systems is to cool the system to reduce thermal noise, when this is the major noise source. In special circumstances, the noise floor can also be artificially lowered with digital signal processing techniques | https://en.wikipedia.org/wiki?curid=795170 |
Noise floor Signals that are below the noise floor can be detected by using different techniques of spread spectrum communications, where signal of a particular information bandwidth is deliberately spread in the frequency domain resulting in a signal with a wider occupied bandwidth. | https://en.wikipedia.org/wiki?curid=795170 |
CLIVAR (climate variability and predictability) is a component of the World Climate Research Programme. Its purpose is to describe and understand climate variability and predictability on seasonal to centennial time-scales, identify the physical processes responsible for climate change and develop modeling and predictive capabilities for climate modelling. The following is an approximate timeline of and its precedents: has a number of panels and working groups based on the study of climate variability and predictability of different components of the global climate system. has three global panels: Regional panels focus on specific aspects of the climate system. Since the different regions of the ocean are qualitatively different, and given the important role of the oceans in controlling climate over the interannual, decadal, and centennial scales considered by CLIVAR, the subdivision into panels is largely based on regions of the ocean system. Specifically, the following is the list of regional panels: There are four national programmes, that run largely autonomously but contribute to the international program: | https://en.wikipedia.org/wiki?curid=797882 |
HyperPhysics is an educational website about physics topics. The information architecture of the website is based on HyperCard, the platform on which the material was originally developed, and a thesaurus organization, with thousands of controlled links and usual trees organizing topics from general to specific. It also exploits concept maps to facilitate smooth navigation. "HyperPhysics" is hosted by Georgia State University and authored by Georgia State faculty member Dr. Rod Nave. Various teaching and education facilitators make use of "HyperPhysics" material through projects and organizations, and also publishers which use SciLinks. Various areas of physics are accessible through broad categories. Related applied mathematics are also covered. | https://en.wikipedia.org/wiki?curid=799988 |
Seligman Crystal The is an award of the International Glaciological Society. The prize is "awarded from time to time to one who has made an outstanding scientific contribution to glaciology so that the subject is now enriched" and named after Gerald Seligman. Source: International Glaciological Society | https://en.wikipedia.org/wiki?curid=800775 |
Claude Lorius He has taken part in more than 20 polar expeditions, mostly to Antarctica, and has helped organise many international collaborations, notably the Vostok Station ice core. He was instrumental in the discovery and interpretation of the palaeo-atmosphere information within ice cores. | https://en.wikipedia.org/wiki?curid=800837 |
Andrey Arkhangelsky Andrey Dmitriyevich Arkhangelsky () (December 8, 1879 – June 16, 1940) was a Russian geologist. He was a professor at Moscow State University. He won the Lenin Prize in 1928. A crater on Mars and Arkhangel'skiy Nunataks in Antarctica are named after him. | https://en.wikipedia.org/wiki?curid=801754 |
Valeri Barsukov Valeri Leonidovich Barsukov (Валерий Леонидович Барсуков) (March 14, 1928 – July 22, 1992) was a Soviet geologist. He worked in comparative planetology and the geochemistry of space. He was director of the V. I. Vernadsky Institute of Geochemistry from 1976 to 1992. In 1987 he received the V.I. Vernadsky Gold Medal for his work. A crater on Mars was named after him. | https://en.wikipedia.org/wiki?curid=803380 |
Helicase-dependent amplification (HDA) is a method for "in vitro" DNA amplification (like the polymerase chain reaction) that takes place at a constant temperature. The polymerase chain reaction is the most widely used method for "in vitro" DNA amplification for purposes of molecular biology and biomedical research. This process involves the separation of the double-stranded DNA in high heat into single strands (the denaturation step, typically achieved at 95–97 °C), annealing of the primers to the single stranded DNA (the annealing step) and copying the single strands to create new double-stranded DNA (the extension step that requires the DNA polymerase) requires the reaction to be done in a thermal cycler. These bench-top machines are large, expensive and costly to run and maintain, limiting the potential applications of DNA amplification in situations outside the laboratory (e.g., in the identification of potentially hazardous micro-organisms at the scene of investigation, or at the point of care of a patient). Although PCR is usually associated with thermal cycling, the original patent by Mullis et al. disclosed the use of a helicase as a means for denaturation of double stranded DNA thereby including isothermal nucleic acid amplification. "In vivo", DNA is replicated by DNA polymerases with various accessory proteins, including a DNA helicase that acts to separate the DNA by unwinding the DNA double helix. HDA was developed from this concept, using a helicase (an enzyme) to denature the DNA | https://en.wikipedia.org/wiki?curid=803558 |
Helicase-dependent amplification Strands of double stranded DNA are first separated by a DNA helicase and coated by single stranded DNA (ssDNA)-binding proteins. In the second step, two sequence specific primers hybridise to each border of the DNA template. DNA polymerases are then used to extend the primers annealed to the templates to produce a double stranded DNA and the two newly synthesized DNA products are then used as substrates by DNA helicases, entering the next round of the reaction. Thus, a simultaneous chain reaction develops, resulting in exponential amplification of the selected target sequence (see Vincent "et al."., 2004 for a schematic diagram). Since the publication of its discovery, HDA technology is being used for a "simple, easy to adapt nucleic acid test for the detection of "Clostridium difficile"". Other applications include the rapid detection of "Staphylococcus aureus " by the amplification and detection of a short DNA sequence specific to the bacterium. The advantages of HDA is that it provides a rapid method of nucleic acid amplification of a specific target at an isothermic temperature that does not require a thermal cycler. However, the optimisation of primers and sometimes buffers is required beforehand by the researcher. Normally primer and buffer optimisation is tested and achieved through PCR, raising the question of the need to spend extra on a separate system to do the actual amplification | https://en.wikipedia.org/wiki?curid=803558 |
Helicase-dependent amplification Despite the selling point that HDA negates the use of a thermal cycler and therefore allows research to be conducted in the field, much of the work required to detect potentially hazardous microorganisms is carried out in a research/hospital lab setting regardless. At present, mass diagnoses from a great number of samples cannot yet be achieved by HDA, whereas PCR reactions carried out in thermal cycler that can hold multi-well sample plates allows for the amplification and detection of the intended DNA target from a maximum of 96 samples. The cost of purchasing reagents for HDA are also relatively expensive to that of PCR reagents, more so since it comes as a ready-made kit. | https://en.wikipedia.org/wiki?curid=803558 |
Blink comparator A blink comparator is a viewing apparatus formerly used by astronomers to find differences between two photographs of the night sky. It permits rapid switching from viewing one photograph to viewing the other, "blinking" back and forth between the two images taken of the same area of the sky at different times. This allows the user to more easily spot objects in the night sky that have changed position. It was also sometimes known as a blink microscope. It was invented in 1904 by physicist Carl Pulfrich at Carl Zeiss AG, then constituted as Carl-Zeiss-Stiftung. In photographs taken a few days apart, rapidly moving objects such as asteroids and comets would stand out, because they would appear to be jumping back and forth between two positions, while all the distant stars remained stationary. Photographs taken at longer intervals could be used to detect stars with large proper motion, or variable stars, or to distinguish binary stars from optical doubles. The most notable body to be found using this technique is Pluto, discovered by Clyde Tombaugh in 1930. The Projection Blink Comparator (PROBLICOM), invented by amateur astronomer Ben Mayer, is a low-cost version of the professional tool. It consists of two slide projectors with a rotating occluding disk that alternately blocks the images from the projectors. This tool allowed amateur astronomers to contribute to some phases of serious research | https://en.wikipedia.org/wiki?curid=809979 |
Blink comparator In modern times, charge-coupled devices (CCDs) have largely replaced photographic plates, as astronomical images are stored digitally on computers. The blinking technique can easily be performed on a computer screen rather than with a physical blink comparator apparatus as before. The blinking technique is less used today, because image differencing algorithms detect moving objects more effectively than human eyes can. To measure the precise position of a known object whose direction and rate of motion are known, a "track and stack" software technique is used. Multiple images are superimposed such that the moving object is fixed in place; the moving object then stands out as a dot among the star trails. This is particularly effective in cases where the moving object is very faint and superimposing multiple images of it permits it to be seen better. | https://en.wikipedia.org/wiki?curid=809979 |
Eberhard August Wilhelm von Zimmermann Eberhardt August Wilhelm von Zimmermann (August 17, 1743, Uelzen – July 4, 1815, Braunschweig) was a German geographer and zoologist. He studied natural philosophy and mathematics in Leiden, Halle, Berlin, and Göttingen, and in 1766 was appointed professor of mathematics and natural sciences at the Collegium Carolinum in Braunschweig. One of his pupils was mathematician Carl Friedrich Gauss. From 1789 onward, he served as aulic councillor in Braunschweig. During his career, he travelled widely throughout Europe — Livonia, Russia, Sweden, Denmark. England, France, Germany, Switzerland, and Italy. On his journeys, he conducted research of economic conditions and natural resources. He wrote "Specimen Zoologiae Geographicae Quadrupedum" (1777), one of the first works on the geographical distribution of mammals (zoogeography). He was the author of works on a variety of subjects, such as mathematics, natural sciences, regional studies, and the history of discovery. From 1802 to 1813, he published the "Taschenbuch der Reisen" ("Handbook of Travel"). | https://en.wikipedia.org/wiki?curid=810431 |
Frank Scott Hogg (June 26, 1904 – January 1, 1951) was a Canadian astronomer. Hogg was born in Preston, Ontario to Dr. James Scott Hogg and Ida Barberon. After earning an undergraduate degree from the University of Toronto, Hogg received the second doctorate in astronomy awarded at Harvard University in 1929 where he pioneered in the study of spectrophotometry of stars and of spectra of comets. His supervisor there was Cecilia Payne-Gaposchkin. During World War II, he developed a two-star sextant for air navigation. He was the head of the Department of Astronomy at the University of Toronto and director of the David Dunlap Observatory from 1946 until his death. During this time he pursued the observatory's major research program to study the motions of faint stars in the line of sight. He was married to fellow astronomer Helen Sawyer Hogg from 1930 until his death from a heart attack in 1951. The crater Hogg on the moon is co-named for him and Arthur Hogg. | https://en.wikipedia.org/wiki?curid=826467 |
Gustaf von Paykull Gustav von Paykull (21 August 1757 – 28 January 1826) was a Swedish" friherre" (circa baron) and Marshal of the Court, ornithologist and entomologist. He was a member of the Royal Swedish Academy from 1791 and a founder of the natural history museum (Naturhistoriska Riksmuseet) in Stockholm, through his 1819 donation of his extensive zoological collections to the academy (now in the Swedish Museum of Natural History). He was elected a Foreign Honorary Member of the American Academy of Arts and Sciences in 1804. His best-known publications are: | https://en.wikipedia.org/wiki?curid=826556 |
New Naturalist The Library (also known as "The New Naturalists") is a series of books published by Collins in the United Kingdom, on a variety of natural history topics relevant to the British Isles. The aim of the series at the start was: "To interest the general reader in the wild life of Britain by recapturing the inquiring spirit of the old naturalists." An editors' preface to a 1952 monograph says: "An object of the "New Naturalist" series is the recognition of the many-sidedness of British natural history, and the encouragement of unusual and original developments of its forgotten or neglected facets." The first volume to appear was E.B. Ford's "Butterflies" in 1945. The authors of this series are usually eminent experts, often professional scientists. This gives the series authority, and many are or have been authoritative introductory textbooks on a subject for some years. The books are written in scientific style, but are intended to be readable by the non-specialist, and are an early example of popular science in the media. The books of the series have had considerable influence on many students who later became professional biologists, such as W.D. Hamilton and Mike Majerus. The latter was inspired by Ford's "Butterflies" and "Moths", and has since added two volumes of his own to the series. A parallel series known as the "Monograph Library" (and often referred to as "The Special Volumes") was also published. Its aim was to cover "in greater detail... a single species or group of species" | https://en.wikipedia.org/wiki?curid=826700 |
New Naturalist There have been no additions to the "Monograph Library" since 1971. Volume 82 of the main series, "The New Naturalists", described the series to date, with authors' biographies and a guide to collecting the books. The original Editorial Board consisted of Julian Huxley, James Fisher, Dudley Stamp, John Gilmour and Eric Hosking. Until 1985, the highly characteristic dust jacket illustrations were by Rosemary and Clifford Ellis; since then they have been by Robert Gillmor. Being a numbered series, with a very low print run for some volumes, the books are highly collectable. Second-hand copies of the rarer volumes, in good condition, can command high prices. The 100th volume, "Woodlands" by Oliver Rackham was published in 2006. "Woodlands" (volume 100) was also published in 2006 as a "leatherbound" edition, limited to 100 copies. In fact it was fake leather. The second "leatherbound" "New Naturalist" - "Dragonflies" by Philip Corbet and Stephen Brooks - was published in 2008. The (fake) leather edition of "Dragonflies" (volume 106) was initially limited to 400 copies, which was subsequently limited to 303, and finally to 250. According to the website only 217 were actually sold and the remaining unsold stock is being kept secure at HarperCollins's offices. HarperCollins continue to produce limited numbers of "leatherbound" editions of all volumes published since "Dragonflies", but only from "Islands" (volume 109) was real leather actually used. All recent volumes have only 50 leatherbound copies | https://en.wikipedia.org/wiki?curid=826700 |
New Naturalist The series won the 2007 British Book Design and Production Award for "brand or series identity", and in 2008 the official website was launched, with features including the latest news, a members only area with access to exclusive content and downloads, and a forum. In around 1990, Bloomsbury produced a series of facsimile editions, as hardbacks with new dustjacket designs, and with all plates in black and white, including those which were originally in colour. | https://en.wikipedia.org/wiki?curid=826700 |
Biotechnology and Biological Sciences Research Council (BBSRC), part of UK Research and Innovation, is a non-departmental public body (NDPB), and is the largest UK public funder of non-medical bioscience. It predominantly funds scientific research institutes and university research departments in the UK. Receiving its funding through the science budget of the Department for Business, Energy & Industrial Strategy (BEIS), BBSRC's mission is to "promote and support, by any means, high-quality basic, strategic and applied research and related postgraduate training relating to the understanding and exploitation of biological systems". BBSRC's head office is at Polaris House in Swindon - the same building as the other councils of UK Research and Innovation, AHRC EPSRC, ESRC, Innovate UK, MRC, NERC, Research England and STFC, as well as the UKSA. Funded by Government, BBSRC invested over £498 million in world-class bioscience in 2017–18. BBSRC also manages the joint Research Councils' Office in Brussels – the UK Research Office (UKRO). BBSRC was created in 1994, merging the former Agricultural and Food Research Council (AFRC) and taking over the biological science activities of the former Science and Engineering Research Council (SERC). Chairs Chief Executives Executive Chairs BBSRC is managed by the BBSRC Council consisting of a Chair (from 2015, Professor Sir Gordon Duff), an Executive Chair (Professor Melanie Welham) and from ten to eighteen representatives from UK universities, government and industry | https://en.wikipedia.org/wiki?curid=830651 |
Biotechnology and Biological Sciences Research Council The Council approves policies, strategy, budgets and major funding. A Research Panel provides expert advice which BBSRC Council draws upon in making decisions. The purpose of the Research Panel is to advise on: In addition to the Council and the Research Panel, BBSRC has a series of other internal bodies for specific purposes. The Council strategically funds eight research institutes in the UK, and a number of centres (BBSRC: Institutes and centres). The Institutes are tasked with delivering innovative, world class bioscience research and training, leading to wealth and job creation, generating high returns for the UK economy. They have strong links with business, industry and the wider community, and support policy development. The institutes' research underpins key sectors of the UK economy such as agriculture, bioenergy, biotechnology, food and drink and pharmaceuticals. In addition, the institutes maintain unique research facilities of national importance. Other research institutes have merged with each other or with local universities. Previous BBSRC (or AFRC) sponsored institutes include: | https://en.wikipedia.org/wiki?curid=830651 |
Walter Frederick Gale (27 November 1865 – 1 June 1945) was an Australian banker. Gale was born in Paddington, Sydney, New South Wales. He had a strong interest in astronomy and built his first telescope in 1884. He discovered a number of comets, including the lost periodic comet 34D/Gale. He also discovered five southern double stars with the prefix GLE, and several dark sky objects, including the planetary nebula, IC 5148 in Grus. In 1892, he described oases and canals on Mars. He was awarded the Jackson-Gwilt Medal of the Royal Astronomical Society in 1935 for "discoveries of comets and his work for astronomy in New South Wales." A crater on Mars, Gale Crater, was named in his honour. It was selected as the 2012 landing site for the Curiosity Rover. | https://en.wikipedia.org/wiki?curid=834672 |
Entropy of fusion The entropy of fusion is the increase in entropy when melting a substance. This is almost always positive since the degree of disorder increases in the transition from an organized crystalline solid to the disorganized structure of a liquid; the only known exception is helium. It is denoted as formula_1 and normally expressed in J mol K A natural process such as a phase transition will occur when the associated change in the Gibbs free energy is negative. Since this is a thermodynamic equation, the symbol T refers to the absolute thermodynamic temperature, measured in kelvins (K). Equilibrium occurs when the temperature is equal to the melting point formula_4 so that and the entropy of fusion is the heat of fusion divided by the melting point. Helium-3 has a negative entropy of fusion at temperatures below 0.3 K. Helium-4 also has a very slightly negative entropy of fusion below 0.8 K. This means that, at appropriate constant pressures, these substances freeze with the addition of heat. | https://en.wikipedia.org/wiki?curid=837770 |
Robert Methven Petrie (May 15, 1906 – April 8, 1966) was a Canadian astronomer. He was born in Scotland but emigrated to Canada at the age of five. He grew up in Victoria, British Columbia and studied physics and mathematics at the University of British Columbia. He began working summer jobs at the Dominion Astrophysical Observatory and became fascinated with astronomy. He obtained his Ph.D. at the University of Michigan in 1932. He taught there until 1935, when he joined the staff of the Dominion Astrophysical Observatory. In 1951 he became its director. He extensively studied spectroscopic binaries. The crater Petrie on the Moon is named after him. The Canadian Astronomical Society established the R. M. Petrie Prize Lecture to honor his astrophysical research. | https://en.wikipedia.org/wiki?curid=838544 |
Synthetic biology (SynBio) is a multidisciplinary area of research that seeks to create new biological parts, devices, and systems, or to redesign systems that are already found in nature. It is a branch of science that encompasses a broad range of methodologies from various disciplines, such as biotechnology, genetic engineering, molecular biology, molecular engineering, systems biology, membrane science, biophysics, chemical and biological engineering, electrical and computer engineering, control engineering and evolutionary biology. Due to more powerful genetic engineering capabilities and decreased DNA synthesis and sequencing costs, the field of synthetic biology is rapidly growing. In 2016, more than 350 companies across 40 countries were actively engaged in synthetic biology applications; all these companies had an estimated net worth of $3.9 billion in the global market. currently has no generally accepted definition, and it is defined in various ways depending on the specific discipline or use. Because it is an emergent area of research, utilized in multiple fields of study, numerous definitions are found in literature. Yet, all definitions touch upon a common concept: the creation of new biological systems via the synthesis or assembly of artificial or natural components. Here are a few examples of the various definitions: To note, synthetic biology has traditionally been divided into two different approaches: top down and bottom up. Biological systems are thus assembled module-by-module | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Cell-free protein expression systems are often employed, as are membrane-based molecular machinery. There are increasing efforts to bridge the divide between these approaches by forming hybrid living/synthetic cells, and engineering communication between living and synthetic cell populations. 1910: The first identifiable use of the term "synthetic biology" was in Stéphane Leduc's publication of "Théorie physico-chimique de la vie et générations spontanées". He also noted this term in another publication, "La Biologie Synthétique" in 1912. 1961: Jacob and Monod postulated cellular regulation by molecular networks from their study of the "lac" operon in "E. coli" and envisioned the ability to assemble new systems from molecular components. 1973: A contemporary interpretation of synthetic biology was given by Polish geneticist Wacław Szybalski in a panel discussion during Eighteenth Annual "OHOLO" Biological Conference on Strategies for the Control of Gene Expression in" Zichron Yaakov, Israel." 1978: Arber, Nathans and Smith won the Nobel Prize in Physiology or Medicine for the discovery of restriction enzymes, leading Szybalski to offer an editorial comment in the journal "Gene": The work on restriction nucleases not only permits us easily to construct recombinant DNA molecules and to analyze individual genes, but also has led us into the new era of synthetic biology where not only existing genes are described and analyzed but also new gene arrangements can be constructed and evaluated | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology 2000: A notable advance in synthetic biology occurred when two articles in Nature discussed the creation of synthetic biological circuit devices of a genetic toggle switch and a biological clock by combining genes within "E. coli" cells. 2003: Researchers engineer a artemisinin precursor pathway in "E. coli". 2004: The first international conference for synthetic biology, Synthetic Biology 1.0 (SB1.0) was held at the Massachusetts Institute of Technology, USA. 2005: Researchers develop a light-sensing circuit in "E. coli". Another group designs circuits capable of multicellular pattern formation. 2006: Researchers design a new therapeutic strategy for cancer treatment; They engineered a synthetic circuit that promotes bacterial invasion of tumour cells. 2010: a group of researchers revealed the first self-replicating synthetic bacterial cell, called "M. mycoides" JCVI-syn1.0. Researchers were able to synthesize a new genome, using DNA sequences from two laboratory strains of "Mycoplasma myciodes," and perform successful transplantation into a host "Mycoplasma capricolum" cell. The new bacterium behaved much like its donor and was able to self-replicate freely. 2011: Functional synthetic chromosome arms are engineered in yeast. April 2019: scientists at ETH Zurich reported the creation of the world's first bacterial genome, named "Caulobacter ethensis-2.0", made entirely by a computer, although a related viable form of "C. ethensis-2.0" does not yet exist | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology May 2019: researchers, in a milestone effort, reported the creation of a new synthetic (possibly artificial) form of viable life, a variant of the bacteria "Escherichia coli", by reducing the natural number of 64 codons in the bacterial genome to 59 codons instead, in order to encode 20 amino acids. Engineers view biology as a "technology" (in other words, a given system's "biotechnology" or its "biological engineering") includes the broad redefinition and expansion of biotechnology, with the ultimate goals of being able to design and build engineered biological systems that process information, manipulate chemicals, fabricate materials and structures, produce energy, provide food, and maintain and enhance human health (see Biomedical Engineering) and our environment. Studies in synthetic biology can be subdivided into broad classifications according to the approach they take to the problem at hand: standardization of biological parts, biomolecular engineering, genome engineering. Biomolecular engineering includes approaches that aim to create a toolkit of functional units that can be introduced to present new technological functions in living cells. Genetic engineering includes approaches to construct synthetic chromosomes for whole or minimal organisms. Biomolecular design refers to the general idea of de novo design and additive combination of biomolecular components | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Each of these approaches share a similar task: to develop a more synthetic entity at a higher level of complexity by inventively manipulating a simpler part at the preceding level. On the other hand, "re-writers" are synthetic biologists interested in testing the irreducibility of biological systems. Due to the complexity of natural biological systems, it would be simpler to rebuild the natural systems of interest from the ground up; In order to provide engineered surrogates that are easier to comprehend, control and manipulate. Re-writers draw inspiration from refactoring, a process sometimes used to improve computer software. Several novel enabling technologies were critical to the success of synthetic biology. Concepts include standardization of biological parts and hierarchical abstraction to permit using those parts in synthetic systems. Basic technologies include reading and writing DNA (sequencing and fabrication). Measurements under multiple conditions are needed for accurate modeling and computer-aided-design (CAD). Driven by dramatic decreases in costs of oligonucleotide ("oligos") synthesis, the sizes of DNA constructions from oligos have increased to the genomic level. In 2000, researchers reported synthesis of the 9.6 kbp (kilo bp) Hepatitis C virus genome from chemically synthesized 60 to 80-mers. In 2002 researchers at Stony Brook University succeeded in synthesizing the 7741 bp poliovirus genome from its published sequence, producing the second synthetic genome, spanning two years | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology In 2003 the 5386 bp genome of the bacteriophage Phi X 174 was assembled in about two weeks. In 2006, the same team, at the J. Craig Venter Institute, constructed and patented a synthetic genome of a novel minimal bacterium, "Mycoplasma laboratorium" and were working on getting it functioning in a living cell. In 2007 it was reported that several companies were offering synthesis of genetic sequences up to 2000 base pairs (bp) long, for a price of about $1 per bp and a turnaround time of less than two weeks. Oligonucleotides harvested from a photolithographic- or inkjet-manufactured DNA chip combined with DNA mismatch error-correction allows inexpensive large-scale changes of codons in genetic systems to improve gene expression or incorporate novel amino-acids (see George M. Church's and Anthony Forster's synthetic cell projects.) This favors a synthesis-from-scratch approach. Additionally, the CRISPR/Cas system has emerged as a promising technique for gene editing. It was described as "the most important innovation in the synthetic biology space in nearly 30 years". While other methods take months or years to edit gene sequences, CRISPR speeds that time up to weeks. Due to its ease of use and accessibility, however, it has raised ethical concerns, especially surrounding its use in biohacking. DNA sequencing determines the order of nucleotide bases in a DNA molecule. Synthetic biologists use DNA sequencing in their work in several ways | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology First, large-scale genome sequencing efforts continue to provide information on naturally occurring organisms. This information provides a rich substrate from which synthetic biologists can construct parts and devices. Second, sequencing can verify that the fabricated system is as intended. Third, fast, cheap, and reliable sequencing can facilitate rapid detection and identification of synthetic systems and organisms. Microfluidics, in particular droplet microfluidics, is an emerging tool used to construct new components, and to analyse and characterize them. It is widely employed in screening assays. The most used standardized DNA parts are BioBrick plasmids, invented by Tom Knight in 2003. Biobricks are stored at the Registry of Standard Biological Parts in Cambridge, Massachusetts. The BioBrick standard has been used by thousands of students worldwide in the international Genetically Engineered Machine (iGEM) competition. While DNA is most important for information storage, a large fraction of the cell's activities are carried out by proteins. Tools can send proteins to specific regions of the cell and to link different proteins together. The interaction strength between protein partners should be tunable between a lifetime of seconds (desirable for dynamic signaling events) up to an irreversible interaction (desirable for device stability or resilient to harsh conditions). Interactions such as coiled coils, SH3 domain-peptide binding or SpyTag/SpyCatcher offer such control | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology In addition it is necessary to regulate protein-protein interactions in cells, such as with light (using light-oxygen-voltage-sensing domains) or cell-permeable small molecules by chemically induced dimerization. In a living cell, molecular motifs are embedded in a bigger network with upstream and downstream components. These components may alter the signalling capability of the modeling module. In the case of ultrasensitive modules, the sensitivity contribution of a module can differ from the sensitivity that the module sustains in isolation. Models inform the design of engineered biological systems by better predicting system behavior prior to fabrication. benefits from better models of how biological molecules bind substrates and catalyze reactions, how DNA encodes the information needed to specify the cell and how multi-component integrated systems behave. Multiscale models of gene regulatory networks focus on synthetic biology applications. Simulations can model all biomolecular interactions in transcription, translation, regulation and induction of gene regulatory networks. Studies have considered the components of the DNA transcription mechanism. One desire of scientists creating synthetic biological circuits is to be able to control the transcription of synthetic DNA in unicellular organisms (prokaryotes) and in multicellular organisms (eukaryotes) | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology One study tested the adjustability of synthetic transcription factors (sTFs) in areas of transcription output and cooperative ability among multiple transcription factor complexes. Researchers were able to mutate functional regions called zinc fingers, the DNA specific component of sTFs, to decrease their affinity for specific operator DNA sequence sites, and thus decrease the associated site-specific activity of the sTF (usually transcriptional regulation). They further used the zinc fingers as components of complex-forming sTFs, which are the eukaryotic translation mechanisms. A biological computer refers to an engineered biological system that can perform computer-like operations, which is a dominant paradigm in synthetic biology. Researchers built and characterized a variety of logic gates in a number of organisms, and demonstrated both analog and digital computation in living cells. They demonstrated that bacteria can be engineered to perform both analog and/or digital computation. In human cells research demonstrated a universal logic evaluator that operates in mammalian cells in 2007. Subsequently, researchers utilized this paradigm to demonstrate a proof-of-concept therapy that uses biological digital computation to detect and kill human cancer cells in 2011. Another group of researchers demonstrated in 2016 that principles of computer engineering, can be used to automate digital circuit design in bacterial cells | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology In 2017, researchers demonstrated the 'Boolean logic and arithmetic through DNA excision' (BLADE) system to engineer digital computation in human cells. A biosensor refers to an engineered organism, usually a bacterium, that is capable of reporting some ambient phenomenon such as the presence of heavy metals or toxins. One such system is the Lux operon of "Aliivibrio fischeri," which codes for the enzyme that is the source of bacterial bioluminescence, and can be placed after a respondent promoter to express the luminescence genes in response to a specific environmental stimulus. One such sensor created, consisted of a bioluminescent bacterial coating on a photosensitive computer chip to detect certain petroleum pollutants. When the bacteria sense the pollutant, they luminesce. Another example of a similar mechanism is the detection of landmines by an engineered "E.coli" reporter strain capable of detecting TNT and its main degradation product DNT, and consequently producing a green fluorescent protein (GFP). Modified organisms can sense environmental signals and send output signals that can be detected and serve diagnostic purposes. Microbe cohorts have been used. Cells use interacting genes and proteins, which are called gene circuits, to implement diverse function, such as responding to environmental signals, decision making and communication | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Three key components are involved: DNA, RNA and Synthetic biologist designed gene circuits that can control gene expression from several levels including transcriptional, post-transcriptional and translational levels. Traditional metabolic engineering has been bolstered by the introduction of combinations of foreign genes and optimization by directed evolution. This includes engineering "E. coli" and yeast for commercial production of a precursor of the antimalarial drug, Artemisinin. Entire organisms have yet to be created from scratch, although living cells can be transformed with new DNA. Several ways allow constructing synthetic DNA components and even entire synthetic genomes, but once the desired genetic code is obtained, it is integrated into a living cell that is expected to manifest the desired new capabilities or phenotypes while growing and thriving. Cell transformation is used to create biological circuits, which can be manipulated to yield desired outputs. By integrating synthetic biology with materials science, it would be possible to use cells as microscopic molecular foundries to produce materials with properties whose properties were genetically encoded. Re-engineering has produced Curli fibers, the amyloid component of extracellular material of biofilms, as a platform for programmable nanomaterial. These nanofibers were genetically constructed for specific functions, including adhesion to substrates, nanoparticle templating and protein immobilization. Natural proteins can be engineered, e.g | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology , by directed evolution, novel protein structures that match or improve on the functionality of existing proteins can be produced. One group generated a helix bundle that was capable of binding oxygen with similar properties as hemoglobin, yet did not bind carbon monoxide. A similar protein structure was generated to support a variety of oxidoreductase activities. Another group generated a family of G-protein coupled receptors that could be activated by the inert small molecule clozapine-N-oxide but insensitive to the native ligand, acetylcholine. Novel functionalities or protein specificity can also be engineered using computational approaches. One study was able to use two different computational methods – a bioinformatics and molecular modeling method to mine sequence databases, and a computational enzyme design method to reprogram enzyme specificity. Both methods resulted in designed enzymes with >100 fold specificity for production of longer chain alcohols from sugar. Another common investigation is expansion of the natural set of 20 amino acids. Excluding stop codons, 61 codons have been identified, but only 20 amino acids are coded generally in all organisms. Certain codons are engineered to code for alternative amino acids including: nonstandard amino acids such as O-methyl tyrosine; or exogenous amino acids such as 4-fluorophenylalanine | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Typically, these projects make use of re-coded nonsense suppressor tRNA-Aminoacyl tRNA synthetase pairs from other organisms, though in most cases substantial engineering is required. Other researchers investigated protein structure and function by reducing the normal set of 20 amino acids. Limited protein sequence libraries are made by generating proteins where groups of amino acids may be replaced by a single amino acid. For instance, several non-polar amino acids within a protein can all be replaced with a single non-polar amino acid. One project demonstrated that an engineered version of Chorismate mutase still had catalytic activity when only 9 amino acids were used. Researchers and companies practice synthetic biology to synthesize industrial enzymes with high activity, optimal yields and effectiveness. These synthesized enzymes aim to improve products such as detergents and lactose-free dairy products, as well as make them more cost effective. The improvements of metabolic engineering by synthetic biology is an example of a biotechnological technique utilized in industry to discover pharmaceuticals and fermentive chemicals. may investigate modular pathway systems in biochemical production and increase yields of metabolic production. Artificial enzymatic activity and subsequent effects on metabolic reaction rates and yields may develop "efficient new strategies for improving cellular properties ... for industrially important biochemical production" | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Scientists can encode digital information onto a single strand of synthetic DNA. In 2012, George M. Church encoded one of his books about synthetic biology in DNA. The 5.3 Mb of data was more than 1000 times greater than the previous largest amount of information to be stored in synthesized DNA. A similar project encoded the complete sonnets of William Shakespeare in DNA. Many technologies have been developed for incorporating unnatural nucleotides and amino acids into nucleic acids and proteins, both "in vitro" and "in vivo". For example, in May 2014, researchers announced that they had successfully introduced two new artificial nucleotides into bacterial DNA. By including individual artificial nucleotides in the culture media, they were able to exchange the bacteria 24 times; they did not generate mRNA or proteins able to use the artificial nucleotides. raised NASA's interest as it could help to produce resources for astronauts from a restricted portfolio of compounds sent from Earth. On Mars, in particular, synthetic biology could lead to production processes based on local resources, making it a powerful tool in the development of manned outposts with less dependence on Earth. One important topic in synthetic biology is "synthetic life", that is concerned with hypothetical organisms created "in vitro" from biomolecules and/or chemical analogues thereof | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Synthetic life experiments attempt to either probe the origins of life, study some of the properties of life, or more ambitiously to recreate life from non-living (abiotic) components. Synthetic life biology attempts to create living organisms capable of carrying out important functions, from manufacturing pharmaceuticals to detoxifying polluted land and water. In medicine, it offers prospects of using designer biological parts as a starting point for new classes of therapies and diagnostic tools. A living "artificial cell" has been defined as a completely synthetic cell that can capture energy, maintain ion gradients, contain macromolecules as well as store information and have the ability to mutate. Nobody has been able to create such a cell. A completely synthetic bacterial chromosome was produced in 2010 by Craig Venter, and his team introduced it to genomically emptied bacterial host cells. The host cells were able to grow and replicate. The first living organism with 'artificial' expanded DNA code was presented in 2014; the team used "E. coli" that had its genome extracted and replaced with a chromosome with an expanded genetic code. The nucleosides added are d5SICS and dNaM. In May 2019, researchers, in a milestone effort, reported the creation of a new synthetic (possibly artificial) form of viable life, a variant of the bacteria "Escherichia coli", by reducing the natural number of 64 codons in the bacterial genome to 59 codons instead, in order to encode 20 amino acids | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Bacteria have long been used in cancer treatment. "Bifidobacterium" and "Clostridium" selectively colonize tumors and reduce their size. Recently synthetic biologists reprogrammed bacteria to sense and respond to a particular cancer state. Most often bacteria are used to deliver a therapeutic molecule directly to the tumor to minimize off-target effects. To target the tumor cells, peptides that can specifically recognize a tumor were expressed on the surfaces of bacteria. Peptides used include an affibody molecule that specifically targets human epidermal growth factor receptor 2 and a synthetic adhesin. The other way is to allow bacteria to sense the tumor microenvironment, for example hypoxia, by building an AND logic gate into bacteria. The bacteria then only release target therapeutic molecules to the tumor through either lysis or the bacterial secretion system. Lysis has the advantage that it can stimulate the immune system and control growth. Multiple types of secretion systems can be used and other strategies as well. The system is inducible by external signals. Inducers include chemicals, electromagnetic or light waves. Multiple species and strains are applied in these therapeutics. Most commonly used bacteria are "Salmonella typhimurium", "Escherichia Coli", "Bifidobacteria", "Streptococcus", "Lactobacillus", "Listeria" and "Bacillus subtilis" | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Each of these species have their own property and are unique to cancer therapy in terms of tissue colonization, interaction with immune system and ease of application. The immune system plays an important role in cancer and can be harnessed to attack cancer cells. Cell-based therapies focus on immunotherapies, mostly by engineering T cells. T cell receptors were engineered and ‘trained’ to detect cancer epitopes. Chimeric antigen receptors (CARs) are composed of a fragment of an antibody fused to intracellular T cell signaling domains that can activate and trigger proliferation of the cell. A second generation CAR-based therapy was approved by FDA. Gene switches were designed to enhance safety of the treatment. Kill switches were developed to terminate the therapy should the patient show severe side effects. Mechanisms can more finely control the system and stop and reactivate it. Since the number of T-cells are important for therapy persistence and severity, growth of T-cells is also controlled to dial the effectiveness and safety of therapeutics. Although several mechanisms can improve safety and control, limitations include the difficulty of inducing large DNA circuits into the cells and risks associated with introducing foreign components, especially proteins, into cells. The creation of new life and the tampering of existing life has raised ethical concerns in the field of synthetic biology and are actively being discussed | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Common ethical questions include: The ethical aspects of synthetic biology has 3 main features: biosafety, biosecurity, and the creation of new life forms. Other ethical issues mentioned include the regulation of new creations, patent management of new creations, benefit distribution, and research integrity. Ethical issues have surfaced for recombinant DNA and genetically modified organism (GMO) technologies and extensive regulations of genetic engineering and pathogen research were in place in many jurisdictions. Amy Gutmann, former head of the Presidential Bioethics Commission, argued that we should avoid the temptation to over-regulate synthetic biology in general, and genetic engineering in particular. According to Gutmann, "Regulatory parsimony is especially important in emerging technologies...where the temptation to stifle innovation on the basis of uncertainty and fear of the unknown is particularly great. The blunt instruments of statutory and regulatory restraint may not only inhibit the distribution of new benefits, but can be counterproductive to security and safety by preventing researchers from developing effective safeguards.". One ethical question is whether or not it is acceptable to create new life forms, sometimes known as "playing God". Currently, the creation of new life forms not present in nature is at small-scale, the potential benefits and dangers remain unknown, and careful consideration and oversight are ensured for most studies | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Many advocates express the great potential value—to agriculture, medicine, and academic knowledge, among other fields—of creating artificial life forms. Creation of new entities could expand scientific knowledge well beyond what is currently known from studying natural phenomena. Yet there is concern that artificial life forms may reduce nature’s "purity" (i.e., nature could be somehow corrupted by human intervention and manipulation) and potentially influence the adoption of more engineering-like principles instead of biodiversity- and nature-focused ideals. Some are also concerned that if an artificial life form were to be released into nature, it could hamper biodiversity by beating out natural species for resources (similar to how algal blooms kill marine species). Another concern involves the ethical treatment of newly created entities if they happen to sense pain, sentience, and self-perception. Should such life be given moral or legal rights? If so, how? What is most ethically appropriate when considering biosafety measures? How can accidental introduction of synthetic life in the natural environment be avoided? Much ethical consideration and critical thought has been given to these questions. Biosafety not only refers to biological containment; it also refers to strides taken to protect the public from potentially hazardous biological agents | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Even though such concerns are important and remain unanswered, not all products of synthetic biology present concern for biological safety or negative consequences for the environment. Some would argue that most synthetic technologies are benign and incapable of flourishing in the outside world due to their "unnatural" characteristics. For instance, organisms like bacteria and yeast can be engineered to be unable to produce histidine, an important amino acid for all life. Such organisms can thus only be grown on histidine-rich media in laboratory conditions, nullifying fears that they could spread into undesirable areas. Some ethical issues relate to biosecurity, where biosynthetic technologies could be deliberately used to cause harm to society and/or the environment. Since synthetic biology raises ethical issues and biosecurity issues, humanity must consider and plan on how to deal with potentially harmful creations, and what kinds of ethical measures could possibly be employed to deter nefarious biosynthetic technologies. With the exception of regulating synthetic biology and biotechnology companies, however, the issues are not seen as new because they were raised during the earlier recombinant DNA and genetically modified organism (GMO) debates and extensive regulations of genetic engineering and pathogen research are already in place in many jurisdictions. The European Union-funded project SYNBIOSAFE has issued reports on how to manage synthetic biology | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology A 2007 paper identified key issues in safety, security, ethics and the science-society interface, which the project defined as public education and ongoing dialogue among scientists, businesses, government and ethicists. The key security issues that SYNBIOSAFE identified involved engaging companies that sell synthetic DNA and the biohacking community of amateur biologists. Key ethical issues concerned the creation of new life forms. A subsequent report focused on biosecurity, especially the so-called dual-use challenge. For example, while synthetic biology may lead to more efficient production of medical treatments, it may also lead to synthesis or modification of harmful pathogens (e.g., smallpox). The biohacking community remains a source of special concern, as the distributed and diffuse nature of open-source biotechnology makes it difficult to track, regulate or mitigate potential concerns over biosafety and biosecurity. COSY, another European initiative, focuses on public perception and communication. To better communicate synthetic biology and its societal ramifications to a broader public, COSY and SYNBIOSAFE published "SYNBIOSAFE", a 38-minute documentary film, in October 2009. The International Association Synthetic Biology has proposed self-regulation. This proposes specific measures that the synthetic biology industry, especially DNA synthesis companies, should implement | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology In 2007, a group led by scientists from leading DNA-synthesis companies published a "practical plan for developing an effective oversight framework for the DNA-synthesis industry". In January 2009, the Alfred P. Sloan Foundation funded the Woodrow Wilson Center, the Hastings Center, and the J. Craig Venter Institute to examine the public perception, ethics and policy implications of synthetic biology. On July 9–10, 2009, the National Academies' Committee of Science, Technology & Law convened a symposium on "Opportunities and Challenges in the Emerging Field of Synthetic Biology". After the publication of the first synthetic genome and the accompanying media coverage about "life" being created, President Barack Obama established the Presidential Commission for the Study of Bioethical Issues to study synthetic biology. The commission convened a series of meetings, and issued a report in December 2010 titled "New Directions: The Ethics of Synthetic Biology and Emerging Technologies." The commission stated that "while Venter’s achievement marked a significant technical advance in demonstrating that a relatively large genome could be accurately synthesized and substituted for another, it did not amount to the “creation of life”. It noted that synthetic biology is an emerging field, which creates potential risks and rewards | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology The commission did not recommend policy or oversight changes and called for continued funding of the research and new funding for monitoring, study of emerging ethical issues and public education. Synthetic biology, as a major tool for biological advances, results in the "potential for developing biological weapons, possible unforeseen negative impacts on human health ... and any potential environmental impact". These security issues may be avoided by regulating industry uses of biotechnology through policy legislation. Federal guidelines on genetic manipulation are being proposed by "the President's Bioethics Commission ... in response to the announced creation of a self-replicating cell from a chemically synthesized genome, put forward 18 recommendations not only for regulating the science ... for educating the public". On March 13, 2012, over 100 environmental and civil society groups, including Friends of the Earth, the International Center for Technology Assessment and the ETC Group issued the manifesto "The Principles for the Oversight of Synthetic Biology". This manifesto calls for a worldwide moratorium on the release and commercial use of synthetic organisms until more robust regulations and rigorous biosafety measures are established. The groups specifically call for an outright ban on the use of synthetic biology on the human genome or human microbiome | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology Richard Lewontin wrote that some of the safety tenets for oversight discussed in "The Principles for the Oversight of Synthetic Biology" are reasonable, but that the main problem with the recommendations in the manifesto is that "the public at large lacks the ability to enforce any meaningful realization of those recommendations". The hazards of synthetic biology include biosafety hazards to workers and the public, biosecurity hazards stemming from deliberate engineering of organisms to cause harm, and environmental hazards. The biosafety hazards are similar to those for existing fields of biotechnology, mainly exposure to pathogens and toxic chemicals, although novel synthetic organisms may have novel risks. For biosecurity, there is concern that synthetic or redesigned organisms could theoretically be used for bioterrorism. Potential risks include recreating known pathogens from scratch, engineering existing pathogens to be more dangerous, and engineering microbes to produce harmful biochemicals. Lastly, environmental hazards include adverse effects on biodiversity and ecosystem services, including potential changes to land use resulting from agricultural use of synthetic organisms. In general, existing hazard controls, risk assessment methodologies, and regulations developed for traditional genetically modified organisms (GMOs) are considered to be sufficient for synthetic organisms | https://en.wikipedia.org/wiki?curid=841429 |
Synthetic biology "Extrinsic" biocontainment methods in a laboratory context include physical containment through biosafety cabinets and gloveboxes, as well as personal protective equipment. In an agricultural context they include isolation distances and pollen barriers, similar to methods for biocontainment of GMOs. Synthetic organisms may offer increased hazard control because they can be engineered with "intrinsic" biocontainment methods that limit their growth in an uncontained environment, or prevent horizontal gene transfer to natural organisms. Examples of intrinsic biocontainment include auxotrophy, biological kill switches, inability of the organism to replicate or to pass modified or synthetic genes to offspring, and the use of xenobiological organisms using alternative biochemistry, for example using artificial xeno nucleic acids (XNA) instead of DNA. Existing risk analysis systems for GMOs are generally considered sufficient for synthetic organisms, although there may be difficulties for an organism built "bottom-up" from individual genetic sequences. generally falls under existing regulations for GMOs and biotechnology in general, and any regulations that exist for downstream commercial products, although there are generally no regulations in any jurisdiction that are specific to synthetic biology. | https://en.wikipedia.org/wiki?curid=841429 |
DGH Degrees of general hardness (dGH or °GH) is a unit of water hardness, specifically of general hardness. General hardness is a measure of the concentration of divalent metal ions such as calcium (Ca) and magnesium (Mg) per volume of water. Specifically, 1 dGH is defined as 10 milligrams (mg) of calcium oxide (CaO) per litre of water. Since CaO has a molar mass of 56.08 g/mol, 1 dGH is equivalent to 0.17832 mmol per litre of elemental calcium and/or magnesium ions. In water testing, paper strips often measure hardness in parts per million (ppm), where one part per million is defined as one milligram of calcium carbonate (CaCO) per litre of water. Consequently, 1 dGH corresponds to 10 ppm CaO but 17.848 ppm CaCO which has a molar mass of 100.09 g/mol. | https://en.wikipedia.org/wiki?curid=842185 |
Alejandro Corichi is a theoretical physicist working at the Quantum Gravity group of the National Autonomous University of Mexico (UNAM). He obtained his bachelor's degree at UNAM (1991) and his PhD at Pennsylvania State University (1997). His field of study is General Relativity and Quantum Gravity, where he has contributed to the understanding of classical aspects of black holes, to the non-commutativity and black holes within the approach known as Loop quantum gravity and to loop quantum cosmology. | https://en.wikipedia.org/wiki?curid=843182 |
Crewe (crater) Crewe is a crater approximately 3 km in diameter lying situated within the Margaritifer Sinus quadrangle (MC-19) region of the planet Mars, located at 25° South, 10° West. The crater was named after the town of Crewe, Cheshire, England. | https://en.wikipedia.org/wiki?curid=844619 |
Carlos Torres (astronomer) Carlos Torres (1929–2011) was a Chilean astronomer of the University of Chile and an individual member of the International Astronomical Union on several commissions. Between 1968 and 1982, he discovered or co-discovered a number of asteroids from the University of Chile's Cerro El Roble Astronomical Station. Together with Spanish astronomer Carlos Guillermo Torres (1910–1965), he was honored with the naming of asteroid 1769 Carlostorres. | https://en.wikipedia.org/wiki?curid=846054 |
Tableau encyclopédique et méthodique The des trois regnes de la nature was an illustrated encyclopedia of plants, animals and minerals, notable for including the first scientific descriptions of many species, and for its attractive engravings. It was published in Paris by Charles Joseph Panckoucke, from 1788 on. Although its several volumes can be considered a part of the greater "Encyclopédie méthodique", they were titled and issued separately. Contributors: Individual prints from this work today can sell for hundreds of dollars (US) apiece. | https://en.wikipedia.org/wiki?curid=849770 |
Tenebrescence Tenebrescence, also known as "reversible photochromism", is the ability of minerals to change colour when exposed to light. The effect can be repeated indefinitely, but is destroyed by heating. Tenebrescent minerals include hackmanite, spodumene and tugtupite. Tenebrescent behavior is exploited in synthetic materials for the manufacture of self-adjusting sunglasses, which darken on exposure to sunlight. | https://en.wikipedia.org/wiki?curid=851681 |
Felix Santschi (1 December 1872 – 20 November 1940) was a Swiss entomologist known for discovering that ants use the sun as a compass and for describing about 2000 taxa of ants. Santschi is known for his pioneering work on the navigational abilities of ants. In one experiment, he investigated the way harvester ants used the sky to navigate. He found that as long as even a small patch of sky was visible, the ants could return directly to the nest after gathering food. However, when the sky was completely hidden, they lost their sense of direction and began moving haphazardly. Some seventy years later it was shown that ants are guided by the polarization of light. | https://en.wikipedia.org/wiki?curid=851890 |
Nena (supercontinent) Nena, an acronym for Northern Europe–North America, was the Early Proterozoic amalgamation of Baltica and Laurentia into a single "cratonic landmass", a name first proposed in 1990. Since then several similar Proterozoic supercontinents have been proposed, including Nuna and Arctica, that include other Archaean cratons, such as Siberia and East Antarctica. In the original concept Nena formed in the Penokean, Makkovikan, Ketilidian, and Svecofennian orogenies. However, because Nena excludes several known Archaean cratons, including those in India and Australia, it is strictly speaking not a supercontinent. Nena, or Nuna, can, nevertheless be thought of as the core of Columbia, another supercontinent concept with several proposed configurations. Nena as a continent has been associated with the Sudbury Basin Impact. | https://en.wikipedia.org/wiki?curid=853793 |
Astronomical algorithm Astronomical algorithms are the algorithms used to calculate ephemerides, calendars, and positions (as in celestial navigation or satellite navigation). Examples of large and complex astronomical algorithms are those used to calculate the position of the Moon. A simple example is the calculation of the Julian day. Numerical model of solar system discusses a generalized approach to local astronomical modeling. The "variations séculaires des orbites planétaires" describes an often used model. | https://en.wikipedia.org/wiki?curid=853888 |
DLR-Archenhold Near Earth Objects Precovery Survey DANEOPS, the DLR-Archenhold Near Earth Objects Precovery Survey, has been initiated to systematically search existing photographic plate archives for precovery images of known NEOs. It has so far (July 2004) precovered or recovered some 145 objects. | https://en.wikipedia.org/wiki?curid=854413 |
Scherk–Schwarz mechanism In theoretical physics, the (named after Joël Scherk and John Henry Schwarz) for a field φ basically means that φ is a section of a non-trivializable fiber bundle (not necessarily a vector bundle since φ needn't be linear) which is fixed by the model. This is called a "twist" by physicists. Note that this can never occur in a spacetime which is homeomorphic to R, which is a contractible space. However, for Kaluza–Klein theories, the is a possibility which can't be neglected. | https://en.wikipedia.org/wiki?curid=857791 |
K-type asteroid K-type asteroids are relatively uncommon asteroids with a moderately reddish spectrum shortwards of 0.75 μm, and a slight bluish trend longwards of this. They have a low albedo. Their spectrum resembles that of CV and CO meteorites. These asteroids were described as "featureless" S-types in the Tholen classification. The K-type was proposed by J. F. Bell and colleagues in 1988 for bodies having a particularly shallow 1 μm absorption feature, and lacking the 2 μm absorption. These were found during studies of the Eos family of asteroids. | https://en.wikipedia.org/wiki?curid=858606 |
Biochip In molecular biology, biochips are essentially miniaturized laboratories that can perform hundreds or thousands of simultaneous biochemical reactions. Biochips enable researchers to quickly screen large numbers of biological analytes for a variety of purposes, from disease diagnosis to detection of bioterrorism agents. Digital microfluidic biochips have become one of the most promising technologies in many biomedical fields. In a digital microfluidic biochip, a group of (adjacent) cells in the microfluidic array can be configured to work as storage, functional operations, as well as for transporting fluid droplets dynamically. The development started with early work on the underlying sensor technology. One of the first portable, chemistry-based sensors was the glass pH electrode, invented in 1922 by Hughes. The basic concept of using exchange sites to create permselective membranes was used to develop other ion sensors in subsequent years. For example, a K sensor was produced by incorporating valinomycin into a thin membrane. In 1953, Watson and Crick announced their discovery of the now familiar double helix structure of DNA molecules and set the stage for genetics research that continues to the present day. The development of sequencing techniques in 1977 by Gilbert and Sanger (working separately) enabled researchers to directly read the genetic codes that provide instructions for protein synthesis | https://en.wikipedia.org/wiki?curid=859981 |
Biochip This research showed how hybridization of complementary single oligonucleotide strands could be used as a basis for DNA sensing. Two additional developments enabled the technology used in modern DNA-based. First, in 1983 Kary Mullis invented the polymerase chain reaction (PCR) technique, a method for amplifying DNA concentrations. This discovery made possible the detection of extremely small quantities of DNA in samples. Secondly in 1986 Hood and co-workers devised a method to label DNA molecules with fluorescent tags instead of radiolabels, thus enabling hybridization experiments to be observed optically. Figure 1 shows the make up of a typical biochip platform. The actual sensing component (or "chip") is just one piece of a complete analysis system. Transduction must be done to translate the actual sensing event (DNA binding, oxidation/reduction, "etc.") into a format understandable by a computer (voltage, light intensity, mass, "etc."), which then enables additional analysis and processing to produce a final, human-readable output. The multiple technologies needed to make a successful biochip—from sensing chemistry, to microarraying, to signal processing—require a true multidisciplinary approach, making the barrier to entry steep. One of the first commercial biochips was introduced by Affymetrix | https://en.wikipedia.org/wiki?curid=859981 |
Biochip Their "GeneChip" products contain thousands of individual DNA sensors for use in sensing defects, or single nucleotide polymorphisms (SNPs), in genes such as p53 (a tumor suppressor) and BRCA1 and BRCA2 (related to breast cancer). The chips are produced by using microlithography techniques traditionally used to fabricate integrated circuits (see below). The microarray—the dense, two-dimensional grid of biosensors—is the critical component of a biochip platform. Typically, the sensors are deposited on a flat substrate, which may either be passive ("e.g." silicon or glass) or active, the latter consisting of integrated electronics or micromechanical devices that perform or assist signal transduction. Surface chemistry is used to covalently bind the sensor molecules to the substrate medium. The fabrication of microarrays is non-trivial and is a major economic and technological hurdle that may ultimately decide the success of future biochip platforms. The primary manufacturing challenge is the process of placing each sensor at a specific position (typically on a Cartesian grid) on the substrate. Various means exist to achieve the placement, but typically robotic micro-pipetting or micro-printing systems are used to place tiny spots of sensor material on the chip surface. Because each sensor is unique, only a few spots can be placed at a time. The low-throughput nature of this process results in high manufacturing costs | https://en.wikipedia.org/wiki?curid=859981 |
Biochip Fodor and colleagues developed a unique fabrication process (later used by Affymetrix) in which a series of microlithography steps is used to combinatorially synthesize hundreds of thousands of unique, single-stranded DNA sensors on a substrate one nucleotide at a time. One lithography step is needed per base type; thus, a total of four steps is required per nucleotide level. Although this technique is very powerful in that many sensors can be created simultaneously, it is currently only feasible for creating short DNA strands (15–25 nucleotides). Reliability and cost factors limit the number of photolithography steps that can be done. Furthermore, light-directed combinatorial synthesis techniques are not currently possible for proteins or other sensing molecules. As noted above, most microarrays consist of a Cartesian grid of sensors. This approach is used chiefly to map or "encode" the coordinate of each sensor to its function. Sensors in these arrays typically use a universal signalling technique ("e.g." fluorescence), thus making coordinates their only identifying feature. These arrays must be made using a serial process ("i.e." requiring multiple, sequential steps) to ensure that each sensor is placed at the correct position. "Random" fabrication, in which the sensors are placed at arbitrary positions on the chip, is an alternative to the serial method. The tedious and expensive positioning process is not required, enabling the use of parallelized self-assembly techniques | https://en.wikipedia.org/wiki?curid=859981 |
Biochip In this approach, large batches of identical sensors can be produced; sensors from each batch are then combined and assembled into an array. A non-coordinate based encoding scheme must be used to identify each sensor. As the figure shows, such a design was first demonstrated (and later commercialized by Illumina) using functionalized beads placed randomly in the wells of an etched fiber optic cable. Each bead was uniquely encoded with a fluorescent signature. However, this encoding scheme is limited in the number of unique dye combinations that can be used and successfully differentiated. Microarrays are not limited to DNA analysis; protein microarrays, antibody microarray, chemical compound microarray can also be produced using biochips. Randox Laboratories Ltd. launched Evidence, the first protein Array Technology analyzer in 2003. In protein Array Technology, the biochip replaces the ELISA plate or cuvette as the reaction platform. The biochip is used to simultaneously analyze a panel of related tests in a single sample, producing a patient profile. The patient profile can be used in disease screening, diagnosis, monitoring disease progression or monitoring treatment. Performing multiple analyses simultaneously, described as multiplexing, allows a significant reduction in processing time and the amount of patient sample required. Array Technology is a novel application of a familiar methodology, using sandwich, competitive and antibody-capture immunoassays | https://en.wikipedia.org/wiki?curid=859981 |
Biochip The difference from conventional immunoassays is that, the capture ligands are covalently attached to the surface of the biochip in an ordered array rather than in solution. In sandwich assays an enzyme-labelled antibody is used; in competitive assays an enzyme-labelled antigen is used. On antibody-antigen binding a chemiluminescence reaction produces light. Detection is by a charge-coupled device (CCD) camera. The CCD camera is a sensitive and high-resolution sensor able to accurately detect and quantify very low levels of light. The test regions are located using a grid pattern then the chemiluminescence signals are analysed by imaging software to rapidly and simultaneously quantify the individual analytes. Biochips are also used in the field of microphysiometry e.g. in skin-on-a-chip applications. For details about other array technologies, see Antibody microarray. | https://en.wikipedia.org/wiki?curid=859981 |
Cumulus castellanus cloud Cumulus castellanus (from Latin "castellanus", castle) is a type of cumulus cloud that is distinctive because it displays multiple towers arising from its top, indicating significant vertical air movement. They are so named because they somewhat resemble the crenellation on medieval castles. Cumulus castellanus clouds are associated with the formation of towering cumulus or cumulonimbus clouds, and correspondingly can be an indicator of forthcoming showers and thunderstorms. The World Meteorological Organization and the American Meteorological Society do not recognize cumulus castellanus as a distinct species, but instead classify all towering cumulus clouds as Cumulus congestus. | https://en.wikipedia.org/wiki?curid=860246 |
Alexander Butlerov Alexander Mikhaylovich Butlerov (Алекса́ндр Миха́йлович Бу́тлеров; 15 September 1828 – 17 August 1886) was a Russian chemist, one of the principal creators of the theory of chemical structure (1857–1861), the first to incorporate double bonds into structural formulas, the discoverer of hexamine (1859), the discoverer of formaldehyde (1859) and the discoverer of the formose reaction (1861). He first proposed the idea of possible tetrahedral arrangement of valence bonds in carbon compounds in 1862. The crater Butlerov on the Moon is named after him. was born in Chistopol into a landowning family. | https://en.wikipedia.org/wiki?curid=861040 |
Chrysiogenaceae is a bacterial family. The phylogeny is based on 16S rRNA-based LTP release 123 by 'The All-Species Living Tree' Project. The currently accepted taxonomy is based on the List of Prokaryotic names with Standing in Nomenclature (LPSN) and National Center for Biotechnology Information (NCBI) List of bacterial orders | https://en.wikipedia.org/wiki?curid=868790 |
Upper shoreface Upper Shoreface refers to the portion of the seafloor that is shallow enough to be agitated by everyday wave action, the wave base. Below that is the lower shoreface. The continuous agitation of the sea floor in the upper shoreface environment results in sediments that are winnowed of the smallest grains, leaving only those grains heavy enough that the water cannot keep them suspended. Seawater is moved in a vertical circular motion when a wave passes. The radius of the circle of motion for any given water molecule decreases with depth. The maximum depth of influence of a water wave is half the wavelength. Below that depth the water remains stationary as the wave passes. For instance, in a pool of water deep, a wave with a wavelength of would not be able to cause water movement on the bottom. However, a wave with a wavelength would be moving the water (barely) at the bottom. | https://en.wikipedia.org/wiki?curid=869980 |
Water contact In the hydrocarbon industry water contact is the elevation above which fluids other than water can be found in the pores of a rock. For example, in a traditional hand-excavated water well, the level at which the water stabilizes represents the water table, or the elevation in the rock where air starts to occupy the rock pores. In most situations in the hydrocarbon industry the term is qualified as being an oil-water contact (abbreviated to "OWC") or a gas-water contact ("GWC"). Often there is also a gas-oil contact ("GOC"). In an oil or gas field, hydrocarbons migrate into rocks and can be trapped if there is a permeability barrier to prevent upward escape. Gas and oil are lighter than water, so they will form a bubble at the high end of the "trap" formed by the impermeable barriers. A simple physical model of this would be a coffee cup held upside down underwater with an air bubble occupying the highest portion of the cup's interior. The base of the bubble is the water contact. Capillary action can obscure the true water contact in permeable media like sandstone. Capillary pressure prevents the hydrocarbons from expelling all of the water in the pores, which creates a transition zone between the fully saturated hydrocarbon levels and the fully saturated water levels. In poorly porous intervals, the oil-water, gas-water or gas-oil contacts can similarly be obscured, which makes estimation of hydrocarbon reserves difficult | https://en.wikipedia.org/wiki?curid=870055 |
Water contact Descriptions of the well's petrophysics will then often further qualify to delineate a gas-down-to, oil-up-to, oil-down-to and water-up-to line, clearly showing the uncertainties involved. | https://en.wikipedia.org/wiki?curid=870055 |
Outer-grazer and inner-grazer are configurations of heliocentric orbit. All six diagrams show the Sun (the orange dot) in the middle and a putative planet's orbital band (in yellow). The latter is a ring whose inner radius is the planet's perihelion and its outer radius the aphelion. | https://en.wikipedia.org/wiki?curid=870529 |
Sussexite is a manganese borate mineral MnBO(OH). Crystals are monoclinic prismatic and typically fibrous in occurrence. Colour is white, pink, yellowish white with a pearly lustre. It has a Mohs hardness of 3 and a specific gravity of 3.12. It is named after the Franklin Mining District in Sussex County, New Jersey, US where it was first discovered in 1868. also occurs in France, Italy, Namibia, North Korea, South Africa, Switzerland, and the US states of Michigan, New Jersey, Utah and Virginia. | https://en.wikipedia.org/wiki?curid=870994 |
Mare Erythraeum is a very large dark dusky region of Mars that can be viewed by even a small telescope. The name comes from the Latin for the Erythraean Sea, because it was originally thought to be a large sea of liquid water. It was included in Percival Lowell's 1895 map of Mars. Under the name of De La Rue Ocean it was included in Procter's 1905 map of Mars. | https://en.wikipedia.org/wiki?curid=871227 |
Li Fan (Han dynasty) Li Fan (Chinese: 李梵, pinyin: Lǐ Fàn) was a Chinese astronomer during the Han Dynasty (202 BC-220 AD). He noticed that the Moon does not move uniformly through its phases by using background stars as reference. In 85 Li Fan and Bian Xin (Chinese: 編訢) were tasked by Emperor Zhang to resolve inaccuracies in the Taichu calendar. He is also known to have worked with inflow clepsydras as opposed to earlier, typically less accurate outflow clepsydras. The measurements of synodic periods of the planets given in the following table are attributed to him. An impact crater that is located at the Phaethontis quadrangle, Mars, 47.2°S Latitude and 153.2°W Longitude was named in his honor. The diameter of the impact crater is approximately 104.8 km. | https://en.wikipedia.org/wiki?curid=871458 |
Shire (pharmaceutical company) Shire Plc was a Jersey-registered specialty biopharmaceutical company. Originating in the United Kingdom with an operational base in the United States, its brands and products included Vyvanse, Lialda, and Adderall XR. Shire was acquired by Takeda Pharmaceutical Company on 8 January 2019. Shire was a global biotechnology company focused on serving people with rare diseases and other highly specialized conditions. The company's products were available in more than 100 countries across core therapeutic areas including Hematology, Immunology, Neuroscience, Lysosomal Storage Disorders, Gastrointestinal / Internal Medicine / Endocrine and Hereditary Angioedema; a growing franchise in Oncology; and an emerging, innovative pipeline in Ophthalmics. The original corporate headquarters was located in Basingstoke, Hampshire, England. Main offices are located in Dublin, Ireland, the United States in Cambridge, Massachusetts, and Chicago, Illinois, and in Zug, Switzerland. In addition, Shire owns manufacturing sites in Lexington, Massachusetts, and Social Circle, Georgia. Shire’s headquarters in Lexington, Massachusetts, will be integrated with Takeda’s new U.S. headquarters, which is being relocated from Deerfield, Illinois, to the Boston area. Shire was founded in 1986 in the UK by five entrepreneurs: Harry Stratford, Dennis Stephens, Peter Moriarty, Geoff Hall and Dr Jim Murray. Under the management of Rolf Stahel Shire was first listed on the London Stock Exchange in 1996 | https://en.wikipedia.org/wiki?curid=871579 |
Shire (pharmaceutical company) Shire's initial products were calcium supplements (Calcichew-D) for patients seeking to treat or prevent osteoporosis. In 1997 the company acquired Pharmavene for £105 million in order to access Pharmavene's drug delivery methods. Later in the same year Shire acquired Richwood Pharmaceutical Company, forming Shire-Richwood Inc. In 2001 the company acquired Biochem Canada. Shire's next acquisition didn't come until 2005 when it acquired Transkaryotic Therapeutics and two years later - in 2007 - New River Pharmaceuticals Inc, for a then company record of $2.6 billion. With the purchase of New River, Shire gained access and ownership of Vyvanse. A year later the company acquired the German company Jerini, for $521 million. Jerini focused on treating hereditary angioedema. In 2008, in reaction to new taxation measures announced by the Labour Party in the treatment of royalties on patents, the company moved its tax domicile to Dublin, Ireland. 2010 saw a change in company strategy, with the company seeking to expand through mergers and acquisitions - culminating in the company becoming one of the most acquisitive in the industry. In 2010 the company acquired Movetis, a Belgian company focusing on gastrointestinal products for $565 million, a year later it acquired regenerative medicine manufacturer Advanced BioHealing. In 2012 the company acquired FerroKin BioSciences for $325 million along with FerroKins lead iron chelator - FBS0701 | https://en.wikipedia.org/wiki?curid=871579 |
Shire (pharmaceutical company) 2013 saw the company complete its highest number of acquisitions with Lotus Tissue Repair, Inc. (lead compound, ABH001), SARcode Bioscience Inc., with the last being ViroPharma. Shire changed the name of ViroPharma to Shire Viropharma Inc. upon acquisition and on their final day of trading the company was valued at $3.3 billion. At $4.2 billion, ViroPharma set a new company record. In 2014 Shire acquired two rare disease drug companies: Fibrotech with its antifibrotic compounds for $75 million, and Lumena, a company researching rare gastro-intestinal and hepatic compounds, for $260 million. In 2015, NPS Pharmaceuticals was acquired for $5.2 billion, bringing along its rare disease drugs Gattex and Natpara. On their final day of trading, NPS had a market capitalisation of $4.99 billion. The company also acquired, later in the same year, Meritage Pharma for $245 million, Foresight Biotherapeutics for $300 million and Dyax for $6.5 billion. The purchases bolstered Shires gastro-intestinal and rare disease sectors, with Phase-III-ready treatment - Budesonide - for the treatment of eosinophilic esophagitis. As well as expanding the company's pipeline with a late-stage treatment candidate for infectious conjunctivitis with lead candidate FST-100 and increasing the company's rare disease catalogue with Dyax’s portfolio of plasma kallikrein inhibitors against hereditary angioedema (led by the approved drug Kalbitor and the Phase III DX-2930) | https://en.wikipedia.org/wiki?curid=871579 |
Shire (pharmaceutical company) In January 2016, the company made its most significant purchase, with the $32 billion acquisition of Baxalta (which had been spun-off from Baxter the previous year), creating the largest global biotech company focused solely on rare diseases. In April 2018, Shire agreed to sell its oncology business to French pharmaceutical company Servier for £1.7billion. On 20 June 2014, Shire rejected a takeover attempt by AbbVie. AbbVie offered £46.11 per share (£27.3 billion or $46.5 billion in total). On 8 July, the offer was increased to $51.5 billion. On 18 July, it was announced that AbbVie would acquire Shire for $54.8 billion. On 15 October, news broke suggesting AbbVie was reconsidering their proposed takeover deal due to changes in US "Tax Inversion" law and on 16 October AbbVie's board recommended that shareholders vote against the deal. This news sent Shire's share price down over 27%; however, AbbVie would be subject to a $1.6 billion break-up fee, payable to Shire. On 21 October, the merger was called off. In April 2018, reported that Takeda Pharmaceutical Company had an approach to acquire Shire. Days later Shire announced they had rejected all three Takeda bids. The first bid valued the business at £41 billion (£28 per Shire shares paid in Takeda shares plus £16 per share in cash), the second £43 billion (£28.75 per Shire shares paid in Takeda shares plus £16.75 per share in cash) and the third £44 billion (£28 per Shire shares paid in Takeda shares plus £17.75 per share in cash) | https://en.wikipedia.org/wiki?curid=871579 |
Shire (pharmaceutical company) Reuters also reported interest from Allergan however they ruled themselves out a day later. A day later Takeda increased their offer with a fourth bid, to £26 per Shire shares paid in Takeda shares plus £21 per share in cash - giving a total value of £44.3 billion ($62.1 billion). On 24 April, Takeda submitted an enhanced fifth bid for the company. On 25 April, Shire said that they will recommend the revised £45.8 billion ($64 billion) offer to their shareholders. The enhanced offer included a more generous cash component, with the deal offering £21.76 ($30.33) in cash for each Shire ordinary share. The same day, GlaxoSmithKline ruled out making any form of counter-bid. On 8 May 2018, an agreement was finally reached in which Shire was sold to Takeda in a $62 billion deal. Takeda’s acquisition of Shire closed on 8 January 2019. The Annual Revenue figures in the following table were drawn from the company's 2015 preliminary results. In July 2014, Shire licensed the rights to the investigational Hunter syndrome compound, AGT-182, from ArmaGen for up to $225 million. The Annual Revenue figures in the following Table were drawn from the company's 2015 preliminary results. Flemming Ørnskov, was the company's chief executive officer through 8 January 2019 with Takeda's Acquisition of Shire. Ginger Gregory as Chief Human Resources Officer, Jeffrey Poulton as CFO, and Philip Vickers as Head of R&D. James Bowling vacated his position as interim CFO in the aftermath of the collapse of the AbbVie inversion deal | https://en.wikipedia.org/wiki?curid=871579 |
Shire (pharmaceutical company) The Chair of Shire's Board of Directors was Susan Kilsby. | https://en.wikipedia.org/wiki?curid=871579 |
Auricupride is a natural alloy that combines copper and gold. Its chemical formula is CuAu. The alloy crystallizes in the cubic crystal system in the L1 structure type and occurs as malleable grains or platey masses. It is an opaque yellow with a reddish tint. It has a hardness of 3.5 and a specific gravity of 11.5. A variant called "tetra-auricupride" (CuAu) exists. Silver may be present resulting in the variety "argentocuproauride" (Cu(Au,Ag)). It was first described in 1950 for an occurrence in the Ural Mountains Russia. It occurs as low temperature "unmixing" product in serpentinites and as reduction "halos" in redbed deposits. It is most often found in Chile, Argentina, Tasmania, Russia, Cyprus, Switzerland and South Africa. | https://en.wikipedia.org/wiki?curid=871659 |
Complementary experiments In physics, two experimental techniques are often called complementary if they investigate the same subject in two different ways such that two different (ideally non-overlapping) properties or aspects can be investigated. For example, X-ray scattering and neutron scattering experiments are often said to be complementary because the former reveals information about the electron density of the atoms in the target but gives no information about the nuclei (because they are too small to affect the X-rays significantly), while the latter allows you to investigate the nuclei of the atoms but cannot tell you anything about their electron hulls (because the neutrons, being neutral, do not interact with the charged electrons). Scattering experiments are sometimes also called complementary when they investigate the same physical property of a system from two complementary view points in the sense of Bohr. For example, time-resolved and energy-resolved experiments are said to be complementary. The former uses a pulse which is well defined in time (its position is well known at a given time). The latter uses a monochromatic pulse well defined in energy (its frequency is well known). | https://en.wikipedia.org/wiki?curid=872023 |
Zirkelite is an oxide mineral with formula: (Ca,Th,Ce)Zr(Ti,Nb)O. It occurs as well-formed fine sized isometric crystals. It is a black, brown or yellow mineral with a hardness of 5.5 and a specific gravity of 4.7. was first discovered in Brazil in 1895. It was named for German petrographer Ferdinand Zirkel (1838–1912). Initial discovery was from the Jacupiranga carbonatite, Sao Paulo, Brazil. It is also found in Canada, Kazakhstan, Norway, Russia, South Africa, the United Kingdom, and the United States. | https://en.wikipedia.org/wiki?curid=872324 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.