text stringlengths 11 1.65k | source stringlengths 38 44 |
|---|---|
CSIRO Larry Marshall, who is responsible for management of the organisation. is structured into Research Business Units, National Facilities and Collections, and Services. As at 2019, CSIRO's research areas are identified as "Impact science" and organised into the following Business Units: manages national research facilities and scientific infrastructure on behalf of the nation to assist with the delivery of research. The national facilities and specialised laboratories are available to both international and Australian users from industry and research. As at 2019, the following National Facilities are listed: manages a number of collections of animal and plant specimens that contribute to national and international biological knowledge. The National Collections contribute to taxonomic, genetic, agricultural and ecological research. As at 2019, CSIRO's Collections are listed as the following: In 2019, Services are itemised as follows: Other services are noted as including education, publishing, infrastructure technologies, Small and Medium Enterprise engagement and Futures. A precursor to CSIRO, the Advisory Council of Science and Industry, was established in 1916 on the initiative of prime minister Billy Hughes. However, the Advisory Council struggled with insufficient funding during the First World War. In 1920 the Council was renamed the Commonwealth Institute of Science and Industry, and was led by George Handley Knibbs (1921–26), but continued to struggle financially | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO In 1926 the Australian Parliament modified the principal Act for national scientific research (the "Institute of Science and Industry Act 1920") by passing "The Science and Industry Research Act 1926". The new Act replaced the Institute with the Council for Scientific and Industrial Research (CSIR). With encouragement from prime minister Stanley Bruce, strengthened national science leadership and increased research funding, CSIR grew rapidly and achieved significant early successes. The council was structured to represent the federal structure of government in Australia, and had state-level committees and a central council. In addition to an improved structure, CSIR benefited from strong bureaucratic management under George Julius, David Rivett, and Arnold Richardson. Research focused on primary and secondary industries. Early in its existence, CSIR established divisions studying animal health and animal nutrition. After the Great Depression, research was extended into manufacturing and other secondary industries. In 1949 the Act was changed again and the entity name amended to the Commonwealth Scientific and Industrial Research Organisation. The amendment enlarged and reconstituted the organisation and its administrative structure. Under Ian Clunies Ross as chairman, pursued new areas such as radio astronomy and industrial chemistry. still operates under the provisions of the 1949 Act in a wide range of scientific inquiry | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO Since 1949 has expanded its activities to almost every field of primary, secondary and tertiary industry, including the environment, human nutrition, conservation, urban and rural planning, and water. It works with leading organisations around the world and maintains more than 50 sites across Australia and in France, Chile and the United States of America, employing about 5500 people. Notable inventions and breakthroughs by include: had a pioneering role in the scientific discovery of the universe through radio "eyes". A team led by Paul Wild built and operated (from 1948) the world's first solar radiospectrograph, and from 1967 the radioheliograph at Culgoora in New South Wales. For three decades, the Division of Radiophysics had a world-leading role in solar research, attracting prominent solar physicists from around the world. owned the first computer in Australia, CSIRAC, built as part of a project began in the Sydney Radiophysics Laboratory in 1947. The CSIR Mk 1 ran its first program in 1949, the fifth electronic computer in the world. It was over 1,000 times faster than the mechanical calculators available at the time. It was decommissioned in 1955 and recommissioned in Melbourne as CSIRAC in 1956 as a general purpose computing machine used by over 700 projects until 1964. The CSIRAC is the only surviving first-generation computer in the world. Between 1965 and 1985, George Bornemissza of CSIRO's Division of Entomology founded and led the Australian Dung Beetle Project | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO Bornemissza, upon settling in Australia from Hungary in 1951, noticed that the pastureland was covered in dry cattle dung pads which did not seem to be recycled into the soil and caused areas of rank pasture which were unpalatable to the cattle. He proposed that the reason for this was that native Australian dung beetles, which had co-evolved alongside the marsupials (which produce dung very different in its composition from cattle), were not adapted to utilise cattle dung for their nutrition and breeding since cattle had only relatively recently been introduced to the continent in the 1880s. The Australian Dung Beetle Project sought, therefore, to introduce species of dung beetle from South Africa and Europe (which had co-evolved alongside bovids) in order to improve the fertility and quality of cattle pastures. Twenty-three species were successfully introduced throughout the duration of the project and also had the effect of reducing the pestilent bush fly population by 90%. was the first Australian organisation to start using the Internet and was able to register the second-level domain csiro.au (as opposed to csiro.org.au or csiro.com.au). Guidelines were introduced in 1996 to regulate the use of the .au domain. When CSIR was formed in 1926, it was led initially by an Executive Committee of three people, two of whom were designated as the Chairman and the Chief Executive. Since then the roles and responsibilities of the Chair and Chief Executive have changed many times | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO From 1927 to 1986 the head of CSIR (and from 1949, CSIRO) was the Chairman, who was responsible for the management of the organisation, supported by the Chief Executive. From 1 July 1959 to 4 December 1986 had no Chief Executive; the Chairman undertook both functions. In 1986, when the Australian Government changed the structure of to include a board of non-executive members plus the Chief Executive to lead CSIRO, the roles changed. The Chief Executive is now responsible for management of the organisation in accordance with the strategy, plans and policies approved by the Board which, led by the Chair of the Board, is responsible to the Australian Government for the overall strategy, governance and performance of CSIRO. As with its governance structure, the priorities and structure of CSIRO, and the teams and facilities that implement its research, have changed as Australia's scientific challenges have evolved. In 2005 the gained worldwide attention, including some criticism, for promoting a high-protein, low-carbohydrate diet of their own creation called "Total Wellbeing Diet". The published the diet in a book which sold over half a million copies in Australia and over 100,000 overseas. The diet was criticised in an editorial by "Nature" for giving scientific credence to a "fashionable" diet sponsored by meat and dairy industries | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO In the early 1990s, radio astronomy scientists John O'Sullivan, Graham Daniels, Terence Percival, Diethelm Ostry and John Deane undertook research directed to finding a way to make wireless networks work as fast as wired networks within confined spaces such as office buildings. The technique they developed, involving a particular combination of forward error correction, frequency-domain interleaving, and multi-carrier modulation, became the subject of , which was granted on 23 January 1996. In 1997 Macquarie University professor David Skellern and his colleague Neil Weste established the company Radiata, Inc., which took a nonexclusive licence to the patent for the purpose of developing commercially viable integrated circuit devices implementing the patented technology. During this period, the IEEE 802.11 Working Group was developing the 802.11a wireless LAN standard. did not participate directly in the standards process, however David Skellern was an active participant as secretary of the Working Group, and representative of Radiata. In 1998 it became apparent that the patent would be pertinent to the standard. In response to a request from Victor Hayes of Lucent Technologies, who was Chair of the 802.11 Working Group, confirmed its commitment to make non-exclusive licenses available to implementers of the standard on reasonable and non-discriminatory terms. In 1999, Cisco Systems, Inc | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO and Broadcom Corporation each invested A$4 million in Radiata, representing an 11% stake for each investor and valuing the company at around A$36 million. In September 2000, Radiata demonstrated a chip set complying with the recently finalised IEEE 802.11a Wi-Fi standard, and capable of handling transmission rates of up to 54 Mbit/s, at a major international exhibition. In November 2000, Cisco acquired Radiata in exchange for US$295 million in Cisco common stock with the intention of incorporating the Radiata Baseband Processor and Radio chips into its Aironet family of wireless LAN products. Cisco subsequently took a large write-down on the Radiata acquisition, following the 2001 telecoms crash, and in 2004 it shut down its internal development of wireless chipsets based on the Radiata technology in order to focus on software development and emerging new technologies. Controversy over the patent arose in 2006 after the organisation won an injunction against Buffalo Technology in an infringement suit filed in Federal Court in the Eastern District of Texas. The injunction was subsequently suspended on appeal, with the Court of Appeals for the Federal Circuit finding that the judge in Texas should have allowed a trial to proceed on Buffalo's challenge to the validity of the patent. In 2007, declined to provide an assurance to the IEEE that it would not sue companies which refused to take a license for use in 802 | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO 11n-compliant devices, while at the same time continuing to defend legal challenges to the validity of the patent brought by Intel, Dell, Microsoft, Hewlett-Packard and Netgear. In April 2009, Hewlett-Packard broke ranks with the rest of the industry becoming the first to reach a settlement of its dispute with CSIRO. This agreement was followed quickly by settlements with Microsoft, Fujitsu and Asus and then Dell, Intel, Nintendo, Toshiba, Netgear, Buffalo, D-Link, Belkin, SMC, Accton, and 3Com. The controversy grew after sued US carriers AT&T, Verizon and T-Mobile in 2010, with the organisation being accused of being "Australia's biggest patent troll", a wrathful "patent bully", and of imposing a "WiFi tax" on American innovation. Further fuel was added to the controversy after a settlement with the carriers, worth around $229 million, was announced in March 2012. Encouraged in part by an announcement by the Australian Minister for Tertiary Education, Skills Science and Research, Senator Chris Evans, an article in Ars Technica portrayed as a shadowy organisation responsible for US consumers being compelled to make "a multimillion dollar donation" on the basis of a questionable patent claiming "decades old" technology. The resulting debate became so heated that the author was compelled to follow up with a defence of the original article. An alternative view was also published on The Register, challenging a number of the assertions made in the Ars Technica piece | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO Total income to from the patent is currently estimated at nearly $430 million. On 14 June 2012, the inventors received the European Patent Office (EPO) European Inventor Award (EIA), in the category of "Non-European Countries". On 14 July 2011, Greenpeace activists vandalised a crop of GM wheat, circumventing the scientific trials being undertaken. Greenpeace was forced to pay reparations to of $280,000 for the criminal damage, and were accused by the sentencing judge, Justice Hilary Penfold, of cynically using junior members of the organisation with good standing to avoid custodial sentences, while the offenders were given 9-month suspended sentences. Following the attack Greenpeace criticised for a close relationship with industry that had led to an increase in genetically modified crops, even though a core aim of is Cooperative Research "working hand in hand with industry [to] build partnerships and engage with industry to generate impact". On 25 November 2009, a debate was held in the Australian Senate concerning the alleged involvement of the and the Labor government in censorship. The debate was called for by opposition parties after evidence came to light that a paper critical of carbon emissions trading was being suppressed. At the time, the Labor government was trying to get such a scheme through the Senate | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO After the debate, the Science Minister, Kim Carr, was forced to release the paper, but when doing so in the Senate he also delivered a letter from the CEO of the CSIRO, Megan Clark, which attacked the report's author and threatened him with unspecified punishment. The author of the paper, Clive Spash, was cited in the press as having been bullied and harassed, and later gave a radio interview about this. In the midst of the affair, management had considered releasing the paper with edits that Nature reported would be "tiny". Spash claimed the changes actually demanded amounted to censorship and resigned. He later posted on his website a document detailing the text that management demanded be deleted; by itself, this document forms a coherent set of statements criticising emissions trading without any additional wording needed. In subsequent Senate Estimates hearings during 2010, Senator Carr and Clark went on record claiming the paper was originally stopped from publication solely due to its low quality not meeting standards. At the time of its attempted suppression, the paper had been accepted for publication in an academic journal, New Political Economy, which in 2010 had been ranked by the Australian Research Council as an 'A class' publication. In an ABC radio interview, Spash called for a Senate enquiry into the affair and the role played by senior management and the Science Minister | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO After these events, the "Sydney Morning Herald" reported that "Questions are being raised about the closeness of BHP Billiton and the under its chief executive, Megan Clark". After his resignation, an unedited version of the paper was released by Spash as a discussion paper, and later published as an academic journal article. On 11 April 2013, the "Sydney Morning Herald" ran a story on how had "duped" the Swiss-based pharmaceutical giant Novartis into purchasing an anti-counterfeit technology for its vials of injectable Voltaren. The invention was marketed by a small Australian company called DataTrace DNA as a method of identifying fake vials, on the basis that a unique tracer code developed by was embedded in the product. However, the code sold to Novartis for more than A$2M was apparently not unique, and was based on a "cheap tracer ... bought in bulk from a Chinese distributor". Novartis was contractually bound not to reverse-engineer the tracer to verify its uniqueness. The "Sydney Morning Herald" report alleges that this was done with the knowledge of key personnel. has since conducted a full review of the allegations and found no evidence to support them. In recent years the has fallen under the spotlight for allegedly exhibiting a culture of workplace bullying and harassment. Former employees started to surface with experiences of workplace bullying and other unreasonable behaviour by current and former staff members. took the allegations seriously and responded to the articles on a number of occasions | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO The shadow minister for innovation, industry, science and research, Sophie Mirabella, wrote to the government requesting it establish an inquiry. Mirabella said she is aware of as many as 100 cases of alleged workplace harassment. On 20 July 2012 Comcare issued with an Improvement Notice with regard to handling and management of workplace misconduct/code of conduct type investigations and allegations. On 24 June 2013 Mirabella advised the Australian House of Representatives that in relation to the worker's compensation claim for psychological injuries of ex-employee, Martin Williams, which was vigorously defended by Comcare on the advice of the CSIRO, that officers had provided false testimony on no less than 128 occasions under oath when the matter went before the Administrative Appeals Tribunal. Mirabella stated, "even in establishing the framework for this inquiry it is obvious there's an inappropriate 'hands on' approach by CSIRO." In response to the allegations Clark commissioned Dennis Pearce, who is assisted by an investigation team from HWL Ebsworth Lawyers, to conduct an independent investigation into allegations of workplace bullying and other unreasonable behaviour. Mirabella continued to question the independence of the investigation. The first stage of the investigation is scheduled to publish its findings at the end of July 2013, and the final stage is scheduled to be complete by February 2014 | https://en.wikipedia.org/wiki?curid=288262 |
CSIRO In August 2015 the discontinued its annual July and August survey, conducted over the previous five years, polling to create a long-term view of how Australians viewed global warming and their support for action. In the previous 2013 poll, 86 per cent agreed with the statement that climate change was occurring and only 7.6 per cent disagreed. On 11 February 2016, Dr Larry Marshall – a former venture capitalist with Southern Cross Venture Holdings, who had been appointed CEO of the on 1 January 2015, caused an international outcry after describing Australia's national climate change discussion as "more like religion than science," a week after announcing hundreds of job cuts to the organisation that will reduce the effectiveness of its climate research team. In "an open letter to the Australian Government and CSIRO", 2,800 of the leading climate scientists from 60 countries say the announcement of cuts to the CSIRO's Oceans and Atmosphere research program has alarmed the global climate research community. They say the decision shows a lack of insight and a misunderstanding of the importance of the depth and significance of Australian contributions to global and regional climate research. | https://en.wikipedia.org/wiki?curid=288262 |
Alignments of random points in a plane can be demonstrated by statistics to be counter-intuitively easy to find when a large number of random points are marked on a bounded flat surface. This has been put forward as a demonstration that ley lines and other similar mysterious alignments believed by some to be phenomena of deep significance might exist solely due to chance alone, as opposed to the supernatural or anthropological explanations put forward by their proponents. The topic has also been studied in the fields of computer vision and astronomy. A number of studies have examined the mathematics of alignment of random points on the plane. In all of these, the width of the line - the allowed displacement of the positions of the points from a perfect straight line - is important. It allows the fact that real-world features are not mathematical points, and that their positions need not line up exactly for them to be considered in alignment. Alfred Watkins, in his classic work on ley lines "The Old Straight Track", used the width of a pencil line on a map as the threshold for the tolerance of what might be regarded as an alignment. For example, using a 1 mm pencil line to draw alignments on a 1:50,000 Ordnance Survey map, the corresponding width on the ground would be 50 m. Contrary to intuition, finding alignments between randomly placed points on a landscape gets progressively easier as the geographic area to be considered increases | https://en.wikipedia.org/wiki?curid=289860 |
Alignments of random points One way of understanding this phenomenon is to see that the increase in the number of possible combinations of sets of points in that area overwhelms the decrease in the probability that any given set of points in that area line up. One definition which expresses the generally accepted meaning of "alignment" is: More precisely, a path of width "w" may be defined as the set of all points within a distance of "w/2" of a straight line on a plane, or a great circle on a sphere, or in general any geodesic on any other kind of manifold. Note that, in general, any given set of points that are aligned in this way will contain a large number of infinitesimally different straight paths. Therefore, only the existence of at least one straight path is necessary to determine whether a set of points is an alignment. For this reason, it is easier to count the sets of points, rather than the paths themselves. The number of alignments found is very sensitive to the allowed width "w", increasing approximately proportionately to "w", where "k" is the number of points in an alignment. The following is a very approximate order-of-magnitude estimate of the likelihood of alignments, assuming a plane covered with uniformly distributed "significant" points. Consider a set of "n" points in a compact area with approximate diameter "L" and area approximately "L". Consider a valid line to be one where every point is within distance "w"/2 of the line (that is, lies on a track of width "w", where "w" ≪ "L") | https://en.wikipedia.org/wiki?curid=289860 |
Alignments of random points Consider all the unordered sets of "k" points from the "n" points, of which there are: (see factorial and binomial coefficient for notation). To make a rough estimate of the probability that any given subset of "k" points is approximately collinear in the way defined above, consider the line between the "leftmost" and "rightmost" two points in that set (for some arbitrary left/right axis: we can choose top and bottom for the exceptional vertical case). These two points are by definition on this line. For each of the remaining "k"-2 points, the probability that the point is "near enough" to the line is roughly "w"/"L", which can be seen by considering the ratio of the area of the line tolerance zone (roughly "wL") and the overall area (roughly "L"). So, the expected number of k-point alignments, by this definition, is very roughly: Among other things this can be used to show that, contrary to intuition, the number of k-point lines expected from random chance in a plane covered with points at a given density, for a given line width, increases much more than linearly with the size of the area considered, since the combinatorial explosion of growth in the number of possible combinations of points more than makes up for the increase in difficulty of any given combination lining up | https://en.wikipedia.org/wiki?curid=289860 |
Alignments of random points A more precise expression for the number of 3-point alignments of maximum width "w" and maximum length "d" expected by chance among "n" points placed randomly on a square of side "L" is If edge effects (alignments lost over the boundaries of the square) are included, then the expression becomes A generalisation to "k"-point alignments (ignoring edge effects) is which has roughly similar asymptotic scaling properties as the crude approximation in the previous section, with combinatorial explosion for large "n" overwhelming the effects of other variables. Computer simulations show that points on a plane tend to form alignments similar to those found by ley hunters in numbers consistent with the order-of-magnitude estimates above, suggesting that ley lines may also be generated by chance. This phenomenon occurs regardless of whether the points are generated pseudo-randomly by computer, or from data sets of mundane features such as pizza restaurants or telephone booths. It is easy to find alignments of 4 to 8 points in reasonably small data sets with "w" = 50 m. Choosing large areas or larger values of "w" makes it easy to find alignments of 20 or more points. | https://en.wikipedia.org/wiki?curid=289860 |
Lattice model (physics) In physics, a lattice model is a physical model that is defined on a lattice, as opposed to the continuum of space or spacetime. Lattice models originally occurred in the context of condensed matter physics, where the atoms of a crystal automatically form a lattice. Currently, lattice models are quite popular in theoretical physics, for many reasons. Some models are exactly solvable, and thus offer insight into physics beyond what can be learned from perturbation theory. Lattice models are also ideal for study by the methods of computational physics, as the discretization of any continuum model automatically turns it into a lattice model. Examples of lattice models in condensed matter physics include the Ising model, the Potts model, the XY model, the Toda lattice. The exact solution to many of these models (when they are solvable) includes the presence of solitons. Techniques for solving these include the inverse scattering transform and the method of Lax pairs, the Yang–Baxter equation and quantum groups. The solution of these models has given insights into the nature of phase transitions, magnetization and scaling behaviour, as well as insights into the nature of quantum field theory. Physical lattice models frequently occur as an approximation to a continuum theory, either to give an ultraviolet cutoff to the theory to prevent divergences or to perform numerical computations | https://en.wikipedia.org/wiki?curid=291472 |
Lattice model (physics) An example of a continuum theory that is widely studied by lattice models is the QCD lattice model, a discretization of quantum chromodynamics. However, digital physics considers nature fundamentally discrete at the Planck scale, which imposes , aka Holographic principle. More generally, lattice gauge theory and lattice field theory are areas of study. Lattice models are also used to simulate the structure and dynamics of polymers. Examples include the bond fluctuation model and the 2nd model. | https://en.wikipedia.org/wiki?curid=291472 |
Flavr Savr (also known as CGN-89564-2; pronounced "flavor saver"), a genetically modified tomato, was the first commercially grown genetically engineered food to be granted a license for human consumption. It was produced by the Californian company Calgene, and submitted to the U.S. Food and Drug Administration (FDA) in 1992. On May 18, 1994, the FDA completed its evaluation of the tomato and the use of APH(3')II, concluding that the tomato "is as safe as tomatoes bred by conventional means" and "that the use of aminoglycoside 3'-phosphotransferase II is safe for use as a processing aid in the development of new varieties of tomato, rapeseed oil, and cotton intended for food use." It was first sold in 1994, and was only available for a few years before production ceased in 1997. Calgene made history, but mounting costs prevented the company from becoming profitable, and it was eventually acquired by Monsanto Company. Tomatoes have a short shelf-life in which they remain firm and ripe. This lifetime may be shorter than the time needed for them to reach market when shipped from winter growing areas to markets in the north, and the softening process can also lead to more of the fruit being damaged during transit. To address this, tomatoes intended for shipping are often picked while they are unripe, or "green", and then prompted to ripen just before delivery through the use of ethylene gas which acts as a plant hormone | https://en.wikipedia.org/wiki?curid=293676 |
Flavr Savr The downside to this approach is that the tomato does not complete its natural growing process, and the final flavor suffers as a result. Through genetic engineering, Calgene hoped to slow down the ripening process of the tomato and thus prevent it from softening, while still allowing the tomato to retain its natural colour and flavour. This would allow it to fully ripen on the vine and still be shipped long distances without it going soft. The was made more resistant to rotting by adding an antisense gene which interferes with the production of the enzyme polygalacturonase. The enzyme normally degrades pectin in the cell walls and results in the softening of fruit which makes them more susceptible to being damaged by fungal infections. turned out to disappoint researchers in that respect, as the antisensed PG gene had a positive effect on shelf life, but not on the fruit's firmness, so the tomatoes still had to be harvested like any other unmodified vine-ripe tomatoes. An improved flavor, later achieved through traditional breeding of and better tasting varieties, would also contribute to selling at a premium price at the supermarket. The FDA stated that special labeling for these modified tomatoes was not necessary because they have the essential characteristics of non-modified tomatoes. Specifically, there was no evidence for health risks, and the nutritional content was unchanged. The failure of the has been attributed to Calgene's inexperience in the business of growing and shipping tomatoes | https://en.wikipedia.org/wiki?curid=293676 |
Flavr Savr In the UK, Zeneca produced a tomato paste that used technology similar to the Flavr Savr. Don Grierson was involved in the research to make the genetically modified tomato. Due to the characteristics of the tomato, it was cheaper to produce than conventional tomato paste, resulting in the product being 20% cheaper. Between 1996 and 1999, 1.8 million cans, clearly labelled as genetically engineered, were sold in the major supermarket chains Sainsbury's and Safeway UK. At one point the paste outsold normal tomato paste but sales fell in the autumn of 1998. The House of Commons of the United Kingdom published a report in which they stated that the decline in sales during this period was linked to changing consumer perceptions of genetically modified crops. The report identified several possible factors, including product labeling and perception of choice, lobbying campaigns, and media attention. It concluded that the tone of media reports on the subject underwent a "fundamental shift" in response to a high-profile incident in which Dr. Arpad Pusztai, a researcher for Rowett Research Institute, was fired after making a televised claim about detrimental health effects in lab rats fed a diet of genetically modified potatoes (see the Pusztai affair). Subsequent peer review and testimony by Dr. Pusztai led the House Science and Technology Select Committee to conclude that his initial claim was "contradicted by his own evidence | https://en.wikipedia.org/wiki?curid=293676 |
Flavr Savr " In the intervening period, Sainsbury's and Safeway both pledged that none of their house brand products would contain genetically modified ingredients. | https://en.wikipedia.org/wiki?curid=293676 |
Helimagnetism is an incommensurate form of magnetic ordering that results from the competition between ferromagnetic and antiferromagnetic exchange interactions, and is typically only observed at liquid helium temperatures. Spins of neighbouring magnetic moments arrange themselves in a spiral or helical pattern, with a characteristic turn angle of somewhere between 0 and 180 degrees. It is possible to view ferromagnetism and antiferromagnetism as helimagnetic structures with characteristic turn angles of 0 and 180 degrees respectively. Helimagnetic order breaks spatial inversion symmetry, as it can be either left-handed or right-handed in nature. was first proposed in 1959, as an explanation of the magnetic structure of manganese dioxide. Initially applied to neutron diffraction, it has since been observed more directly by Lorentz electron microscopy. Some helimagnetic structures are reported to be stable up to room temperature. | https://en.wikipedia.org/wiki?curid=6005413 |
Stable isotope labeling by amino acids in cell culture Stable Isotope Labeling by/with Amino acids in Cell culture (SILAC) is a technique based on mass spectrometry that detects differences in protein abundance among samples using non-radioactive isotopic labeling. It is a popular method for quantitative proteomics. Two populations of cells are cultivated in cell culture. One of the cell populations is fed with growth medium containing normal amino acids. In contrast, the second population is fed with growth medium containing amino acids labeled with stable (non-radioactive) heavy isotopes. For example, the medium can contain arginine labeled with six carbon-13 atoms (C) instead of the normal carbon-12 (C). When the cells are growing in this medium, they incorporate the heavy arginine into all of their proteins. Thereafter, all peptides containing a single arginine are 6 Da heavier than their normal counterparts. Alternatively, uniform labeling with C or N can be used. The trick is that the proteins from both cell populations can be combined and analyzed together by mass spectrometry. Pairs of chemically identical peptides of different stable-isotope composition can be differentiated in a mass spectrometer owing to their mass difference. The ratio of peak intensities in the mass spectrum for such peptide pairs reflects the abundance ratio for the two proteins. A SILAC approach involving incorporation of tyrosine labeled with nine carbon-13 atoms (C) instead of the normal carbon-12 (C) has been utilized to study tyrosine kinase substrates in signaling pathways | https://en.wikipedia.org/wiki?curid=6006708 |
Stable isotope labeling by amino acids in cell culture SILAC has emerged as a very powerful method to study cell signaling, post translation modifications such as phosphorylation, protein–protein interaction and regulation of gene expression. In addition, SILAC has become an important method in secretomics, the global study of secreted proteins and secretory pathways. It can be used to distinguish between proteins secreted by cells in culture and serum contaminants. Standardized protocols of SILAC for various applications have also been published. While SILAC had been mostly used in studying eukaryotic cells and cell cultures, it had been recently employed in bacteria and its multicellular biofilm in antibiotic tolerance, to differentiate tolerance and sensitive subpopulations. Pulsed SILAC (pSILAC) is a variation of the SILAC method where the labelled amino acids are added to the growth medium for only a short period of time. This allows monitoring differences in "de novo" protein production rather than raw concentration. It had also been used to study biofilm tolerance to antibiotics to differentiate tolerant and sensitive subpopulations Traditionally the level of multiplexing in SILAC was limited due to the number of SILAC isotopes available. Recently, a new technique called NeuCode (neutron encoding) SILAC, has augmented the level of multiplexing achievable with metabolic labeling (up to 4). The NeuCode amino acid method is similar to SILAC but differs in that the labeling only utilizes heavy amino acids | https://en.wikipedia.org/wiki?curid=6006708 |
Stable isotope labeling by amino acids in cell culture The use of only heavy amino acids eliminates the need for 100% incorporation of amino acids needed for SILAC. The increased multiplexing capability of NeuCode amino acids is from the use of mass defects from extra neutrons in the stable isotopes. These small mass differences however need to be resolved on high resolution mass spectrometers. | https://en.wikipedia.org/wiki?curid=6006708 |
Telechelic polymer A telechelic polymer or oligomer is a prepolymer capable of entering into further polymerization or other reactions through its reactive end-groups. It can be used for example to synthesize block copolymers. By definition, a telechelic polymer is a di-end-functional polymer where both ends possess the same functionality. Where the chain-ends of the polymer are not of the same functionality they are termed di-end-functional polymers. All polymers resulting from living polymerization are end-functional but may not necessarily be telechelic. To prepare polymers by step-growth polymerization, telechelic polymers like polymeric diols and epoxy prepolymers can be used. The main examples are: Other examples of telechelic polymers are the halato-telechelic polymers or halatopolymers. The end-groups of these polymers are ionic or ionizable like carboxylate or quaternary ammonium groups. | https://en.wikipedia.org/wiki?curid=6012260 |
History of biotechnology Biotechnology is the application of scientific and engineering principles to the processing of materials by biological agents to provide goods and services. From its inception, biotechnology has maintained a close relationship with society. Although now most often associated with the development of drugs, historically biotechnology has been principally associated with food, addressing such issues as malnutrition and famine. The history of biotechnology begins with zymotechnology, which commenced with a focus on brewing techniques for beer. By World War I, however, zymotechnology would expand to tackle larger industrial issues, and the potential of industrial fermentation gave rise to biotechnology. However, both the single-cell protein and gasohol projects failed to progress due to varying issues including public resistance, a changing economic scene, and shifts in political power. Yet the formation of a new field, genetic engineering, would soon bring biotechnology to the forefront of science in society, and the intimate relationship between the scientific community, the public, and the government would ensue. These debates gained exposure in 1975 at the Asilomar Conference, where Joshua Lederberg was the most outspoken supporter for this emerging field in biotechnology. By as early as 1978, with the development of synthetic human insulin, Lederberg's claims would prove valid, and the biotechnology industry grew rapidly | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology Each new scientific advance became a media event designed to capture public support, and by the 1980s, biotechnology grew into a promising real industry. In 1988, only five proteins from genetically engineered cells had been approved as drugs by the United States Food and Drug Administration (FDA), but this number would skyrocket to over 125 by the end of the 1990s. The field of genetic engineering remains a heated topic of discussion in today's society with the advent of gene therapy, stem cell research, cloning, and genetically modified food. While it seems only natural nowadays to link pharmaceutical drugs as solutions to health and societal problems, this relationship of biotechnology serving social needs began centuries ago. Biotechnology arose from the field of zymotechnology or zymurgy, which began as a search for a better understanding of industrial fermentation, particularly beer. Beer was an important industrial, and not just social, commodity. In late 19th-century Germany, brewing contributed as much to the gross national product as steel, and taxes on alcohol proved to be significant sources of revenue to the government. In the 1860s, institutes and remunerative consultancies were dedicated to the technology of brewing. The most famous was the private Carlsberg Institute, founded in 1875, which employed Emil Christian Hansen, who pioneered the pure yeast process for the reliable production of consistent beer. Less well known were private consultancies that advised the brewing industry | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology One of these, the Zymotechnic Institute, was established in Chicago by the German-born chemist John Ewald Siebel. The heyday and expansion of zymotechnology came in World War I in response to industrial needs to support the war. Max Delbrück grew yeast on an immense scale during the war to meet 60 percent of Germany's animal feed needs. Compounds of another fermentation product, lactic acid, made up for a lack of hydraulic fluid, glycerol. On the Allied side the Russian chemist Chaim Weizmann used starch to eliminate Britain's shortage of acetone, a key raw material for cordite, by fermenting maize to acetone. The industrial potential of fermentation was outgrowing its traditional home in brewing, and "zymotechnology" soon gave way to "biotechnology." With food shortages spreading and resources fading, some dreamed of a new industrial solution. The Hungarian Károly Ereky coined the word "biotechnology" in Hungary during 1919 to describe a technology based on converting raw materials into a more useful product. He built a slaughterhouse for a thousand pigs and also a fattening farm with space for 50,000 pigs, raising over 100,000 pigs a year. The enterprise was enormous, becoming one of the largest and most profitable meat and fat operations in the world. In a book entitled "Biotechnologie", Ereky further developed a theme that would be reiterated through the 20th century: biotechnology could provide solutions to societal crises, such as food and energy shortages | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology For Ereky, the term "biotechnologie" indicated the process by which raw materials could be biologically upgraded into socially useful products. This catchword spread quickly after the First World War, as "biotechnology" entered German dictionaries and was taken up abroad by business-hungry private consultancies as far away as the United States. In Chicago, for example, the coming of prohibition at the end of World War I encouraged biological industries to create opportunities for new fermentation products, in particular a market for nonalcoholic drinks. Emil Siebel, the son of the founder of the Zymotechnic Institute, broke away from his father's company to establish his own called the "Bureau of Biotechnology," which specifically offered expertise in fermented nonalcoholic drinks. The belief that the needs of an industrial society could be met by fermenting agricultural waste was an important ingredient of the "chemurgic movement." Fermentation-based processes generated products of ever-growing utility. In the 1940s, penicillin was the most dramatic. While it was discovered in England, it was produced industrially in the U.S. using a deep fermentation process originally developed in Peoria, Illinois. The enormous profits and the public expectations penicillin engendered caused a radical shift in the standing of the pharmaceutical industry | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology Doctors used the phrase "miracle drug", and the historian of its wartime use, David Adams, has suggested that to the public penicillin represented the perfect health that went together with the car and the dream house of wartime American advertising. Beginning in the 1950s, fermentation technology also became advanced enough to produce steroids on industrially significant scales. Of particular importance was the improved semisynthesis of cortisone which simplified the old 31 step synthesis to 11 steps. This advance was estimated to reduce the cost of the drug by 70%, making the medicine inexpensive and available. Today biotechnology still plays a central role in the production of these compounds and likely will for years to come. Even greater expectations of biotechnology were raised during the 1960s by a process that grew single-cell protein. When the so-called protein gap threatened world hunger, producing food locally by growing it from waste seemed to offer a solution. It was the possibilities of growing microorganisms on oil that captured the imagination of scientists, policy makers, and commerce. Major companies such as British Petroleum (BP) staked their futures on it. In 1962, BP built a pilot plant at Cap de Lavera in Southern France to publicize its product, Toprina. Initial research work at Lavera was done by Alfred Champagnat, In 1963, construction started on BP's second pilot plant at Grangemouth Oil Refinery in Britain | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology As there was no well-accepted term to describe the new foods, in 1966 the term "single-cell protein" (SCP) was coined at MIT to provide an acceptable and exciting new title, avoiding the unpleasant connotations of microbial or bacterial. The "food from oil" idea became quite popular by the 1970s, when facilities for growing yeast fed by n-paraffins were built in a number of countries. The Soviets were particularly enthusiastic, opening large "BVK" ("belkovo-vitaminny kontsentrat", i.e., "protein-vitamin concentrate") plants next to their oil refineries in Kstovo (1973) and Kirishi (1974). By the late 1970s, however, the cultural climate had completely changed, as the growth in SCP interest had taken place against a shifting economic and cultural scene (136). First, the price of oil rose catastrophically in 1974, so that its cost per barrel was five times greater than it had been two years earlier. Second, despite continuing hunger around the world, anticipated demand also began to shift from humans to animals. The program had begun with the vision of growing food for Third World people, yet the product was instead launched as an animal food for the developed world. The rapidly rising demand for animal feed made that market appear economically more attractive. The ultimate downfall of the SCP project, however, came from public resistance. This was particularly vocal in Japan, where production came closest to fruition | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology For all their enthusiasm for innovation and traditional interest in microbiologically produced foods, the Japanese were the first to ban the production of single-cell proteins. The Japanese ultimately were unable to separate the idea of their new "natural" foods from the far from natural connotation of oil. These arguments were made against a background of suspicion of heavy industry in which anxiety over minute traces of petroleum was expressed. Thus, public resistance to an unnatural product led to the end of the SCP project as an attempt to solve world hunger. Also, in 1989 in the USSR, the public environmental concerns made the government decide to close down (or convert to different technologies) all 8 paraffin-fed-yeast plants that the Soviet Ministry of Microbiological Industry had by that time. In the late 1970s, biotechnology offered another possible solution to a societal crisis. The escalation in the price of oil in 1974 increased the cost of the Western world's energy tenfold. In response, the U.S. government promoted the production of gasohol, gasoline with 10 percent alcohol added, as an answer to the energy crisis. In 1979, when the Soviet Union sent troops to Afghanistan, the Carter administration cut off its supplies to agricultural produce in retaliation, creating a surplus of agriculture in the U.S. As a result, fermenting the agricultural surpluses to synthesize fuel seemed to be an economical solution to the shortage of oil threatened by the Iran–Iraq War | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology Before the new direction could be taken, however, the political wind changed again: the Reagan administration came to power in January 1981 and, with the declining oil prices of the 1980s, ended support for the gasohol industry before it was born. Biotechnology seemed to be the solution for major social problems, including world hunger and energy crises. In the 1960s, radical measures would be needed to meet world starvation, and biotechnology seemed to provide an answer. However, the solutions proved to be too expensive and socially unacceptable, and solving world hunger through SCP food was dismissed. In the 1970s, the food crisis was succeeded by the energy crisis, and here too, biotechnology seemed to provide an answer. But once again, costs proved prohibitive as oil prices slumped in the 1980s. Thus, in practice, the implications of biotechnology were not fully realized in these situations. But this would soon change with the rise of genetic engineering. The origins of biotechnology culminated with the birth of genetic engineering. There were two key events that have come to be seen as scientific breakthroughs beginning the era that would unite genetics with biotechnology. One was the 1953 discovery of the structure of DNA, by Watson and Crick, and the other was the 1973 discovery by Cohen and Boyer of a recombinant DNA technique by which a section of DNA was cut from the plasmid of an E. coli bacterium and transferred into the DNA of another | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology This approach could, in principle, enable bacteria to adopt the genes and produce proteins of other organisms, including humans. Popularly referred to as "genetic engineering," it came to be defined as the basis of new biotechnology. Genetic engineering proved to be a topic that thrust biotechnology into the public scene, and the interaction between scientists, politicians, and the public defined the work that was accomplished in this area. Technical developments during this time were revolutionary and at times frightening. In December 1967, the first heart transplant by Christian Barnard reminded the public that the physical identity of a person was becoming increasingly problematic. While poetic imagination had always seen the heart at the center of the soul, now there was the prospect of individuals being defined by other people's hearts. During the same month, Arthur Kornberg announced that he had managed to biochemically replicate a viral gene. "Life had been synthesized," said the head of the National Institutes of Health. Genetic engineering was now on the scientific agenda, as it was becoming possible to identify genetic characteristics with diseases such as beta thalassemia and sickle-cell anemia. Responses to scientific achievements were colored by cultural skepticism. Scientists and their expertise were looked upon with suspicion. In 1968, an immensely popular work, "The Biological Time Bomb", was written by the British journalist Gordon Rattray Taylor | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology The author's preface saw Kornberg's discovery of replicating a viral gene as a route to lethal doomsday bugs. The publisher's blurb for the book warned that within ten years, "You may marry a semi-artificial man or woman…choose your children's sex…tune out pain…change your memories…and live to be 150 if the scientific revolution doesn’t destroy us first." The book ended with a chapter called "The Future – If Any." While it is rare for current science to be represented in the movies, in this period of "Star Trek", science fiction and science fact seemed to be converging. "Cloning" became a popular word in the media. Woody Allen satirized the cloning of a person from a nose in his 1973 movie "Sleeper", and cloning Adolf Hitler from surviving cells was the theme of the 1976 novel by Ira Levin, "The Boys from Brazil". In response to these public concerns, scientists, industry, and governments increasingly linked the power of recombinant DNA to the immensely practical functions that biotechnology promised. One of the key scientific figures that attempted to highlight the promising aspects of genetic engineering was Joshua Lederberg, a Stanford professor and Nobel laureate. While in the 1960s "genetic engineering" described eugenics and work involving the manipulation of the human genome, Lederberg stressed research that would involve microbes instead. Lederberg emphasized the importance of focusing on curing living people | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology Lederberg's 1963 paper, "Biological Future of Man" suggested that, while molecular biology might one day make it possible to change the human genotype, "what we have overlooked is euphenics, the engineering of human development." Lederberg constructed the word "euphenics" to emphasize changing the phenotype after conception rather than the genotype which would affect future generations. With the discovery of recombinant DNA by Cohen and Boyer in 1973, the idea that genetic engineering would have major human and societal consequences was born. In July 1974, a group of eminent molecular biologists headed by Paul Berg wrote to "Science" suggesting that the consequences of this work were so potentially destructive that there should be a pause until its implications had been thought through. This suggestion was explored at a meeting in February 1975 at California's Monterey Peninsula, forever immortalized by the location, Asilomar. Its historic outcome was an unprecedented call for a halt in research until it could be regulated in such a way that the public need not be anxious, and it led to a 16-month moratorium until National Institutes of Health (NIH) guidelines were established. Joshua Lederberg was the leading exception in emphasizing, as he had for years, the potential benefits. At Asilomar, in an atmosphere favoring control and regulation, he circulated a paper countering the pessimism and fears of misuses with the benefits conferred by successful use | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology He described "an early chance for a technology of untold importance for diagnostic and therapeutic medicine: the ready production of an unlimited variety of human proteins. Analogous applications may be foreseen in fermentation process for cheaply manufacturing essential nutrients, and in the improvement of microbes for the production of antibiotics and of special industrial chemicals." In June 1976, the 16-month moratorium on research expired with the Director's Advisory Committee (DAC) publication of the NIH guidelines of good practice. They defined the risks of certain kinds of experiments and the appropriate physical conditions for their pursuit, as well as a list of things too dangerous to perform at all. Moreover, modified organisms were not to be tested outside the confines of a laboratory or allowed into the environment. Atypical as Lederberg was at Asilomar, his optimistic vision of genetic engineering would soon lead to the development of the biotechnology industry. Over the next two years, as public concern over the dangers of recombinant DNA research grew, so too did interest in its technical and practical applications. Curing genetic diseases remained in the realms of science fiction, but it appeared that producing human simple proteins could be good business. Insulin, one of the smaller, best characterized and understood proteins, had been used in treating type 1 diabetes for a half century. It had been extracted from animals in a chemically slightly different form from the human product | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology Yet, if one could produce synthetic human insulin, one could meet an existing demand with a product whose approval would be relatively easy to obtain from regulators. In the period 1975 to 1977, synthetic "human" insulin represented the aspirations for new products that could be made with the new biotechnology. Microbial production of synthetic human insulin was finally announced in September 1978 and was produced by a startup company, Genentech. Although that company did not commercialize the product themselves, instead, it licensed the production method to Eli Lilly and Company. 1978 also saw the first application for a patent on a gene, the gene which produces human growth hormone, by the University of California, thus introducing the legal principle that genes could be patented. Since that filing, almost 20% of the more than 20,000 genes in the human DNA have been patented. The radical shift in the connotation of "genetic engineering" from an emphasis on the inherited characteristics of people to the commercial production of proteins and therapeutic drugs was nurtured by Joshua Lederberg. His broad concerns since the 1960s had been stimulated by enthusiasm for science and its potential medical benefits. Countering calls for strict regulation, he expressed a vision of potential utility. Against a belief that new techniques would entail unmentionable and uncontrollable consequences for humanity and the environment, a growing consensus on the economic value of recombinant DNA emerged | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology The MOSFET (metal-oxide-semiconductor field-effect transistor, or MOS transistor) was invented by Mohamed M. Atalla and Dawon Kahng in 1959, and demonstrated in 1960. Two years later, L.C. Clark and C. Lyons invented the biosensor in 1962. Biosensor MOSFETs (BioFETs) were later developed, and they have since been widely used to measure physical, chemical, biological and environmental parameters. The first BioFET was the ion-sensitive field-effect transistor (ISFET), invented by Piet Bergveld for electrochemical and biological applications in 1970. the adsorption FET (ADFET) was patented by P.F. Cox in 1974, and a hydrogen-sensitive MOSFET was demonstrated by I. Lundstrom, M.S. Shivaraman, C.S. Svenson and L. Lundkvist in 1975. The ISFET is a special type of MOSFET with a gate at a certain distance, and where the metal gate is replaced by an ion-sensitive membrane, electrolyte solution and reference electrode. The ISFET is widely used in biomedical applications, such as the detection of DNA hybridization, biomarker detection from blood, antibody detection, glucose measurement, pH sensing, and genetic technology. By the mid-1980s, other BioFETs had been developed, including the gas sensor FET (GASFET), pressure sensor FET (PRESSFET), chemical field-effect transistor (ChemFET), reference ISFET (REFET), enzyme-modified FET (ENFET) and immunologically modified FET (IMFET) | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology By the early 2000s, BioFETs such as the DNA field-effect transistor (DNAFET), gene-modified FET (GenFET) and cell-potential BioFET (CPFET) had been developed. With ancestral roots in industrial microbiology that date back centuries, the new biotechnology industry grew rapidly beginning in the mid-1970s. Each new scientific advance became a media event designed to capture investment confidence and public support. Although market expectations and social benefits of new products were frequently overstated, many people were prepared to see genetic engineering as the next great advance in technological progress. By the 1980s, biotechnology characterized a nascent real industry, providing titles for emerging trade organizations such as the Biotechnology Industry Organization (BIO). The main focus of attention after insulin were the potential profit makers in the pharmaceutical industry: human growth hormone and what promised to be a miraculous cure for viral diseases, interferon. Cancer was a central target in the 1970s because increasingly the disease was linked to viruses. By 1980, a new company, Biogen, had produced interferon through recombinant DNA. The emergence of interferon and the possibility of curing cancer raised money in the community for research and increased the enthusiasm of an otherwise uncertain and tentative society | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology Moreover, to the 1970s plight of cancer was added AIDS in the 1980s, offering an enormous potential market for a successful therapy, and more immediately, a market for diagnostic tests based on monoclonal antibodies. By 1988, only five proteins from genetically engineered cells had been approved as drugs by the United States Food and Drug Administration (FDA): synthetic insulin, human growth hormone, hepatitis B vaccine, alpha-interferon, and tissue plasminogen activator (TPa), for lysis of blood clots. By the end of the 1990s, however, 125 more genetically engineered drugs would be approved. The 2007–2008 global financial crisis led to several changes in the way the biotechnology industry was financed and organized. First, it led to a decline in overall financial investment in the sector, globally; and second, in some countries like the UK it led to a shift from business strategies focused on going for an initial public offering (IPO) to seeking a trade sale instead. By 2011, financial investment in the biotechnology industry started to improve again and by 2014 the global market capitalization reached $1 trillion. Genetic engineering also reached the agricultural front as well. There was tremendous progress since the market introduction of the genetically engineered Flavr Savr tomato in 1994. Ernst and Young reported that in 1998, 30% of the U.S. soybean crop was expected to be from genetically engineered seeds | https://en.wikipedia.org/wiki?curid=6012335 |
History of biotechnology In 1998, about 30% of the US cotton and corn crops were also expected to be products of genetic engineering. Genetic engineering in biotechnology stimulated hopes for both therapeutic proteins, drugs and biological organisms themselves, such as seeds, pesticides, engineered yeasts, and modified human cells for treating genetic diseases. From the perspective of its commercial promoters, scientific breakthroughs, industrial commitment, and official support were finally coming together, and biotechnology became a normal part of business. No longer were the proponents for the economic and technological significance of biotechnology the iconoclasts. Their message had finally become accepted and incorporated into the policies of governments and industry. According to Burrill and Company, an industry investment bank, over $350 billion has been invested in biotech since the emergence of the industry, and global revenues rose from $23 billion in 2000 to more than $50 billion in 2005. The greatest growth has been in Latin America but all regions of the world have shown strong growth trends. By 2007 and into 2008, though, a downturn in the fortunes of biotech emerged, at least in the United Kingdom, as the result of declining investment in the face of failure of biotech pipelines to deliver and a consequent downturn in return on investment. | https://en.wikipedia.org/wiki?curid=6012335 |
Pressure drop is defined as the difference in total pressure between two points of a fluid carrying network. A pressure drop occurs when frictional forces, caused by the resistance to flow, act on a fluid as it flows through the tube. The main determinants of resistance to fluid flow are fluid velocity through the pipe and fluid viscosity. increases proportionally to the frictional shear forces within the piping network. A piping network containing a high relative roughness rating as well as many pipe fittings and joints, tube convergence, divergence, turns, surface roughness, and other physical properties will affect the pressure drop. High flow velocities and/or high fluid viscosities result in a larger pressure drop across a section of pipe or a valve or elbow. Low velocity will result in lower or no pressure drop. The fluid may also be biphasic as in pneumatic conveying with a gas and a solid, in this case, the friction of the solid must also be taken into consideration for calculating the pressure drop http://www.fao.org/docrep/x5744e/x5744eee.gif | https://en.wikipedia.org/wiki?curid=6014225 |
Hannay angle In classical mechanics, the is a mechanics analogue of the whirling geometric phase (or Berry phase). It was named after John Hannay of the University of Bristol, UK. Hannay first described the angle in 1985, extending the ideas of the recently formalized Berry phase to classical mechanics. The Foucault pendulum is an example from classical mechanics that is sometimes used to illustrate the Berry phase. | https://en.wikipedia.org/wiki?curid=6020635 |
Keyhole problem The keyhole problem, in the context of astronomy, refers to the difficulty that azimuth-elevation type telescopes or antenna gimbal systems encounter in crossing the zenith. To track celestial objects as they move across the sky, these systems usually rotate on two axes. Often, a tilting mechanism (elevation) is mounted upon a panning base (azimuth). To cover the complete hemisphere of visible sky, a telescope gimbal can have a 360-degree azimuth range and a 0- to 90-degree elevation range. To visualize this shape, imagine drawing a quarter circle spanning from the horizon to directly above you and revolving it around the vertical axis. If, on the other hand, the gimbal has a range from 0 to slightly less than 90 degrees elevation, the telescope is unable to see a region of sky. A variation on the keyhole problem involves defining behavior for gimbals with full-circle azimuth range, and at least 90-degree but less than 180-degree elevation range. Imagine a satellite on an orbital path that crosses directly overhead. If the gimbal tilts to track the object from the horizon but must stop at 90 degrees, the entire telescope must pan 180 degrees to follow the object from zenith down to the opposite horizon. This is an often-encountered difficulty in creating smooth automated tracking algorithms. | https://en.wikipedia.org/wiki?curid=6036832 |
Barouh Berkovits (May 7, 1926 – October 23, 2012) was one of the pioneers of bio-engineering, particularly the cardiac defibrillator and artificial cardiac pacemaker. In particular, Berkovits invented the "demand pacemaker" and the DC defibrillator. Berkovits was born in Czechoslovakia. His parents and sister died at Auschwitz. He immigrated to the United States in the 1950s, and worked for the pacemaker company Medtronic from 1975 until his retirement. In 1982, Berkovits received the "Distinguished Scientist Award" from the Heart Rhythm Society. He graduated from the New York University Tandon School of Engineering in 1956. He was also a faculty member at NYU Tandon. | https://en.wikipedia.org/wiki?curid=6038692 |
Occulting disk An occulting disk is a small disk used in a telescope to block the view of a bright object in order to allow observation of a fainter one. The coronagraph, at its simplest, is an occulting disk in the focal plane of a telescope, or in front of the entrance aperture, that blocks out the image of the solar disk, so that the corona can be seen. Starshade is one designed to fly in formation with a space telescope to image exoplanets. | https://en.wikipedia.org/wiki?curid=6044113 |
Jan Smit (physicist) Jan Smit (born 16 September 1943, Amsterdam) is a Dutch theoretical physicist. During his PhD at UCLA with professor Robert Finkelstein he made some early contributions to lattice formulation of quantum field theory around 1972, which was a year before Kenneth Wilson, and two years before Alexander Polyakov. However, he encountered some problems with fermion doubling which he could not solve at the moment. At that time he did not realize the value of his work and he only mentioned it briefly in his Ph.D. thesis in 1974, which was about Schwinger source theory. A few years later he returned to working on the lattice formulation and became a well-known expert in the field. He works at the University of Amsterdam at the Institute of Theoretical Physics. | https://en.wikipedia.org/wiki?curid=6044199 |
Breithauptite is a nickel antimonide mineral with the simple formula NiSb. is a metallic opaque copper-red mineral crystallizing in the hexagonal - dihexagonal dipyramidal crystal system. It is typically massive to reniform in habit, but is observed as tabular crystals. It has a Mohs hardness of 3.5 to 4 and a specific gravity of 8.23. It occurs in hydrothermal calcite veins associated with cobalt–nickel–silver ores. It was first described in 1840 from the Harz Mountains, Lower Saxony, Germany and in 1845 for occurrences in the Cobalt and Thunder Bay districts of Ontario, Canada. It was named to honor Saxon mineralogist Johann Friedrich August Breithaupt (1791–1873). | https://en.wikipedia.org/wiki?curid=6044501 |
Leon Knopoff (July 1, 1925 – January 20, 2011) was a geophysicist and musicologist. He received his education at Caltech, graduating in 1949 with a PhD in physics, and came to UCLA the following year. He served on the UCLA faculty for 60 years. His research interests spanned a wide variety of fields and included the physics and statistics of earthquakes, earthquake prediction, the interior structure of the Earth, plate tectonics, pattern recognition, non-linear earthquake dynamics and several other areas of solid Earth geophysics. He also made contributions to the fields of musical perception and archaeology. | https://en.wikipedia.org/wiki?curid=6053337 |
Crude oil assay A crude oil assay is the chemical evaluation of crude oil feedstocks by petroleum testing laboratories. Each crude oil type has unique molecular and chemical characteristics. No two crude oil types are identical and there are crucial differences in crude oil quality. The results of crude oil assay testing provide extensive detailed hydrocarbon analysis data for refiners, oil traders and producers. Assay data help refineries determine if a crude oil feedstock is compatible for a particular petroleum refinery or if the crude oil could cause yield, quality, production, environmental and other problems. The assay can be an inspection assay or comprehensive assay. Testing can include crude oil characterization of whole crude oils and the various boiling range fractions produced from physical or simulated distillation by various procedures. Information obtained from the petroleum assay is used for detailed refinery engineering and client marketing purposes. Feedstock assay data are an important tool in the refining process. 2. https://www.bureauveritas.com/services+sheet/commodities/crude+oil+assay Bureau Veritas | https://en.wikipedia.org/wiki?curid=6060943 |
Nanofoam Nanofoams are a class of nanostructured, porous materials (foams) containing a significant population of pores with diameters less than 100 nm. Aerogels are one example of nanofoam. In 2006, researchers produced metal nanofoams by igniting pellets of energetic metal bis(tetrazolato)amine complexes. Nanofoams of iron, cobalt, nickel, copper, silver, and palladium have been prepared through this technique. These materials exhibit densities as low as 11 mg/cm, and surface areas as high as 258 m/g. These foams are effective catalysts. Also, metal nanofoams can be made by electrodeposition of metals inside templates with interconnected pores, such as 3D-porous anodic aluminum oxide (AAO). Such method gives nanofoams with an organized structure and allows to control the surface area and porosity of the fabricated material. Carbon nanofoam is an allotrope of carbon discovered in 1997. It consists of a cluster-assembly of carbon atoms strung together in a loose three-dimensional web. The material has a density of 2–10 mg/cm (0.0012 lb/ft). In 2014, researchers also fabricated glass nanofoam via femtosecond laser ablation. Their work consisted of raster scanning femtosecond laser pulses over the surface of glass to produce glass nanofoam with ~70 nm diameter wires. | https://en.wikipedia.org/wiki?curid=6064049 |
David Hodgson (chemist) Professor David Michael Hodgson is the Todd Fellow and Tutor in Chemistry at Oriel College, Oxford. Hodgson achieved his Bachelor of Science at the University of Bath and gained his Doctor of Philosophy from the University of Southampton. His research interests are in synthesis, broadly encompassing studies directed towards the design and development of new methods, reagents and strategies for the synthesis of biologically active molecules. | https://en.wikipedia.org/wiki?curid=6067807 |
Gilbert (Martian crater) Gilbert is a large Martian crater located in the south of the planet in the Mare Australe quadrangle. It is one of several large craters in the heavily pock-marked upland region. It was named in 1973 in honour of American geologist Grove K. Gilbert. | https://en.wikipedia.org/wiki?curid=6069148 |
DeepWorker 2000 is a submarine vehicle developed by Nuytco Research, Ltd. It is capable of descending to a depth of 610 m (2001 ft) and remaining submerged for 12 hours. In 1999, it was deployed to the continental shelf and upper continental slope on a five-year mission in association with the National Geographic Society's Sustainable Seas Expeditions. In 1999, the submersible was used to quantify the species of fish as well as the space resources utilized within the Stellwagen Bank National Marine Sanctuary. In 2000, was used for evaluating the coral reef system located in the Florida Middle Grounds. | https://en.wikipedia.org/wiki?curid=6071883 |
Joseph Altman (1925 – 2016) was an American biologist who worked in the field of neurobiology. Born in Hungary to a Jewish family, he survived the Holocaust and migrated with his family via Germany and Australia to the United States. In these places, he sought employment as a librarian and used the opportunity to inform himself reading books about psychology, human behavior, psychoanalysis, and human brain structure. In New York, where he married his first wife Elizabeth Altman, he became a graduate student in psychology in the laboratory of Hans-Lukas Teuber, earning a PhD.,in 1959 from New York University. That degree launched his scientific career, first as a postdoctoral fellow at Columbia University, next at the Massachusetts Institute of Technology, and finally at Purdue University. During his career, he collaborated closely with his second wife, Shirley A. Bayer. From the early 1960s to 2016, he published many articles in peer-reviewed journals, books, monographs, and online free books that emphasized developmental processes in brain anatomy and function. discovered adult neurogenesis, the creation of new neurons in the adult brain, in the 1960s. As an independent investigator at MIT, his results were largely ignored, by his account, in favor of Pasko Rakic's findings that neurogenesis is limited to pre-natal development. By the late 1990s, a paradigm shift had occurred | https://en.wikipedia.org/wiki?curid=6076253 |
Joseph Altman The fact that the brain can create new neurons even into adulthood was rediscovered by Elizabeth Gould in 1999, leading it to be one of the hottest fields in neuroscience. Adult neurogenesis has recently been proven to occur in the dentate gyrus, olfactory bulb and striatum through the measurement of Carbon-14—the levels of which changed during nuclear bomb testing throughout the 20th century—in postmortem human brains. | https://en.wikipedia.org/wiki?curid=6076253 |
Jon Claerbout Jon F. Claerbout (born 1938) is an American geophysicist and seismologist. He is the Cecil Green Professor Emeritus of Geophysics at Stanford University. Since the later half of the 20th century, he has been a leading researcher and pioneered the use of computers in processing and filtering seismic exploration data, eventually developing the field of time series analysis and seismic interferometry, modelling the propagation of seismic waves. Claerbout obtained a BS in physics in 1960, a MS in geophysics in 1963 and a PhD in geophysics in 1967, all from the Massachusetts Institute of Technology. His BS thesis was titled "A rubidium vapor magnetometer". He worked with Stephen M. Simpson, Jr. for his MS thesis, titled "Digital filters and applications to seismic detection and discrimination". The publication of this work made many geophysicists, including those in the oil and gas industry, well aware of Claerbout's potential. However, Claerbout found the sparse availability and low quality of earthquake seismic data frustrating and decided to study atmospheric gravity waves during his PhD. His advisor was Theodore R. Madden and the title of his thesis was "Electromagnetic Effects of Atmospheric Gravity Waves". Claerbout is the founder of the Stanford Exploration Project (SEP), the first geophysical research consortium funded by the oil and gas industry. Claerbout has been a doctoral advisor to many of influential geophysicists who joined SEP such as Oz Yilmaz and Biondo Biondi | https://en.wikipedia.org/wiki?curid=6079718 |
Jon Claerbout The term and concept of exploding reflectors in reflection seismology is often attributed to Jon Claerbout. However, Claerbout claims that the term was coined by John Sherwood, a geophysicist from Chevron who introduced him to exploration geophysics. John Sherwood has said that he only used the term to refer to Claerbout's innovative method of seismic migration. He was one of the first scientists to emphasize that computational methods threaten the reproducibility of research unless open access is provided to both the data and the software underlying a publication. Claerbout's books have been among the most read and cited in geophysical research, especially "Fundamentals of Geophysical Data Processing" and "Imaging the Earth's Interior", which have been translated into Chinese and Russian among other languages. He has since made all his books available for free download from his website. He is the youngest ever recipient of the Maurice Ewing Medal of the Society of Exploration Geophysicists, having received this award in 1992 for lifetime achievements when he was in his early fifties. Asteroid 156990 Claerbout, discovered by Joseph A. Dellinger at George Observatory in 2003, was named in his honor. The official was published by the Minor Planet Center on 17 May 2011 (). | https://en.wikipedia.org/wiki?curid=6079718 |
Thomas R. Holtz Jr. Thomas Richard Holtz Jr., Ph.D. (born September 13, 1965) is an American vertebrate palaeontologist, author, and principal lecturer at the University of Maryland's Department of Geology. He has published extensively on the phylogeny, morphology, ecomorphology, and locomotion of terrestrial predators, especially on tyrannosaurids and other theropod dinosaurs. He wrote the book "" and is the author or co-author of the chapters "Saurischia", "Basal Tetanurae", and "Tyrannosauroidea" in the second edition of "The Dinosauria". He has also been consulted as a scientific advisor for the "Walking With Dinosaurs" BBC series as well as the Discovery special "When Dinosaurs Roamed America", and has appeared in numerous documentaries focused on prehistoric life, such as "Jurassic Fight Club" on History and "Monsters Resurrected" on Discovery. Holtz is also the director of the Science and Global Change Program within the College Park Scholars living-learning community at the University of Maryland, College Park. Holtz has come up with several new theories and hypotheses about the dinosaurs' classification. For example, he coined the terms Maniraptoriformes and Arctometatarsus. He also proposed two classification systems for theropods. The first is the clade Arctometatarsalia, made up of tyrannosauroids, ornithomimosaurs, and troodontids, because all of these coelurosaurs had pinched middle metatarsal bones in their feet | https://en.wikipedia.org/wiki?curid=6091321 |
Thomas R. Holtz Jr. In this proposed classification system, the tyrannosauroids were supposedly basal to a clade known as Bullatosauria, which was made up of the Troodontidae and the Ornithomimosauria. These two groups were purported to form a clade, because they both shared a common characteristic; which was a skull capsule. However, later on, both of these classification systems were found to be paraphyletic, or "artificial", clades. For example, troodontids are now known to be deinonychosaurs, or "raptors", closely related to dromaeosaurids and birds. The discovery of basal tyrannosauroids, such as "Guanlong", which lacked an arctometatarsus, also helped to disprove this theory. And, eventually, the "skull capsule" in troodontids and ornithomimosaurs was found to be an example of convergent evolution, causing the clade Bullatosauria to be abandoned. Holtz was also a key figure in the discovery that tyrannosauroids were not carnosaurs, as had been previously believed by most palaeontologists, but rather large coelurosaurs. By being one of the very first scientists to theorise this, Holtz contributed greatly to the debunking of a monophyletic Carnosauria. | https://en.wikipedia.org/wiki?curid=6091321 |
SEG-Y The (sometimes SEG Y) file format is one of several standards developed by the Society of Exploration Geophysicists (SEG) for storing geophysical data. It is an open standard, and is controlled by the SEG Technical Standards Committee, a non-profit organization. The format was originally developed in 1973 to store single-line seismic reflection digital data on magnetic tapes. The specification was published in 1975. The format and its name evolved from the SEG "Ex" or Exchange Tape Format. However, since its release, there have been significant advancements in geophysical data acquisition, such as 3-dimensional seismic techniques and high speed, high capacity recording. The most recent revision of the format was published in 2017, named the "rev 2.0" specification. It still features certain legacies of the original format (referred as "rev 0"), such as an optional tape label, the main 3200 byte textual EBCDIC character encoded tape header and a 400 byte binary header. This image shows the byte stream structure of a file, with rev 1 Extended Textual File Header records. Many programs do not totally follow the specification. For example, ODEC uses the opposite byte order and adds 320 bytes to the tail of each trace. A related format called "PASSCAL SEG-Y" is sometimes used to store continuous seismic records longer than a typical shot gather and should not be confused with the shot-gather focused "SEG-Y" format documented above | https://en.wikipedia.org/wiki?curid=6092968 |
SEG-Y PASSCAL files contain a single trace, and therefore consist of solely a 240-byte trace header and a variable-length stream of binary integers (which can be much longer than the 32767-sample limit in shot-gather files). | https://en.wikipedia.org/wiki?curid=6092968 |
Peter Dayan is director at the Max Planck Institute for Biological Cybernetics in Tübingen, Germany. He is co-author of "Theoretical Neuroscience", an influential textbook on computational neuroscience. He is known for applying bayesian methods from machine learning and artificial intelligence to understand neural function and is particularly recognized for relating neurotransmitter levels to prediction errors and Bayesian uncertainties. He has pioneered the field of reinforcement learning (RL) where he helped develop the Q-learning algorithm, and made contributions to unsupervised learning, including the wake-sleep algorithm for neural networks and the Helmholtz machine. Dayan studied mathematics at the University of Cambridge and then continued for a PhD in artificial intelligence at the University of Edinburgh School of Informatics on statistical learning supervised by and David Wallace, focusing on associative memory and reinforcement learning. After his PhD, Dayan held postdoctoral research positions with Terry Sejnowski at the Salk Institute and Geoffrey Hinton at the University of Toronto. He then took up an assistant professor position at the Massachusetts Institute of Technology (MIT), and moved to the Gatsby Charitable Foundation computational neuroscience unit at University College London (UCL) in 1998, becoming professor and director in 2002. In September 2018, the Max Planck Society announced his appointment as a director at the Max Planck Institute for Biological Cybernetics | https://en.wikipedia.org/wiki?curid=6097857 |
Peter Dayan Dayan was elected a Fellow of the Royal Society (FRS) in 2018. He was awarded the Rumelhart Prize in 2012 and The Brain Prize in 2017. | https://en.wikipedia.org/wiki?curid=6097857 |
Italian Amateur Astronomers Union The (; UAI) is an Italian organization active in astronomy research and outreach that was founded in 1967. Its members are both professional and amateur astronomers. The UAI claims more than two thousands members from the whole Italy and is one of the most important amateur astronomical associations in Europe. The main-belt asteroid 234026 Unioneastrofili, discovered by Luciano Tesi in 1998, was named in honor of the organization. | https://en.wikipedia.org/wiki?curid=6100068 |
Zhao Xijin (赵喜进; born c. 1935 died July 21, 2012) was a Chinese paleontologist notable for having named numerous dinosaurs. He was a professor at Beijing's Institute of Vertebrate Paleontology and Paleoanthropology. Paul Sereno and Zhao went on a dinosaur fossil hunt in 2005 to Tibet to look for a site that Zhao had found 27 years prior. Before this hunt, in 2001, they had been engaged in a dig in the Gobi Desert. This involved a rock quarry that led them to finding 25 skeletons of the species "Sinornithomimus dongi". In 2008, Zhao was involved in and in charge of a dig in Zhucheng that consisted of digging out a "980 ft-long pit". The site has unearthed more than 7,600 fossils through Xijin's work. It is believed to be the largest such site in the world. The majority of the fossils found appeared to be from the Late Cretaceous period. He died in 2012 at the age of 77. Besides the above, also named the family Mamenchisauridae (with Young Chung Chien, 1972). | https://en.wikipedia.org/wiki?curid=6102836 |
Gothenburg International Bioscience Business School Gothenburg International Bioscience Business School, GIBBS, is an educational platform with a focus on business creation within the bio- and life sciences in Gothenburg, Sweden. As a student at the school you are studying intellectual property, management, economics and business development. The skills and tools to drive innovation and growth are learned by increasingly acknowledged pedagogy—‘experiential knowledge’ and ‘team based learning’, allowing the students to learn-by-doing. The programme is an international, action-based, and multi-disciplinary education where the education platform calls for special competencies and resources. Consequently, students engage in “real-life” commercialization project supported by instructors that apply their entrepreneurial experiences and network to the exclusive pedagogical experience offered to an exclusive pick of students. The education is a collaboration between Sahlgrenska Academy at Göteborg University and Chalmers University of Technology and is a part of the Center for Intellectual Property Studies, CIP. Most of the collaboration is during the first year with a shared curriculum studying with peers from various backgrounds such as law, engineering, life sciences, and management. The second year students work in groups with the innovation project with the aim to commercialize an innovation | https://en.wikipedia.org/wiki?curid=6104628 |
Gothenburg International Bioscience Business School The University of Gothenburg an education geared towards life science students who become entrepreneurial project leaders ready to deal uncertainties and distinct dynamics that life science industry and start-ups. The graduates from the programme are not only well versed in their respective life science fields; uniquely, they are also entrepreneurial project leaders. An essential mix that prepares them for the distinctive life science market dynamics in their chosen career, equipped to recognize possibilities and create growth. Chalmers University of Technology offers an education geared towards engineers in technology based ventures. | https://en.wikipedia.org/wiki?curid=6104628 |
Gyula Farkas (natural scientist) Farkas Gyula, or Julius Farkas (March 28, 1847 – December 27, 1930) was a Hungarian mathematician and physicist. He attended the gymnasium at Győr (Raab), and studied law and physics at Pest. After teaching in a secondary school at Székesfehérvár (Stuhlweissenburg), Farkas became in succession principal of the normal school at Pápa, privat-docent (1881) of mathematics at the University of Budapest, and professor of physics (1888) at Franz Joseph University of Kolozsvár (Klausenburg). He worked here up to 1915, when he retired and moved to Budapest. The Hungarian Academy of Science elected him corresponding member May 6, 1898. He made a contribution to linear algebra with Farkas' lemma, which is named after him for his derivation of it. His principal writings are embodied in the reports of the Academy of Science of Paris (1878–1884) His separately published works are: | https://en.wikipedia.org/wiki?curid=6107158 |
Witold Milewski (1817–1889) was a , , and pedagogue. In 1853 he became director of the Gymnasium in Trzemeszno. | https://en.wikipedia.org/wiki?curid=6112007 |
Cirque glacier A "cirque glacier" is formed in a cirque, a bowl-shaped depression on the side of or near mountains. Snow and ice accumulation in corries often occurs as the result of avalanching from higher surrounding slopes. If a cirque glacier advances far enough, it may become a valley glacier. Additionally, if a valley glacier retreats enough that it is within the cirque, it becomes a cirque glacier again. In these depressions, snow persists through summer months, and becomes glacier ice. Snow may be situated on the leeward slope of a mountain, where it is sheltered from wind. Rock fall from above slopes also plays an important role in sheltering the snow and ice from sunlight. If enough rock falls onto the glacier, it may become a rock glacier. Randklufts may form beneath corrie glaciers as open space between the ice and the bedrock, where meltwater can play a role in deposition of the rock. | https://en.wikipedia.org/wiki?curid=6121014 |
Emilio Cornalia (25 August 1824 – 8 June 1882) was an Italian naturalist. He was born in Milan and died in the same city. He was conservator from 1851 to 1866, and director from 1866 till his death, of the Milan Museum of Natural History, and was interested in all areas of biology. He was one of the group of leading scientists instrumental in founding "La Società Entomologica Italiana", the Italian Entomological Society. He was the author of important works of applied entomology, such as "Monografia del bombice del gelso" published in 1856, and was part of a scientific expedition to the upper Nile valley in 1873. | https://en.wikipedia.org/wiki?curid=6122794 |
Natural design is an approach to psychology and biology that holds that concepts such as "motivation", "emotion", "inner feeling", "development", "adaptation" refer not to down-reductive explanations of things but to up-reductive descriptions of patterns of which those things are part. It has its roots in philosophical behaviorism and the new realism. It also refers to an holistic approach to Design called for by Prof David W. Orr (Professor of Environmental Studies and Politics, Oberlin College USA) and developed for research practice by Prof Seaton Baxter (Emeritus Professor for the Study of Natural Design, Duncan of Jordanstone College of Art and Design, University of Dundee). has attribution to the process of natural selection. All species have been designed based on their life situation in order to have more offspring. This resulted as only better designed organisms can be found today because natural selection is only limited by the rapidity of environment change and the capacity of the genes to generate variation. Moreover, the theory of evolution by natural selection can some how explain the behavior of animals. Darwinism, psychologist who studies in behavior, stated two consequences of studying behavior. These two results provided two competing systems for study of behavior and evolved into ethology and further evolved into comparative psychology. | https://en.wikipedia.org/wiki?curid=6126807 |
Oak Investment Partners is a private equity firm focusing on venture capital investments in companies developing communications systems, information technology, new Internet media, healthcare services and retail. The firm, founded in 1978, is based in Greenwich, Connecticut with offices in Norwalk, Connecticut, Minneapolis and Palo Alto, California. Since inception, Oak had invested in more than 480 companies and had raised more than $8.4 billion in investor commitments across 12 private equity funds.. Ann Lamont is a founder and managing partner. In May 2006, Oak raised its 12th fund, at $2.56 billion reportedly the largest venture capital fund ever raised. In 2015, Indian-born employee Iftikar Ahmed was sued by the U.S. Securities and Exchange Commission on suspicion of stealing US$65 million from the firm. Ahmed was believed to have fled to India. In August 2015, Fortune reported that Mr. Ahmed had been detained in an Indian prison from May 22 until July 23 and that his passport had been confiscated. Oak invests across a range of stages: funding startup companies, funding spinouts of existing divisions and assets, and providing growth capital to later-stage companies. Oak also selectively invests in public companies through PIPE investments. In December 2008, Oak invested $25 million in the online media outlet Huffington Post. Other notable Oak investments include eVoice, Brightcom Group, Dick's Sporting Goods, DueDil, Moxie Software, NeoPhotonics Corporation, NextNav, One Medical Group, Office Depot, Petsmart, P.F | https://en.wikipedia.org/wiki?curid=6128122 |
Oak Investment Partners Chang's China Bistro, Photobucket, Polycom, nLIGHT, Sandisk, Seagate, SmartDrive Systems, Sovrn Holdings, Tikona Infinet Limited, Wonga.com. and Zumobi. | https://en.wikipedia.org/wiki?curid=6128122 |
Arctic Lowlands The 'Canadian Shield and the Innuitian region are located to the south of the Arctic Lowland plains. This is a region of tundra, a treeless plain, with a cold, dry climate and poorly drained soil. The region is located in Nunavut and the Northwest Territories. The are plains located in Canada. Plains are extensive areas of level or gently rolling land. In North America there is a large, flat interior plain. They are also part of an area that is commonly referred to as the Arctic Archipelago, which occupies much of the central Canadian Arctic. They are made up of a series of islands located in Canada's far north, and remain frozen for most of the year. However, the Paleozoic sedimentary rock, from which the Lowlands are formed, contains lignite (a form of coal), oil, and natural gas deposits. Limestone is very abundant as well. The have a small human population. The terrain is mostly ice, snow, rock, and it is full of marshes, especially in the winter. Animals that live in the area include polar bears, char, Arctic hares and Arctic foxes. This region is being affected by global warming. It is very cold and human life may be difficult. Commonly known as the Hudson Bay-Arctic Lowlands, the Hudson Bay contains over 80% water. It started to form in the Cenozoic era. | https://en.wikipedia.org/wiki?curid=6130928 |
ISO 31-9 gives name, symbol and definition for 51 quantities and units of atomic and nuclear physics. Where appropriate, conversion factors are also given. Annex A includes names of symbols of the chemical elements, Annex B the notation of symbols for chemical elements and nuclides, Annex C the original names and symbols for nuclides of radioactive series, Annex D examples of relations in different systems of equations using different systems of units. | https://en.wikipedia.org/wiki?curid=6141463 |
Weather house A weather house is a folk art device in the shape of a small German or Alpine chalet that indicates the weather. A typical weather house has two doors side by side. The left side has a girl or woman, the right side a boy or man. The female figure comes out of the house when the weather is sunny and dry, while the male comes out to indicate rain. In fact, a weather house functions as a hygrometer embellished by folk art. The male and female figures ride on a balance bar, which is suspended by a piece of catgut or hair. The gut relaxes or shrinks based on the humidity in the surrounding air, relaxing when the air is wet and tensing when the air is dry. This action swings one figure or the other out of the house depending on the humidity. Some variants function as a barometer: low pressure indicates wrong (rainy) weather, high pressure good (sunny) weather. Weather houses are associated in the popular mind with Austria, Germany or Switzerland, and are often decorated in the style of a cuckoo clock. Many weather houses also bear a small thermometer on the part between the two doors that conceals the gut suspension, and many also contain a piggy bank. In contrast, the term "weather house" in the United States refers to buildings built by the U. S. Signal Service and then the U. S. Weather Bureau to house the instruments and Chief Weather Observers so that they could do their job. For more information see the U.S. Weather Bureau | https://en.wikipedia.org/wiki?curid=6141469 |
Weather house A one-act English comic opera called "Weather or No", about the male and female figures in a weather house falling in love, became popular when it was played as a companion piece to "The Mikado" in 1896-97. "The Brollys" is an animated television series about a boy who is magically transported every night into the weather house on the wall of his bedroom. | https://en.wikipedia.org/wiki?curid=6141469 |
Peter Doran Peter Doran, Ph.D. is Professor of Geology and Geophysics and John Franks Endowed Chair at Louisiana State University. Doran specializes in polar regions, especially Antarctic climate and ecosystems. Doran was the lead author of a research paper about Antarctic temperatures that was published in the journal "Nature" in January 2002. Because he and his colleagues found that some parts of Antarctica had cooled between 1964 and 2000, his paper has been frequently cited by opponents of the global warming theory, such as Ann Coulter and Michael Crichton. In an opinion piece in the July 27, 2006 "New York Times", Doran characterized this as a "misinterpretation" and stated, "I have never thought such a thing ... I would like to remove my name from the list of scientists who dispute global warming." (The temporary phenomenon is related to the "hole" in the ozone. As the "hole heals" the Antarctic will dramatically warm quickly. ) Doran and his grad student Maggie Kendall Zimmerman also published a paper in the Jan 27, 2009 issue of EOS showing that active climate researchers almost unanimously agree that humans have had a significant impact on the Earth's climate. Doran also applies his expertise in extreme polar environments to planetary science and has led and been a member of several NASA-funded projects using polar regions as analogs for Mars and icy/ocean worlds | https://en.wikipedia.org/wiki?curid=6144037 |
Peter Doran He was nominated by former Apollo astronaut Harrison Schmidt to be a member of the Planetary Protection Subcommittee of the NASA Advisory Council in 2008 and served until 2017. He is a member of a Committee on Space Research (COSPAR) group that regularly meets to discuss planetary protection on human missions to Mars, and in 2018 was appointed to represent the U.S. as a member of the COSPAR Panel on Planetary Protection Both an Antarctic stream and glacier were named for Doran by the U.S. Geological Survey to commemorate his many significant research contributions conducted on the continent. | https://en.wikipedia.org/wiki?curid=6144037 |
Controlled lab reactor In chemistry, a Controlled Lab Reactor or CLR is any reaction system where there is an element of automated control. Generally these devices refers to a jacketed glass vessel where a circulating chiller unit pumps a thermal control fluid through the jacket to accurately control the temperature of the vessel contents. Additional to this, it is common to have a series of sensors (temperature, pH, pressure) measuring and recording parameters about the reactor contents. It is additionally possible to control pumps to act on the reactor. The first controlled lab reactors were derived from the control systems used in chemical plants. These were generally dedicated to specific tasks as reprogramming was difficult. These first systems were often home built and used hardware that was adapted rather than designed for the task Modern CLR systems take a wide range of forms with the ability to work on a range of different volume reactors (and indeed reactor styles). Data is usually transmitted back to a PC to be recorded (and indeed complex recipe based control is usually performed here too) though other systems may use off-line data logging. In the most sophisticated systems that exist, analytical instruments such as raman spectrometers and FTIR probes can also be integrated with the reactor. These more sophisticated systems also allow the closed loop control of the reactor as a result of taking readings from the sensors and analytical instruments concerned | https://en.wikipedia.org/wiki?curid=6153698 |
Controlled lab reactor Most reaction calorimeters can be used as controlled lab reactors (indeed some calorimeters are based on CLR's). Reaction Calorimeter | https://en.wikipedia.org/wiki?curid=6153698 |
National Museum of Natural History, Bulgaria The National Museum of Natural History (, "Natsionalen prirodonauchen muzey"; abbreviated НПМ, NMNHS) of Bulgaria is a natural history museum located in Sofia, the capital of Bulgaria, on "Tzar Osvoboditel" Str., next to the Russian church. Founded in 1889, it is affiliated with the Bulgarian Academy of Sciences, and is the first and largest museum of this kind in the Balkans. The Museum's collection includes over 400 stuffed mammals, over 1,200 species of birds, hundreds of thousands of insects and other invertebrates, as well as samples of about one quarter of the world's mineral species. The National Museum of Natural History was founded in 1889 as the Natural History Museum of Knyaz Ferdinand of Bulgaria, with various foreign and Bulgarian specialists (e.g. Ivan Buresh, director from 1913 to 1947) serving as its directors until 1947, when the museum became part of the Bulgarian Academy of Sciences' Zoological Institute. The Museum became autonomous as a separate institute within the system of BAS in 1974. In 1992, the Asenovgrad Palaeontology Museum, an NMNH branch in Asenovgrad, was formed. | https://en.wikipedia.org/wiki?curid=6157134 |
Aircraft Meteorological Data Relay (AMDAR) is a program initiated by the World Meteorological Organization. AMDAR is used to collect meteorological data worldwide by using commercial aircraft. Data is collected by the aircraft navigation systems and the onboard standard temperature and static pressure probes. The data is then preprocessed before linking them down to the ground either via VHF communication (ACARS) or via satellite link ASDAR. A detailed description is given in the AMDAR Reference Manual (WMO-No 958) available from the World Meteorological Organization, Geneva, Switzerland AMDAR transmissions are most commonly used in forecast models as a supplement to radiosonde data, to aid in the plotting of upper-air data between the standard radiosonde soundings at 00Z and 12Z. | https://en.wikipedia.org/wiki?curid=6158574 |
Huayco A huaico or huayco (from the Quechua "wayqu", meaning "depth, valley") is an Andean term for the mudslide and flash flood caused by torrential rains occurring high in the mountains, especially during the weather phenomenon known as "El Niño". National forests such as the San Matías–San Carlos Protection Forest were created in Peru to protect vegetation, which reduces runoff, and prevent huaicos. The indigenous Mapuche residents of "Lo Barnechea", in present-day Santiago Province, Chile, were called "Huaicoches" in their Mapudungun language: "Huaico" (flash flood) and "che" (people). | https://en.wikipedia.org/wiki?curid=6160943 |
Cooperstock's energy-localization hypothesis In physics, the is a hypothesis proposed by Fred Cooperstock that in general relativity, energy only exists in regions of non-vanishing energy–momentum tensor. Since the creation of general relativity there have been questions about the energy of gravitational fields. Among the proposals for the energy are the Landau–Lifshitz pseudotensor, Einstein pseudotensor, and the Møller superpotential. In Misner, Thorne & Wheeler the authors claimed that energy can only be localized for spherical systems, which Cooperstock & Sarracino demonstrated implies that energy must be localized for all systems, while Bondi argued that non-localizable energy is not allowed in general relativity. The energy localization hypothesis has also been proven for a number of specific examples (see for example Ref ), but has not been proven or disproven in general. Feynman's sticky bead argument shows that energy is transported by gravitational waves, which is difficult to make compatible with the Cooperstock's hypothesis. | https://en.wikipedia.org/wiki?curid=6162261 |
TNF inhibitor A is a pharmaceutical drug that suppresses the physiologic response to tumor necrosis factor (TNF), which is part of the inflammatory response. TNF is involved in autoimmune and immune-mediated disorders such as rheumatoid arthritis, ankylosing spondylitis, inflammatory bowel disease, psoriasis, hidradenitis suppurativa and refractory asthma, so TNF inhibitors may be used in their treatment. The important side effects of TNF inhibitors include lymphomas, infections (especially reactivation of latent tuberculosis), congestive heart failure, demyelinating disease, a lupus-like syndrome, induction of auto-antibodies, injection site reactions, and systemic side effects. The global market for TNF inhibitors in 2008 was $13.5 billion and $22 billion in 2009. Inhibition of TNF effects can be achieved with a monoclonal antibody such as infliximab (Remicade), adalimumab (Humira), certolizumab pegol (Cimzia), and golimumab (Simponi), or with a circulating receptor fusion protein such as etanercept (Enbrel). Thalidomide (Immunoprin) and its derivatives lenalidomide (Revlimid) and pomalidomide (Pomalyst, Imnovid) are also active against TNF. While most clinically useful TNF inhibitors are monoclonal antibodies, some are simple molecules such as xanthine derivatives (e.g. pentoxifylline) and bupropion. Bupropion is the active ingredient in the smoking cessation aid Zyban and the antidepressants Wellbutrin and Aplenzin | https://en.wikipedia.org/wiki?curid=6173922 |
TNF inhibitor Several 5-HT agonist hallucinogens including (R)-DOI, TCB-2, LSD and LA-SS-Az have unexpectedly also been found to act as potent inhibitors of TNF, with DOI being the most active, showing TNF inhibition in the picomolar range, an order of magnitude more potent than its action as a hallucinogen. The role of TNF as a key player in the development of rheumatoid arthritis was originally demonstrated by Kollias and colleagues in proof of principle studies in transgenic animal models. TNF levels have been shown to be raised in both the synovial fluid and synovium of patients with rheumatoid arthritis. This leads to local inflammation through the signalling of synovial cells to produce metalloproteinases and collagenase. Clinical application of anti-TNF drugs in rheumatoid arthritis was demonstrated by Marc Feldmann and Ravinder N. Maini, who won the 2003 Lasker Award for their work. Anti-TNF compounds help eliminate abnormal B cell activity. Therapy which combines certain anti-TNF agents such as etanercept with DMARDs such as methotrexate has been shown to be more effective at restoring quality of life to sufferers of rheumatoid arthritis than using either drug alone. Clinical trials regarding the effectiveness of these drugs on hidradenitis suppurativa are ongoing. The National Institute of Clinical Excellence (NICE) has issued guidelines for the treatment of severe psoriasis using the anti-TNF drugs etanercept (Enbrel) and adalimumab (Humira) as well as the anti-IL12/23 biological treatment ustekinumab (Stelara) | https://en.wikipedia.org/wiki?curid=6173922 |
TNF inhibitor In cases where more conventional systemic treatments such as psoralen combined with ultraviolet A treatment (PUVA), methotrexate, and ciclosporin have failed or can not be tolerated, these newer biological agents may be prescribed. Infliximab (Remicade) may be used to treat severe plaque psoriasis if aforementioned treatments fail or can not be tolerated. In 2010 The National Institute of Clinical Excellence (NICE) in the UK issued guidelines for the treatment of severe Crohn's Disease with Infliximab and adalimumab. Anti-TNF therapy has shown only modest effects in cancer therapy. Treatment of renal cell carcinoma with infliximab resulted in prolonged disease stabilization in certain patients. Etanercept was tested for treating patients with breast cancer and ovarian cancer showing prolonged disease stabilization in certain patients via downregulation of IL-6 and CCL2. On the other hand, adding infliximab or etanercept to gemcitabine for treating patients with advanced pancreatic cancer was not associated with differences in efficacy when compared with placebo. The U.S. Food and Drug Administration continues to receive reports of a rare cancer of white blood cells (known as hepatosplenic T-cell lymphoma or HSTCL), primarily in adolescents and young adults being treated for Crohn's disease and ulcerative colitis with TNF blockers, as well as with azathioprine, and/or mercaptopurine. TNF inhibitors put patients at increased risk of certain opportunistic infections | https://en.wikipedia.org/wiki?curid=6173922 |
TNF inhibitor The FDA has warned about the risk of infection from two bacterial pathogens, Legionella and Listeria. People taking TNF blockers are at increased risk for developing serious infections that may lead to hospitalization or death due to certain bacterial, mycobacterial, fungal, viral, and parasitic opportunistic pathogens. In patients with latent Mycobacterium tuberculosis infection, active tuberculosis (TB) may develop soon after the initiation of treatment with infliximab. Before prescribing a TNF inhibitor, physicians should screen patients for latent tuberculosis. The anti-TNF monoclonal antibody biologics infliximab, golimumab, certolizumab and adalimumab, and the fusion protein etanercept, which are all currently approved by the FDA for human use, have warnings which state that patients should be evaluated for latent TB infection, and if it is detected, preventive treatment should be initiated prior to starting therapy with these medications. The FDA issued a warning on September 4, 2008, that patients on TNF inhibitors are at increased risk of opportunistic fungal infections such as pulmonary and disseminated histoplasmosis, coccidioidomycosis, and blastomycosis. They encourage clinicians to consider empiric antifungal therapy in certain circumstances to all patients at risk until the pathogen is identified | https://en.wikipedia.org/wiki?curid=6173922 |
TNF inhibitor A recent review showed that anti-TNFα agents associate with increased infection risks for both endemic and opportunistic invasive fungal infections, particularly when given late in the overall course of treatment of the underlying disease, and in young patients receiving concomitant cytotoxic or augmented immunosuppressive therapy. In 1999 a randomized control trial was conducted testing a TNF-alpha inhibitor prototype, Lenercept, for the treatment of Multiple Sclerosis (MS). However, the patients in the study who received the drug had significantly more exacerbations and earlier exacerbations of their disease than those who did not. Case reports have also come out suggesting that anti-TNF agents not only worsen, but cause new-onset Multiple Sclerosis in some patients. Most recently, a 2018 case report described an Italian man with plaque psoriasis who developed MS after starting entanercept. Their literature review at that time identified 34 other cases of demyelinating disease developing after the initiation of an anti-TNF drug. Thus, anti-TNF drugs are contraindicated in patients with MS, and the American Academy of Dermatology recommends avoiding their use in those with a first degree relative with MS. Several anti-TNF drugs are commonly prescribed by a number of autoimmune conditions. Some of them have been reported to produce a CNS-demyelination compatible with, and by current knowledge indistinguishable from, standard MS | https://en.wikipedia.org/wiki?curid=6173922 |
TNF inhibitor Several other monoclonal antibodies like adalimumab, pembrolizumab, nivolumab and infliximab have been reported to produce MS artificially. TNF or its effects are inhibited by several natural compounds, including curcumin (a compound present in turmeric), and catechins (in green tea). Cannabidiol and Echinacea purpurea also seem to have anti-inflammatory properties through inhibition of TNF-α production, although this effect may be mediated through cannabinoid CB or CB receptor-independent effects. Early experiments associated TNF with the pathogenesis of bacterial sepsis. Thus, the first preclinical studies using polyclonal antibodies against TNF-alpha were performed in animal models of sepsis in 1985 and showed that anti-TNF antibodies protected mice from sepsis. However, subsequent clinical trials in patients with sepsis showed no significant benefit. It wasn't until 1991 that studies in a transgenic mouse model of overexpressed human TNF provided the pre-clinical rationale for a causal role of TNF in the development of polyarthritis and that anti-TNF treatments could be effective against human arthritides. This was later confirmed in clinical trials and led to the development of the first biological therapies for rheumatoid arthritis. | https://en.wikipedia.org/wiki?curid=6173922 |
Fissility (geology) In geology, fissility is the ability or tendency of a rock to split along flat planes of weakness (“parting surfaces”). These planes of weakness are oriented parallel to stratification in sedimentary rocks. Fissility is differentiated from scaly fabric in hand sample by the parting surfaces’ continuously parallel orientations to each other and to stratification. Fissility is distinguished from scaly fabric in thin section by the well-developed orientation of platy minerals such as mica. Fissility is the result of sedimentary or metamorphic processes. Planes of weakness are developed in sedimentary rocks such as shale or mudstone by clay particles aligning during compaction. Planes of weakness are developed in metamorphic rocks by the recrystallization and growth of micaceous minerals. A rock's fissility can be degraded in numerous ways during the geologic process, including clay particles flocculating into a random fabric before compaction, bioturbation during compaction, and weathering during and after uplift. The effect of bioturbation has been documented well in shale cores sampled: past variable critical depths where burrowing organisms can no longer survive, shale fissility will become more pervasive and better defined. Fissility is used by some geologists as the defining characteristic which separates mudstone (no fissility) from shale (fissile). However, some professions, like drilling engineers, continue to use the terms shale and mudstone interchangeably. | https://en.wikipedia.org/wiki?curid=6181420 |
Charles Vélain (16 May 1845 – 6 June 1925) was a French geologist and geographer. He was born in Château-Thierry. Charles Vélain's route to the field of geology was an unusual one. He was a student of pharmacy. He studied geology later at the Sorbonne. He was part of several explorations, during which he studied various geological factors that would later bring him fame. Vélain's major contributions in the field of geology were in petrography and physical geography. He was an authority on volcanism. His works in this particular field earned him numerous accolades such as the price-Delalande Guérineau of the French Academy of Sciences in 1877. also enjoyed a successful career as a scholar. He worked as a lecturer for several years. He also authored several scientific papers and articles. He wrote mostly on geology, petrology and physical geography, with volcanoes remaining his subject of interest and expertise. Some of his works include 'Volcanoes' (Paris, 1884), 'The Earthquakes' (Paris, 1887), and 'Petrology conferences' (Paris, 1889). The contributions of were commemorated in 1963 by naming a peak in the Rally du Baty Peninsula as Mount Velain. | https://en.wikipedia.org/wiki?curid=6183649 |
Harold Cogger Harold George "Hal" Cogger (born 4 May 1935) is an Australian herpetologist. He was curator of reptiles and amphibians at the Australian Museum from 1960 to 1975, and Deputy Director of the museum from 1976 to 1995. He has written extensively on Australian herpetology, and was the first author to create a field guide for all Australian frogs and reptiles. Cogger was made an honorary Doctor of Science in 1997. At least eight reptile taxa have been named after Cogger, including one genus, six species, and one subspecies: "Coggeria", "Ctenotus coggeri", "Emoia coggeri", "Geomyersi coggeri", "Hydrophis coggeri", "Lampropholis coggeri", "Oedura coggeri", and "Diporiphora nobbi coggeri". | https://en.wikipedia.org/wiki?curid=6190989 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.