text stringlengths 15 1.63k | source stringlengths 38 44 | __index_level_0__ int64 4 518k |
|---|---|---|
Science DMZ Network Architecture For example, it should only take less than 4 hours to transfer 10 TeraBytes of data on a 10 Gigabit Ethernet network path, assuming disk performance is adequate The problem is that this requires networks that are free from packet loss and middleboxes such as traffic shapers or firewalls that slow network performance. Most businesses and other institutions use a firewall to protect their internal network from malicious attacks originating from outside. All traffic between the internal network and the external Internet must pass through a firewall, which discards traffic likely to be harmful. A stateful firewall tracks the state of each logical connection passing through it, and rejects data packets inappropriate for the state of the connection. For example, a website would not be allowed to send a page to a computer on the internal network, unless the computer had requested it. This requires a firewall to keep track of the pages recently requested, and match requests with responses. A firewall must also analyze network traffic in much more detail, compared to other networking components, such as routers and switches. Routers only have to deal with the network layer, but firewalls must also process the transport and application layers as well. All this additional processing takes time, and limits network throughput | https://en.wikipedia.org/wiki?curid=37194181 | 98,690 |
Alkali–silica reaction " (2005): However, not all Na or K soluble salts can precipitate insoluble calcium salts, such as, "e.g.", NaCl-based deicing salts: As calcium chloride is a soluble salt, the reaction cannot occur and the chemical equilibrium regresses to the left side of the reaction. So, a question arises: can NaCl or KCl from deicing salts still possibly play a role in the alkali-silica reaction? and cations in themselves cannot attack silica (the culprit is their counter ion ) and soluble alkali chlorides cannot produce soluble alkali hydroxide by interacting with calcium hydroxide. So, does it exist another route to still produce hydroxide anions in the hardened cement paste (HCP)? Beside portlandite, other hydrated solid phases are present in HCP. The main phases are the calcium silicate hydrates (C-S-H) (the ""glue"" in cement paste), calcium sulfo-aluminate phases (AFm and AFt, ettringite) and hydrogarnet. C-S-H phases are less soluble (~ 10 M) than portlandite (CH) (~ 2.2 10 M at 25 °C) and therefore are expected to play a negligible role for the calcium ions release. An anion-exchange reaction between chloride ions and the hydroxide anions contained in the lattice of some calcium aluminate hydrates (C-A-H), or related phases (C-A-S-H, AFm, AFt), is suspected to also contribute to the release of hydroxide anions into solution | https://en.wikipedia.org/wiki?curid=12484448 | 89,236 |
Split-ring resonator These resonators have been used for the synthesis of left handed and negative refractive index media, where the necessary value of the negative effective permeability is due to the presence of the SRRs. When an array of electrically small SRRs is excited by means of a time varying magnetic field, the structure behaves as an effective medium with negative effective permeability in a narrow band above SRR resonance. SRRs have also been coupled to planar transmission lines, for the synthesis of transmission line metamaterials. The split ring resonator was a microstructure design featured in the paper by Pendry et al in 1999 called, "Magnetism from Conductors and Enhanced Nonlinear Phenomena". It proposed that the split ring resonator design, built out of nonmagnetic material, could enhance the magnetic activity unseen in natural materials. In the simple microstructure design, it is shown that in an array of conducting cylinders, with an applied external formula_1 field parallel to the cylinders, the effective permeability can be written as the following. (This model is very limited and it is important to note that the effective permeability cannot be less than zero or greater than one.) Where formula_3 is the resistance of the cylinder surface per unit area, a is the spacing of the cylinders, formula_4 is the angular frequency, formula_5 is the permeability of free space and r is the radius | https://en.wikipedia.org/wiki?curid=8849460 | 367,647 |
Osmometer An osmometer is a device for measuring the osmotic strength of a solution, colloid, or compound. There are several different techniques employed in osmometry: Osmometers are useful for determining the total concentration of dissolved salts and sugars in blood or urine samples. Osmometry is also useful in determining the molecular weight of unknown compounds and polymers. Osmometry is the measurement of the osmotic strength of a substance. This is often used by chemists for the determination of average molecular weight. | https://en.wikipedia.org/wiki?curid=1313809 | 249,340 |
Semigroup action In algebra and theoretical computer science, an action or act of a semigroup on a set is a rule which associates to each element of the semigroup a transformation of the set in such a way that the product of two elements of the semigroup (using the semigroup operation) is associated with the composite of the two corresponding transformations. The terminology conveys the idea that the elements of the semigroup are "acting" as transformations of the set. From an algebraic perspective, a semigroup action is a generalization of the notion of a group action in group theory. From the computer science point of view, semigroup actions are closely related to automata: the set models the state of the automaton and the action models transformations of that state in response to inputs. An important special case is a monoid action or act, in which the semigroup is a monoid and the identity element of the monoid acts as the identity transformation of a set. From a category theoretic point of view, a monoid is a category with one object, and an act is a functor from that category to the category of sets. This immediately provides a generalization to monoid acts on objects in categories other than the category of sets. Another important special case is a transformation semigroup. This is a semigroup of transformations of a set, and hence it has a tautological action on that set. This concept is linked to the more general notion of a semigroup by an analogue of Cayley's theorem | https://en.wikipedia.org/wiki?curid=1058218 | 119,100 |
Industrial control system The SCADA also enables alarm conditions, such as loss of flow or high temperature, to be displayed and recorded. PLCs can range from small modular devices with tens of inputs and outputs (I/O) in a housing integral with the processor, to large rack-mounted modular devices with a count of thousands of I/O, and which are often networked to other PLC and SCADA systems. They can be designed for multiple arrangements of digital and analog inputs and outputs, extended temperature ranges, immunity to electrical noise, and resistance to vibration and impact. Programs to control machine operation are typically stored in battery-backed-up or non-volatile memory. Process control of large industrial plants has evolved through many stages. Initially, control was from panels local to the process plant. However this required personnel to attend to these dispersed panels, and there was no overall view of the process. The next logical development was the transmission of all plant measurements to a permanently-manned central control room. Often the controllers were behind the control room panels, and all automatic and manual control outputs were individually transmitted back to plant in the form of pneumatic or electrical signals. Effectively this was the centralisation of all the localised panels, with the advantages of reduced manpower requirements and consolidated overview of the process | https://en.wikipedia.org/wiki?curid=7333367 | 385,280 |
Medawar Medal The British Transplant Society (BTS) awards the each year for the best clinical and scientific research presentations by a scientist or doctor . The Medawar medal is the most prestigious award that the society can offer, is highly competitive, and cannot be won more than once by a single individual. The award is named after Peter Medawar, a Nobel Prize winner in Medicine or Physiology. Two medals are awarded every year. | https://en.wikipedia.org/wiki?curid=57048047 | 180,225 |
DNA vaccination Gene gun delivery systems, cationic liposome packaging, and other delivery methods bypass this entry method, but understanding it may be useful in reducing costs (e.g. by reducing the requirement for cytofectins), which could be important in animal husbandry. Studies using chimeric mice have shown that antigen is presented by bone-marrow derived cells, which include dendritic cells, macrophages and specialised B-cells called professional antigen presenting cells (APC). After gene gun inoculation to the skin, transfected Langerhans cells migrate to the draining lymph node to present antigens. After IM and ID injections, dendritic cells present antigen in the draining lymph node and transfected macrophages have been found in the peripheral blood. Besides direct transfection of dendritic cells or macrophages, cross priming occurs following IM, ID and gene gun DNA deliveries. Cross-priming occurs when a bone marrow-derived cell presents peptides from proteins synthesised in another cell in the context of MHC class 1. This can prime cytotoxic T-cell responses and seems to be important for a full primary immune response. IM and ID DNA delivery initiate immune responses differently. In the skin, keratinocytes, fibroblasts and Langerhans cells take up and express antigens and are responsible for inducing a primary antibody response. Transfected Langerhans cells migrate out of the skin (within 12 hours) to the draining lymph node where they prime secondary B- and T-cell responses | https://en.wikipedia.org/wiki?curid=45570 | 160,257 |
Baroclinity When the interface between these two surfaces is not horizontal and the system is close to hydrostatic equilibrium, the gradient of the pressure is vertical but the gradient of the density is not. Therefore the baroclinic vector is nonzero, and the sense of the baroclinic vector is to create vorticity to make the interface level out. In the process, the interface overshoots, and the result is an oscillation which is an internal gravity wave. Unlike surface gravity waves, internal gravity waves do not require a sharp interface. For example, in bodies of water, a gradual gradient in temperature or salinity is sufficient to support internal gravity waves driven by the baroclinic vector. | https://en.wikipedia.org/wiki?curid=166371 | 36,330 |
Diffusion MRI In addition the directional information can be exploited at a higher level of structure to select and follow neural tracts through the brain—a process called tractography. A more precise statement of the image acquisition process is that the image-intensities at each position are attenuated, depending on the strength ("b"-value) and direction of the so-called magnetic diffusion gradient, as well as on the local microstructure in which the water molecules diffuse. The more attenuated the image is at a given position, the greater diffusion there is in the direction of the diffusion gradient. In order to measure the tissue's complete diffusion profile, one needs to repeat the MR scans, applying different directions (and possibly strengths) of the diffusion gradient for each scan. In present-day clinical neurology, various brain pathologies may be best detected by looking at particular measures of anisotropy and diffusivity. The underlying physical process of diffusion causes a group of water molecules to move out from a central point, and gradually reach the surface of an ellipsoid if the medium is anisotropic (it would be the surface of a sphere for an isotropic medium). The ellipsoid formalism functions also as a mathematical method of organizing tensor data. Measurement of an ellipsoid tensor further permits a retrospective analysis, to gather information about the process of diffusion in each voxel of the tissue | https://en.wikipedia.org/wiki?curid=2574377 | 199,442 |
Ringing artifacts This can be resolved by using a filter whose impulse response is non-negative and does not oscillate, but shares desired traits. For example, for a low-pass filter, the Gaussian filter is non-negative and non-oscillatory, hence causes no ringing. However, it is not as good as a low-pass filter: it rolls off in the passband, and leaks in the stopband: in image terms, a Gaussian filter "blurs" the signal, which reflects the attenuation of desired higher frequency signals in the passband. A general solution is to use a window function on the sinc filter, which cuts off or reduces the negative lobes: these respectively eliminate and reduce overshoot and ringing. Note that truncating some but not all of the lobes eliminates the ringing beyond that point, but does not reduce the amplitude of the ringing that is not truncated (because this is determined by the size of the lobe), and increases the magnitude of the overshoot if the last non-cut lobe is negative, since the magnitude of the overshoot is the integral of the "tail," which is no longer canceled by positive lobes. Further, in practical implementations one at least truncates sinc, otherwise one must use infinitely many data points (or rather, all points of the signal) to compute every point of the output – truncation corresponds to a rectangular window, and makes the filter practically implementable, but the frequency response is no longer perfect | https://en.wikipedia.org/wiki?curid=22200477 | 113,118 |
Roy Rosenzweig Center for History and New Media On April 15, 2011, the Center for History and New Media became the Roy Rosenzweig Center for History and New Media, in memory of its founder. | https://en.wikipedia.org/wiki?curid=3618556 | 239,168 |
Ancient furniture Upon excavation, much of the furniture was conserved with paraffin wax mixed with carbon powder, which coats the wood and obscures important details such as decorations and joinery. It is now impossible to remove the wax coating without further damaging the furniture. Several wooden pieces were found with bone and metal fittings. Wooden shelving and racks are found in shops and kitchens in the Vesuvian sites, and one house has elaborate wooden room dividers. | https://en.wikipedia.org/wiki?curid=33959261 | 275,542 |
Robert Rose Tavern The is a historic house and former tavern at 298 Long Sands Road in York, Maine. Built in the 1750s using elements of a house built in 1680, it is one of the oldest surviving public hostelries in the state. The original house was built by John Banks, one of York's early settlers, and the tavern was built by a prominent local businessman. Now a private residence, the house was listed on the National Register of Historic Places in 1975. The Rose Tavern is located at the end of a long private drive off Long Sands Road, which runs just east of, and parallel to, the main access road to York High School. It is a two-story L-shaped wood frame structure, with a hip roof, clapboard siding, and granite foundation. Its main facade, which faces east, is roughly symmetrical, five bays wide, with a center entrance set in an enclosed projecting gabled vestibule. The door is flanked by sidelight windows, which are separated from the door and the vestibule corners by pilasters. To the right of the door, instead of a pair of sash windows (as found to its left) there are a group of three modern replacement sash windows. The interior of the building follows a central hall plan, with the original staircase still in place, along with late Georgian and Federal period woodwork elsewhere. The property on which the tavern stands was granted to John Banks, the son of Richard Banks who settled in York in 1642. Banks, born in York in 1657, built his house in 1680 | https://en.wikipedia.org/wiki?curid=47709282 | 328,056 |
Helena South-Central Historic District The is a collection of historic buildings located in Helena, Montana and roughly bounded by Broadway, South Davis Street, the city limits, and South Warren Street According to the Montana History Wiki: "The flamboyant Second Empire style is exceptionally well articulated in this grand residence showcasing the considerable talent of its builder and original owner, Martin M. Holter. One of few well-preserved examples of this style in Montana, the beautifully restored residence offers a glimpse into the 1870s when the style enjoyed short-term popularity and Rodney Street was the town's most fashionable neighborhood. Holter and his brother, Anton, were Norwegian immigrants who established Helena's first sawmill in 1865. By 1867, the successful brothers operated several Helena businesses including a distillery and grocery. Martin built this magnificent home in the late 1870's. The distinctive central tower, mansard roof with elaborately capped dormers and ornately decorated eaves are characteristics of this style. The arched, multi-paned Venetian windows lighting the beautiful, very tall double entry doors open to an interior that continues the outer grandeur. Original hand-grained wood finishes, extraordinarily high ceilings, a wide upstairs central hall and an elegant curved stairway contribute to an overall impression of great space and luxury." According to the Montana History Wiki: "Irish immigrant James M | https://en.wikipedia.org/wiki?curid=51665080 | 326,819 |
Comité de liaison de patronat de l'A.E.F. ('Liaison Committee of Employers of French Equatorial Africa', abbreviated COLPAEF) was an employers' organization in French Equatorial Africa (AEF). It had territorial branches in each of the four territories of AEF (Gabon, Moyen-Congo, Oubangui-Chari and Chad). COLPAEF was founded a few days after the introduction of the French Overseas Labour Code in 1953. The organization was affiliated to the "Conseil national du patronat français" (CNPF), the French National Employers Council. A Permanent Secretariat coordinated the activities of the four territorial branches of COLPAEF. In Gabon, COLPAEF affiliates included SYCOMIMPEX-GABON (traders' and importers' syndicate), "Syndicat professionnel des usines de sciage et de placages du Gabon" (S.P.U.S.P.D.G.) and "Syndicat forestier du Gabon" ('Forestry Union of Gabon'), out of which the latter was the most important. In Moyen-Congo, Fédération des petites et moyennes entreprise ('Federation of Small and Medium-sized Companies', founded in 1952), Syndicat des entrepreneurs du bâtiment et des travaux publics ('Construction and Public Works Entrepreneurs Union', founded in 1957) and SYCOMIMPEX du Congo (founded in 1958) were members of COLPAEF. COLPAEF was organizationally significantly weaker than its West African counterparts. The employers of French Equatorial Africa were generally highly individualistic, and often operated with little regard to the Labour Code. COLPAEF was entirely dominated by French people. Politically, the organization had limited influence | https://en.wikipedia.org/wiki?curid=25643798 | 452,421 |
Concurrency (computer science) In computer science, concurrency is the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the final outcome. This allows for parallel execution of the concurrent units, which can significantly improve overall speed of the execution in multi-processor and multi-core systems. In more technical terms, concurrency refers to the decomposability property of a program, algorithm, or problem into order-independent or partially-ordered components or units. A number of mathematical models have been developed for general concurrent computation including Petri nets, process calculi, the parallel random-access machine model, the actor model and the Reo Coordination Language. As Leslie Lamport (2015) notes, "While concurrent program execution had been considered for years, the computer science of concurrency began with Edsger Dijkstra's seminal 1965 paper that introduced the mutual exclusion problem. ... The ensuing decades have seen a huge growth of interest in concurrency—particularly in distributed systems. Looking back at the origins of the field, what stands out is the fundamental role played by Edsger Dijkstra". Because computations in a concurrent system can interact with each other while being executed, the number of possible execution paths in the system can be extremely large, and the resulting outcome can be indeterminate | https://en.wikipedia.org/wiki?curid=928467 | 127,515 |
Allstate McKinsey's recommendation to Allstate, according to Berardinelli, was to low-ball claims so that desperate customers in dire straits would be more likely to accept a settlement offer while continued to make a profit and collect interest on the insurance payment. would offer its "good hands" in the way of a low-ball claim and, if the customer did not accept, to get out "boxing gloves." The book was reviewed by "Business Week" magazine. According to that article, responded to Berardinelli's allegations by claiming that Berardinelli's allegations were "unfounded and unproven." Rather than trying to cheat customers, the company stated that its major goal was to benefit policyholders by identifying "exaggerated and fraudulent claims" and that its "processes are absolutely sound . . . to investigate, evaluate, and promptly resolve each claim fairly, based on the merits." Court decisions on the issues debated by Berardinelli and have gone both ways. According to the Business Week article, as of 2006, "Courts and regulators in a number of states, including New York, Pennsylvania, and Washington, have forced to halt or change its practice of handing out a controversial 'Do I Need an Attorney?' form to people involved in accidents." On the other hand, the article also states that "Although plaintiffs have had piecemeal success in bad-faith cases against Allstate, the insurer points to seven court rulings that have rejected attacks on CCPR | https://en.wikipedia.org/wiki?curid=797063 | 476,902 |
Economic ethics Academic literature also presents an ethical reasoning to the limitation associated with the application of rational actor theory to policy choice. Given that incomes are dependent on policy choice, and vise-versa, the logic of the rational model in policy choice is circular, hence the possibility of wrong policy recommendations. There are also many factors that increase one's propensity to deviate from the modelled assumptions of decision making. It is argued, under self-effacing moral theory, that such mechanisms as CBA may be justified even if not explicitly moral. The contrasting beliefs that public actions are based on such utilitarian reckonings, and that all policy-making is politically contingent, justifies the need for forecasting which itself is an ethical dilemma. This is founded on the proposition that forecasts can be amended to suit a particular action or policy rather than being objective and neutral. For example, the code of ethics of the American Institute of Certified Planners provides inadequate support for forecasters to avert this practice. Such 'canons' as those found in the Code of Professional Ethics and Practices of the American Association for Public Opinion Research is limited in regulating or preventing this convention. | https://en.wikipedia.org/wiki?curid=63072175 | 503,539 |
Rugg/Feldman benchmarks " Rugg and Feldman conclude the article by mentioning some of the other concerns that were raised after the original article. One common issue was the lack of more advanced mathematical functions, which they acknowledge, but suggest this is something best left to the reader. The other was the lack of string manipulation, but they note that the syntax of string handling differed considerably between platforms and thus could not be made in a single version. In this series of tests, the list was topped by the OSI Challenger, a 6502-based machine that had been "souped up" to 2 MHz, double that of typical 1 MHz 6502 machines of the era like the Apple II and PET. When running at its normal 1 MHz speed, the Challenger was just beaten by Zapple BASIC on Z80 machines running at 4 MHz. PET BASIC was next, only slightly behind the Challenger. They conclude that the 6502 is the highest performing of the CPUs, agreeing with comments Gates had made in his letter. The 6800 once again ends up in last place. As part of a longer article discussing new entries into the computer market, including the TRS-80, John Coll used the Rugg/Feldman tests to benchmark a variety of machines available to him in the UK in October 1977. He added an eighth test to exercise the math routines, and provided the resulting run times both on their own as well as the additional time compared to Test 7, in keeping with the earlier concept of each test modifying the last | https://en.wikipedia.org/wiki?curid=60933877 | 291,573 |
Mass formula A mass formula is an equation or set of equations in physics which attempts to predict the mass or mass ratios of the subatomic particles. An important step in high energy physics was the discovery of the Gell-Mann–Okubo mass formula predicting relationships between masses of the members of SU(3) multiplets. The development of an accurate mass formula is one of several fundamental aspects to developing a working theory of everything, which is expected to overcome the incompatibilities between current classical and quantum physics theories. There are currently no universal mass formulae which are generally accepted as correct by the mainstream physics community, however several versions of potential mass formulae have been presented and are currently being explored by some (largely amateur) physics theorists. | https://en.wikipedia.org/wiki?curid=1876520 | 2,945 |
Glycoside hydrolase A classification system for glycosyl hydrolases, based on sequence similarity, has led to the definition of more than 100 different families. This classification is available on the CAZy (CArbohydrate-Active EnZymes) web site. The database provides a series of regularly updated sequence based classification that allow reliable prediction of mechanism (retaining/inverting), active site residues and possible substrates. The online database is supported by CAZypedia, an online encyclopedia of carbohydrate active enzymes. Based on three-dimensional structural similarities, the sequence-based families have been classified into 'clans' of related structure. Recent progress in glycosidase sequence analysis and 3D structure comparison has allowed the proposal of an extended hierarchical classification of the glycoside hydrolases. Inverting enzymes utilize two enzymic residues, typically carboxylate residues, that act as acid and base respectively, as shown below for a β-glucosidase: Retaining glycosidases operate through a two-step mechanism, with each step resulting in inversion, for a net retention of stereochemistry. Again, two residues are involved, which are usually enzyme-borne carboxylates. One acts as a nucleophile and the other as an acid/base. In the first step the nucleophile attacks the anomeric centre, resulting in the formation of a glycosyl enzyme intermediate, with acidic assistance provided by the acidic carboxylate | https://en.wikipedia.org/wiki?curid=7013607 | 55,763 |
Modularity-driven testing is a term used in the testing of software. The test script modularity framework requires the creation of small, independent scripts that represent modules, sections, and functions of the application-under-test. These small scripts are then used in a hierarchical fashion to construct larger tests, realizing a particular test case. Of all the frameworks, this one should be the simplest to grasp and master. It is a well-known programming strategy to build an abstraction layer in front of a component to hide the component from the rest of the application. This insulates the application from modifications in the component and provides modularity in the application design. The test script modularity framework applies this principle of abstraction or encapsulation in order to improve the maintainability and scalability of automated test suites. | https://en.wikipedia.org/wiki?curid=9987726 | 110,526 |
Line splice In telecommunications, a line splice is a method of connecting electrical cables (electrical splice) or optical fibers (optical splice). Splices are often housed in sleeves to protect against external influences. The splicing of copper wires happens in the following steps: The splicing of copper wires is mainly used on paper insulated wires. LSA techniques (LSA: soldering, screwing and stripping free) are used to connect copper wires, making the copper wires faster and easier to connect. LSA techniques include: Fiber-optic cables are spliced using a special arc-splicer, with installation cables connected at their ends to respective "pigtails" - short individual fibers with fiber-optic connectors at one end. The splicer precisely adjusts the light-guiding cores of the two ends of the glass fibers to be spliced. The adjustment is done fully automatically in modern devices, whereas in older models this is carried out manually by means of micrometer screws and microscope. An experienced splicer can precisely position the fiber ends within a few seconds. Subsequently, the fibers are fused together (welded) with an electric arc. Since no additional material is added, such as gas welding or soldering, this is called a "fusion splice". Depending on the quality of the splicing process, attenuation values at the splice points are achieved by 0.3 dB, with good splices also below 0.02 dB. For newer generation devices, alignment is done automatically by motors. Here one differentiates core and jacket centering | https://en.wikipedia.org/wiki?curid=57462594 | 391,043 |
Oppression remedy 210 of the 1948 UK Act were first introduced into Canadian law through the 1975 passage of the "Canada Business Corporations Act". It incorporated recommendations made in 1962 by the UK Jenkins Committee on Company Law for removing the linkage of the remedy with that of winding-up and for broadening its scope. Canadian legislation (both federally and in all provinces other than Prince Edward Island) provides for a broad approach to the oppression remedy (). In "Peoples Department Stores Inc. (Trustee of) v. Wise", the Supreme Court of Canada noted: A "complainant" is deemed to be a current or former registered security holder, a current or former director or officer, the Director appointed under the "CBCA", or "any other person who, in the discretion of a court, is a proper person to make an application under this Part." In that regard, it can include a creditor of the corporation (but not every creditor will qualify), as well as a trustee appointed under the "Bankruptcy and Insolvency Act" or a monitor appointed under the "Companies' Creditors Arrangement Act". As in the United Kingdom, oppressive conduct is not restricted to that committed by corporations | https://en.wikipedia.org/wiki?curid=4649271 | 462,417 |
Computer (job description) The term "human computer" has been recently used by a group of researchers who refer to their work as "human computation". In this usage, "human computer" refers to activities of humans in the context of human-based computation (HBC). This usage is questionable for the following reason. HBC is a computational technique where a machine outsources certain (not necessarily algorithmic) tasks to humans. In fact, most of the time humans in the context of HBC are not provided with a sequence of exact steps that needs to be executed to yield an answer. HBC is agnostic about how humans solve the problem. This is why the term outsourcing is used in the definition. The use of humans as "human computers" in the context of HBC is very rare. | https://en.wikipedia.org/wiki?curid=3833695 | 198,092 |
Roberto Salmeron In 1963, Salmeron returned to Brazil and accepted a post as professor of physics in the newly created Universidade de Brasília. Unfortunately, the military dictatorship repressed strongly the faculty with liberal and leftist ideas and he joined 223 other professors in protest, who resigned from the University in October 1965. In 1966 Salmeron left definitely Brazil and went to work in Europe at CERN again, where he had an important role in experiments attempting to discover the quark–gluon plasma. Afterwards (1967) he worked at the École Polytechnique in Paris, France, one of the most important schools of engineering in the world. | https://en.wikipedia.org/wiki?curid=1635755 | 369,692 |
History of chemistry In 1905, Fritz Haber and Carl Bosch developed the Haber process for making ammonia, a milestone in industrial chemistry with deep consequences in agriculture. The Haber process, or Haber-Bosch process, combined nitrogen and hydrogen to form ammonia in industrial quantities for production of fertilizer and munitions. The food production for half the world's current population depends on this method for producing fertilizer. Haber, along with Max Born, proposed the Born–Haber cycle as a method for evaluating the lattice energy of an ionic solid. Haber has also been described as the "father of chemical warfare" for his work developing and deploying chlorine and other poisonous gases during World War I. In 1905, Albert Einstein explained Brownian motion in a way that definitively proved atomic theory. Leo Baekeland invented bakelite, one of the first commercially successful plastics. In 1909, American physicist Robert Andrews Millikan - who had studied in Europe under Walther Nernst and Max Planck - measured the charge of individual electrons with unprecedented accuracy through the oil drop experiment, in which he measured the electric charges on tiny falling water (and later oil) droplets. His study established that any particular droplet's electrical charge is a multiple of a definite, fundamental value — the electron's charge — and thus a confirmation that all electrons have the same charge and mass | https://en.wikipedia.org/wiki?curid=1416046 | 84,017 |
Engineering economics To further add to the issues associated with depreciation, it must be broken down into three separate types, each having intricate calculations and implications. Calculation of depreciation also comes in a number of forms; "straight line, declining balance, sum-of-the-year's," and "service output". The first method being perhaps the easiest to calculate, while the remaining have varying levels of difficulty and utility. Most situations faced by managers in regards to depreciation can be solved using any of these formulas, however, company policy or preference of individual may affect the choice of model. The main form of depreciation used inside the U.S. is the Modified Accelerated Capital Recovery System (MACRS), and it is based on a number of tables that give the class of asset, and its life. Certain classes are given certain lifespans, and these affect the value of an asset that can be depreciated each year. This does not necessarily mean that an asset must be discarded after its MACRS life is fulfilled, just that it can no longer be used for tax deductions. Capital budgeting, in relation to engineering economics, is the proper usage and utilization of capital to achieve project objectives. It can be fully defined by the statement; "... as the series of decisions by individuals and firms concerning how much and where resources will be obtained and expended to meet future objectives | https://en.wikipedia.org/wiki?curid=7977203 | 501,957 |
Thermal engineering Thermal Engineering is a specialized sub-discipline of mechanical engineering and chemical engineering that deals with the movement of heat energy and transfer. The energy can be transformed between two mediums or transferred into other forms of energy. A thermal engineer will have knowledge of thermodynamics and the process to convert generated energy from thermal sources into chemical, mechanical, or electrical energy. Many process plants use a wide variety of machines that utilize components that use heat transfer in some way. Many plants use heat exchangers in their operations. A thermal engineer must allow the proper amount of energy to be transferred for correct use. Too much and the components could fail, too little and the system will not function at all. Thermal engineers must have an understanding of economics and the components that they will be servicing or interacting with. Some components that a thermal engineer could work with include heat exchangers, heat sinks, bi-metals strips, radiators and many more. Some systems that require a thermal engineer include; Boilers, heat pumps, water pumps, engines, and more. Part of being a thermal engineer is to improve a current system and make it more efficient than the current system. Many industries employ thermal engineers, some main ones are the automotive manufacturing industry, commercial construction, and Heating Ventilation and Cooling industry. Job opportunities for a thermal engineer are very broad and promising | https://en.wikipedia.org/wiki?curid=19582317 | 220,748 |
Comparative physiology Stephens (1925–2003) was a well-known invertebrate comparative physiologist, serving on the faculty of the University of Minnesota until becoming the founding chairman of the Department of Organismic Biology at the University of California at Irvine in 1964. He was the mentor for numerous graduate students, many of whom have gone on to further build the field (obituary). He authored several books and in addition to being an accomplished biologist was also an accomplished pianist and philosopher. | https://en.wikipedia.org/wiki?curid=9435784 | 153,381 |
World Wide Web Cookies were designed to be a reliable mechanism for websites to remember stateful information (such as items added in the shopping cart in an online store) or to record the user's browsing activity (including clicking particular buttons, logging in, or recording which pages were visited in the past). They can also be used to remember arbitrary pieces of information that the user previously entered into form fields such as names, addresses, passwords, and credit card numbers. Cookies perform essential functions in the modern web. Perhaps most importantly, "authentication cookies" are the most common method used by web servers to know whether the user is logged in or not, and which account they are logged in with. Without such a mechanism, the site would not know whether to send a page containing sensitive information, or require the user to authenticate themselves by logging in. The security of an authentication cookie generally depends on the security of the issuing website and the user's web browser, and on whether the cookie data is encrypted. Security vulnerabilities may allow a cookie's data to be read by a hacker, used to gain access to user data, or used to gain access (with the user's credentials) to the website to which the cookie belongs (see cross-site scripting and cross-site request forgery for examples) | https://en.wikipedia.org/wiki?curid=33139 | 302,959 |
Pesticide poisoning The latest estimate by a WHO task group indicates that there may be 1 million serious unintentional poisonings each year and in addition 2 million people hospitalized for suicide attempts with pesticides. This necessarily reflects only a fraction of the real problem. On the basis of a survey of self-reported minor poisoning carried out in the Asian region, it is estimated that there could be as many as 25 million agricultural workers in the developing world suffering an episode of poisoning each year." In Canada in 2007 more than 6000 cases of acute pesticide poisoning occurred. Estimating the numbers of chronic poisonings worldwide is more difficult. Rachel Carson's 1962 environmental science book "Silent Spring" brought about the first major wave of public concern over the chronic effects of pesticides. An obvious side effect of using a chemical meant to kill is that one is likely to kill more than just the desired organism. Contact with a sprayed plant or "weed" can have an effect upon local wildlife, most notably insects. A cause for concern is how pests, the reason for pesticide use, are building up a resistance. Phytophagous insects are able to build up this resistance because they are easily capable of evolutionary diversification and adaptation. The problem this presents is that in order to obtain the same desired effect of the pesticides they have to be made increasingly stronger as time goes on | https://en.wikipedia.org/wiki?curid=371486 | 60,459 |
Quasitransitive relation Quasitransitivity is a weakened version of transitivity that is used in social choice theory or microeconomics. Informally, a relation is quasitransitive if it is symmetric for some values and transitive elsewhere. The concept was introduced by to study the consequences of Arrow's theorem. A binary relation T over a set "X" is quasitransitive if for all "a", "b", and "c" in "X" the following holds: If the relation is also antisymmetric, T is transitive. Alternately, for a relation T, define the asymmetric or "strict" part P: Then T is quasitransitive iff P is transitive. Preferences are assumed to be quasitransitive (rather than transitive) in some economic contexts. The classic example is a person indifferent between 7 and 8 grams of sugar and indifferent between 8 and 9 grams of sugar, but who prefers 9 grams of sugar to 7. Similarly, the Sorites paradox can be resolved by weakening assumed transitivity of certain relations to quasitransitivity. | https://en.wikipedia.org/wiki?curid=6770393 | 501,585 |
Television studio A television studio, also called a television production studio, is an installation room in which video productions take place, either for the recording of live television to video tape, or for the acquisition of raw footage for post-production. The design of a studio is similar to, and derived from, movie studios, with a few amendments for the special requirements of television production. A professional television studio generally has several rooms, which are kept separate for noise and practicality reasons. These rooms are connected via 'talkback' or an intercom, and personnel will be divided among these workplaces. The studio floor is the actual stage on which the actions that will be recorded and viewed take place. A typical studio floor has the following characteristics and installations: While a production is in progress, people composing a television crew work on the studio floor. The production control room is the place in a television studio in which the composition of the outgoing program takes place. The production control room is occasionally also called a studio control room (SCR) or a "gallery"the latter name comes from the original placement of the director on an ornately carved bridge spanning the BBC's first studio at Alexandra Palace, which was once referred to as like a minstrels' gallery. The vast majority of devices in a PCR are interfaces for rack-mounted equipment that is located in the Central Apparatus Room (CAR) | https://en.wikipedia.org/wiki?curid=1555089 | 260,247 |
LCD television Sharp Corporation introduced the dot matrix TN-LCD in 1983, and Casio introduced its TV-10 portable TV. In 1984, Epson released the ET-10, the first full-color pocket LCD television. That same year Citizen Watch introduced the Citizen Pocket TV, a 2.7-inch color LCD TV, with the first commercial TFT LCD display. Throughout this period, screen sizes over 30" were rare as these formats would start to appear blocky at normal seating distances when viewed on larger screens. LCD projection systems were generally limited to situations where the image had to be viewed by a larger audience. At the same time, plasma displays could easily offer the performance needed to make a high quality display, but suffered from low brightness and very high power consumption. Still, some experimentation with LCD televisions took place during this period. In 1988, Sharp introduced a 14-inch active-matrix full-color full-motion TFT-LCD. These were offered primarily as high-end items, and were not aimed at the general market. This led to Japan launching an LCD industry, which developed larger-size LCDs, including TFT computer monitors and LCD televisions. Epson developed the 3LCD projection technology in the 1980s, and licensed it for use in projectors in 1988. Epson's VPJ-700, released in January 1989, was the world's first compact, full-color LCD projector | https://en.wikipedia.org/wiki?curid=778233 | 416,979 |
Failure detector The inspection of crashed programs does not depend on completeness. The following are correctness arguments to satisfy the algorithm of changing a failure detector "W" to a failure detector "S". The failure detector "W" is weak in completeness, and the failure detector "S" is strong in completeness. They are both weak in accuracy. If all arguments above are satisfied, the reduction of a weak failure detector "W" to a strong failure detector "S" will agree with the algorithm within the distributed computing system. | https://en.wikipedia.org/wiki?curid=26330426 | 139,815 |
Robot Most of the robots in cinema are fictional. Two of the most famous are R2-D2 and C-3PO from the "Star Wars" franchise. The concept of humanoid sex robots has elicited both public attention and concern. Opponents of the concept have stated that the development of sex robots would be morally wrong. They argue that the introduction of such devices would be socially harmful, and demeaning to women and children. Fears and concerns about robots have been repeatedly expressed in a wide range of books and films. A common theme is the development of a master race of conscious and highly intelligent robots, motivated to take over or destroy the human race. "Frankenstein" (1818), often called the first science fiction novel, has become synonymous with the theme of a robot or android advancing beyond its creator. Other works with similar themes include "The Mechanical Man", "The Terminator, Runaway, RoboCop", the Replicators in "Stargate", the Cylons in "Battlestar Galactica", the Cybermen and Daleks in "Doctor Who", "The Matrix", "Enthiran" and "I, Robot". Some fictional robots are programmed to kill and destroy; others gain superhuman intelligence and abilities by upgrading their own software and hardware. Examples of popular media where the robot becomes evil are "", "Red Planet" and "Enthiran". The 2017 game Horizon Zero Dawn explores themes of robotics in warfare, robot ethics, and the AI control problem, as well as the positive or negative impact such technologies could have on the environment | https://en.wikipedia.org/wiki?curid=25781 | 396,465 |
Ballistic impact is a high velocity impact by a small mass object, analogous to runway debris or small arms fire. The simulation of ballistic impacts can be achieved with a light-gas gun or other ballistic launcher. It is important to study the response of materials to ballistic impact loads. Applications of this research include body armor, armored vehicles and fortified buildings, as well as the protection of essential equipment, such as the jet engines of an airliner. | https://en.wikipedia.org/wiki?curid=19871700 | 63,859 |
Consumerism The not-so-wealthy consumers can "purchase something new that will speak of their place in the tradition of affluence". A consumer can have the instant gratification of purchasing an expensive item to improve social status. Emulation is also a core component of 21st century consumerism. As a general trend, regular consumers seek to emulate those who are above them in the social hierarchy. The poor strive to imitate the wealthy and the wealthy imitate celebrities and other icons. The celebrity endorsement of products can be seen as evidence of the desire of modern consumers to purchase products partly or solely to emulate people of higher social status. This purchasing behavior may co-exist in the mind of a consumer with an image of oneself as being an individualist. Cultural capital, the intangible social value of goods, is not solely generated by cultural pollution. Subcultures also manipulate the value and prevalence of certain commodities through the process of bricolage. Bricolage is the process by which mainstream products are adopted and transformed by subcultures. These items develop a function and meaning that differs from their corporate producer's intent. In many cases, commodities that have undergone bricolage often develop political meanings. For example, Doc Martens, originally marketed as workers boots, gained popularity with the punk movement and AIDs activism groups and became symbols of an individual's place in that social group | https://en.wikipedia.org/wiki?curid=170522 | 507,586 |
Neoclassicism in France The result, incorporating elements of ancient Roman, French, and Italian architecture, "resolves itself into the greatest palace façade in Europe." Under Louis XIV, the Roman dome and facade of monumental columns became the dominant features of important new churches, beginning with the chapel of Val-de-Grâce (1645-1710), by Mansart, Jacques Lemercier and Pierre Le Muet, followed by the church of Les Invalides (1680-1706). While the basic features of the architecture of these churches were classical, the interiors were lavishly decorated in the baroque style. In the latter part of the reign of Louis XV, the neoclassical became the dominant style in both civil and religious architecture. The chief architect of the king was Jacques Gabriel from 1734 until 1742, and then his more famous son, Ange-Jacques Gabriel until the end of the reign. His major works included the École Militaire, the ensemble of buildings overlooking the Place Louis XV (now Place de la Concorde (1761-1770)) and the Petit Trianon at Versailles (1764). Over the course of the reign of Louis XV, while interiors were lavishly decorated, the facades gradually became simpler, less ornamented and more classical. The facades Gabriel designed were carefully rhymed and balanced by rows of windows and columns, and, on large buildings like those the Place de la Concorde, often featured grand arcades on the street level, and classical pediments or balustrades on the roofline | https://en.wikipedia.org/wiki?curid=56673259 | 264,249 |
Trajectory optimization The first optimal control approaches grew out of the calculus of variations, based on the research of Gilbert Ames Bliss and Bryson in America, and Pontryagin in Russia. Pontryagin's maximum principle is of particular note. These early researchers created the foundation of what we now call indirect methods for trajectory optimization. Much of the early work in trajectory optimization was focused on computing rocket thrust profiles, both in a vacuum and in the atmosphere. This early research discovered many basic principles that are still used today. Another successful application was the climb to altitude trajectories for the early jet aircraft. Because of the high drag associated with the transonic drag region and the low thrust of early jet aircraft, trajectory optimization was the key to maximizing climb to altitude performance. Optimal control based trajectories were responsible for some of the world records. In these situations, the pilot followed a Mach versus altitude schedule based on optimal control solutions. One of the important early problems in trajectory optimization was that of the singular arc, where Pontryagin's maximum principle fails to yield a complete solution. An example of a problem with singular control is the optimization of the thrust of a missile flying at a constant altitude and which is launched at low speed. Here the problem is one of a bang-bang control at maximum possible thrust until the singular arc is reached | https://en.wikipedia.org/wiki?curid=2116830 | 107,757 |
Antoine Lavoisier The Classical elements of earth, air, fire, and water were discarded, and instead some 55 substances which could not be decomposed into simpler substances by any known chemical means were provisionally listed as elements. The elements included light; caloric (matter of heat); the principles of oxygen, hydrogen, and azote (nitrogen); carbon; sulfur; phosphorus; the yet unknown "radicals" of muriatic acid (hydrochloric acid), boric acid, and "fluoric" acid; 17 metals; 5 earths (mainly oxides of yet unknown metals such as magnesia, baria, and strontia); three alkalies (potash, soda, and ammonia); and the "radicals" of 19 organic acids. The acids, regarded in the new system as compounds of various elements with oxygen, were given names which indicated the element involved together with the degree of oxygenation of that element, for example sulfuric and sulfurous acids, phosphoric and phosphorous acids, nitric and nitrous acids, the "ic" termination indicating acids with a higher proportion of oxygen than those with the "ous" ending. Similarly, salts of the "ic" acids were given the terminal letters "ate," as in copper sulfate, whereas the salts of the "ous" acids terminated with the suffix "ite," as in copper sulfite. The total effect of the new nomenclature can be gauged by comparing the new name "copper sulfate" with the old term "vitriol of Venus." Lavoisier's new nomenclature spread throughout Europe and to the United States and became common use in the field of chemistry | https://en.wikipedia.org/wiki?curid=1822 | 87,443 |
Canada Water Wells The Canada Water Wells, near jct. of Canada-Toto Rd.& Canada -Toto Loop Barrigada-Mangilao, GU, are water works that were built in 1937. They have also been known as Kanada, as Chochugu' and as To'tu. The structures were listed on the National Register of Historic Places in 2008; the listing included two contributing structures. A U.S. Geological Survey report in 2003 described the water of Guam, including identifying that there are approximately 180 water wells serving the needs of Guam, drawing on a lens-shaped aquifer. | https://en.wikipedia.org/wiki?curid=39915504 | 334,889 |
Radioactivity in the life sciences Radioactivity is generally used in life sciences for highly sensitive and direct measurements of biological phenomena, and for visualizing the location of biomolecules radiolabelled with a radioisotope. All atoms exist as stable or unstable isotopes and the latter decay at a given half-life ranging from attoseconds to billions of years; radioisotopes useful to biological and experimental systems have half-lives ranging from minutes to months. In the case of the hydrogen isotope tritium (half-life = 12.3 years) and carbon-14 (half-life = 5,730 years), these isotopes derive their importance from all organic life containing hydrogen and carbon and therefore can be used to study countless living processes, reactions, and phenomena. Most short lived isotopes are produced in cyclotrons, linear particle accelerators, or nuclear reactors and their relatively short half-lives give them high maximum theoretical specific activities which is useful for detection in biological systems. Radiolabeling is a technique used to track the passage of a molecule that incorporates a radioisotope through a reaction, metabolic pathway, cell, tissue, organism, or biological system. The reactant is 'labeled' by replacing specific atoms by their isotope. Replacing an atom with its own radioisotope is an intrinsic label that does not alter the structure of the molecule. Alternatively, molecules can be radiolabeled by chemical reactions that introduce an atom, moiety, or functional group that contains a radionuclide | https://en.wikipedia.org/wiki?curid=8355163 | 161,237 |
23 (film) Driven by contacts with a drug dealer—and by increasing KGB pressure to hack successfully into foreign systems—Karl spirals into a cocaine dependency and grows increasingly alienated from David. In a drug-addled state, Karl begins to sit in front of his computer for days at a time. Perpetually sleepless, he also grows increasingly delusional. When David publicly reveals the espionage activity in which the two men have been engaged, Karl is left alone to face the consequences. Collapse soon follows. Karl is taken to a hospital to deal with his drug addiction and mysteriously dies after his supposed hacking of Chernobyl . | https://en.wikipedia.org/wiki?curid=677394 | 244,405 |
Speedometer The former type are quite reliable and low maintenance but need a gauge and hub gearing properly matched to the rim and tyre size, whereas the latter require little or no calibration for a moderately accurate readout (with standard tyres, the "distance" covered in each wheel rotation by a friction wheel set against the rim should scale fairly linearly with wheel size, almost as if it were rolling along the ground itself) but are unsuitable for off-road use, and must be kept properly tensioned and clean of road dirt to avoid slipping or jamming. Most speedometers have tolerances of some ±10%, mainly due to variations in tire diameter. Sources of error due to tire diameter variations are wear, temperature, pressure, vehicle load, and nominal tire size. Vehicle manufacturers usually calibrate speedometers to read high by an amount equal to the average error, to ensure that their speedometers never indicate a lower speed than the actual speed of the vehicle, to ensure they are not liable for drivers violating speed limits. Excessive speedometer errors after manufacture, can come from several causes but most commonly is due to nonstandard tire diameter, in which case the error is: formula_1 Nearly all tires now have their size shown as "T/A_W" on the side of the tire (See: Tire code), and the tires. formula_2 formula_3 For example, a standard tire is "185/70R14" with diameter = 2*185*(70/100)+(14*25.4) = 614.6 mm (185x70/1270 + 14 = 24.20 in). Another is "195/50R15" with 2*195*(50/100)+(15*25.4) = 576 | https://en.wikipedia.org/wiki?curid=576681 | 270,130 |
Dimension stone borders with Canada and Mexico. A current problem is how to consider stone quarried domestically, sent to China or Italy for finishing, and shipped back to be used in a project. When demolishing a structure, dimension stone is 100% reusable and can be salvaged for new construction, used as paving or crushed for use as aggregates. There are also "green" methods of stone cleaning either in development or already in use, such as removing the black gypsum crusts that form on marble and limestone by applying sulfate-reducing bacteria to the crust to gasify it, breaking up the crust for easy removal. See DSAN for updates on "building green" and dimension stone recycling. The Federal Trade Commission (FTC) in America is re-examining and will most likely update its "Green Guides" used to regulate green advertising claims. The updating will emphasize green building, including the products it involves, such as dimension stone. When the new requirements are finalized, the FTC will go after firms that violate the new requirements, in order to establish legal precedents. The Natural Stone Council has a library of information on building green with dimension stone, including life-cycle inventory data for each major dimension stone, giving the amount of energy, water, other inputs, and processing emissions, plus some best practice studies (see below) | https://en.wikipedia.org/wiki?curid=2229095 | 339,925 |
Fiber to the premises by country However, on 15 May 2010, Superonline sent an e-mail to its customers stating that the announcement on the bills was a "technical glitch" which should be ignored. This incident decreased Superonline's credibility among its fiber internet customers. Superonline announced on 9 July 2010 that customers would be discriminated according to their internet service starting dates. Customers who started using fiber internet before 15 March 2010 will not be affected by the "fair usage policy", thus they will be able to download unlimited data while paying half the price of unlimited tariffs or in other words paying the same price as a fair usage limited user and downloading unlimited data. Superonline tariffs in 2013 are 25/5, 50/5, 100/5 and 1000/20Mbit/s. The cheapest prices are 49 TL ($27) for 25Mbit/s, 89 TL ($45) for 50Mbit/s, 109 TL ($55) for 100Mbit/s and 999TL ($504) for 1Gbit/s. The 1Gbit/s packet is unlimited in any means. The fair usage policy affects all packages except the 1Gbit/s tariff. The company has been heavily criticised for fair usage policy. The network's main drawback is it is coverage. No significant expansions were made by far. Although residents can fill the form for the fibre coverage, there is really low chance that this will affect future plans of the company. In Ukraine the first FTTH project was launched in Odessa in 2006 by Comstar-Ukraine, LLC, a local branch of Comstar-UTS, Russia. The project aimed to prepare a basic network for TriplePlay service deployment | https://en.wikipedia.org/wiki?curid=15305915 | 404,968 |
PKA (irradiation) A Primary Knock-on Atom or PKA is an atom that is displaced from its lattice site by irradiation; it is, by definition, the first atom that an incident particle encounters in the target. After it is displaced from its initial lattice site, the PKA can induce the subsequent lattice site displacements of other atoms if it possesses sufficient energy, or come to rest in the lattice at an interstitial site if it does not. Most of the displaced atoms resulting from electron irradiation and some other types of irradiation are PKAs, since these are usually below the threshold displacement energy and do not have sufficient energy to displace more atoms. In other cases like fast neutron irradiation, most of the displacements result from higher energy PKAs colliding with other atoms as they slow down to rest. Atoms can only be displaced if, upon bombardment, the energy they receive exceeds a threshold energy "E". Likewise, when a moving atom collides with a stationary atom, both atoms will have energy greater than "E" after the collision only if the original moving atom had an energy exceeding 2"E". Thus, only PKAs with an energy greater than 2"E" can continue to displace more atoms and increase the total number of displaced atoms. In cases where the PKA does have sufficient energy to displace further atoms, the same truth holds for any subsequently displaced atom. In any scenario, the majority of displaced atoms leave their lattice sites with energies no more than two or three times "E" | https://en.wikipedia.org/wiki?curid=31260516 | 31,554 |
Pacific Exchange The was a regional stock exchange in California until 2001. Its main exchange floor and building were in San Francisco, California, with a branch building in Los Angeles, California. Its history began with the founding of the San Francisco Stock and Bond Exchange in 1882 and the Los Angeles Oil Exchange in 1899. In 1956 the two exchanges merged to create the Pacific Coast Stock Exchange, with trading floors maintained in both cities. In 1973, it was renamed the Pacific Stock Exchange. The was bought by Archipelago Holdings in 2005, which merged with the New York Stock Exchange in 2006. equities and options trading now takes place exclusively through the NYSE Arca platform. Its history began with the founding of the San Francisco Stock and Bond Exchange in 1882 and the Los Angeles Oil Exchange in 1899. In 1956, the San Francisco Stock and Bond Exchange and the Los Angeles Oil Exchange merged to create the Pacific Coast Stock Exchange, though trading floors were maintained in both cities. In 1973, it was renamed the Pacific Stock Exchange and it began trading options three years later in 1976. In 1999, the exchange became the first U.S. stock exchange to demutualize. The trading floor in Los Angeles was closed in 2001, followed by the floor in San Francisco a year later. 2003 saw the exchange launch PCX Plus, an electronic options trading platform. By 2005, the was bought by the owner of the ArcaEx platform, Archipelago Holdings, which then merged with the New York Stock Exchange in 2006 | https://en.wikipedia.org/wiki?curid=230088 | 367,316 |
CRISPR novicida" to dampen host response and promote virulence. The basic model of evolution is newly incorporated spacers driving phages to mutate their genomes to avoid the bacterial immune response, creating diversity in both the phage and host populations. To resist a phage infection, the sequence of the spacer must correspond perfectly to the sequence of the target phage gene. Phages can continue to infect their hosts given point mutations in the spacer. Similar stringency is required in PAM or the bacterial strain remains phage sensitive. A study of 124 "S. thermophilus" strains showed that 26% of all spacers were unique and that different loci showed different rates of spacer acquisition. Some loci evolve more rapidly than others, which allowed the strains' phylogenetic relationships to be determined. A comparative genomic analysis showed that "E. coli" and "S. enterica" evolve much more slowly than "S. thermophilus". The latter's strains that diverged 250 thousand years ago still contained the same spacer complement. Metagenomic analysis of two acid-mine-drainage biofilms showed that one of the analyzed CRISPRs contained extensive deletions and spacer additions versus the other biofilm, suggesting a higher phage activity/prevalence in one community than the other. In the oral cavity, a temporal study determined that 7–22% of spacers were shared over 17 months within an individual while less than 2% were shared across individuals | https://en.wikipedia.org/wiki?curid=2146034 | 203,222 |
International economics In making an influential case for flexible exchange rates in the 1950s, Milton Friedman had claimed that if there were any resulting instability, it would mainly be the consequence of macroeconomic instability, but an empirical analysis in 1999 found no apparent connection. Neoclassical theory had led them to expect capital to flow from the capital-rich developed economies to the capital-poor developing countries - because the returns to capital there would be higher. Flows of financial capital would tend to increase the level of investment in the developing countries by reducing their costs of capital, and the direct investment of physical capital would tend to promote specialisation and the transfer of skills and technology. However, theoretical considerations alone cannot determine the balance between those benefits and the costs of volatility, and the question has had to be tackled by empirical analysis. A 2006 International Monetary Fund working paper offers a summary of the empirical evidence. The authors found little evidence either of the benefits of the liberalisation of capital movements, or of claims that it is responsible for the spate of financial crises. They suggest that net benefits can be achieved by countries that are able to meet threshold conditions of financial competence but that for others, the benefits are likely to be delayed, and vulnerability to interruptions of capital flows is likely to be increased | https://en.wikipedia.org/wiki?curid=1700209 | 517,437 |
Homologous somatic pairing In 1998 it was determined that homologous pairing in "Drosophila" occurs through independent initiations (as opposed to a directed, 'processive zippering' motion). The first RNAi screen (based on DNA FISH) was carried out to identify genes regulating "D. melanogaster" somatic pairing in 2012, described at the time as providing "an extensive “parts list” of mostly novel factors". These comprised 40 pairing promoting genes and 65 'anti-pairing' genes (of which 2 and 1 were already known, respectively), many of which have human orthologs. An earlier RNAi screen in 2007 showed the disruption of Topoisomerase II activity impairs somatic pairing within "Drosophila" tissue culture, indicating a role for topoisomerase-mediated organisation (or the direct interactions of topoisomerase enzymes) in pairing. Condensin (despite dependent interactions with Topoisomerase II) is antagonistic to "Drosophila" homologous pairing. | https://en.wikipedia.org/wiki?curid=50097973 | 180,820 |
Turing's proof Strangely (perhaps World War II intervened) it took Post some ten years to dissect it in the "Appendix" to his paper "Recursive Unsolvability of a Problem of Thue", 1947 (reprinted in "Undecidable", p. 293). Other problems present themselves: In his "Appendix" Post commented indirectly on the paper's difficulty and directly on its "outline nature" (Post in "Undecidable", p. 299) and "intuitive form" of the proofs ("ibid".). Post had to infer various points: Anyone who has ever tried to read the paper will understand Hodges' complaint: In his proof that the Entscheidungsproblem can have no solution, Turing proceeded from two proofs that were to lead to his final proof. His first theorem is most relevant to the halting problem, the second is more relevant to Rice's theorem. First proof: that no "computing machine" exists that can decide whether or not an arbitrary "computing machine" (as represented by an integer 1, 2, 3, . . .) is "circle-free" (i.e. goes on printing its number in binary ad infinitum): "...we have no general process for doing this in a finite number of steps" (p. 132, "ibid".). Turing's proof, although it seems to use the "diagonal process", in fact shows that his machine (called H) cannot calculate its own number, let alone the entire diagonal number (Cantor's diagonal argument): "The fallacy in the argument lies in the assumption that B [the diagonal number] is computable" ("Undecidable", p. 132). The proof does not require much mathematics | https://en.wikipedia.org/wiki?curid=3739933 | 125,160 |
Nuclear magnetic resonance spectroscopy of nucleic acids HO is used as a solvent, and other methods are used to eliminate the strong solvent signal, such as saturating the solvent signal before the normal pulse sequence ("presaturation"), which works best a low temperature to prevent exchange of the saturated solvent protons with the nucleic acid protons; or exciting only resonances of interest ("selective excitation"), which has the additional, potentially undesired effect of distorting the peak amplitudes. The exchangeable and non-exchangeable protons are usually assigned to their specific peaks as two independent groups. For exchangeable protons, which are for the most part the protons involved in base pairing, NOESY can be used to find through-space correlations between on neighboring bases, allowing an entire duplex molecule to be assigned through sequential walking. For nonexchangable protons, many of which are on the sugar moiety of the nucleic acid, COSY and TOCSY are used to identify systems of coupled nuclei, while NOESY is again used to correlate the sugar to the base and each base to its neighboring base. For duplex DNA nonexchangeable protons the H6/H8 protons on the base correlate to their counterparts on neighboring bases and to the H1' proton on the sugar, allowing sequential walking to be done. For RNA, the differences in chemical structure and helix geometry make this assignment more technically difficult, but still possible | https://en.wikipedia.org/wiki?curid=32458543 | 188,123 |
Telecine In 1996 Philips, working with Kodak, introduced the Spirit DataCine (SDC 2000), which was capable of scanning the film image at HDTV resolutions and approaching 2K (1920 Luminance and 960 Chrominace RGB) × 1556 RGB. With the data option the Spirit DataCine can be used as a motion picture film scanner outputting 2K DPX data files as 2048 × 1556 RGB. In 2000 Philips introduced the Shadow (STE), a low cost version of the Spirit with no Kodak parts. The Spirit DataCine, Cintel's C-Reality and ITK's Millennium opened the door to the technology of digital intermediates, wherein telecine tools were not just used for video outputs, but could now be used for high-resolution data that would later be recorded back out to film. The DFT Digital Film Technology, formerly Grass Valley Spirit 4K/2K/HD (2004) replaced the Spirit 1 Datacine and uses both 2K and 4K line array CCDs. (Note: the SDC-2000 did not use a color prisms and/or dichroic mirrors.) DFT revealed its new scanner at the 2009 NAB Show, Scanity. The Scanity uses Time Delay Integration (TDI) sensor technology for extremely fast and sensitive film scans. High speed scanning 15 frame/s @ 4K; 25 frame/s @ 2K; 44 frame/s @ 1K. With the manufacturing of new high power LEDs, came pulsed LED/triggered three CCD camera systems. Flashing the LED light source for a very short time span gives the full frame CCD camera a stop action of the film, allowing continuous film motion | https://en.wikipedia.org/wiki?curid=170870 | 280,344 |
Health information on Wikipedia The authors theorized that this might be due to the increasing prevalence of MRI scans, which has led to an increase in incidental findings of white matter lesions. Although most of these lesions have nothing to do with multiple sclerosis, they may lead patients, relatives, and even physicians to perform Internet searches on "multiple sclerosis", which may lead them to the Wikipedia article. Wikipedia's health information has been described as "transforming how our next doctors learn medicine". Various commentators in health education have said that Wikipedia is popular and convenient for medical students. A 2013 study done at a single Australian medical school showed that 97% of students used Wikipedia to study medicine, with the most common reasons being ease of access and ease of understanding. There was no relationship between a student's year in medical school and his or her use of Wikipedia, but students further along in medical school were less likely to use Wikipedia as their first resource, only resource, or most common resource; they were also more likely to perceive Wikipedia as unreliable. In 2013, UCSF School of Medicine began to offer fourth-year medical students a month-long elective centered around improving Wikipedia's health-related articles. Between 2013 and 2015, 43 students took part in the course and chose a single health-related article to work on | https://en.wikipedia.org/wiki?curid=36094374 | 241,977 |
Fuse (electrical) Some installations use these Edison-base circuit breakers. However, any such breaker sold today does have one flaw. It may be installed in a circuit-breaker box with a door. If so, if the door is closed, the door may hold down the breaker's reset button. While in this state, the breaker is effectively useless: it does not provide any overcurrent protection. In the 1950s, fuses in new residential or industrial construction for branch circuit protection were superseded by low voltage circuit breakers. Fuses are widely used for protection of electric motor circuits; for small overloads, the motor protection circuit will open the controlling contactor automatically, and the fuse will only operate for short-circuits or extreme overload. Where several fuses are connected in series at the various levels of a power distribution system, it is desirable to blow (clear) only the fuse (or other overcurrent device) electrically closest to the fault. This process is called "coordination" or "discrimination" and may require the time-current characteristics of two fuses to be plotted on a common current basis. Fuses are selected so that the minor, branch, fuse disconnects its circuit well before the supplying, major, fuse starts to melt. In this way, only the faulty circuit is interrupted with minimal disturbance to other circuits fed by a common supplying fuse | https://en.wikipedia.org/wiki?curid=683342 | 384,863 |
Libertarian socialism It predates Leninism as De Leonism's principles developed in the early 1890s with De Leon's assuming leadership of the SLP whereas Leninism and its vanguard party idea took shape after the 1902 publication of Lenin's "What Is To Be Done?" The highly decentralized and democratic nature of the proposed De Leonist government is in contrast to the democratic centralism of Marxism–Leninism and what they see as the dictatorial nature of the Soviet Union and the People's Republic of China and other communist states. The success of the De Leonist plan depends on achieving majority support among the people both in the workplaces and at the polls in contrast to the Leninist notion that a small vanguard party should lead the working class to carry out the revolution. Council communism is a radical left-wing movement originating in Germany and the Netherlands in the 1920s. Its primary organization was the Communist Workers Party of Germany (KAPD). Council communism continues today as a theoretical and activist position within Marxism and also within libertarian socialism. In contrast to those of social democracy and Leninist communism, the central argument of council communism is that workers' councils arising in the factories and municipalities are the natural and legitimate form of working class organisation and government power. This view is opposed to the reformist and Bolshevik stress on vanguard parties, parliaments, or the state | https://en.wikipedia.org/wiki?curid=18048 | 516,119 |
Noblex camera The Noblex is a German made motor-driven swing-lens panoramic camera made by Kamera-Werkstätten. There are multiple models of this camera in multiple formats. Cameras with similar functions include the Widelux and Horizon. | https://en.wikipedia.org/wiki?curid=1325086 | 249,358 |
Endogenous growth theory It uses the assumption that the production function does not exhibit diminishing returns to scale to lead to endogenous growth. Various rationales for this assumption have been given, such as positive spillovers from capital investment to the economy as a whole or improvements in technology leading to further improvements. However, the endogenous growth theory is further supported with models in which agents optimally determined the consumption and saving, optimizing the resources allocation to research and development leading to technological progress. Romer (1987, 1990) and significant contributions by Aghion and Howitt (1992) and Grossman and Helpman (1991), incorporated imperfect markets and R&D to the growth model. The AK model production function is a special case of a Cobb–Douglas production function: This equation shows a Cobb–Douglas function where "Y" represents the total production in an economy. "A" represents total factor productivity, "K" is capital, "L" is labor, and the parameter formula_2 measures the output elasticity of capital. For the special case in which formula_3, the production function becomes linear in capital thereby giving constant returns to scale: In neo-classical growth models, the long-run rate of growth is exogenously determined by either the savings rate (the Harrod–Domar model) or the rate of technical progress (Solow model). However, the savings rate and rate of technological progress remain unexplained | https://en.wikipedia.org/wiki?curid=378724 | 508,679 |
Silvana Cardoso is a Portuguese fluid dynamicist working in Britain. She is professor of Fluid Mechanics and the Environment at the University of Cambridge and a fellow of Pembroke College, Cambridge. She leads the Fluids and the Environment research group at the Department of Chemical Engineering and Biotechnology. Her research focuses on fluid mechanics and environmental science, in particular the interaction of natural convection and chemical kinetics including She is on the International Advisory Panel of the journal Chemical Engineering Science and the Editorial Board of Chemical Engineering Journal. In 2016 she was awarded the Davidson medal of the Institution of Chemical Engineers (IChemE). Recent press interest in her work has included pieces on whether natural geochemical reactions can delay or prevent the spreading of carbon dioxide in subsurface aquifers used for carbon capture and storage, the possible melting of oceanic methane hydrate deposits owing to climate change, and the importance to astrobiology of brinicles on Jupiter's moon, Europa. | https://en.wikipedia.org/wiki?curid=61522806 | 315,947 |
Pittacal was the first synthetic dyestuff to be produced commercially. It was accidentally discovered in 1832 by German chemist Carl Ludwig Reichenbach, who is also recognized as being the discoverer of kerosene, phenol, eupion, paraffin wax and creosote. According to history, Reichenbach applied creosote to the wooden posts of his home, in order to drive away dogs who urinated on them. The strategy was ineffective, however, and he noted that the dog's urine reacted with creosote to form an intense dark blue deposit. He named the new substance píttacal (from Greek words "tar" and "beautiful"). He later succeeded in producing pure pittacal by treating beechwood tar with barium oxide and using alumina as a mordant to the dye's fabrics. Although sold commercially as a dyestuff, it did not fare well. Eupittone (derived from eu- + pittacal + -one) is a yellow crystalline substance resembling aurin, and obtained by the oxidation of pittacal. It is also called also eupittonic acid or eupitton. Kaufmann, GB - Pittacal—the first synthetic dyestuff. "J. Chem. Edu.", 753, 1977. | https://en.wikipedia.org/wiki?curid=6699974 | 93,756 |
Bioceramic This work inspired a new field called bioceramics. With the discovery of bioglass, interest in bioceramics grew rapidly. On April 26, 1988, the first international symposium on bioceramics was held in Kyoto, Japan. Ceramics are now commonly used in the medical fields as dental and bone implants. Surgical cermets are used regularly. Joint replacements are commonly coated with bioceramic materials to reduce wear and inflammatory response. Other examples of medical uses for bioceramics are in pacemakers, kidney dialysis machines, and respirators. The global demand on medical ceramics and ceramic components was about U.S. $9.8 billion in 2010. It was forecast to have an annual growth of 6 to 7 percent in the following years, with world market value predicted to increase to U.S. $15.3 billion by 2015 and reach U.S. $18.5 billion by 2018. Bioceramics are meant to be used in extracorporeal circulation systems (dialysis for example) or engineered bioreactors; however, they're most common as implants. Ceramics show numerous applications as biomaterials due to their physico-chemical properties. They have the advantage of being inert in the human body, and their hardness and resistance to abrasion makes them useful for bones and teeth replacement. Some ceramics also have excellent resistance to friction, making them useful as replacement materials for malfunctioning joints. Properties such as appearance and electrical insulation are also a concern for specific biomedical applications | https://en.wikipedia.org/wiki?curid=22183423 | 16,880 |
Robert Seamans In 1958, he became Chief Engineer of the Missile Electronics and Controls Division at RCA in Burlington, Massachusetts. From 1948 to 1958, Seamans also served on technical committees of NASA's predecessor organization, the National Advisory Committee for Aeronautics. He served as a consultant to the Scientific Advisory Board of the United States Air Force from 1957 to 1959, as a Member of the Board from 1959 to 1962, and as an Associate Advisor from 1962 to 1967. He was a National Delegate, Advisory Group for Aerospace Research and Development (NATO) from 1966 to 1969. In 1960, Seamans joined NASA as Associate Administrator. In 1965, he became Deputy Administrator, retaining many of the general management-type responsibilities of the Associate Administrator and also serving as Acting Administrator. During his years at NASA he worked closely with the Department of Defense in research and engineering programs and served as Co-chairman of the Astronautics Coordinating Board. Through these associations, NASA was kept aware of military developments and technical needs of the Department of Defense and Seamans was able to advise that agency of NASA activities which had application to national security. In January 1968 he resigned from NASA to become a visiting professor at MIT and in July 1968 was appointed to the Jerome Clarke Hunsaker professorship, an MIT-endowed visiting professorship in the Department of Aeronautics and Astronautics, named in honor of the founder of the Aeronautical Engineering Department | https://en.wikipedia.org/wiki?curid=8643117 | 216,529 |
Radio-frequency welding This process is not instantaneous, therefore if the frequency is high enough, the dipole will be unable to rotate quickly enough to stay aligned with the electric field resulting in random motion as the molecule attempts to follow the electrical field. This motion causes intermolecular friction which leads to heat generation. The amount of heat generated by friction in the material is dependent on field strength, frequency, dipole strength, and free volume in the material. Since the main driving force for dielectric heating is the interaction of the dipole of a molecule with the applied electrical field, RF welding can only be conducted on dipole molecules. The typical frequency range for dielectric heating is 10-100 MHz but normally RF Welding is conducted around 27 MHz. At too low of frequency, the dipoles are able to align themselves with the electrical field and stay in phase with the electrical current minimizing the intermolecular friction that is produced. This can also be described as having minimal power loss from the electrical field since the molecules will stay in phase and absorb minimal energy. As frequencies become high enough, power loss starts to increase as the dipoles are unable to align themselves at the rate of the reversing electrical field. The dipoles become out of phase absorbing energy and this is when heating occurs. At a certain frequency, a power loss maximum is reached to where higher frequencies will have decreased power loss and produce less heating | https://en.wikipedia.org/wiki?curid=57218852 | 450,758 |
Blu-ray The choice of formats affects the producer's licensing/royalty costs as well as the title's maximum run time, due to differences in compression efficiency. Discs encoded in MPEG-2 video typically limit content producers to around two hours of high-definition content on a single-layer (25 GB) BD-ROM. The more-advanced video formats (VC-1 and MPEG-4 AVC) typically achieve a video run time twice that of MPEG-2, with comparable quality. MPEG-2 was used by many studios (including Paramount Pictures, which initially used the VC-1 format for HD DVD releases) for the first series of Discs, which were launched throughout 2006. Modern releases are now often encoded in either MPEG-4 AVC or VC-1, allowing film studios to place all content on one disc, reducing costs and improving ease of use. Using these formats also frees a lot of space for storage of bonus content in HD (1080i/p), as opposed to the SD (480i/p) typically used for most titles. Some studios, such as Warner Bros., have released bonus content on discs encoded in a different format than the main feature title. For example, the Disc release of "Superman Returns" uses VC-1 for the feature film and MPEG-2 for some of its bonus content. Today, Warner and other studios typically provide bonus content in the video format that matches the feature. For audio, BD-ROM players are required to implement Dolby Digital (AC-3), DTS, and linear PCM | https://en.wikipedia.org/wiki?curid=11015826 | 403,203 |
Geometric dynamic recrystallization Geometric Dynamic Recrystallization (GDR) is a recrystallization mechanism that has been proposed to occur in several alloys, particularly aluminium, at high temperatures and low strain rates. It is a variant of dynamic recrystallization. The basic mechanism is that during deformation the grains will be increasingly flattened until the boundaries on each side are separated by only a small distance. The deformation is accompanied by the serration of the grain boundaries due to surface tension effects where they are in contact with low-angle grain boundaries belonging to sub-grains. Eventually the points of the serrations will come into contact. Since the contacting boundaries are defects of opposite 'sign' they are able to annihilate and so reduce the total energy in the system. In effect the grain will pinch in two new grains. The grain size is known to decrease as the applied stress is increased. However, high stresses require a high strain rate and at some point statically recrystallized grains will begin to nucleate and consume the GDRX microstructure. There are features that are unique to GDRX: | https://en.wikipedia.org/wiki?curid=4301570 | 4,531 |
Cochlear implant Users of all devices report a wide range of performance after implantation. Much of the strongest objection to cochlear implants has come from within the Deaf community, some of whom are pre-lingually Deaf people whose first language is a sign language. For some in the Deaf community, cochlear implants are an affront to their culture, which, as they view it, is a minority threatened by the hearing majority. This is an old problem for the Deaf community, going back as far as the 18th century with the argument of manualism vs. oralism. This is consistent with medicalisation and the standardisation of the "normal" body in the 19th century, when differences between normal and abnormal began to be debated. It is important to consider the sociocultural context, particularly in regards to the Deaf community, which considers itself to possess its own unique language and culture. This accounts for the cochlear implant being seen as an affront to their culture, as many do not believe that deafness is something that needs to be cured. However, it has also been argued that this does not necessarily have to be the case: the cochlear implant can act as a tool deaf people can use to access the "hearing world" without losing their Deaf identity. It is believed by some that cochlear implants for congenitally deaf children are most effective when implanted at a young age. However evidence shows that Deaf children who sign well do better academically | https://en.wikipedia.org/wiki?curid=241649 | 141,612 |
Holography On one side, one has to perform the operation always on the whole image, and on the other side, the operation a hologram can perform is basically either a multiplication or a phase conjugation. In optics, addition and Fourier transform are already easily performed in linear materials, the latter simply by a lens. This enables some applications, such as a device that compares images in an optical way. The search for novel for dynamic holography is an active area of research. The most common materials are photorefractive crystals, but in semiconductors or semiconductor heterostructures (such as quantum wells), atomic vapors and gases, plasmas and even liquids, it was possible to generate holograms. A particularly promising application is optical phase conjugation. It allows the removal of the wavefront distortions a light beam receives when passing through an aberrating medium, by sending it back through the same aberrating medium with a conjugated phase. This is useful, for example, in free-space optical communications to compensate for atmospheric turbulence (the phenomenon that gives rise to the twinkling of starlight). Since the beginning of holography, amateur experimenters have explored its uses. In 1971, Lloyd Cross opened the San Francisco School of and taught amateurs how to make holograms using only a small (typically 5 mW) helium-neon laser and inexpensive home-made equipment | https://en.wikipedia.org/wiki?curid=66338 | 234,290 |
Da Vinci Systems Its features included: The early/budget 68000 models had two control panels and 16 vector secondaries, the same as the Classic. The later 68020 versions usually feature Kilovectors, advanced secondary correction and had three control panels. Options included: 12v panels were used on early versions with 5v panels with separate PSU on later versions (not interchangeable with 12v panels). Both 525 and 625 standards were offered and a B&W menu monitor was used. Often connected to a FDL60, FDL90, MK3 or URSA telecine, it was also used with videotape machines for tape-to-tape grading. Some versions used an extra interface module (the EMC unit) to function with the URSA serial control busses. Normally used with a separate TLC editor (1 or 2), an additional interface is required for this on URSA installations. da Vinci Renaissance 888 was similar to the above system, but had 888 digital video processing in place of the analog video processing. This system was manufactured from 1992–1998. Its features included: Additional options for the digital 888 included: Early versions had a double backplane chassis (4:2:2 only) and used an extra interface module, the EMC, to function with URSA control busses. 888s were normally used with a separate TLC1 or 2 editors and an additional interface was required for this on URSA installations. da Vinci also made the da Vinci Light. This was not marketed, so not many were sold. It is a da Vinci DUI 888 without the digital 888 cards | https://en.wikipedia.org/wiki?curid=10626981 | 403,088 |
Marketing and artificial intelligence However searching for this information and analysis of it can be a sizeable task, manually analyzing this information also presents the potential for researcher bias. Therefore, objective opinion analysis systems are suggested as a solution to this in the form of automated opinion mining and summarization systems. Marketers using this type of intelligence to make inferences about consumer opinion should be wary of what is called opinion spam, where fake opinions or reviews are posted in the web in order to influence potential consumers for or against a product or service. Search engines are a common type of intelligence that seeks to learn what the user is interested in to present appropriate information. PageRank and HITS are examples of algorithms that search for information via hyperlinks; Google uses PageRank to control its search engine. Hyperlink based intelligence can be used to seek out web communities, which is described as ‘ a cluster of densely linked pages representing a group of people with a common interest’. Centrality and prestige are types of measurement terms used to describe the level of common occurrences among a group of actors; the terms help to describe the level of influence and actor holds within a social network. Someone who has many ties within a network would be described as a ‘central’ or ‘prestige’ actor. Identifying these nodes within a social network is helpful for marketers to find out who are the trendsetters within social networks | https://en.wikipedia.org/wiki?curid=35591037 | 475,821 |
Heinrich Anton de Bary In 1858, he had observed conjugation in the alga "Spirogyra", and in 1861, he described sexual reproduction in the fungus "Peronospora" sp. He saw the necessity of observing the whole life cycle of pathogens and attempted to follow it in the living host plants. De Bary published his first work on fungi in 1861, and then spent more than 15 years studying Peronosporeae, particularly "Phytophthora infestans" (formerly "Peronospora infestans") and "Cystopus" ("Albugo"), parasites of potato. In his published work in 1863 entitled "Recherches sur le developpement de quelques champignons parasites", he reported having inoculated spores of "P. infestans" on healthy potato leaves and observed the penetration of the leaf and the subsequent growth of the mycelium that affected the tissue, the formation of conidia, and the appearance of the characteristic black spots of the potato blight. He also did similar experiments on potato stalks and tubers. He watched conidia in the soil and their infection of the tubers, observing that mycelium could survive the cold winter in the tubers. From all these studies, he concluded that organisms could not be generated spontaneously. He did a thorough investigation on "Puccinia graminis", the pathogen of rust of wheat, rye and other grains. He noticed that "P. graminis" produced reddish summer spores called "urediospores", and dark winter spores called "teleutospores" | https://en.wikipedia.org/wiki?curid=51526 | 191,352 |
Bernard Pietenpol Bernard H. Pietenpol (1901–1984) was an aircraft designer. A designer of homebuilt aircraft, Pietenpol was a self-taught mechanic who lived most of his life in the small community of Cherry Grove in southeastern Minnesota. His best-known design, the Pietenpol Air Camper, was meant to be built and flown by the "average American" of the 1930s. The Air Camper is a two-place open cockpit monoplane with "parasol" wing built from material that was, in the 1930s, readily available from local sources. Powered by a Ford Model A engine, and first flown with one in May 1929, Pietenpol's design was sturdy, simple and affordable. Plans were first published in Modern Mechanics and Inventions, then in the magazine's "1932 Flying and Glider Manual". With the success of the Air Camper, MMI editor Weston Farmer convinced Pietenpol to design an airplane that could be powered with the cheaper and more readily available Ford Model T engine. In response to Farmer's request, Pietenpol designed the single-place Pietenpol Sky Scout. Air Campers have been built using many (over 50 have been recorded) different power plants. Pietenpol himself liked the air-cooled Corvair engine. But his original design is well suited to the high torque, slow-turning Model "A." A restored example of the Air Camper can be found in the inventory of the Mid-Atlantic Air Museum in Reading, Pennsylvania. In later years, Pietenpol sold and repaired television sets. In 1981, the Pietenpol Workshop and Garage was added to the National Register of Historic Places | https://en.wikipedia.org/wiki?curid=3295317 | 240,493 |
Electro-oxidation Hydroxyl radicals are known to have one of the highest redox potentials, allowing degrading many refractory organic compounds. A reaction mechanism has been proposed for the formation of the hydroxyl radical at the anode through oxidation of water: <chem>S + H2O -> S[*OH] + H+ + e-</chem> Where S represents the generic surface site for adsorption on the electrode surface. Then, the radical species can interact with the contaminants through two different reaction mechanisms, according to the anode material. The surface of "active" anodes strongly interacts with hydroxyl radicals, leading to the production of higher state oxides or superoxides. The higher oxide then acts as a mediator in the selective oxidation of organic pollutants. Due to the radicals being strongly chemisorbed onto the electrode surface, the reactions are limited to the proximity of the anode surface, according to the mechanism: <chem>S[*OH] -> SO +H+ + e-</chem> <chem>SO + R -> S + RO</chem> Where R is the generic organic compound, while RO is the partially oxidized product. If the electrode interacts weakly with the radicals, it is qualified as a "non active" anode. Hydroxyl radicals are physisorbed on the electrode surface by means of weak interaction forces and thus available for reaction with contaminants | https://en.wikipedia.org/wiki?curid=61186827 | 315,580 |
Vannevar Bush At a meeting in Stimson's office on September 23 attended by Bundy, Bush, Conant, Groves, Marshall Somervell and Stimson, Bush put forward his proposal for steering the project by a small committee answerable to the Top Policy Group. The meeting agreed with Bush, and created a Military Policy Committee chaired by him, with Somervell's chief of staff, Brigadier General Wilhelm D. Styer, representing the army, and Rear Admiral William R. Purnell representing the navy. At the meeting with Roosevelt on October 9, 1941, Bush advocated cooperating with the United Kingdom, and he began corresponding with his British counterpart, Sir John Anderson. But by October 1942, Conant and Bush agreed that a joint project would pose security risks and be more complicated to manage. Roosevelt approved a Military Policy Committee recommendation stating that information given to the British should be limited to technologies that they were actively working on and should not extend to post-war developments. In July 1943, on a visit to London to learn about British progress on antisubmarine technology, Bush, Stimson, and Bundy met with Anderson, Lord Cherwell, and Winston Churchill at 10 Downing Street. At the meeting, Churchill forcefully pressed for a renewal of interchange, while Bush defended current policy. Only when he returned to Washington did he discover that Roosevelt had agreed with the British | https://en.wikipedia.org/wiki?curid=32767 | 302,877 |
Federal Council of Engineering and Agronomy It is the plenary that appreciates and decides on draft resolutions aimed at regulating and executing the law and on draft normative decision aimed at establishing understandings or determining procedures for unity of action of the Confe / Crea System. The Plenary also regulates integration issues with the state and society, the qualification and professional supervision, and economic and financial control; Appreciates and decides on the normative act of Crea, among several other competences. More information about the Plenary and its competence can be found in Rules of Confea , approved by Resolution No. 1.015 / 2006. Resolution No. 1,015 of June 30, 2006, which approves the Federal Council of Engineering, Architecture and Agronomy, created the possibility of having working groups to subsidize the execution of Federal actions. Established by the Confea Plenary, the Working Groups deal with issues ranging from structural issues of the System and studies for changes in existing laws and regulations (such as shadowing assignments with other professions and advice and representation of mid-level technicians In plenary sessions) to sustainability and to bills of interest of professionals in the technological area. Imposed in the Confea by Resolution No. 1060/2014, the thematic committees aim to collect data and study specific subjects, continuing nature, the aim of assisting the standing committees of Confea on relevant topics discussion that permeate the professions covered by Confea / Believe | https://en.wikipedia.org/wiki?curid=52763975 | 256,172 |
IMPACT (computer graphics) IMPACT (sometimes spelled Impact) is a computer graphics architecture for Silicon Graphics computer workstations. IMPACT Graphics was developed in 1995 and was available as a high-end graphics option on workstations released during the mid-1990s. IMPACT graphics gives the workstation real-time 2D and 3D graphics rendering capability similar to that of even high-end PCs made well after IMPACT's introduction. IMPACT graphics systems consist of either one or two Geometry Engines and one or two Raster Engines in various configurations. IMPACT graphics consists of five graphics subsystems: the Command Engine, Geometry Subsystem, Raster Engine, framebuffer and Display Subsystem. IMPACT Graphics can produce resolutions up to 1600 x 1200 pixels with 32-bit color and can also process unencoded NTSC and PAL analog television signals. IMPACT graphics subsystems come in three configurations for SGI Indigo2 IMPACT workstations: Solid IMPACT, High IMPACT, and Maximum IMPACT. The equivalent configurations also exist for the SGI Octane workstation but are referred to as SI, SSI, and MXI (I-series). Later Octane workstations used a similar configuration but with updated ASIC chips and are referred to as SE, SSE, and MXE (E-series). IMPACT uses Rambus RDRAM for texture memory. The IMPACT graphics architecture was superseded by SGI's VPro graphics architecture in 1997. | https://en.wikipedia.org/wiki?curid=45137215 | 99,876 |
Center for Advancing Innovation The (CAI) is a registered 501(c)(3) non-profit organization, headquartered in Bethesda, Maryland, with a mission to accelerate technology transfer and commercialization. CAI was founded by Rosemarie Truman, a consultant and former VP of Global Strategy at Marsh & McLennan. Rosemarie was also a senior manager at Oracle and Goldman Sachs and led R&D and innovation strategy practice globally at IBM. CAI is a partner of The Founder Institute, BIO, J&J's JLABs, AstraZeneca, the Walton Family Foundation, the Brain Tumour Charity, the Silicon Valley Community Foundation and many other organizations. Among its efforts to promote innovation, CAI organizes startup "challenges" in which multidisciplinary teams from around the world develop business plans to commercialize federally funded technologies. Industry experts judge hundreds of business plans about these promising technologies to determine challenge "winners", which enter into licensing negotiations with participating technology transfer offices. As of 2017, CAI has orchestrated four startup challenges: the National Institutes of Health-sponsored Breast Cancer Startup Challenge, Neuro Startup Challenge, and Nanotechnology Startup Challenge in Cancer competitions, as well as the SPACE RACE startup challenge in partnership with NASA and the Medical Center of the Americas Foundation | https://en.wikipedia.org/wiki?curid=49060638 | 298,693 |
Overtime Out of approximately 120 million American workers, nearly 50 million are exempt from overtime laws (US Department of Labor, Wage and Hour Division, 1998). In 2004, the United States was 7th out of 24 OECD countries in terms of annual working hours per worker. (See Working time for a complete listing.) In 2015, the United States Department of Labor proposed dramatic changes to certain exemptions from federal minimum wage and overtime requirements. These changes are anticipated to take effect in July 2016, but as of January 2016, still are pending final approval. Proposed changes include: setting the minimum salary level required for exemption for full-time salaried workers at $970 per week, or $50,440 annually (an increase from the current $455 per week, or $23,660 annually) Increasing the total annual compensation required to exempt highly compensated employees to $122,148 annually (from the current $100,000 annually). On August 23, 2004, President George W. Bush and the Department of Labor proposed changes to regulations governing implementation of the law. According to one study, the changes would have had significant impact on the number of workers covered by overtime laws and have exempted several million additional workers. The Bush administration maintained that the practical impact on working Americans would be minimal and that the changes would help clarify an outdated regulation. In particular, the new rules would have allowed more companies to offer flextime to their workers instead of overtime | https://en.wikipedia.org/wiki?curid=341254 | 453,066 |
Tariff If unemployment (or underutilized resources) exists, there are no opportunity costs, because the production of one good can be increased without reducing the production of another good. Since comparative advantages are determined by opportunity costs in the neoclassical formulation, these cannot be calculated and this formulation would lose its logical basis. If a country's resources were not fully utilized, production and consumption could be increased at the national level without participating in international trade. The whole raison d'être of international trade would disappear, as would the possible gains. In this case, a State could even earn more by refraining from participating in international trade and stimulating domestic production, as this would allow it to employ more labour and capital and increase national income. Moreover, any adjustment mechanism underlying the theory no longer works if unemployment exists. In practice, however, the world is characterised by unemployment. Unemployment and underemployment of capital and labour are not a short-term phenomenon, but it is common and widespread. Unemployment and untapped resources are more the rule than the exception. | https://en.wikipedia.org/wiki?curid=55551 | 509,874 |
Cybersecurity Information Sharing Act Senate Majority Leader Mitch McConnell, (R-Ky) attempted to attach the bill as an amendment to the annual National Defense Authorization Act, but was blocked 56-40, not reaching the necessary 60 votes to include the amendment. Mitch McConnell hoped to bring the bill to senate-wide vote during the week of August 3–7, but was unable to take up the bill before the summer recess. The Senate tentatively agreed to limit debate to 21 particular amendments and a manager's amendment, but did not set time limits on debate. In October 2015, the US Senate took the bill back up following legislation concerning sanctuary cities. The main provisions of the bill make it easier for companies to share personal information with the government, especially in cases of cyber security threats. Without requiring such information sharing, the bill creates a system for federal agencies to receive threat information from private companies. With respect to privacy, the bill includes provisions for preventing the sharing of personal data that is irrelevant to cyber security. Any personal information that does not get removed during the sharing procedure can be used in a variety of ways. These shared cyber threat indicators can be used to prosecute cyber crimes, but may also be used as evidence for crimes involving physical force. Sharing National Intelligence threat data among public and private partners is a hard problem and one that we should all care about | https://en.wikipedia.org/wiki?curid=43274224 | 99,164 |
Joan Curran " In 1939, Dee proposed that the team spend a month at the Royal Aircraft Establishment at Farnborough Airfield. They arrived on 1 September 1939. Two days later, Britain declared war on Germany, and Britain entered the Second World War. Instead of returning to the Cavendish, the team moved to Exeter, where Dee and three others worked on developing rockets as anti-aircraft weapons, while Strothers and Curran joined a group under John Coles working on the development of the proximity fuse. Strothers was based at Leeson House and Durnford School. She and Curran developed a workable fuse, which was codenamed VT, an acronym of "Variable Time fuze". The system was a small, short-range, Doppler radar that used a clever circuit. However, Britain lacked the capacity to mass-produce the fuze, so the design was shown to the United States by the Tizard Mission in late 1940. The Americans perfected and mass-produced the fuse. In due course, these proximity fuses arrived in the United Kingdom, where they played an important part in the defence of the kingdom against the V-1 flying bomb. Strothers married Curran on 7 November 1940. Soon afterwards they were transferred to the Telecommunications Research Establishment near Swanage, where Sam worked on centimetric radar, while Joan joined the Counter Measures Group in an adjoining lab. It was with this group, at Swanage, and later at Malvern, that Joan devised the technique that was codenamed Window, which is also known as chaff | https://en.wikipedia.org/wiki?curid=14439096 | 254,914 |
NRANK NRANK, or National Rank, is a ranking of the rarity of a species within a nation. Each nation can assign their own NRANK, based on information from conservation data centres, natural heritage programmes and expert scientists. | https://en.wikipedia.org/wiki?curid=5416131 | 192,373 |
Georg Statz (1894–1945) was a school teacher and taxonomist who published widely on fossil insects from the Oligocene Rott Formation of Germany (Rott am Siebengbirge). Known as the "Statz Collection", his fossils are reposited at the Natural History Museum of Los Angeles County and the Institüt fur Paläontologie, Universität Bonn. | https://en.wikipedia.org/wiki?curid=55089508 | 187,565 |
Flight instruments The altimeter shows the aircraft's altitude above sea-level by measuring the difference between the pressure in a stack of aneroid capsules inside the altimeter and the atmospheric pressure obtained through the static system. The most common unit for altimeter calibration worldwide is hectopascals (hPa), except for North America and Japan where inches of mercury (inHg) are used. The altimeter is adjustable for local barometric pressure which must be set correctly to obtain accurate altitude readings, usually in either feet or meters. As the aircraft ascends, the capsules expand and the static pressure drops, causing the altimeter to indicate a higher altitude. The opposite effect occurs when descending. With the advancement in aviation and increased altitude ceiling, the altimeter dial had to be altered for use both at higher and lower altitudes. Hence when the needles were indicating lower altitudes i.e. the first 360-degree operation of the pointers was delineated by the appearance of a small window with oblique lines warning the pilot that he or she is nearer to the ground. This modification was introduced in the early sixties after the recurrence of air accidents caused by the confusion in the pilot's mind. At higher altitudes, the window will disappear. The airspeed indicator shows the aircraft's speed relative to the surrounding air. Knots is the currently most used unit, but kilometers per hour is sometimes used instead | https://en.wikipedia.org/wiki?curid=79537 | 408,419 |
Zhang Guangdou (; 1 May 1912 – 21 June 2013) was a Chinese hydraulic engineer. He was a specialist in hydraulic engineering, professor and vice president of Tsinghua University, and a member of the Chinese Academy of Sciences and Chinese Academy of Engineering. Zhang graduated from National Chiao Tung University (now Shanghai Jiao Tong University) in 1934, obtained his master's degree in Civil Engineering from University of California, Berkeley in 1936, and a master's degree in engineering mechanics from Harvard University in 1937. Zhang was the chief designer of the Miyun Reservoir and Yuzixi powerplant. He was also involved in the design of many major hydraulic and hydropower projects, including Sanmenxia, Danjiangkou, Gezhouba, Ertan, Geheyan, Xiaolangdi, and the Three Gorges Dam. Zhang received numerous awards, including the Haas International Award from UC Berkeley (1981), the National Award for Technology Advancement (2nd Prize, 1985), the Ho Leung Ho Lee Technology Advancement Award (1995), and the Technology Achievement Award from the Chinese Academy of Engineering (1996). He was elected to the Chinese Academy of Sciences in 1955, and the Chinese Academy of Engineering in 1994. He was also elected as a foreign member of Mexico National Academy of Engineering in 1981. | https://en.wikipedia.org/wiki?curid=5092855 | 322,000 |
Palacio de Fabio Nelli The collection distributes in two sections: Archaeology (10 rooms) and Fine arts (8 rooms). The section of Archaeology shows a complete chronological sequence of pieces of the Valladolid province from the Paleolithic until the Half Age. In the section of Fine arts offer paintings of the 15th and 16th centuries, Flemish tapestries, goldsmithing of the 17th century, Spanish popular ceramics, sculpture and a small section devoted to the history of the city. Between the collection of fine arts project the painting of the teacher of the Bishop Sancho de Rojas representing to the Virgin with the Boy together with two donors presented by Santa Catalina and the Adoration of the shepherds of Vicente Masip. Between the sample of archaeologic rests stands out the sarcophagus of the child Alfonso de Castilla, with his clothes and fabrics. | https://en.wikipedia.org/wiki?curid=49575391 | 333,080 |
Antony King In 2013, Antony was nominated for Pro Sound Awards' Live/Touring Sound Engineer of the Year. Antony has recently been nominated for a 30th Annual Technical Excellence & Creativity (TEC) Award in Outstanding Creative Achievement in the category of Tour/Event Sound Production for his work on the 2014 Nine Inch Nails and Soundgarden tour. In 2017 and 2018, Antony was FOH Engineer for Depeche Mode's "Global Spirit Tour," which played to 3 million fans and accrued over $202 million in boxoffice earnings. Antony used a Solid State Logic L500 Plus mixing console on the Depeche Mode "Global Spirit Tour." His outboard rack gear included Warm Audio WA76's on Dave Gahan and Martin Gore’s vocals; a TC Electronic M6000; TC Electronic D2 controlled via MIDI from the desk; a Warm Audio WA2A and an EQP-WA on the bass; SPL Transient Designers on the drums; Chandler TG1 on the drum overheads; Smart C2 over the Left and Right. For system EQ, he used a Dolby Lake. He also used a Beyerdynamic TG 1000 digital wireless system and an L-Acoustics K1 system. He currently divides his time between London and Los Angeles, CA. | https://en.wikipedia.org/wiki?curid=44370905 | 393,962 |
Cyborg antenna A cyborg antenna is an osseointegrated device implanted in a human skull. The antenna, composed of a wireless camera on one end and a wireless sound vibration implant on the other end, allows wireless communication and wireless transmission of images, sound or video from skull to skull. The antenna uses audible vibrations in the skull to report information. This includes measurements of electromagnetic radiation, phone calls, music, as well as video or images which are transmitted through audible vibrations. The Wi-Fi enabled antenna also allows the reception signals and data from satellites. The first antenna was created in England in 2003 by Neil Harbisson. The invention, under the heading "Bridging the Island of the Colourblind Project", won a British award in Innovation (Submerge 2004) and a European award in Content Tools and Interface Design (Europrix 2004). In 2007, Peter Kese, a software developer from Kranj, Slovenia, made further developments to the antenna by increasing the number of color hues to 360 and adding color saturation through different volume levels. In 2009, Matias Lizana, a student from Universitat Politècnica de Catalunya developed the antenna into a chip as part of his final year project. The chip allows users to have the antenna implanted and to hear colors beyond the limits of human perception such as infrared and ultraviolet. Harbisson's Sonochromatic Music Scale (2003) is a microtonal and logarithmic scale with 360 notes in an octave | https://en.wikipedia.org/wiki?curid=42290411 | 164,226 |
Timber framing The infill may be covered by other materials, including weatherboarding or tiles. or left exposed. When left exposed, both the framing and infill were sometimes done in a decorative manner. Germany is famous for its decorative half-timbering and the figures sometimes have names and meanings. The decorative manner of half-timbering is promoted in Germany by the German Timber-Frame Road, several planned routes people can drive to see notable examples of "Fachwerk" buildings. Gallery of infill types: Gallery of some named figures and decorations: The collection of elements in half timbering are sometimes given specific names: The term half-timbering is not as old as the German name "Fachwerk" or the French name "colombage", but it is the standard English name for this style. One of the first people to publish the term "half-timbered" was Mary Martha Sherwood (1775–1851), who employed it in her book, "The Lady of the Manor", published in several volumes from 1823 to 1829. She uses the term picturesquely: "...passing through a gate in a quickset hedge, we arrived at the porch of an old half-timbered cottage, where an aged man and woman received us." By 1842, half-timbered had found its way into "The Encyclopedia of Architecture" by Joseph Gwilt (1784–1863). This juxtaposition of exposed timbered beams and infilled spaces created the distinctive "half-timbered", or occasionally termed, "Tudor" style, or "black-and-white". The most ancient known half-timbered building is called the House of "opus craticum" | https://en.wikipedia.org/wiki?curid=699422 | 338,474 |
This TV ), American International Pictures, and the Mirisch Company (all of which were acquired by MGM); in addition, films produced by PolyGram Filmed Entertainment (which was not acquired by MGM at the time PolyGram folded in 1999, although MGM holds distribution rights to its pre-1996 films) are also featured on the network. This also shows films from Miramax as well. Under Weigel co-ownership, aired "Pink Panther" cartoon shorts to pad out surplus airtime when a film concludes more than five minutes prior to the end of the film's allotted timeslot. also commonly features themed movie presentations, with the entire day's schedule consisting of films from a particular genre once a week throughout the month (such as Mondays, which feature drama and romance films under the theme "From the Heart;" and Wednesdays, which feature action and western films under the theme "Wednesdays Are Wild"). On certain days, the network may air differing genres of films separated by daypart (for example, crime dramas during the day and comedies at night). The network also broadcasts a featured movie in primetime at 8:00 p.m. Eastern Time on Monday through Friday nights. Until October 31, 2013, the weeknight prime movie presentations were typically replayed later in the evening (usually at 12:00 a.m. Eastern Time, depending on the length of the film that preceded it), which allows viewers which have This's primetime pre-empted by a secondary network to watch those films | https://en.wikipedia.org/wiki?curid=18653290 | 485,418 |
Discovery and development of tubulin inhibitors Most of the SAR studies involve the vindoline portion of bis-indole alkaloids because modification at C-16 and C-17 offers good opportunities for developing new analogues. The replacement of the ester group with an amide group at the C-16 resulted in the development of vindesine. Similarly replacement of the acetyl group at C-16 with L-trp-OC2H5, d-Ala(P)-(OC2H5)2, L-Ala(P)-(OC2H5)2 and I-Vla(P)-(OC2H5)2 gave rise to new analogues having anti- tubulin activity. Also it was found that the vindoline's indole methyl group is a useful position to functionalize potentially and develop new, potent vinblastine derivatives. A new series of semi-synthetic C-16 -spiro-oxazolidine-1,3-diones prepared from 17-deacetyl vinblastine showed good anti-tubulin activity and lower cytotoxicity. Vinglycinate a glycinate prodrug derived from the C-17-OH group of vinblastine showed similar antitumor activity and toxicity as that of vinblastine. Limitations in anticancer therapy occur mainly due to two reasons; because of the patient's organism, or because of the specific genetic alterations in the tumor cells. From the patient, therapy is limited by poor absorption of a drug which can lead to low concentration of the active agent in the blood and small amount delivery to the tumor. Low serum level of a drug can be also caused by rapid metabolism and excretion associated with affinity to intestinal or/and liver cytochrome P450. Another reason is the instability and degradation of the drugs in gastro-intestinal environment | https://en.wikipedia.org/wiki?curid=33071344 | 76,868 |
List of European power companies by carbon intensity The following is a list of European power companies by carbon intensity. | https://en.wikipedia.org/wiki?curid=31721675 | 31,977 |
John Kenneth Hulm In 1989 he accompanied Mildred Dresselhaus to Japan to evaluate that country’s superconductivity research. During his career he was the author or co-author of about 100 scientific papers. In 1948 Hulm married Joan Askham. Upon his death he was survived by his widow, four daughters, a son, and six grandchildren. | https://en.wikipedia.org/wiki?curid=61828114 | 92,877 |
Geoprofessions Proposed modifications may include such things as vegetation removal, using various types of earth materials in construction, applying loads to shallow or deep foundations, constructing cut or fill slopes and other grading, and modifying ground and surface water flow. The effects of surficial and deep-seated geologic processes are evaluated and analyzed to predict their potential effect on public health, public safety, land use, or proposed development. (b) Typical engineering geologic applications and types of projects. Engineering geology is applied during all project phases, from conception through planning, design, construction, maintenance, and, in some cases, reclamation and closure. Planning-level engineering geologic work is commonly conducted in response to forest practice regulations, critical areas ordinances, and the State Environmental Policy Act. Typical planning-level engineering geologic applications include timber harvest planning, proposed location of residential and commercial developments and other buildings and facilities, and alternative route selection for roads, rail lines, trails, and utilities | https://en.wikipedia.org/wiki?curid=31071816 | 352,723 |
Computer processing of body language That includes people who are possibly going to be observed and monitored by a kind of computer that analyzes their body movements and gestures when they are doing something as simple as shopping at a mall or traveling through airports. | https://en.wikipedia.org/wiki?curid=29726536 | 136,051 |
DARPA ARPA at this point (1959) played an early role in Transit (also called NavSat) a predecessor to the Global Positioning System (GPS). "Fast-forward to 1959 when a joint effort between and the Johns Hopkins Applied Physics Laboratory began to fine-tune the early explorers’ discoveries. TRANSIT, sponsored by the Navy and developed under the leadership of Dr. Richard Kirschner at Johns Hopkins, was the first satellite positioning system." During the late 1960s, with the transfer of these mature programs to the Services, ARPA redefined its role and concentrated on a diverse set of relatively small, essentially exploratory research programs. The agency was renamed the Defense Advanced Research Projects Agency (DARPA) in 1972, and during the early 1970s, it emphasized direct energy programs, information processing, and tactical technologies. Concerning information processing, made great progress, initially through its support of the development of time-sharing (all modern operating systems rely on concepts invented for the Multics system, developed by a cooperation among Bell Labs, General Electric and MIT, which supported by funding Project MAC at MIT with an initial two-million-dollar grant). supported the evolution of the ARPANET (the first wide-area packet switching network), Packet Radio Network, Packet Satellite Network and ultimately, the Internet and research in the artificial intelligence fields of speech recognition and signal processing, including parts of Shakey the robot | https://en.wikipedia.org/wiki?curid=8957 | 189,721 |
Atom probe The needle is oriented towards a phosphor screen to create a projected image of the work function at the tip apex. The image resolution is limited to (2-2.5 nm), due to quantum mechanical effects and lateral variations in the electron velocity. In field ion microscopy the tip is cooled by a cryogen and its polarity is reversed. When an "imaging gas" (usually hydrogen or helium) is introduced at low pressures (< 0.1 Pascal) gas ions in the high electric field at the tip apex are "field ionized" and produce a projected image of protruding atoms at the tip apex. The image resolution is determined primarily by the temperature of the tip but even at 78 Kelvin atomic resolution is achieved. The 10-cm Atom Probe, invented in 1973 by J. A. Panitz was a “new and simple atom probe which permits rapid, in depth species identification or the more usual atom-by atom analysis provided by its predecessors ... in an instrument having a volume of less than two liters in which tip movement is unnecessary and the problems of evaporation pulse stability and alignment common to previous designs have been eliminated.” This was accomplished by combining a time of flight (TOF) mass spectrometer with a proximity focussed, dual channel plate detector, an 11.8 cm drift region and a 38° field of view. An FIM image or a desorption image of the atoms removed from the apex of a field emitter tip could be obtained. The 10-cm Atom Probe has been called the "progenitor" of later atom probes including the commercial instruments | https://en.wikipedia.org/wiki?curid=3211 | 93,828 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.