source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/Lawvere%E2%80%93Tierney%20topology | In mathematics, a Lawvere–Tierney topology is an analog of a Grothendieck topology for an arbitrary topos, used to construct a topos of sheaves. A Lawvere–Tierney topology is also sometimes also called a local operator or coverage or topology or geometric modality. They were introduced by and Myles Tierney.
Definition
If E is a topos, then a topology on E is a morphism j from the subobject classifier Ω to Ω such that j preserves truth (), preserves intersections (), and is idempotent ().
j-closure
Given a subobject of an object A with classifier , then the composition defines another subobject of A such that s is a subobject of , and is said to be the j-closure of s.
Some theorems related to j-closure are (for some subobjects s and w of A):
inflationary property:
idempotence:
preservation of intersections:
preservation of order:
stability under pullback: .
Examples
Grothendieck topologies on a small category C are essentially the same as Lawvere–Tierney topologies on the topos of presheaves of sets over C.
References
Topos theory
Closure operators |
https://en.wikipedia.org/wiki/Cuneiform%20%28programming%20language%29 | Cuneiform is an open-source workflow language
for large-scale scientific data analysis.
It is a statically typed functional programming language promoting parallel computing. It features a versatile foreign function interface allowing users to integrate software from many external programming languages. At the organizational level Cuneiform provides facilities like conditional branching and general recursion making it Turing-complete. In this, Cuneiform is the attempt to close the gap between scientific workflow systems like Taverna, KNIME, or Galaxy and large-scale data analysis programming models like MapReduce or Pig Latin while offering the generality of a functional programming language.
Cuneiform is implemented in distributed Erlang. If run in distributed mode it drives a POSIX-compliant distributed file system like Gluster or Ceph (or a FUSE integration of some other file system, e.g., HDFS). Alternatively, Cuneiform scripts can be executed on top of HTCondor or Hadoop.
Cuneiform is influenced by the work of Peter Kelly who proposes functional programming as a model for scientific workflow execution.
In this, Cuneiform is distinct from related workflow languages based on dataflow programming like Swift.
External software integration
External tools and libraries (e.g., R or Python libraries) are integrated via a foreign function interface. In this it resembles, e.g., KNIME which allows the use of external software through snippet nodes, or Taverna which offers BeanShell services for integrating Java software. By defining a task in a foreign language it is possible to use the API of an external tool or library. This way, tools can be integrated directly without the need of writing a wrapper or reimplementing the tool.
Currently supported foreign programming languages are:
Bash
Elixir
Erlang
Java
JavaScript
MATLAB
GNU Octave
Perl
Python
R
Racket
Foreign language support for AWK and gnuplot are planned additions.
Type System
Cuneiform provides |
https://en.wikipedia.org/wiki/WiperSoft | WiperSoft is an anti-spyware program developed by Wiper Software. It is designed to help users protect their computers from such threats as adware, browser hijackers, worms, potentially unwanted programs (PUPs), trojans, and viruses. Currently available only for Microsoft Windows.
History
WiperSoft was launched in 2015 and was available as a free program for home users. Users were able to use the scan and removal functions without having to buy a subscription.
In 2016, it was re-released with a new design, improved detection and removal functionalities and a more user-friendly interface. That same year, WiperSoft also became a paid program.
WiperSoft saw a big increase in downloads and sales in 2017, and is reportedly used by 1 million users from 120 different countries.
It was tested by Softpedia in 2017 and was rated 100% Clean.
Product
WiperSoft is primarily an anti-spyware program, and comes in two versions. Free WiperSoft offers users to scan their computers for malware. Paid WiperSoft features include malware detection and removal, help desk services and custom fix.
According to Wiper Software, the program can detect and remove threats like potentially unwanted programs, adware, browser hijackers, questionable toolbars, browser add-ons, viruses, trojans and more. Detected potential threats are not automatically deleted, and users have the option of keeping them installed. The program will also undo the changes made by detected threats, such as change of homepage or default search engine.
Availability
The program is currently only available for Microsoft Windows users. All popular browsers, such as Google Chrome, Mozilla Firefox, Internet Explorer and Opera are supported. The program is available in 10 languages.
References
External links
WiperSoft
Softpedia Review
Utilities for Windows
Proprietary package management systems
Windows software
Antivirus software |
https://en.wikipedia.org/wiki/Quanta%20Magazine | Quanta Magazine is an editorially independent online publication of the Simons Foundation covering developments in physics, mathematics, biology and computer science.
Undark Magazine described Quanta Magazine as "highly regarded for its masterful coverage of complex topics in science and math." The science news aggregator RealClearScience ranked Quanta Magazine first on its list of "The Top 10 Websites for Science in 2018." In 2020, the magazine received a National Magazine Award for General Excellence from the American Society of Magazine Editors for its "willingness to tackle some of the toughest and most difficult topics in science and math in a language that is accessible to the lay reader without condescension or oversimplification."
The articles in the magazine are freely available to read online. Scientific American, Wired, The Atlantic, and The Washington Post, as well as international science publications like Spektrum der Wissenschaft, have reprinted articles from the magazine.
History
Quanta Magazine was initially launched as Simons Science News in October 2012, but it was renamed to its current title in July 2013. It was founded by the former New York Times journalist Thomas Lin, who is the magazine's editor-in-chief. The two deputy editors are John Rennie and Michael Moyer, formerly of Scientific American, and the art director is Samuel Velasco.
In November 2018, MIT Press published two collections of articles from Quanta Magazine, Alice and Bob Meet the Wall of Fire and The Prime Number Conspiracy.
In May 2022 the magazine's staff, notably Natalie Wolchover, were awarded the Pulitzer Prize for Explanatory Reporting.
References
External links
American science websites
Magazines established in 2012
Online magazines published in the United States
Popular science magazines
Science and technology magazines published in the United States
Pulitzer Prize for Explanatory Journalism winners
2012 establishments in the United States |
https://en.wikipedia.org/wiki/Exhaust%20gas%20analyzer | An exhaust gas analyser or exhaust carbon monoxide (CO) analyser is an instrument for the measurement of carbon monoxide among other gases in the exhaust, caused by an incorrect combustion, the Lambda coefficient measurement is the most common.
The principles used for CO sensors (and other types of gas) are infrared gas sensors and chemical gas sensors. Carbon monoxide sensors are used to assess the CO amount during an Ministry of Transport test. In order to be used for such test it must be approved as suitable for use in the scheme. In the UK, a list of acceptable exhaust gas analysers for use within the MOT test is available via the Driver and Vehicle Standards Agency website.
Lambda coefficient measurement
The presence of oxygen in the exhaust gases indicates that the combustion of the mixture was not perfect, resulting in contaminant gases. Thus measuring the proportion of oxygen in the exhaust gases of these engines can monitor and measure these emissions. This measurement is performed in the MOT test through Lambda coefficient measurement.
The Lambda coefficient (λ) is obtained from the relationship between air and gasoline involved in combustion of the mixture. It is a measure of the efficiency of the gasoline engine by measuring the percentage of oxygen in the exhaust.
When gasoline engines operate with a stoichiometric mixture of 14.7: 1 the value of lambda (λ) is "1".
Mixing ratio = weight of fuel / weight of air
- Expressed as mass ratio: 14.7 kg of air per 1 kg. of fuel.
- Expressed as volume ratio: 10,000 liters of air per 1 liter of fuel.
With this relationship theoretically a complete combustion of gasoline is achieved and greenhouse gas emissions would be minimal. The coefficient is defined as Lambda coefficient
If Lambda > 1 = lean mixture, excess of air.
If Lambda < 1 = rich mixture, excess of gasoline.
A lean mixture contains an excess of oxygen. The surplus oxygen will react with nitrogen to (oxides of nitrogen), if the temperature is |
https://en.wikipedia.org/wiki/Marketing%20engineering | Marketing engineering is currently defined as "a systematic approach to harness data and knowledge to drive effective marketing decision making and implementation through a technology-enabled and model-supported decision process".
History
The term marketing engineering can be traced back to Lilien et al. in "The Age of Marketing Engineering" published in 1998; in this article the authors define marketing engineering as the use of computer decision models for making marketing decisions. Marketing managers typically use "conceptual marketing", that is they develop a mental model of the decision situation based on past experience, intuition and reasoning. That approach has its limitations though: experience is unique to every individual, there is no objective way of choosing between the best judgments of multiple individuals in such a situation and furthermore judgment can be influenced by the person's position in the firm's hierarchy. In the same year Lilien G. L. and A. Rangaswamy published Marketing Engineering: Computer-Assisted Marketing Analysis and Planning, Fildes and Ventura praised the book in their review, while noting that a fuller discussion of market share models and econometric models would have made the book better for teaching and that "conceptual marketing" should not be discarded in the presence of marketing engineering, but that both approaches should be used together. Leeflang and Wittink (2000) have identified five eras of model building in marketing:
(1950-1965) The first era of application of operations research and management science to marketing
(1965-1970) Adaptation of models to fit marketing problems
(1970-1985) Emphasis on models that are an acceptable representation of reality and are easy to use
(1985-2000) Increase interest in marketing decision support systems, meta-analyses and studies of the generalizability of results
(2000- . ) Growth of new exchange systems (ex: e-commerce) and need for new modeling approaches
How to build |
https://en.wikipedia.org/wiki/Western%20blot%20normalization | Normalization of Western blot data is an analytical step that is performed to compare the relative abundance of a specific protein across the lanes of a blot or gel under diverse experimental treatments, or across tissues or developmental stages. The overall goal of normalization is to minimize effects arising from variations in experimental errors, such as inconsistent sample preparation, unequal sample loading across gel lanes, or uneven protein transfer, which can compromise the conclusions that can be obtained from Western blot data. Currently, there are two methods for normalizing Western blot data: (i) housekeeping protein normalization and (ii) total protein normalization.
Procedure
Normalization occurs directly on either the gel or the blotting membrane. First, the stained gel or blot is imaged, a rectangle is drawn around the target protein in each lane, and the signal intensity inside the rectangle is measured. The signal intensity obtained can then be normalized with respect to the signal intensity of the loading internal control detected on the same gel or blot. When using protein stains, the membrane may be incubated with the chosen stain before or after immunodetection, depending on the type of stain.
Housekeeping protein controls
Housekeeping genes and proteins, including β-Actin, GAPDH, HPRT1, and RPLP1, are often used as internal controls in western blots because they are thought to be expressed constitutively, at the same levels, across experiments. However, recent studies have shown that expression of housekeeping proteins (HKPs) can change across different cell types and biological conditions. Therefore, scientific publishers and funding agencies now require that normalization controls be previously validated for each experiment to ensure reproducibility and accuracy of the results.
Fluorescent antibodies
When using fluorescent antibodies to image proteins in western blots, normalization requires that the user define the upper and lowe |
https://en.wikipedia.org/wiki/Nozzle%20and%20flapper | The nozzle and flapper mechanism is a displacement type detector which converts mechanical movement into a pressure signal by covering the opening of a nozzle with a flat plate called the flapper. This restricts fluid flow through the nozzle and generates a pressure signal.
It is a widely used mechanical means of creating a high gain fluid amplifier. In industrial control systems, they played an important part in the development of pneumatic PID controllers and are still widely used today in pneumatic and hydraulic control and instrumentation systems.
Operating principle
The operating principle makes use of the high gain effect when a flapper plate is placed a small distance from a small pressurized nozzle emitting a fluid.
The example shown is pneumatic. At sub-millimeter distances, a small movement of the flapper plate results in a large change in flow. The nozzle is fed from a chamber which is in turn fed by a restriction, so changes of flow result in changes of chamber pressure. The nozzle diameter must be larger than the restriction orifice in order to work. The high gain of the open loop mechanism can be made linear using a pressure feedback bellows on the flapper to create a force balance system with a linear output. The "live" zero of 0.2 bar or 3 psi is set by the bias spring which ensures that the device is working in its linear region.
The industry standard ranges of either 3-15 psi (USA), or 0.2 - 1.0 bar (metric), is normally used in pneumatic PID controllers, valve positioning servomechanisms and force balance transducers.
Application
The nozzle and flapper in pneumatic controls is a simple low maintenance device which operates well in a harsh industrial environment, and does not present an explosion risk in hazardous atmospheres. They were the industry controller amplifier for many decades until the advent of practical and reliable electronic high gain amplifiers. However they are still used extensively for field devices such as control valve |
https://en.wikipedia.org/wiki/Bounded%20arithmetic | Bounded arithmetic is a collective name for a family of weak subtheories of Peano arithmetic. Such theories are typically obtained by requiring that quantifiers be bounded in the induction axiom or equivalent postulates (a bounded quantifier is of the form ∀x ≤ t or ∃x ≤ t, where t is a term not containing x). The main purpose is to characterize one or another class of computational complexity in the sense that a function is provably total if and only if it belongs to a given complexity class. Further, theories of bounded arithmetic present uniform counterparts to standard propositional proof systems such as Frege system and are, in particular, useful for constructing polynomial-size proofs in these systems. The characterization of standard complexity classes and correspondence to propositional proof systems allows to interpret theories of bounded arithmetic as formal systems capturing various levels of feasible reasoning (see below).
The approach was initiated by Rohit Jivanlal Parikh in 1971, and later developed by Samuel R. Buss. and a number of other logicians.
Theories
Cook's equational theory
Stephen Cook introduced an equational theory (for Polynomially Verifiable) formalizing feasibly constructive proofs (resp. polynomial-time reasoning). The language of consists of function symbols for all polynomial-time algorithms introduced inductively using Cobham's characterization of polynomial-time functions. Axioms and derivations of the theory are introduced simultaneously with the symbols from the language. The theory is equational, i.e. its statements assert only that two terms are equal. A popular extension of is a theory , an ordinary first-order theory. Axioms of are universal sentences and contain all equations provable in . In addition, contains axioms replacing the induction axioms for open formulas.
Buss's first-order theories
Samuel Buss introduced first-order theories of bounded arithmetic . are first-order theories with equality in the lan |
https://en.wikipedia.org/wiki/Mine%20survey | Mine surveying is the practice of determining the relative positions of points on or beneath the surface of the earth by direct or indirect measurements of distance, direction & elevation.
International and National Institutions
International Society for Mine Surveying (ISM)
Australian Institute of Mine Surveyiors (AIMS)
Czech Society of Mine Surveyors and Geologists (SDMG)
German Mine Surveying Association (DMV e.V)
Polish Mine Surveying Committee (PK-ISM)
Institute of Mine Surveyors of South Africa (IMSSA)
See also
Surveying
Land subsidence
geological survey
spatial sciences
References
Mining engineering
Civil engineering |
https://en.wikipedia.org/wiki/Ecofiction | Ecofiction (also "eco-fiction" or "eco fiction") is the branch of literature that encompasses nature or environment-oriented works of fiction. While this super genre's roots are seen in classic, pastoral, magical realism, animal metamorphoses, science fiction, and other genres, the term ecofiction did not become popular until the 1960s when various movements created the platform for an explosion of environmental and nature literature, which also inspired ecocriticism. Ecocriticism is the study of literature and the environment from an interdisciplinary point of view, where literature scholars analyze texts that illustrate environmental concerns and examine the various ways literature treats the subject of nature. Environmentalists have claimed that the human relationship with the ecosystem often went unremarked in earlier literature.
According to Jim Dwyer, author of Where the Wild Books Are: A Field Guide to Ecofiction, "My criteria for determining whether a given work is ecofiction closely parallel Lawrence Buell's":
The nonhuman environment is present not merely as a framing device but as a presence that begins to suggest that human history is implicated in natural history.
The human interest is not understood to be the only legitimate interest.
Human accountability to the environment is part of the text's ethical orientation.
Some sense of the environment as a process rather than as a constant or a given is at least implicit in the text.'
Definitions and explanations
"The terms 'environmental fiction,' 'green fiction,' and 'nature-oriented fiction,' might better be considered as categories of ecofiction....[Ecofiction] deals with environmental issues or the relation between humanity and the physical environment, that contrasts traditional and industrial cosmologies, or in which nature or the land has a prominent role…[It is] made up of many styles, primarily modernism, postmodernism, realism, and magical realism, and can be found in many genres, primar |
https://en.wikipedia.org/wiki/Monopoly%20%281985%20video%20game%29 | Monopoly is a 1985 multi-platform video game based on the board game Monopoly, released on the Amiga, Amstrad CPC, BBC Micro, Commodore 64, MS-DOS, MSX, Tatung Einstein, Thomson MO, Thomson TO, and ZX Spectrum. Published by Leisure Genius, this title was one of many inspired by the property.
Gameplay
The game contains very similar gameplay to the board game it is based on, with various physical tasks being replaced by automation and digital representations.
Critical reception
Computer Shopper praised the game for its graphics and animation, and deemed it "excellent value". Your Spectrum thought the game was an "excellent conversion" of the board game, while Sinclair User wrote that the game was "very boring".
In 1990, M. Evan Brooks reviewed the computer editions of Risk, Monopoly, Scrabble, and Clue for Computer Gaming World, and stated that "Monopoly has been released in numerous shareware and public domain versions which thereby weaken its standing."
When the game was released for the Amiga in 1991, Amiga Power deemed it a "sound conversion" albeit more expensive than its source material, while another from the same publication said it was competent but "arguably quite pointless".
Reviews
Jeux & Stratégie #43
References
External links
Monopoly at MobyGames
ASM review
Review in RUN Magazine
1985 video games
Amiga games
Amstrad CPC games
Commodore 64 games
DOS games
Leisure Genius games
Monopoly video games
MSX games
Tatung Einstein games
Thomson MO games
Thomson TO games
Video games developed in the United Kingdom
ZX Spectrum games |
https://en.wikipedia.org/wiki/Chaos%20machine | In mathematics, a chaos machine is a class of algorithms constructed on the base of chaos theory (mainly deterministic chaos) to produce pseudo-random oracle. It represents the idea of creating a universal scheme with modular design and customizable parameters, which can be applied wherever randomness and sensitiveness is needed.
Theoretical model was published in early 2015 by Maciej A. Czyzewski. It was designed specifically to combine the benefits of hash function and pseudo-random function. However, it can be used to implement many cryptographic primitives, including cryptographic hashes, message authentication codes and randomness extractors.
See also
Merkle–Damgård construction
Sponge function
External links
Libchaos - implemented chaos machines
Official paper published at IACR
References
Theory of cryptography
Chaos theory |
https://en.wikipedia.org/wiki/History%20of%20Microsoft%20SQL%20Server | The history of Microsoft SQL Server begins with the first Microsoft SQL Server database product – SQL Server v1.0, a 16-bit relational database for the OS/2 operating system, released in 1989.
Versions
Detailed history
Genesis
On June 12, 1988, Microsoft joined Ashton-Tate and Sybase to create a variant of Sybase SQL Server for IBM OS/2 (then developed jointly with Microsoft), which was released the following year. This was the first version of Microsoft SQL Server, and served as Microsoft's entry to the enterprise-level database market, competing against Oracle, IBM, Informix, Ingres and later, Sybase. SQL Server 4.2 was shipped in 1992, bundled with OS/2 version 1.3, followed by version 4.21 for Windows NT, released alongside Windows NT 3.1. SQL Server 6.0 was the first version designed for NT, and did not include any direction from Sybase.
About the time Windows NT was released in July 1993, Sybase and Microsoft parted ways and each pursued its own design and marketing schemes. Microsoft negotiated exclusive rights to all versions of SQL Server written for Microsoft operating systems. (In 1996 Sybase changed the name of its product to Adaptive Server Enterprise to avoid confusion with Microsoft SQL Server.) Until 1994, Microsoft's SQL Server carried three Sybase copyright notices as an indication of its origin.
SQL Server 7.0
SQL Server 7.0 was a major rewrite (using C++) of the older Sybase engine, which was coded in C. Data pages were enlarged from 2k bytes to 8k bytes. Extents thereby grew from 16k bytes to 64k bytes. User Mode Scheduling (UMS) was introduced to handle SQL Server threads better than Windows preemptive multi-threading, also adding support for fibers (lightweight threads, introduced in NT 4.0, which are used to avoid context switching). SQL Server 7.0 also introduced a multi-dimensional database product called SQL OLAP Services (which became Analysis Services in SQL Server 2000).
SQL Server 7.0 would be the last version to run on the DEC A |
https://en.wikipedia.org/wiki/Stanza%20%28computing%29 | In computing, a stanza consists of a related group of lines in a script or configuration file.
Formats depend on context.
See also
XML stanza
References
Computer programming |
https://en.wikipedia.org/wiki/Distal%20promoter | Distal promoter elements are regulatory DNA sequences that can be many kilobases distant from the gene that they regulate.
They can either be enhancers (increasing expression) or silencers (decreasing expression). They act by binding activator or repressor proteins (transcription factors) and the intervening DNA bends such that the bound proteins contact the core promoter and RNA polymerase.
References
Genetics |
https://en.wikipedia.org/wiki/Institution%20of%20Mining%20Engineers | The Institution of Mining Engineers (IMinE) was a former British professional institution.
History
It began as the Federated Institution of Mining Engineers in 1889, comprising the Chesterfield and Midland Counties Institution of Engineers; Midland Institute of Mining, Civil and Mechanical Engineers; North of England Institute of Mining and Mechanical Engineers; South Staffordshire and East Worcestershire Institute of Mining Engineers and later the North Staffordshire Institute of Mining and Mechanical Engineers, the Mining Institute of Scotland and the Manchester Geological and Mining Society. It was given a Royal Charter in 1915. In the early 1980s it became affiliated with Group Four of the Engineering Council; there were fifty-one affiliated engineering organisations to the Engineering Council.
Mergers
It merged with the National Association of Colliery Managers, effective from 23 October 1968. In 1995 it merged with the Institution of Mining Electrical and Mining Mechanical Engineers. Soon after discussions about a merger with the Institution of Mining and Metallurgy, founded in 1892, took place, which it merged with in 2002.
Presidents
1889-90 John Marley
1890-92 Thomas William Embleton
1900 Henry Copson Peake
c.1900 - Wallace Thorneycroft
1923 John Brass
Structure
It was headquartered at Cleveland House on City Road in London.
Function
Fellows of the institution took the initials FIMinE.
Awards
It awarded the Medal of the Institution of Mining Engineers.
See also
List of engineering societies
North of England Institute of Mining and Mechanical Engineers
References
Further reading
Strong, G.R. A history of the Institution of Mining Engineers. 1989
External links
Mining Institute of Scotland
British mining engineers
Defunct professional associations based in the United Kingdom
Engineering societies based in the United Kingdom
Mining engineering
Mining in the United Kingdom
Mining organizations
Organisations based in the City of London |
https://en.wikipedia.org/wiki/Zcash | Zcash is a privacy-focused cryptocurrency which is based on Bitcoin's codebase. It shares many similarities, such as a fixed total supply of 21 million units.
Transactions can be transparent, similar to bitcoin transactions, or they can be shielded transactions which use a type of zero-knowledge proof to provide anonymity in transactions. Zcash coins are either in a transparent pool or a shielded pool.
Zcash offers private transactors the option of "selective disclosure", allowing a user to prove payment for auditing purposes. One such reason is to make it easier for private transactors to comply with anti-money laundering laws and tax regulations.
Use
Zcash transactions can be transparent, similar to bitcoin transactions, in which case they are controlled by a "t-addr", or they can be shielded and are controlled by a "z-addr". A shielded transaction uses a type of zero-knowledge proof, specifically a non-interactive zero-knowledge proof, called "zk-SNARK", which provides anonymity to the coin holders in the transaction. Zcash coins are either in a transparent pool or a shielded pool. As of December 2017 only around 4% of Zcash coins were in the shielded pool and at that time most cryptocurrency wallet programs did not support z-addrs and no web-based wallets supported them. The shielded pool of Zcash coins were further analyzed for security and it was found that the anonymity set can be shrunk considerably by heuristics-based identifiable patterns of usage.
While miners receive 80% of a block reward, 20% is given to the "Zcash development fund": 8% to Zcash Open Major Grants, 7% to Electric Coin Co., and 5% to The Zcash Foundation.
History
Development work on Zcash began in 2013 by Johns Hopkins professor Matthew Green and some of his graduate students. The development was completed by the for-profit Zcash Company, led by Zooko Wilcox, a Colorado-based computer security specialist and cypherpunk. In October 2016, The Zcash Company raised over $3 million fr |
https://en.wikipedia.org/wiki/Drehmann%20sign | The Drehmann sign describes a clinical test of examining orthopedic patients and is widely used in the functional check of the hip joint.
It was first described by Gustav Drehmann (Breslau, 1869–1932).
The Drehmann sign is positive if an unavoidable passive external rotation of the hip occurs when performing a hip flexion. In addition, an internal rotation of the respective hip joint is either not possible or accompanied by pain when forcefully induced.
The positive Drehmann sign is a typical clinical feature in slipped capital femoral epiphysis (SCFE), the impingement syndrome of the acetabulum-hip, or in osteoarthritis of the hip joint.
References
Medical diagnosis
Orthopedics |
https://en.wikipedia.org/wiki/Nest%20Wifi | Nest Wifi, its predecessor the Google Wifi, and the Nest Wifi's successor, the Nest Wifi Pro, are a line of mesh-capable wireless routers and add-on points developed by Google as part of the Google Nest family of products. The first generation was announced on October 4, 2016, and released in the United States on December 5, 2016. The second generation, distinct in being released as two separate offerings, a "router" and "point", were announced at the Pixel 4 hardware event on October 15, 2019, and was released in the United States on November 4, 2019. The third generation was announced on October 4, 2022, two days prior to the Pixel 7 Fall 2022 event. This generation returned to a single model, doing away with the "router/point" variants, and was released in the United States on October 27, 2022.
The Nest Wifi aims to provide enhanced Wi-Fi coverage through the setup of multiple Nest Wifi devices in a home. Nest Wifi automatically switches between access points depending on signal strength.
History
First generation
Android Police reported in September 2016 that Google was preparing to introduce a mesh-capable wireless router with enhanced range, along with its October 4 date of announcement and US$129 price point. Google Wifi was officially announced on October 4, 2016, with expected availability in the United States in December. The device became available in the United States on December 5, 2016, in the United Kingdom on April 6, 2017, in Canada on April 28, 2017, in France and Germany on June 26, 2017, in Australia on July 20, 2017, in Hong Kong and Singapore on August 30, 2017, and in Philippines on June 26, 2018.
The first generation Google Wifi features 802.11ac connectivity with 2.4 GHz and 5 GHz channels, 2x2 antennas, and support for beamforming. It has two gigabit Ethernet ports, and contains a quad-core processor with 512 MB RAM and 4 GB flash memory. Wi-Fi access can be controlled through a companion mobile app.
In 2020, Google relaunched the firs |
https://en.wikipedia.org/wiki/Feng%27s%20classification | Tse-yun Feng suggested the use of degree of parallelism to classify various computer architecture. It is based on sequential and parallel operations at a bit and word level.
About degree of parallelism
Maximum degree of parallelism
The maximum number of binary digits that can be processed within a unit time by a computer system is called the maximum parallelism degree P. If a processor is processing P bits in unit time, then P is called the maximum
degree of parallelism.
Average degree of parallelism
Let i = 1, 2, 3, ..., T be the different timing instants and P1, P2, ..., PT be the corresponding bits processed.
Then,
Processor utilization
Processor utilization is defined as
The maximum degree of parallelism depends on the structure of the arithmetic and logic unit. Higher degree of parallelism indicates a highly parallel ALU or processing element. Average parallelism depends on both the hardware and the software. Higher average parallelism can be achieved through concurrent programs.
Types of classification
According to Feng's classification, computer architecture can be classified into four. The classification is based on the way contents stored in memory are processed. The contents can be either data or instructions.
Word serial bit serial (WSBS)
Word serial bit parallel (WSBP)
Word parallel bit serial (WPBS)
Word parallel bit parallel (WPBP)
Word serial bit serial (WSBS)
One bit of one selected word is processed at a time. This represents serial processing and needs maximum processing time.
Word serial bit parallel (WSBP)
It is found in most existing computers and has been called "word slice" processing because one word of one bit is processed at a time. All bits of a selected word are processed at a time. Bit parallel means all bits of a word.
Word parallel bit serial (WPBS)
It has been called bit slice processing because m-bit slice is processed at a time. Word parallel signifies selection of all words. It can be considered as one bit |
https://en.wikipedia.org/wiki/Bura%20Irrigation%20and%20Settlement%20Project%20%28Kenya%29 | In 1977 the Board of Governors of the World Bank approved Bura Irrigation and Settlement Project (BISP) in Kenya. The project area is situated just South of the Equator in the Lower Tana Basin. It lies on the west bank of the Tana River and falls within the administrative area of Tana River County.
The project was an ambitious attempt of the government of Kenya, the World Bank and a few other donors to develop a remote area, create employment for thousands of people with a reasonable income and earn foreign exchange. Bura project would develop about 6,700 net irrigated ha over a 5 year period and settle on smallholdings about 5,150 landless poor families selected from all parts of Kenya. Physical and social infrastructure would be provided to support the settler and satellite population, expected to reach a total of 65,000 persons by 1985. The total cost of the project was estimated at 92 million dollars in 1977 prices.
Actual construction started in 1978. During implementation the costs exploded from $17,500/= per family to $55,000/=, a new record for the World Bank. The largest cost increase was for the irrigation network (615%). The donors were not willing and the government was not able to raise the additional funds and subsequently the size of the project was scaled down from 6,700 ha to 3,900 and later to 2,500 ha, although the irrigation structures were completed for 6,700 ha. In this period 2,100 landless households from all over Kenya were settled in the scheme. They were allocated two plots of 0.625 ha and a garden of 0.05 ha. Each year they were to grow 1.25 ha cotton and 0.625 maize intercropped with cowpeas.
Soon it was evident that the project would fail to achieve its objectives. Job creation was only 40% of the target, the economic rate of return was negative and the annual operating and maintenance costs exceeded the benefits. Even with net farm incomes of about 40% of the appraisal estimates in real terms, annual government subsidies amounted to |
https://en.wikipedia.org/wiki/Nigel%20Scrutton | Nigel Shaun Scrutton (born 2 April 1964) is a British biochemist and biotechnology innovator known for his work on enzyme catalysis, biophysics and synthetic biology. He is Director of the UK Future Biomanufacturing Research Hub, Director of the Fine and Speciality Chemicals Synthetic Biology Research Centre (SYNBIOCHEM), and Co-founder, Director and Chief Scientific Officer of the 'fuels-from-biology' company C3 Biotechnologies Ltd. He is Professor of Enzymology and Biophysical Chemistry in the Department of Chemistry at the University of Manchester. He is former Director of the Manchester Institute of Biotechnology (MIB) (2010 to 2020).
Early life and education
Scrutton was born in Batley, West Riding of Yorkshire and was brought up in Cleckheaton where he went to Whitcliffe Mount School. Scrutton graduated from King's College London with a first class Bachelor of Science degree in Biochemistry in 1985. He was a Benefactors' Scholar at St John's College, Cambridge where he completed his doctoral research (PhD) in 1988 supervised by Richard Perham. He was a Research Fellow of St John's College, Cambridge (1989–92) and a Fellow / Director of Studies at Churchill College, Cambridge (1992–95). He was awarded a Doctor of Science (ScD) degree in 2003 by the University of Cambridge.
Career and research
Following his PhD, Scrutton was appointed as Lecturer (1995), then Reader (1997) and Professor (1999) at the University of Leicester before being appointed Professor at the University of Manchester in 2005. He has held successive research fellowships over 29 years from the Royal Commission for the Exhibition of 1851 (1851 Research Fellowship), St John's College, Cambridge, the Royal Society (Royal Society University Research Fellow and Royal Society Wolfson Research Merit Award), the Lister Institute of Preventive Medicine, the Biotechnology and Biological Sciences Research Council (BBSRC) and the Engineering and Physical Sciences Research Council (EPSRC). He has been V |
https://en.wikipedia.org/wiki/Star%20Trek%3A%20Bridge%20Crew | Star Trek: Bridge Crew is a virtual-reality action-adventure video game developed by Red Storm Entertainment and published by Ubisoft for Microsoft Windows, PlayStation 4, and Oculus Quest.
Plot
Star Trek: Bridge Crew takes place in the timeline established in the 2009 Star Trek film and sees the Starfleet ship USS Aegis searching for a new homeworld for the Vulcans after the destruction of their planet. The ship heads for a region of space called 'The Trench', which is being occupied by Klingons.
Gameplay
The game is played through four roles: captain, tactical officer, engineer and helm officer. The captain is the only role to which mission objectives are directly displayed; they are responsible for communicating these to the crew and issuing orders to accomplish them. The helm officer controls the ship's course and travel between regions through impulse or warp drive. The tactical officer is in charge of sensors and weapons. The engineer manages the ship's power distribution and supervises repairs. Each role except the captain may be occupied by a human player or by an NPC indirectly controlled by the captain. Both story and randomly generated missions exist.
In December 2017, the game developers modified the game so that it can be played without a virtual-reality headset. Prior to that, the game could only be played using a headset.
Development
It was developed by Red Storm Entertainment and published by Ubisoft. Series actors Karl Urban, LeVar Burton and Jeri Ryan appeared at E3 2016 to promote the game during Ubisoft's press conference. A new trailer was showcased at CES 2017. The game was released on May 30, 2017.
Reception
Star Trek: Bridge Crew received "generally positive" reviews, according to review aggregator Metacritic. Eurogamer ranked it 42nd on their list of the "Top 50 Games of 2017", while GamesRadar+ ranked it 25th on their list of the 25 Best Games of 2017.
Many reviews compared it to Artemis: Spaceship Bridge Simulator, an indie game ins |
https://en.wikipedia.org/wiki/Return%20on%20modeling%20effort | Return on modelling effort (ROME) is the benefit resulting from a (supplementary) effort to create and / or improve a model.
Purpose
In engineering, modelling always serves a particular goal. For example, the lightning protection of aircraft can be modelled as an electrical circuit, in order to predict whether the protection will still work in 30 years, given the ageing of its electrical components. More and more effort can be put in making this model predict reality perfectly. However, this perfection comes at a price: researchers invest time and money in improving the model. As a Return on investment (ROI), the ROME is a metric for the use of further modelling. It may therefore serve as a 'stopping criterion'.
Typically, researchers will pull towards continuing modelling, while management will pull towards stopping modelling. Being explicit about the cost and benefits of continued modeling may help to make informed decisions that are understood by both sides. Continuous communication between model developers and model users increases the probability of models being actually put to profit.
Domains
ROME is a metric, which can be evaluated wherever modelling is performed with a quantifiable goal. Examples include:
Modeling the impact of federal policy on social problems.
Modeling a marketing mix to statistically correlate a number of inputs (or independent variables) – such as a marketing campaign – to outcomes (or dependent variables) – such as sales or profits.
Modeling the links between enterprise actors to make an informed choice on splitting organizations.
Modeling the coupling of an electromagnetic interference to a PCB to reduce its susceptibility by improving the routing of traces.
Research
The initiative "Models at Work" studies the creation, management and use of domain models in scientific and industrial practice, aiming at a diversity of goals, varying from (as truthful as possible) representation of the conceptual structure of the domain that |
https://en.wikipedia.org/wiki/Multi-stage%20game | In game theory, a multi-stage game is a sequence of several simultaneous games played one after the other. This is a generalization of a repeated game: a repeated game is a special case of a multi-stage game, in which the stage games are identical.
Multi-Stage Game with Different Information Sets
As an example, consider a two-stage game in which the stage game in Figure 1 is played in each of two periods:
The payoff to each player is the simple sum of the payoffs of both games.
Players cannot observe the action of the other player within a round; however, at the beginning of Round 2, Player 2 finds out about Player 1's action in Round 1, while Player 1 does not find out about Player 2's action in Round 1.
For Player 1, there are strategies.
For Player 2, there are strategies.
The extensive form of this multi-stage game is shown in Figure 2:
In this game, the only Nash Equilibrium in each stage is (B, b).
(BB, bb) will be the Nash Equilibrium for the entire game.
Multi-Stage Game with Changing Payoffs
In this example, consider a two-stage game in which the stage game in Figure 3 is played in the first period and the game in Figure 4 is played in the second:
The payoff to each player is the simple sum of the payoffs of both games.
Players cannot observe the action of the other player within a round; however, at the beginning of Round 2, both players find out about the other's action in Round 1.
For Player 1, there are strategies.
For Player 2, there are strategies.
The extensive form of this multi-stage game is shown in Figure 5:
Each of the two stages has two Nash Equilibria: which are (A, a), (B, b), (X, x), and (Y, y).
If the complete contingent strategy of Player 1 matches Player 2 (i.e. AXXXX, axxxx), it will be a Nash Equilibrium. There are 32 such combinations in this multi-stage game. Additionally, all of these equilibria are subgame-perfect.
References
Game theory game classes |
https://en.wikipedia.org/wiki/Cirrus%20cloud%20thinning | Cirrus cloud thinning (CCT) is a proposed form of climate engineering. Cirrus clouds are high cold ice that, like other clouds, both reflect sunlight and absorb warming infrared radiation. However, they differ from other types of clouds in that, on average, infrared absorption outweighs sunlight reflection, resulting in a net warming effect on the climate. Therefore, thinning or removing these clouds would reduce their heat trapping capacity, resulting in a cooling effect on Earth's climate. This could be a potential tool to reduce anthropogenic global warming. Cirrus cloud thinning is an alternative category of climate engineering, in addition to solar radiation management and greenhouse gas removal.
In 2021 the IPCC described CCT as a proposal "to reduce the amount of cirrus clouds by injecting ice nucleating substances in the upper troposphere." However it reported low confidence in the cooling effect of CCT, due to limited understanding of cirrus microphysics, its interaction with aerosols, and the complexity of seeding strategy. CCT may also increase global precipitation.
Basic principles
Typical cirrus clouds may be susceptible to modification to reduce their lifetime and optical thickness, and hence their net positive radiative forcing (in contrast to the typical low, warm liquid clouds). Material to seed such modification could be delivered via drones or by aircraft. Scientists believe that cirrus clouds in the high latitude upper troposphere are formed by homogeneous freezing, resulting in large numbers of small ice crystals. If effective ice nuclei were introduced into this environment, the cirrus may instead form by heterogeneous freezing. If the concentration of ice nuclei is seeded such that the resulting cloud particle density is less than that for the natural case, the cloud particles should grow larger due to less water vapor competition and attain higher settling velocities. By seeding with aerosols, ice crystals could grow rapidly and deplete wa |
https://en.wikipedia.org/wiki/Mich%C3%A8le%20Marcotte | Michèle Marcotte is an American food scientist who is a pioneer in food processing research. As a federal scientist with Agriculture and Agri-Food Canada (AAFC), she created a new method of fruit dehydration which can also be applied to vegetables, meat, or fish known as osmotic dehydration. Collaboration with private industry led to the design, development, installation, and start-up of a custom build dried-cranberry production line that is unique to the world. Her research was recognized with several awards including one from the Canadian Society for Bioengineering. Marcotte is currently the Director of the Ottawa Research and Development Centre located at the Central Experimental Farm in Canada.
Biography
Marcotte completed a bachelor's degree in chemical engineering from Laval University in 1985. After completing her master's degree in food engineering at the University of Alberta in 1988, she began her career with Agriculture and Agri-Food Canada (AAFC) as a professional engineer at the Food Research and Development Centre located in St. Hyacinthe, Quebec. In 1999, Marcotte obtained her Ph.D. in food processing from McGill University where she studied the ohmic heating process of viscous liquids.
Career
Marcotte has worked at Agriculture and Agri-Food Canada as a section head of food preservation technologies, a research scientist in Food Processing and Engineering, and an advisor to the Director General of the Food Safety and Quality National Science Program. Marcotte's research interests included:
The development of tools for the characterization of thermophysical properties;
The use of modeling and optimization technique in order to describe the evolution of product safety and quality upon processing; and
The development and optimization of novel thermal processes and thermal dehydration processes.
Under the Federal Partners in Technology Transfer (FPTT) program, Marcotte and her team at the Development Centre partnered with Atoka Cranberries to produce |
https://en.wikipedia.org/wiki/Homological%20stability | In mathematics, homological stability is any of a number of theorems asserting that the group homology of a series of groups is stable, i.e.,
is independent of n when n is large enough (depending on i). The smallest n such that the maps is an isomorphism is referred to as the stable range.
The concept of homological stability was pioneered by Daniel Quillen whose proof technique has been adapted in various situations.
Examples
Examples of such groups include the following:
Applications
In some cases, the homology of the group
can be computed by other means or is related to other data. For example, the Barratt–Priddy theorem relates the homology of the infinite symmetric group agrees with mapping spaces of spheres. This can also be stated as a relation between the plus construction of and the sphere spectrum. In a similar vein, the homology of is related, via the +-construction, to the algebraic K-theory of R.
References
Algebraic topology
Algebraic K-theory |
https://en.wikipedia.org/wiki/Coverage%20%28genetics%29 | In genetics, coverage is one of several measures of the depth or completeness of DNA sequencing, and is more specifically expressed in any of the following terms:
Sequence coverage (or depth) is the number of unique reads that include a given nucleotide in the reconstructed sequence. Deep sequencing refers to the general concept of aiming for high number of unique reads of each region of a sequence.
Physical coverage, the cumulative length of reads or read pairs expressed as a multiple of genome size.
Genomic coverage, the percentage of all base pairs or loci of the genome covered by sequencing.
Sequence coverage
Rationale
Even though the sequencing accuracy for each individual nucleotide is very high, the very large number of nucleotides in the genome means that if an individual genome is only sequenced once, there will be a significant number of sequencing errors. Furthermore, many positions in a genome contain rare single-nucleotide polymorphisms (SNPs). Hence to distinguish between sequencing errors and true SNPs, it is necessary to increase the sequencing accuracy even further by sequencing individual genomes a large number of times.
Ultra-deep sequencing
The term "ultra-deep" can sometimes also refer to higher coverage (>100-fold), which allows for detection of sequence variants in mixed populations. In the extreme, error-corrected sequencing approaches such as Maximum-Depth Sequencing can make it so that coverage of a given region approaches the throughput of a sequencing machine, allowing coverages of >10^8.
Transcriptome sequencing
Deep sequencing of transcriptomes, also known as RNA-Seq, provides both the sequence and frequency of RNA molecules that are present at any particular time in a specific cell type, tissue or organ. Counting the number of mRNAs that are encoded by individual genes provides an indicator of protein-coding potential, a major contributor to phenotype. Improving methods for RNA sequencing is an active area of research both in te |
https://en.wikipedia.org/wiki/Data%20exhaust | Data exhaust or exhaust data is the trail of data left by the activities of an Internet or other computer system users during their online activity, behavior, and transactions. This is part of a broader category of unconventional data that includes geospatial, network, and time-series data and may be useful for predictive analytics. Every visited website, clicked link, and even hovering with a mouse is collected, leaving behind a trail of data. An enormous amount of often raw data are created, which can be in the form of cookies, temporary files, logfiles, storable choices, and more. This information can help to improve the online experience, for example through customized content. It can be used to improve tracking trends and studying data exhaust also improves the user interface and the layout design. On the other hand, they can also compromise privacy, as they offer a valuable insight into the user's habits. For example, as the world's most popular website, Google, uses this data exhaust to refine the predictive value of their products.
The data that is collected by companies is often information that does not seem immediately useful. Although the information is not used by the company right away, it can be stored for future use or sold to someone else who can use the information. The data can help with quality control, performance, and revenue. Unlike primary content, these data are not purposefully created by the user, who is often unaware of their very existence. A bank for example would consider as primary data information concerning the sums and parties of a transaction, whilst secondary data might include the percentage of transactions carried out at a cash machine instead of a real bank.
Medical exhaust data
Most medical devices emit some form of exhaust data, such as many pacemakers, dialysis machines, and cameras used during surgery. The majority of this data is never captured, and is primarily abandoned after the surgery is completed, or the device m |
https://en.wikipedia.org/wiki/Six-state%20protocol | The six-state protocol (SSP) is the quantum cryptography protocol that is the version of BB84 that uses a six-state polarization scheme on three orthogonal bases.
Origin
The six-state protocol first appeared in the article "Optimal Eavesdropping in Quantum Cryptography with Six States" by Dagmar Bruss in 1998, and was further studied in "Incoherent and coherent eavesdropping in the six-state protocol of quantum cryptography" by Pasquinucci and Nicolas Gisin in 1999.
Description
"The six-state protocol is a discrete-variable protocol for quantum key distribution that permits tolerating a noisier channel than the BB84 protocol." (2011, Abruzzo). SSP produces a higher rate of errors during attempted eavesdropping, thus making it easier to detect errors, as an eavesdropper must choose the right basis from three possible bases (Haitjema, 2016). High dimensional systems have been proven to provide a higher level of security.
Implementation
Six-state protocol can be implemented without a quantum computer using only optical technologies. SSP's three conjugate bases span is shown on Picture 1.
Alice randomly generates a qubit string, encodes them using randomly chosen one of three bases, and sends string of qubits to Bob through the secured quantum channel. The probability of using one of the bases equals 1/3. After receiving the string of qubits, Bob also randomly chooses one of three bases for measuring the state of each qubits. Using classical insecure, but authenticated, channel Alice and Bob communicate and discard measurements where Bob used the different basis for measure the state of the qubit than basis that Alice used for encoding. States of qubits where encoding basis matched measurement basis used to determine the secret key.
See also
SARG04
E91 – quantum cryptographic communication protocol
BB84
References
Cryptographic algorithms
Quantum information science
Quantum cryptography
Quantum cryptography protocols
de:Quantenkryptografie#BB84-Protoko |
https://en.wikipedia.org/wiki/Bottle%20flipping | Bottle flipping was a trend that involved throwing a plastic bottle, typically partially full of liquid, into the air so that it rotates, in an attempt to land it upright on its base or cap. It became an international trend in the summer of 2016 with numerous videos of people attempting the activity being posted online. With its popularity, the repetitive thuds of multiple attempts have been criticized as a distraction and a public nuisance. Parents and teachers have expressed frustration at the practice, resulting in water bottle flipping being banned at several schools around the world, as well as many people calling for the practice to only be performed in private.
History
In 2016, a viral video of teenager Mike Senatore, flipping a water bottle at a talent show at Ardrey Kell High School in Charlotte, North Carolina, popularized the activity.
Description
Water bottle flipping involves taking a plastic water bottle that is partially empty and holding it by the neck of the bottle. Force is applied with a flick, with the bottom of the bottle rotating away from the person. If performed successfully, the bottle will land upright. Alternatively, the bottle may land upside-down, or on its cap. Doing this is significantly more difficult than flipping a bottle so it lands upright. The amount of fluid in the bottle greatly influences the success of the feat, and it has been shown empirically that filling the bottle about one-third of the way improves the rate of success. The type of water bottle also plays a role; for instance, the brand Deer Park Spring Water has been noted to make the task easier due to its unique hourglass shape with a third divot.
The feat is often performed with disposable plastic water bottles due to their availability, but other containers can be used as well. The bottle flip is often combined with the Dab after a successful flip. The complex physics behind the activity incorporates concepts of fluid dynamics, projectile motion, angular moment |
https://en.wikipedia.org/wiki/Spin%20gapless%20semiconductor | Spin gapless semiconductors are a novel class of materials with unique electrical band structure for different spin channels in such a way that there is no band gap (i.e., 'gapless') for one spin channel while there is a finite gap in another spin channel.
In a spin-gapless semiconductor, conduction and valence band edges touch, so that no threshold energy is required to move electrons from occupied (valence) states to empty (conduction) states. This gives spin-gapless semiconductors unique properties: namely that their band structures are extremely sensitive to external influences (e.g., pressure or magnetic field).
Because very little energy is needed to excite electrons in an SGS, charge concentrations are very easily ‘tuneable’. For example, this can be done by introducing a new element (doping) or by application of a magnetic or electric field (gating).
A new type of SGS identified in 2017, known as Dirac-type linear spin-gapless semiconductors, has linear dispersion and is considered an ideal platform for massless and dissipationless spintronics because spin-orbital coupling opens a gap for the spin fully polarized conduction and valence band, and as a result, the interior of the sample becomes an insulator, however, an electrical current can flow without resistance at the sample edge. This effect, the quantum anomalous Hall effect has only previously been realised in magnetically doped topological insulators.
As well as Dirac/linear SGSs, the other major category of SGS are parabolic spin gapless semiconductors.
Electron mobility in such materials is two to four orders of magnitude higher than in classical semiconductors.
SGSs are topologically non-trivial.
Prediction and discovery
The spin gapless semiconductor was first proposed as a new spintronics concept and a new class of candidate spintronic materials in 2008 in a paper by Xiaolin Wang of the University of Wollongong in Australia.
Properties and applications
The dependence of bandgap on spin |
https://en.wikipedia.org/wiki/Evo-devo%20gene%20toolkit | The evo-devo gene toolkit is the small subset of genes in an organism's genome whose products control the organism's embryonic development. Toolkit genes are central to the synthesis of molecular genetics, palaeontology, evolution and developmental biology in the science of evolutionary developmental biology (evo-devo). Many of them are ancient and highly conserved among animal phyla.
Toolkit
Toolkit genes are highly conserved among phyla, meaning that they are ancient, dating back to the last common ancestor of bilaterian animals. For example, that ancestor had at least 7 Pax genes for transcription factors.
Differences in deployment of toolkit genes affect the body plan and the number, identity, and pattern of body parts. The majority of toolkit genes are components of signaling pathways and encode for the production of transcription factors, cell adhesion proteins, cell surface receptor proteins (and signalling ligands that bind to them), and secreted morphogens; all of these participate in defining the fate of undifferentiated cells, generating spatial and temporal patterns that, in turn, form the body plan of the organism. Among the most important of the toolkit genes are those of the Hox gene cluster, or complex. Hox genes, transcription factors containing the more broadly distributed homeobox protein-binding DNA motif, function in patterning the body axis. Thus, by combinatorially specifying the identity of particular body regions, Hox genes determine where limbs and other body segments will grow in a developing embryo or larva. A paradigmatic toolkit gene is Pax6/eyeless, which controls eye formation in all animals. It has been found to produce eyes in mice and Drosophila, even if mouse Pax6/eyeless was expressed in Drosophila.
This means that a big part of the morphological evolution undergone by organisms is a product of variation in the genetic toolkit, either by the genes changing their expression pattern or acquiring new functions. A good example of |
https://en.wikipedia.org/wiki/Prince%20%28cipher%29 | Prince is a block cipher targeting low latency, unrolled hardware implementations. It is based on the so-called FX construction. Its most notable feature is the alpha reflection: the decryption is the encryption with a related key which is very cheap to compute. Unlike most other "lightweight" ciphers, it has a small number of rounds and the layers constituting a round have low logic depth. As a result, fully unrolled implementation are able to reach much higher frequencies than AES or PRESENT. According to the authors, for the same time constraints and technologies, PRINCE uses 6–7 times less area than PRESENT-80 and 14–15 times less area than AES-128.
Overview
The block size is 64 bits and the key size is 128 bits. The key is split into two 64 bit keys and . The input is XORed with , then is processed by a core function using . The output of the core function is xored by to produce the final output ( is a value derived from ). The decryption is done by exchanging and and by feeding the core function with xored with a constant denoted alpha.
The core function contain 5 "forward" rounds, a middle round, and 5 "backward" rounds, for 11 rounds in total. The original paper mentions 12 rounds without explicitly depicting them; if the middle round is counted as two rounds (as it contains two nonlinear layers), then the total number of rounds is 12.
A forward round starts with a round constant XORed with , then a nonlinear layer , and finally a linear layer . The "backward" rounds are exactly the inverse of the "forward" rounds except for the round constants.
The nonlinear layer is based on a single 4-bit S-box which can be chosen among the affine-equivalent of 8 specified S-boxes.
The linear layer consists of multiplication by a 64x64 matrix and a shift row similar to the one in AES but operating on 4-bit nibbles rather than bytes.
is constructed from 16x16 matrices and in such a way that the multiplication by can be computed by four smaller multiplicati |
https://en.wikipedia.org/wiki/Fig%20Tree%20Formation | The Fig Tree Formation, also called Fig Tree Group, is a stromatolite-containing geological formation in South Africa. The rock contains fossils of microscopic life forms of about 3.26 billion years old. Identified organisms include the bacterium Eobacterium isolatum and the algae-like Archaeosphaeroides barbertonensis. The fossils in the Fig Tree Formation are considered some of the oldest known organisms on Earth, and provide evidence that life may have existed much earlier than previously thought. The formation is composed of shales, turbiditic greywackes, volcaniclastic sandstones, chert, turbiditic siltstone, conglomerate, breccias, mudstones, and iron-rich shales.
See also
Archean life in the Barberton Greenstone Belt
Warrawoona Group
References
Further reading
Byerly G.R., Lower D.R. & Walsh M.M. (1986). Stromatolites from the 3300–3500-Myr Swaziland Supergroup, Barberton Mountain Land, South Africa. Nature, 319: 489–491.
Geologic formations of South Africa
Archean Africa
Sandstone formations
Shale formations
Conglomerate formations
Siltstone formations
Mudstone formations
Chert
Fossiliferous stratigraphic units of Africa
Paleontology in South Africa
Origin of life |
https://en.wikipedia.org/wiki/Microbial%20cell%20factory | Microbial cell factory is an approach to bioengineering which considers microbial cells as a production facility in which the optimization process largely depends on metabolic engineering. MCFs is a derivation of cell factories, which are engineered microbes and plant cells. In 1980s and 1990s, MCFs were originally conceived to improve productivity of cellular systems and metabolite yields through strain engineering. A MCF develops native and nonnative metabolites through targeted strain design. In addition, MCFs can shorten the synthesis cycle while reducing the difficulty of product separation.
History
Prior to MCFs, scientists employed traditional engineering techniques to produce various commodities. These methodologies include modifying metabolic pathways, eliminating enzymes, or the balancing of ATP to drive metabolic flux. However, when these approaches were applied for industrial productions, they could not withstand the industrial environments that consisted of toxins and fluctuating temperatures. Ultimately, the techniques were never able to scale up and output bio-products that were obtained in the laboratory.
Thus, MCFs were developed by using a heterogenous biosynthesis pathway in a microbial host. As a host, MCFs take in various substrates and convert them into valuable compounds. These products can range from fuels, chemical, food ingredients, to pharmaceuticals.
Structure
Cell Wall
In microbial cells, the cell walls are either Gram-positive or Gram-negative. These outcomes are based on the Gram Stain test. Gram-positive cell walls have thick peptidoglycan layer and no outer lipid membrane while Gram-negative bacteria have a thin peptidoglycan layer and an outer lipid membrane. Although a thick Gram-positive cell wall is advantageous, it is easier to attack as the peptidoglycan layer absorbs antibiotics and cleaning products. A Gram-negative cell wall is more resistant to such attacks and more difficult to destroy.
Membrane
The membrane of mic |
https://en.wikipedia.org/wiki/Hockey-stick%20identity | In combinatorial mathematics, the hockey-stick identity, Christmas stocking identity, boomerang identity, Fermat's identity or Chu's Theorem, states that if are integers, then
The name stems from the graphical representation of the identity on Pascal's triangle: when the addends represented in the summation and the sum itself are highlighted, the shape revealed is vaguely reminiscent of those objects (see hockey stick, Christmas stocking).
Formulations
Using sigma notation, the identity states
or equivalently, the mirror-image by the substitution :
Proofs
Generating function proof
We have
Let , and compare coefficients of .
Inductive and algebraic proofs
The inductive and algebraic proofs both make use of Pascal's identity:
Inductive proof
This identity can be proven by mathematical induction on .
Base case
Let ;
Inductive step
Suppose, for some ,
Then
Algebraic proof
We use a telescoping argument to simplify the computation of the sum:
Combinatorial proofs
Proof 1
Imagine that we are distributing indistinguishable candies to distinguishable children. By a direct application of the stars and bars method, there are
ways to do this. Alternatively, we can first give candies to the oldest child so that we are essentially giving candies to kids and again, with stars and bars and double counting, we have
which simplifies to the desired result by taking and , and noticing that :
Proof 2
We can form a committee of size from a group of people in
ways. Now we hand out the numbers to of the people. We can divide this into disjoint cases. In general, in case , , person is on the committee and persons are not on the committee. The rest of the committee can be chosen in
ways. Now we can sum the values of these disjoint cases, and using double counting we obtain
See also
Pascal's identity
Pascal's triangle
Leibniz triangle
Vandermonde's identity
Faulhaber's formula, for sums of arbitrary polynomials.
References
External l |
https://en.wikipedia.org/wiki/Oilfield%20scale%20inhibition | Oilfield scale inhibition is the process of preventing the formation of scale from blocking or hindering fluid flow through pipelines, valves, and pumps used in oil production and processing. Scale inhibitors (SIs) are a class of specialty chemicals that are used to slow or prevent scaling in water systems. Oilfield scaling is the precipitation and accumulation of insoluble crystals (salts) from a mixture of incompatible aqueous phases in oil processing systems. is a common term in the oil industry used to describe solid deposits that grow over time, blocking and hindering fluid flow through pipelines, valves, pumps etc. with significant reduction in production rates and equipment damages. Scaling represents a major challenge for flow assurance in the oil and gas industry. Examples of oilfield scales are calcium carbonate (limescale), iron sulfides, barium sulfate and strontium sulfate. Scale inhibition encompasses the processes or techniques employed to treat scaling problems.
Background
The three prevailing water-related problems that upset oil companies today are corrosion, gas hydrates and scaling in production systems. The reservoir water has a high composition of dissolved minerals equilibrated over millions of years at constant physicochemical conditions. As the reservoir fluids are pumped from the ground, changes in temperature, pressure and chemical composition shift the equilibria and cause precipitation and deposition of sparingly soluble salts that build up over time with the potential of blocking vital assets in the oil production setups. Scaling can occur at all stages of oil/gas production systems (upstream, midstream and downstream) and causes blockages of well-bore perforations, casing, pipelines, pumps, valves etc. Severe scaling issues have been reported in Russia and certain North Sea production systems.
Types of scales
Two main classifications of scales are known; inorganic and organic scales and the two types are mutually inclusive, occur |
https://en.wikipedia.org/wiki/Stencil%20Subtractor | The Stencil Subtractor frame was a ciphered text recyphering tool that was invented by British Army Intelligence Officer and cryptographer John Tiltman and was ready for trial by April 1941 but was not adopted officially by the British Forces until March 1942, and not brought into service until June 1943. It was used together with Subtractor tables, placed on top of the table and the numerical values visible in the gaps of the SS Frame were used to encipher the underlying numerical code, for example the War Office Cipher, RAF cipher or Naval cipher. The SSF was described as follows:
"consisting of a plastic grille that contained 100 4-digit wide window's randomly spaced. This was superimposed over an additive sheet that had forty-eight lines of sixty-eight digits each. Setting squares on the grille provided placement for one-hundred possible settings and a conversion table appeared on each sheet with mixed sequences of digits from 00 to 99 for indicating purposes. The placement of the grille was determined through a substitution pattern sent to each user"
References
Cryptographic hardware
Encryption devices
World War II military equipment of the United Kingdom |
https://en.wikipedia.org/wiki/Boaz%20Tsaban | Boaz Tsaban (born February 1973) is an Israeli mathematician on the faculty of Bar-Ilan University. His research interests include selection principles within set theory and nonabelian cryptology, within mathematical cryptology.
Biography
Boaz Tsaban grew up in Or Yehuda, a city near Tel Aviv. At the age of 16 he was selected with other high school students to attend the first cycle of a special preparation program in mathematics, at Bar-Ilan University, being admitted to regular mathematics courses at the University a year later. He completed his B.Sc., M.Sc. and Ph.D. degrees with highest distinctions.
Two years as a post-doctoral fellow at Hebrew University were followed by a three-year Koshland Fellowship at the Weizmann Institute of Science before he joined the Department of Mathematics, Bar-Ilan University in 2007.
Academic career
In the field of selection principles, Tsaban devised the method of omission of intervals for establishing covering properties of sets of real numbers that have certain combinatorial structures. In nonabelian cryptology he devised the algebraic span method that solved a number of computational problems that underlie a number of proposals for nonabelian public-key cryptographic schemes (such as the commutator key exchange).
Awards and recognition
Tsaban's doctoral dissertation, supervised by Hillel Furstenberg, won, with Irit Dinur, the Nessyahu prize for the best Ph.D. in mathematics in Israel in 2003.
In 2009 he won the Wolf Foundation Krill Prize
for Excellence in Scientific Research.
References
External links
Boaz Tsaban's homepage at Bar-Ilan University
1973 births
Set theorists
Cryptologic education
Mathematics educators
Israeli cryptographers
Israeli mathematicians
Living people
Academic staff of Bar-Ilan University
Bar-Ilan University alumni |
https://en.wikipedia.org/wiki/Appium | Appium is an open source automation tool for running scripts and testing native applications, mobile-web applications and hybrid applications on Android or iOS using a webdriver.
History
Appium was originally developed by Dan Cuellar in 2011 under the name "iOSAuto", written in the C# programming language. The program was open-sourced in August 2012 using the Apache 2 license. In January 2013, Sauce Labs agreed to fund Appium's development and motivated its code to be rewritten using Node.js.
Appium won the 2014 Bossie award of InfoWorld for the best open source desktop and mobile software. Appium was also selected as an Open Source Rookie of the Year by Black Duck Software.
In October 2016, Appium joined the JS Foundation. Initially as a mentor program, it graduated in August 2017.
References
External links
Automation software
Software testing tools
Unit testing frameworks |
https://en.wikipedia.org/wiki/Code%20First%20Girls | Code First Girls is a social enterprise that provides free coding courses to women and non-binary people across the UK, Ireland, the USA, Switzerland and the Netherlands. The organization helps companies recruit more women into the tech sector by connecting them with newly trained female developers. Their community of coders, instructors, and mentors is one of the largest in the UK. According to the organisation, as of 2022 they've trained over 50,000 women.
The organisation's stated goal is to "promote gender diversity and female participation in the technology sector by offering free courses for students and professional women who are wanting to re-train." They also support businesses to train staff and encourage levelling-up for female staff within organisations.
As of 2020, Code First Girls is reported to have provided in excess of £10 million worth of free coding courses to more than 18,000 women since 2013.
In 2017, Code First Girls announced the launch of the "Code First: Girls 20:20 campaign" with the aim to "train 20,000 women to code for free by the end of 2020". As of 2018, Code First: Girls have announced "2020 campaign partnerships" with the following companies: Bank of America Merrill Lynch; Goldman Sachs; KKR; Trainline; and OVH. The organisation announced Baroness Martha Lane-Fox and Dame Stephanie Shirley as supporting the campaign as ambassadors.
History
Code First Girls began in late 2012 as "a nine-week, free, part-time course to get female graduates from all walks of life not only interested in coding, but also better equipped to contribute to technical discussions in high-tech businesses".
Founded by Alice Bentinck and Matthew Clifford, Code First: Girls was created when Bentinck and Clifford recognised a lack of female applications for their pre-seed investment programme Entrepreneur First (EF).
Bentinck claims that of the first cohort to complete Code First: Girls training, more than half of the women participants self-identified as "t |
https://en.wikipedia.org/wiki/C%20band%20%28IEEE%29 | The C band is a designation by the Institute of Electrical and Electronics Engineers (IEEE) for a portion of the electromagnetic spectrum in the microwave range of frequencies ranging from 4.0 to 8.0 gigahertz (GHz). However, the U.S. Federal Communications Commission C band proceeding and auction, designated 3.7–4.2 GHz as C band. The C band is used for many satellite communications transmissions, some Wi-Fi devices, some cordless telephones, as well as some radar and weather radar systems.
Use in satellite communication
The communications C band was the first frequency band that was allocated for commercial telecommunications via satellites. The same frequencies were already in use for terrestrial microwave radio relay chains. Nearly all C-band communication satellites use the band of frequencies from 3.7 to 4.2 GHz for their downlinks, and the band of frequencies from 5.925 to 6.425 GHz for their uplinks. Note that by using the band from 3.7 to 4.0 GHz, this C band overlaps somewhat with the IEEE S band for radars.
The C-band communication satellites typically have 24 radio transponders spaced 20 MHz apart, but with the adjacent transponders on opposite polarizations such that transponders on the same polarization are always 40 MHz apart. Of this 40 MHz, each transponder utilizes about 36 MHz. (The unused 4.0 MHz between the pairs of transponders acts as "guard bands" for the likely case of imperfections in the microwave electronics.)
One use of the C band is for satellite communication, whether for full-time satellite television networks or raw satellite feeds, although subscription programming also exists. This use contrasts with direct-broadcast satellite, which is a completely closed system used to deliver subscription programming to small satellite dishes that are connected with proprietary receiving equipment.
The satellite communications portion of the C band is highly associated with television receive-only satellite reception systems, commonly ca |
https://en.wikipedia.org/wiki/Mailvelope | Mailvelope is free software for end-to-end encryption of email traffic inside of a web browser (Firefox, Chromium or Edge) that integrates itself into existing webmail applications ("email websites"). It can be used to encrypt and sign electronic messages, including attached files, without the use of a separate, native email client (like Thunderbird) using the OpenPGP standard.
The name is a portmanteau of the words "mail" and "envelope". It is published together with its source code under the terms of version 3 of the GNU Affero General Public License (AGPL). The company Mailvelope GmbH runs the development using a public code repository on GitHub. Development is sponsored by the Open Technology Fund and Internews.
Similar alternatives had been Mymail-Crypt and WebPG.
Features
Mailvelope equips webmail applications with OpenPGP functionality. Support for several popular providers like Gmail, Yahoo, Outlook on the web and others are preconfigured.
The webmail software Roundcube senses and supports Mailvelope as of version 1.2 from May 2016, as well as most (self-hosted) webmail clients. For Chromium/Chrome there's the possibility to install from an authenticated source using the integrated software extension manager "Chrome Web Store". In addition, Mailvelope is also available for Firefox and Microsoft Edge as an add-on.
Mailvelope works according to the OpenPGP standard, a public-key cryptosystem first standardized in 1998 and is written in JavaScript. On preset or user-authorized web pages it overlays the page with its control elements, which are optically distinguished as being separate from the web application by a surrounding security-background. This background can be customized to detect impersonations. For encryption it relies on the functionality of the program library OpenPGP.js, a free JavaScript Implementation of the OpenPGP standard. By running inside a separate inline frame, its code is executed separately from the web application and should preve |
https://en.wikipedia.org/wiki/ABICOMP%20character%20set | The ABICOMP Character Set was an encoded repertoire of characters used in Brazil. It was devised by the Associação Brasileira de Indústria de Computadores, a Brazilian computer industry association defunct in 1992. It was used on Brazilian-made computers and several printers brands. This code page is known by Star printers and FreeDOS as Code page 3848.
Coverage
The ABICOMP Character Set obviously contained the characters to cover Portuguese. It also contained characters to cover other languages such as Spanish, French, Italian and German. However, the quotation marks "«" and "»" for (European) Portuguese, (European) Spanish, French and Italian are missing.
This character set was different from the Brazilian Standard BraSCII, which was very similar to ISO 8859-1. Although once very used in Brazil, this character set became less and less used because of the ubiquity of other character sets (ISO 8859-1 and later Unicode).
Character set
References
Character sets |
https://en.wikipedia.org/wiki/Ecozone | An ecozone may refer to:
Biogeographic realm, the broadest biogeographic division of Earth's land surface (referred to as ecozone by BBC)
Biome, a large collection of flora and fauna occupying a major habitat
Bioregion, an ecologically and geographically defined area that is smaller than a biogeographical realm, but larger than an ecoregion
Ecoregion, an ecologically and geographically defined area that is smaller than a bioregion
Ecozone (Canada), one of 15 first-level ecological land classifications in Canada
Biogeography |
https://en.wikipedia.org/wiki/WiFi%20Master%20Key | WiFi Master (formerly WiFi Master Key) is a peer-to-peer Wi-Fi sharing mobile application software for free Wi-Fi access developed by LinkSure Network. It uses cloud computing, big data and principles of the sharing economy.
The company's founder and CEO, Chen Danian, was previously CEO and co-founder of Shanda.
WiFi Master was first released in 2012, and by 2016 had become the world’s largest Wi-Fi sharing app, with over 900 million users and 520 million monthly active users.
In terms of combined iOS and Android app downloads, WiFi Master is ranked 5th in the world, after WhatsApp, Instagram, Facebook and Facebook Messenger. WiFi Master is the 3rd largest software app in China after WeChat and Tencent QQ.
History
WiFi Master was created by Chen Danian in hopes of 'bridging the digital divide and to help people achieve self-actualization by granting them access to free Internet'.
Chen Danian shared in an interview with Forbes that 'he was born into poverty in rural China, and using the Internet, he realized that it was a tool to change destinies and pursue happiness by exploring opportunities'.
In September 2012, WiFi Master was first launched in China.
In 2015, its operating company, LinkSure closed its A round funding of USD $52 million, and became a unicorn company in the mobile internet industry. In May 2015, LinkSure bought the domain name wifi.com and established a branch in Singapore to expand its overseas services. WiFi Master was launched worldwide, rapidly gaining popularity in Southeast Asia.
In 2016, WiFi Master became the world’s largest WiFi sharing community, providing over 4 billion daily average connections with a successful connection rate of over 80% worldwide. WiFi Master is available in 223 countries, and is the top Tools app on the Google Play Store in 49 countries.
In 2019, WiFi Master Key rebranded as WiFi Master.
Features
WiFi Search
The application's core function, Wifi Search, allows for users to find and connect to available |
https://en.wikipedia.org/wiki/Junction%20%28hackathon%29 | Junction is a hackathon organizer with headquarters Espoo, Finland. Started in 2015, Junction grew to be one of the largest organizers in Europe. In 2018 it expanded globally with a Junction event at Tsinghua University in China and cooperation with Chinese and South Korean universities bringing high performing students to attend the event in Helsinki.
During the years Junction introduced various formats of events to its public, the biggest one being their yearly Junction Hackathon. This event brings together developers, designers, and entrepreneurs from around the world and helps them build solutions to real world challenges from local and multinational companies.
History
Junction 2015
Junction was first launched in 2015. The event was held on November 6-8th in Kattilahalli, Helsinki and gathered more than 550 participants and resulted in 145 different projects. Notable partners included Uber, Finnair, Supercell, Reaktor, and others.
The winner Junction 2015 was Slush Smackdown, from the Supercell Unlimited track. The team created a game where the players program their own boxer who then competes against other players' codes in real-time. It also won the whole Slush Hacks -hackathon competition main prize, worth 20 000 EUR.
Junction 2016
Junction 2016 was held on November 25–27 in Wanha Satama, in Katajanokka, Helsinki, Finland. About 1300 participants from over 77 nationalities attended the hackathon. Partners included Supercell, Zalando, the European Space Agency, General Electric, Sitra, Tieto, UPM and others.
The teams had 48 hours to develop their ideas and afterwards demo them to other attendants and judges. The main prize for the winning idea of the 2016 event was 20 000 EUR, and many companies offered their own bounties for solving challenges in a specific way or using pre-specified technology. Teams were provided a number of different API's and other emerging technologies to develop their concepts including Oculus Rifts, HTC Vives, Apple Watches, |
https://en.wikipedia.org/wiki/Arithmetices%20principia%2C%20nova%20methodo%20exposita | The 1889 treatise Arithmetices principia, nova methodo exposita (The principles of arithmetic, presented by a new method) by Giuseppe Peano is widely considered to be a seminal document in mathematical logic and set theory, introducing what is now the standard axiomatization of the natural numbers, and known as the Peano axioms, as well as some pervasive notations, such as the symbols for the basic set operations ∈, ⊂, ∩, ∪, and A−B.
The treatise is written in Latin, which was already somewhat unusual at the time of publication, Latin having fallen out of favour as the lingua franca of scholarly communications by the end of the 19th century. The use of Latin in spite of this reflected Peano's belief in the universal importance of the work – which is now generally regarded as his most important contribution to arithmetic – and in that of universal communication. Peano would publish later works both in Latin and in his own artificial language, Latino sine flexione, which is a grammatically simplified version of Latin.
Peano also continued to publish mathematical notations in a series from 1895 to 1908 collectively known as Formulario mathematico.
References
External links
English translation (with original Latin): https://github.com/mdnahas/Peano_Book/blob/master/Peano.pdf
Original treatise (in Latin, scanned) at Internet Archive: https://archive.org/details/arithmeticespri00peangoog
Arithmetic
Books about mathematics
1889 non-fiction books
19th-century Latin books |
https://en.wikipedia.org/wiki/Elmina%20Wilson | Elmina Wilson (1870–1918) was the first American woman to complete a four-year degree in civil engineering. She went on to earn the first master's degree in the field and then became the first woman professor to teach engineering at Iowa State University (ISU). Her first project was as an assistant on the design of the Marston Water Tower on the ISU campus. After teaching for a decade at the school, she moved to New York City to enter private practice. Wilson worked with the James E. Brooks Company, skyscraper design firm Purdy and Henderson, and the John Severn Brown Company.
Early life
Elmina Tessa Wilson was born on 29 September 1870 in Harper, Keokuk County, Iowa to Olive (née Eaton) and John C. Wilson. She was the second-to-the-youngest daughter in a family of five other siblings, Warren, Fanny, Olive, Anna, and Alda. In 1892, she graduated from Iowa State University (ISU) with the first four-year civil engineering degree awarded any woman from an American university. In 1894, Elmina graduated with a master's degree in civil engineering from ISU, simultaneously with sister Alda's graduation with a bachelor's in the same field. Both sisters were members of Pi Beta Phi women's fraternity and staunch supporters of both women's education and suffrage.
Career
Soon after her graduation, Wilson began working at ISU, first as an assistant in the school's drafting room and then was promoted as an instructor the following year. In 1895, she collaborated on a project with a professor, Anson Marston, which was the first elevated steel water tower to be constructed west of the Mississippi. The tower, which became known as the Marston Water Tower, was completed in 1897. After finishing the project, Wilson took a winter course in hydraulics at Cornell University and returned to teaching physics at ISU. During her summer breaks, she worked with Alda in Chicago with the firm of Patton & Miller, doing drafting work and for the next two winter breaks, she studied at Massachuse |
https://en.wikipedia.org/wiki/Yamaha%20YMZ280B | The Yamaha YMZ280B, also known as PCMD8, is a sound chip produced by Yamaha Corporation. It is an eight-channel PCM/ADPCM sample-based synthesizer designed for use with video game machines, packaged in a 64-pin QFP.
Features
Up to 8 simultaneous sounds (voices)
Waveform data lengths of 4 (ADPCM), 8, 16 bits (PCM)
Stereo output (with a 4-bit/16-level pan for each voice)
Up to 16 MB of external memory for wave data
External ROM or SRAM memory.
The YMZ280B can either use an internal crystal oscillator running at 16.9344 MHz or be connected to a master clock line. The chip can be connected to up to 16 MB of external memory to provide the voice data for sound reproduction. The sound data can be encoded as 4-bit ADPCM, 8-bit PCM, or 16-bit PCM, played back in a wide range of frequencies (up to 256 steps), and then mixed together and output as a two's complement MSB-first digital data stream meant to be connected to a complementing DAC chip like the YAC513. The YMZ280B can also be connected to a YSS225 effects processor (EP), allowing two of the 8 channels to be processed further.
Products
Per its design, the YMZ280B found significant use in arcade machines from the late 1990s. The first generation of video games from Cave were among the first to employ the chip. Later games from Data East and Psikyo also employed the chip.
References
Yamaha sound chips |
https://en.wikipedia.org/wiki/Active%20updating | In computer programming, suppose we have a data item A whose value depends on data item B, i.e., the value of A must be changed after the value of B changes and before the value of A becomes necessary. Active updating is updating A immediately after B changes, while passive updating or lazy updating (lazy evaluation) is updating A immediately before its value is fetched. And example of this distinction is, e.g., in the implementation of GUI applications: the list of submenu items may depend on the state of the application; this list may be updated either as soon as the state of the application changes ("active") or only when the menu is invoked ("passive").
Another example is update a visual display as soon as the underlying data change as opposed to clicking the "redraw" button. In this situation active update may create a problem to deal with: an abrupt change of some part of the display may coincide in time with the saccadic movement of the eye, and the change may go unnoticed by a human observer.
See also
Direct updating vs. deferred updating in transaction processing.
References
Programming idioms |
https://en.wikipedia.org/wiki/Inexact%20differential%20equation | An inexact differential equation is a differential equation of the form (see also: inexact differential)
The solution to such equations came with the invention of the integrating factor by Leonhard Euler in 1739.
Solution method
In order to solve the equation, we need to transform it into an exact differential equation. In order to do that, we need to find an integrating factor to multiply the equation by. We'll start with the equation itself. , so we get . We will require to satisfy . We get
After simplifying we get
Since this is a partial differential equation, it is mostly extremely hard to solve, however in some cases we will get either or , in which case we only need to find with a first-order linear differential equation or a separable differential equation, and as such either
or
References
Further reading
External links
A solution for an inexact differential equation from Stack Exchange
a guide for non-partial inexact differential equations at SOS math
Equations
Differential equations
Ordinary differential equations
Differential calculus
Discrete mathematics
Mathematical structures |
https://en.wikipedia.org/wiki/Gen-Z%20%28consortium%29 | The Gen-Z Consortium is a trade group of technology vendors involved in designing CPUs, random access memory, servers, storage, and accelerators. The goal was to design an open and royalty-free "memory-semantic" bus protocol, which is not limited by the memory controller of a CPU, to be used in either a switched fabric or a point-to-point device link on a standard connector.
In November 2021, the GenZ Consortium voted to transfer all its specifications and intellectual property to the CXL Consortium.
History
The consortium was publicly announced on October 11, 2016.
Members include server vendors Cisco Systems, Cray, Dell Technologies, Hewlett Packard Enterprise, Huawei, IBM, and Lenovo. CPU vendor members include Advanced Micro Devices, ARM Holdings, Broadcom Limited, IBM, and Marvell.
Memory and storage vendor members include Micron Technology, Samsung, Seagate Technology, SK Hynix, and Western Digital. Other members include IDT Corporation, IntelliProp, Mellanox Technologies, Microsemi, Red Hat, and Xilinx. Analysts noted the absence of Intel, which announced an inter-connect technology of its own called Omni-Path a year before, and Nvidia, with its own NVLink technology. Gen-Z also maintains cooperation with industry alliances such as OpenFabrics, SNIA, and DMTF.
The effort followed years of delays with product availability for version 4.0 of PCI Express. Some of the vendors also joined a group to promote the cache coherent interconnect for accelerators (CCIX) protocol on the same day. At about the same time, yet another consortium formed to work on an open specification for the Coherent Accelerator Processor Interface (CAPI).
The first version of the GenZ Core specifications was published in 2018; it defines physical link that rom both PCI Express and 50 Gigabit Ethernet physical layer (PHY) standards. The Gen-Z protocol allows for asymmetric links with more bandwidth in one direction, and supports connection topologies like point to point links, daisy- |
https://en.wikipedia.org/wiki/Contact%20guidance | Contact guidance refers to a phenomenon for which the orientation of cells and stress fibers is influenced by geometrical patterns such as nano/microgrooves on substrates, or collagen fibers in gels and soft tissues. This phenomenon was discovered in 1912, and the terminology was introduced in 1945, but it is with the development of tissue engineering that researchers drew increasing attention on this topic, seeing the potential of contact guidance in influencing the morphology and organization of cells. Nevertheless, the biological processes underlying contact guidance are still unclear.
Contact guidance on two-dimensional substrates
When cells are seeded onto flat substrates, they are normally in a random orientation. However, substrates with topographical patterns influence the orientation of cells cultured on these surfaces by their geometrical cues. For example, if a substrate has nano/microgrooves running parallel to each other, cells orient along the direction of these nano/microgrooves. Based on this, cells seem to be able to sense the structural characteristics of their surrounding and consequently respond by adopting the orientation of topographical stimuli. A similar effect can be obtained when cells are cultured on flat surfaces with lines of proteins printed on top (to which cells can adhere), interspersed by repellent lines; in that case, cells also align along the patterns.
It has also been observed that the phenomenon of contact guidance on microgrooved surfaces is influenced by the groove width. For instance, osteoblast-like cells align along the nanogrooves only for grooves wider than 75 nm. A similar behavior has been observed with other cell types, such as fibroblasts, which align along these topographical patterns when the grooves are wider than 150 nm. On the other hand, grooves that are too wide can decrease the effects of contact guidance
Contact guidance in three-dimensional structures
Cells can orient in response to contact guidance when |
https://en.wikipedia.org/wiki/National%20Centre%20for%20Biotechnology%20Education | The National Centre for Biotechnology Education (NCBE) is a national resource centre at the University of Reading to teach pre-university biotechnology in schools in the UK. It was founded in 1990.
History
It began as the National Centre for School Biotechnology (NCSB) in 1985 in the Department of Microbiology. It became the NCBE in 1990. For many years it was the only centre in Europe that was devoted to the teaching of biotechnology in schools. The Dolan DNA Learning Center had been set up in the USA.
It was set up as an education project by the Society for General Microbiology, now the Microbiology Society. Money from the Laboratory of the Government Chemist set up the National Centre for School Biotechnology (NCSB). Money also came from the Gatsby Charitable Foundation. For the first five years, the UK government's DTI was involved, but from 1990 onwards wanted the organization to become self-supporting as it had to cut back on budgets. By 1992 the government provided no money for the centre.
Structure
The site was set up in former buildings of the University of Reading's Department of Microbiology.
Function
It reaches out to schools to give up-to-date information on biotechnology. Biotechnology is a rapidly evolving subject, and schools cannot keep up-to-date with all that they would be required to know. It produces educational resources. It runs the Microbiology in Schools Advisory Committee (MISAC).
See also
Centre for Industry Education Collaboration at York
National Centre for Excellence in the Teaching of Mathematics, University of York
Science and Plants for Schools, another well-known science resource for UK schools
References
External links
NCBE
DNA to Darwin
Education resources from the University of Leicester
European Initiative for Biotechnology Education
1985 establishments in the United Kingdom
Biology education in the United Kingdom
Biotechnology in the United Kingdom
Biotechnology organizations
Educational institutions established |
https://en.wikipedia.org/wiki/Infraorbital%20vein | The infraorbital vein is a vein that drains structures of the floor of the orbit. It arises on the face and passes backwards through the orbit alongside infraorbital artery and nerve, exiting the orbit through the inferior orbital fissure to drain into the pterygoid venous plexus.
Anatomy
Origin
The infraorbital vein arises on the face by the union of several tributaries.
Course and relations
Accompanied by the infraorbital artery and the infraorbital nerve, it passes posteriorly through the infraorbital foramen, infraorbital canal, and infraorbital groove. It exits the orbit through the inferior orbital fissure to drain into the pterygoid venous plexus.
Distribution
The infraorbital vein drains structures of the floor of the orbit; receives tributaries from structures that lie close to the floor of the orbit.
Anastomoses
The infraorbital vein communicates with the inferior ophthalmic vein. It may sometimes additionally also communicate with the facial vein on the face.
References
Anatomy |
https://en.wikipedia.org/wiki/Petrolingual%20ligament | The petrolingual ligament lies at the posteroinferior aspect of the lateral wall of the cavernous sinus and marks the point at which the internal carotid artery enters the cavernous sinus.
Anatomically, the petrolingual ligament demarcates two of the segments of the internal carotid artery:
The petrolingual ligament marks the end of the petrous section of the internal carotid artery.
The cavernous section of the internal carotid artery begins at the superior aspect of the petrolingual ligament.
For surgeons and radiologists, it is important to be oriented to the location of this ligament in cases of possible dissection of the internal carotid artery, as it helps determine whether the dissection has occurred inside or outside the cavernous sinus.
References
Anatomy |
https://en.wikipedia.org/wiki/Ecosphere%20%28planetary%29 | An ecosphere is a planetary closed ecological system. In this global ecosystem, the various forms of energy and matter that constitute a given planet interact on a continual basis. The forces of the four Fundamental interactions cause the various forms of matter to settle into identifiable layers. These layers are referred to as component spheres with the type and extent of each component sphere varying significantly from one particular ecosphere to another. Component spheres that represent a significant portion of an ecosphere are referred to as a primary component spheres. For instance, Earth's ecosphere consists of five primary component spheres which are the Geosphere, Hydrosphere, Biosphere, Atmosphere, and Magnetosphere.
Types of component spheres
Geosphere
The layer of an ecosphere that exists at a Terrestrial planet's Center of mass and which extends radially outward until ending in a solid and spherical layer known as the Crust (geology).
This includes rocks and minerals that are present on the Earth as well as parts of soil and skeletal remains of animals that have become fossilized over the years. This is all about process how rocks metamorphosize. They go through solids to weathered to washing away and back to being buried and resurrected. The primary agent driving these processes is the movement of Earth’s tectonic plates, which creates mountains, volcanoes, and ocean basins. The inner core of the Earth contains liquid iron, which is an important factor in the geosphere as well as the magnetosphere.
Hydrosphere
The total mass of water, regardless of phase (e.g. liquid, solid, gas), that exists within an ecosphere. It's possible for the hydrosphere to be highly distributed throughout other component spheres such as the geosphere and atmosphere.
There are about 1.4 billion km of water on Earth. That includes liquid water in the ocean, lakes, and rivers. It includes frozen water in snow, ice, and glaciers, and water that’s underground in soils and rocks |
https://en.wikipedia.org/wiki/ZYYX | ZYYX is a Swedish desktop 3D printer, designed primarily for office and educational applications. Originally launched in 2014 by Magicfirm Europe AB, based at Chalmers Innovation in Gothenburg, the ZYYX 3D Printer is based on FFF technology and features include an automated levelling function, smell-free operation (most 3D printers tend to smell heavily of hot plastic) and an extruder head which is less prone to jamming.
Models
The first model was released in 2014. The ZYYX 3D Printer was designed to provide an enclosed print environment with a fan and carbon filter to scrub fumes produced by the melted filament before exhausting them into the environment. The XYYZ+ Printer was released in 2016 to introduce improvements that provided a more stable printing environment.
See also
Comparison of 3D printers
Digital modeling and fabrication
List of 3D printer manufacturers
References
External links
Official website
3D printers
Computer-related introductions in 2014
2014 establishments in Sweden
Swedish brands
Robotics in Sweden
Fused filament fabrication
Chalmers University of Technology |
https://en.wikipedia.org/wiki/List%20of%20U.S.%20states%20by%20changes%20in%20life%20expectancy%2C%201985%E2%80%932010 | This article ranks states of the United States sorted by changes in the life expectancy of their residents between 1985 and 2010. Changes in the life expectancy of men and women in each state are also sorted. States are also ranked for three risk factors controllable by the individual: obesity, smoking, and physical activity.
The data is taken from the Institute for Health Metrics and Evaluation, an independent global health research center at the University of Washington.
Note: May not add to total due to rounding.
Life expectancy (years)
Note: may not add to total due to rounding
Controllable risk factors
This list ranks U.S. states for three factors, controllable by the individual: obesity, smoking, and physical activity. These factors affect longevity: obesity and smoking reduce longevity and a higher level of physical activity increases longevity.
See also
List of U.S. states and territories by life expectancy
List of U.S. counties with shortest life expectancy
List of U.S. counties with longest life expectancy
References
Life expectancy changes,1985
State life expectancy changes,1985
Life expectancy changes,1985
Lists of people-related superlatives
United States |
https://en.wikipedia.org/wiki/Perl%20Programming%20Documentation | Perl Programming Documentation, also called perldoc, is the name of the user manual for the Perl 5 programming language. It is available in several different formats, including online in HTML and PDF. The documentation is bundled with Perl in its own format, known as Plain Old Documentation (pod). Some distributions, such as Strawberry Perl, include the documentation in HTML, PDF, and pod formats.
perldoc is also the name of the Perl command that provides "access to all the documentation that comes with Perl", from the command line.
See also
Outline of Perl – overview of and topical guide to the Perl programming language
Raku – Perl 5's sister language
man page – form of software documentation usually found on a Unix or Unix-like operating system, invoked by issuing the man command. Perl documentation is sometimes available as man pages.
PerlMonks – community website covering all aspects of Perl programming and other related topics such as web applications and system administration. Includes forums where perl users may seek answers to their questions, and answer the questions of others.
RTFM – Internet slang for "Read the Frickin' Manual"
External links
Official documentation for Perl 5 – displays the documentation, and also includes links to download the HTML and PDF files for off-line use.
The perldoc help page – covers use of the perldoc command
Perl documentation documentation – documentation about perl's documentation
Official documentation for Perl 6
Software documentation |
https://en.wikipedia.org/wiki/K%20band%20%28NATO%29 | The NATO K band is the obsolete designation given to the radio frequencies from 20 to 40 GHz (equivalent to wavelengths between 1.5 and 0.75 cm) during the cold war period. Since 1992 frequency allocations, allotment and assignments are in line to NATO Joint Civil/Military Frequency Agreement (NJFA).
However, in order to identify military radio spectrum requirements, e.g. for crises management planning, training, electronic warfare activities, or in military operations, this system is still in use.
References
Microwave bands
Satellite broadcasting |
https://en.wikipedia.org/wiki/K%20band%20%28IEEE%29 | The IEEE K-band is a portion of the radio spectrum in the microwave range of frequencies from 18 to 27-Gigahertz (GHz). The range of frequencies in the center of the K-band between 18- and 26.5-GHz is absorbed by water vapor in the atmosphere due to its resonance peak at 22.24-GHz, . Therefore these frequencies experience high atmospheric attenuation and cannot be used for long distance applications. For this reason the original K-band has been split into three bands, Ka-band, K-band, and Ku-band as detailed below.
The K stands for Kurz which stems from the German word for short.
Subdivisions
Because of the water vapor absorption peak in the center of the band, the IEEE K-band is conventionally divided into three sub-bands:
Ku-band: K-under band, 12–18-GHz, mainly used for satellite communications, direct-broadcast satellite television, terrestrial microwave communications, and radar, especially police traffic-speed detectors.
K-band 18–27-GHz: Due to the 22-GHz water vapor absorption line this band has high atmospheric attenuation and is only useful for short range applications.
Ka-band: K-above band, 26.5–40-GHz, mainly used for satellite communications, radar and experimental communications. NASA's Kepler spacecraft is the first NASA mission to use Ka-band NASA Deep Space Network (NASA DSN) communications.
Amateur radio
The Radio Regulations of the International Telecommunication Union (ITU) allow amateur radio and amateur satellite operations in the frequency range 24.000-GHz to 24.250-GHz, which is known as the 1.2-centimeter band. It is also referred to as the K-band by AMSAT.
See also
K band (infrared)
K band (NATO)
K band (Radar Codes)
References
Microwave bands
Satellite broadcasting |
https://en.wikipedia.org/wiki/IAS%2011 | The IAS 11 standard of International Accounting Standards set out requirements for the accounting treatment of the revenue and costs associated with long-term construction contracts. By their nature, construction activities and contracts are long-term projects, often beginning and ending in different accounting periods. Until its replacement with IFRS 15 in January 2018, IAS 11 helped accountants with measuring to what extent costs, revenue and possible profit or loss on the project are incurred in each period.
History
This is a timeline of IAS 11:
Content
How accounting revenue and costs are to be recognized depends first on whether the stage of completion of a project can be reliably measured. If this is the case, cost and revenue (including profit if any) can be recognized up to the percentage of completion during the current accounting period. If the stage of completion of a project cannot be reliably measured, the revenue can only be recognized up to the costs that have been incurred and any profit is only recognized at the end of the last accounting period. In the case a company is expecting to make a loss on the contract, this loss will be immediately recognized in the current accounting period.
References
International Accounting Standards
International Financial Reporting Standards
Civil engineering |
https://en.wikipedia.org/wiki/E-SCREEN | E-SCREEN is a cell proliferation assay based on the enhanced proliferation of human breast cancer cells (MCF-7) in the presence of estrogen active substances. The E-SCREEN test is a tool to easily and rapidly assess estrogenic activity of suspected xenoestrogens (singly or in combination). This bioassay measures estrogen-induced increase of the number of human breast cancer cell, which is biologically equivalent to the increase of mitotic activity in tissues of the genital tract. It was originally developed by Soto et al. and was included in the first version of the OECD Conceptual Framework for Testing and Assessment of Endocrine Disrupters published in 2012. However, due to failed validation, it was not included in the updated version of the framework published in 2018.
The E-SCREEN test
The E-SCREEN cell proliferation assay is performed with the human MCF-7 breast cancer cell line, an established estrogenic cell line that endogenously expresses ERα.
Human MCF-7 are cultivated in Dulbecco’s modified Eagle’s medium (DMEM) with fetal bovine serum (FBS) and phenol red as buffer tracer (culture medium), at 37 °C, in an atmosphere of 5% CO₂ and 95% air under saturating humidity. To accomplish the E-SCREEN assay the cells are trypsinized and plated in well culture plates. Cells are allowed to attach for 24 h, and the 4 seeding medium is then removed and replaced with the experimental culture medium (phenol red free DMEM with charcoal dextran treated fetal bovine serum -steroid-free-).
For assaying suspected estrogen active substances, a range of concentrations of the test compound is added to the experimental medium. In each experiment, the cells are exposed to a dilution series of 17β-estradiol (0.1 pM–1000 pM) for providing a positive control (standard dose-response curve), and treated only with hormone-free medium as a negative control. The bioassay ends on day 6 (late exponential phase) by removing the media from the wells and fixing the cells with trichloroace |
https://en.wikipedia.org/wiki/Privatization%20%28computer%20programming%29 | Privatization is a technique used in shared-memory programming to enable parallelism, by removing dependencies that occur across different threads in a parallel program. Dependencies between threads arise from two or more threads reading or writing a variable at the same time. Privatization gives each thread a private copy, so it can read and write it independently and thus, simultaneously.
Each parallel algorithm specifies whether a variable is shared or private. Many errors in implementation can arise if the variable is declared to be shared but the algorithm requires it to be private, or vice versa.
Traditionally, parallellizing compilers could apply privatization to scalar elements only.
To exploit parallelism that occurs across iterations within a parallel program (loop-level parallelism), the need grew for compilers that can also perform array variable privatization. Most of today's compilers can performing array privatization with more features and functions to enhance the performance of the parallel program in general. An example is the Polaris parallelizing compiler.
Description
A shared-memory multiprocessor is a "computer system composed of multiple independent processors that execute different instruction streams". The shared memory programming model is the most widely used for parallel processor designs. This programming model starts by identifying possibilities for parallelism within a piece of code and then mapping these parallel tasks into threads.
The next step is to determine the scope of variables used in a parallel program, which is one of the key steps and main concerns within this model.
Variable scope
The next step in the model groups tasks together into bigger tasks, as there are typically more tasks than available processors. Typically, the number of execution threads that the tasks are assigned to, is chosen to be less than or equal to the number of processors, with each thread assigned to a unique processor.
Right after this step |
https://en.wikipedia.org/wiki/Food%20Policy%20%28journal%29 | Food Policy is a bimonthly peer-reviewed scientific journal covering food policy. It was established in 1975 and is published bimonthly by Elsevier. The editors-in-chief are Mario Mazzocchi (University of Bologna) and Christopher B. Barrett (Cornell University). According to the Journal Citation Reports, the journal has a 2018 impact factor of 3.788.
References
External links
Academic journals established in 1975
Elsevier academic journals
Bimonthly journals
Policy analysis journals
Food science journals
English-language journals |
https://en.wikipedia.org/wiki/Power%20law%20of%20cache%20misses | A power law is a mathematical relationship between two quantities in which one is directly proportional to some power of the other. The power law for cache misses was first established by C. K. Chow in his 1974 paper, supported by experimental data on hit ratios for stack processing by Richard Mattson in 1971. The power law of cache misses can be used to narrow down the cache sizes to practical ranges, given a tolerable miss rate, as one of the early steps while designing the cache hierarchy for a uniprocessor system.
The power law for cache misses can be stated as
where M is the miss rate for a cache of size C and M0 is the miss rate of a baseline cache. The exponent α is workload-specific and typically ranges from 0.3 to 0.7.
Caveats
The power law can only give an estimate of the miss rate only up to a certain value of cache size. A large enough cache eliminates capacity misses and increasing the cache size further will not reduce the miss rate any further, contrary to the power law's prediction.
The validity of the power law of cache misses also depends on the size of the working memory set in a given process and also on the temporal re-reference pattern of cache blocks in a process. If a process has a small working memory set relative to the cache size, capacity misses are unlikely and the power law does not hold.
Although conflict misses reduce as associativity increases, Hartstein et al. showed that the power law holds irrespective of set associativity.
Hartstein et al. plotted the number of cache block re-accesses versus their re-reference times for a large number of workloads and found that most also follow an exponential relationship.
where R(t) is the rate of re-referencing. It was found that the exponent β ranged between 1.7 and 1.3. Theoretically, it was proved that the power laws of cache re-reference and cache miss rate are related by the equation . This means that for workloads that do not follow the re-reference power law, the power law |
https://en.wikipedia.org/wiki/CodinGame | CodinGame is a technology company editing an online platform for developers, allowing them to play with programming with increasingly difficult puzzles, to learn to code better with an online programming application supporting twenty-five programming languages, and to compete in multiplayer programming contests involving timed artificial intelligence, or code golf challenges.
CodinGame also serves as a recruiting platform, allowing developers to get noticed by companies based on their performance on the contests.
History
Activity
CodinGame's business model is based on sponsoring by companies wanting to get in touch with developers. CodinGame helps these companies to recruit developers through worldwide contests hosted every three months, or private hackathons. The startup was also seeded through several fundraisings in 2013 and 2015.
CodinGame for Work also sells turnkey tech screening solutions to help companies assess the level of their programmer candidates through coding tests.
Available programming languages for solving puzzles or taking part in contests are: Bash, C, C++, C#, Clojure, D, Dart, F#, Go, Groovy, Haskell, Java, JavaScript, Kotlin, Lua, Objective-C, OCaml, Pascal, Perl, PHP, Python (v3), Ruby, Rust, Scala, Swift, TypeScript, and Visual Basic .NET.
See also
HackerRank
CodeFights
Competitive programming
References
Further reading
External links
Programming contests
Computer science competitions
French educational websites
Problem solving
Puzzles
Programming games |
https://en.wikipedia.org/wiki/Agile%20tooling | Agile tooling is the design and fabrication of manufacturing related-tools such as dies, molds, patterns, jigs and fixtures in a configuration that aims to maximise the tools' performance, minimise manufacturing time and cost, and avoid delay in prototyping. A fully functional agile tooling laboratory consists of CNC milling, turning and routing equipment. It can also include additive manufacturing platforms (such as fused filament fabrication, selective laser sintering, Stereolithography, and direct metal laser sintering), hydroforming, vacuum forming, die casting, stamping, injection molding and welding equipment.
Agile tooling is similar to rapid tooling, which uses additive manufacturing to make tools or tooling quickly, either directly by making parts that serve as the actual tools or tooling components, such as mold inserts; or indirectly by producing patterns that are in turn used in a secondary process to produce the actual tools. Another similar technique is prototype tooling, where molds, dies and other devices are used to produce prototypes. Rapid manufacturing, and specifically rapid tooling technologies, are earlier in their development than rapid prototyping (RP) technologies, and are often extensions of RP.
The aim of all toolmaking is to catch design errors early in the design process; improve product design better products, reduce product cost, and reduce time to market.
Users
Hundreds of universities and research centers around the globe are investing in additive manufacturing equipment in order to be positioned to make prototypes and tactile representations of real parts. Few have fully committed the concept of using additive manufacturing (AM) to create manufacturing tools (fixturing, clamps, molds, dies, patterns, negatives, etc.). AM experts seem to agree that tooling is a large, namely untapped market. Deloitte University Press estimated that in 2012 alone, the AM Tooling market $1.2 Billion. At that point in the development cycle of AM |
https://en.wikipedia.org/wiki/Ultimate%20tic-tac-toe | Ultimate tic-tac-toe (also known as super tic-tac-toe, meta tic-tac-toe or (tic-tac-toe)²) is a board game composed of nine tic-tac-toe boards arranged in a 3 × 3 grid. Players take turns playing on the smaller tic-tac-toe boards until one of them wins on the larger board. Compared to traditional tic-tac-toe, strategy in this game is conceptually more difficult and has proven more challenging for computers.
Rules
Just like in regular tic-tac-toe, the two players (X and O) take turns, starting with X. The game starts with X playing wherever they want in any of the 81 empty spots. Next the opponent plays, however they are forced to play in the small board indicated by the relative location of the previous move. For example, if X plays in the top right square of a small (3 × 3) board, then O has to play in the small board located at the top right of the larger board. Playing any of the available spots decides in which small board the next player plays.
If a move is played so that it is to win a small board by the rules of normal tic-tac-toe, then the entire small board is marked as won by the player in the larger board. Once a small board is won by a player or it is filled completely, no more moves may be played in that board. If a player is sent to such a board, then that player may play in any other board. Game play ends when either a player wins the larger board or there are no legal moves remaining, in which case the game is a draw.
Gameplay
Super tic-tac-toe is significantly more complex than most other variations of tic-tac-toe, as there is no clear strategy to playing. This is because of the complicated game branching in this game. Even though every move must be played in a small board, equivalent to a normal tic-tac-toe board, each move must take into account the larger board in several ways:
Anticipating the next move: Each move played in a small board determines where the opponent's next move can be played. This might make moves that are considered bad |
https://en.wikipedia.org/wiki/Cache%20hierarchy | Cache hierarchy, or multi-level caches, refers to a memory architecture that uses a hierarchy of memory stores based on varying access speeds to cache data. Highly requested data is cached in high-speed access memory stores, allowing swifter access by central processing unit (CPU) cores.
Cache hierarchy is a form and part of memory hierarchy and can be considered a form of tiered storage. This design was intended to allow CPU cores to process faster despite the memory latency of main memory access. Accessing main memory can act as a bottleneck for CPU core performance as the CPU waits for data, while making all of main memory high-speed may be prohibitively expensive. High-speed caches are a compromise allowing high-speed access to the data most-used by the CPU, permitting a faster CPU clock.
Background
In the history of computer and electronic chip development, there was a period when increases in CPU speed outpaced the improvements in memory access speed. The gap between the speed of CPUs and memory meant that the CPU would often be idle. CPUs were increasingly capable of running and executing larger amounts of instructions in a given time, but the time needed to access data from main memory prevented programs from fully benefiting from this capability. This issue motivated the creation of memory models with higher access rates in order to realize the potential of faster processors.
This resulted in the concept of cache memory, first proposed by Maurice Wilkes, a British computer scientist at the University of Cambridge in 1965. He called such memory models "slave memory". Between roughly 1970 and 1990, papers and articles by Anant Agarwal, Alan Jay Smith, Mark D. Hill, Thomas R. Puzak, and others discussed better cache memory designs. The first cache memory models were implemented at the time, but even as researchers were investigating and proposing better designs, the need for faster memory models continued. This need resulted from the fact that although ear |
https://en.wikipedia.org/wiki/IEEE%20Journal%20of%20Solid-State%20Circuits | The IEEE Journal of Solid-State Circuits is a monthly peer-reviewed scientific journal on new developments and research in solid-state circuits, published by the Institute of Electrical and Electronics Engineers (IEEE) in New York City. The journal serves as a companion venue for expanding on work presented at the International Solid-State Circuits Conference, the Symposia on VLSI Technology and Circuits, and the Custom Integrated Circuits Conference. The journal has an impact factor of 6.12 and is edited by Dennis Sylvester (University of Michigan).
References
External links
Journal of Solid-State Circuits
Electronics journals
Semiconductor journals
Monthly journals
English-language journals
Academic journals established in 1966 |
https://en.wikipedia.org/wiki/Demanufacturing | Demanufacturing is a process where a product after extensive usage, often at the end of its lifespan, is then disassembled or dismantled into its components. Demanufacturing is also commonly referred to as the reverse process of manufacturing and, hence, can next to disassembly or dismantling also include various other processing steps. For example, demanufacturing commonly starts with product manipulation and next a classification step to evaluate the functionality of the product and/or the herein contained components to assess if these are suitable for reuse or are deemed unusable and need to be recycled, so the materials can used in new products. Demanufacturing was proposed to be used in all industries as a means reduce the environmental footprint while preserving economic viability of the processes involved. This term was first coined by Professor Walter W. Olson and Professor John W. Sutherland in 1993.
In the case of waste electronics demanufacturing involves dismantling them into their components. After a classification and product manipulation step, electronics are typically dismantled into their components either to support the reuse of components (HDDs, RAM, CPUs, etc.) or to facilitate increased precious metal (e.g. Au and Ag of printed wiring boards) and critical metal recovery (e.g. Nd from permanent magnets in HDDs).
Forms of demanufacturing
There are two forms of demanufacturing: destructive and non-destructive. Non-destructive demanufacturing incorporates non-destructive disassembly actions or semi-destructive disassembly actions in which only the fasteners are damaged to allow for components to be taken apart and then reused in new products. In contrast Destructive demanufacturing relies on destructive dismantling techniques in which information on the product structure is used to define optimal cutting points, which are used to take the product apart and to separate specific components with a higher yield and concentration in support of materi |
https://en.wikipedia.org/wiki/Balanced%20heave%20compensation | Balanced heave compensation is a technology engaging the principle of a balanced-arm lamp for offshore motion compensation.
Working principle
The technical working principle can be summarized as converting the non-linear force of a gas spring or hydro-pneumatic spring into an adjustable, substantially linear force, by several mechanical measures. The technology comprises a series of patented inventions by NHLO licensed to and marketed by Seaqualize.
Research project
In cooperation with IHC and research partners MARIN and ECN, a scale model of a first device comprising the technology, an offshore access bridge has been tested. Seaqualize is currently working on a full-scale prototype of the access bridge. The project has been subsidized by the Dutch Ministry of Economic Affairs as a Renewable Energy project.
The research project indicates balanced heave compensation enables a range of potential benefits compared to currently available solutions. This may lead to improvements and cost savings in (engaging) heave compensation systems.
Comparison to traditional heave compensation systems
Balanced heave compensation (BHC) differs from traditional (spring based) Passive Heave Compensation (PHC) and Active Heave Compensation (AHC) in several ways.
In traditional spring based heave compensation systems the movement of the mass is parallel to the movement of the spring cylinder. A passive heave compensation system is in effect a mass spring system: it stabilizes a certain mass in a single position of the spring, in that position only the spring and the mass are balanced. In other positions of the spring the mass and the spring are not balanced and the mass will tend to start moving towards the stabilized position due to residual forces in the spring.
Only in certain (preset) conditions a mass spring system can be beneficial for heave compensation: the spring, its excitation, the mass and the frequency of the excitation of the spring all need to be taken into account |
https://en.wikipedia.org/wiki/Qsub | qsub is an IEEE Std 1003.1-2008 Unix command for submitting jobs to a job scheduler, usually in cluster or grid computing. The qsub command is used to submit jobs to Slurm Workload Manager, to TORQUE, and to Oracle Grid Engine; HTCondor calls it condor_qsub.
References
Job scheduling
Unix |
https://en.wikipedia.org/wiki/Attitude-behavior%20consistency | Attitude-behavior consistency is when a person's attitude is consistent with their behavior. This is not true in many cases. The fact that people often express attitudes that are inconsistent with how they act may surprise those unfamiliar with social and behavioral science, but it is an important fact to understand because facts are often reported as if they are about people's actions when they may only be known to be true about their words. It is often much easier to conduct interviews or surveys than to obtain records of how people behave in situations. Sometimes attitudes, such as voting, are measurably consistent with behavior. In such cases it may be possible to obtain accurate estimates of behavior. However, there is no general method for correcting for attitude-behavior inconsistency.
Applications to research methodology
Attitude-behavior consistency is an important concept for social science research because claims are often made about behavior based on evidence which is really about attitudes. The attitudinal fallacy is committed when verbal data are used to support claims not about what people believe or say, but what they do. Data collection methods based on self-report like surveys, and interviews are vulnerable to the attitudinal fallacy if they attempt to measure behavior and if reported attitudes are inconsistent with the behavior.
Research methods that directly observe behaviors avoid the attitudinal fallacy as a matter of course. However many kinds of behavior are not easily observed. Ethnography can make rich observations and descriptions of behavior and allow for comparison between behavior and attitude. Unfortunately, in general ethnographic data cannot be used to draw statistically generalizable conclusions about behavior in a population. Ethnographers can still commit the attitudinal fallacy if they rely on quotations as evidence for behaviors. Experiments in laboratories make it possible to observe behavior, with the limitations th |
https://en.wikipedia.org/wiki/Edge%20tessellation | In geometry, an edge tessellation is a partition of the plane into non-overlapping polygons (a tessellation) with the property that the reflection of any of these polygons across any of its edges is another polygon in the tessellation.
All of the resulting polygons must be convex, and congruent to each other. There are eight possible edge tessellations in Euclidean geometry, but others exist in non-Euclidean geometry.
The eight Euclidean edge tessellations are:
In the first four of these, the tiles have no obtuse angles, and the degrees of the vertices are all even.
Because the degrees are even, the sides of the tiles form lines through the tiling, so each of these four tessellations can alternatively be viewed as an arrangement of lines. In the second four, each tile has at least one obtuse angle at which the degree is three, and the sides of tiles that meet at that angle do not extend to lines in the same way.
These tessellations were considered by 19th-century inventor David Brewster in the design of kaleidoscopes. A kaleidoscope whose mirrors are arranged in the shape of one of these tiles will produce the appearance of an edge tessellation. However, in the tessellations generated by kaleidoscopes, it does not work to have vertices of odd degree, because when the image within a single tile is asymmetric there would be no way to reflect that image consistently to all the copies of the tile around an odd-degree vertex. Therefore, Brewster considered only the edge tessellations with no obtuse angles, omitting the four that have obtuse angles and degree-three vertices.
See also
Reflection group
Citations
Tessellation |
https://en.wikipedia.org/wiki/Coin%20storage | Coin collectors have various options for storing their coin collections. The various options depend on a few different requirements such as; protection from oxidation and other chemical damage, protection from mechanical damage, ease of viewing and organization, and protection from loss or theft.
For these requirements, a few more common options include; plastic flips, cardboard flips, coin folders (press-in type), coin tubes, coin albums, and for higher value individual coins, coin slabs. The collection can then be placed in specialty designed coin storage boxes. Common storage boxes are available for 2x2 coin flips and various brands of coin slabs.
To prevent theft coin collectors use safes and bank safety deposit boxes. Each type of storage solves some of the challenges of safely storing a coin collection, but few completely solve all of them alone and thus, many collectors use multiple layers of protection to improve the safety of their coins. The more valuable the coin the more elaborate the storage solutions and sizable the collection.
While it may seem counter-intuitive, some storage methods can actually damage coins. Soft PVC and cardboard contain sulfur and other acidic or oxidizing materials. For expensive coins that can be tarnished, collectors should avoid using cardboard folders, paper or plastic bags, certain plastic tubes, and any other storage container that is not chemically inert.
References
Coin collecting
Collecting
Computer storage systems |
https://en.wikipedia.org/wiki/Coherent%20Accelerator%20Processor%20Interface | Coherent Accelerator Processor Interface (CAPI), is a high-speed processor expansion bus standard for use in large data center computers, initially designed to be layered on top of PCI Express, for directly connecting central processing units (CPUs) to external accelerators like graphics processing units (GPUs), ASICs, FPGAs or fast storage. It offers low latency, high speed, direct memory access connectivity between devices of different instruction set architectures.
History
The performance scaling traditionally associated with Moore's Law—dating back to 1965—began to taper off around 2004, as both Intel's Prescott architecture and IBM's Cell processor pushed toward a 4 GHz operating frequency. Here both projects ran into a thermal scaling wall, whereby heat extraction problems associated with further increases in operating frequency largely outweighed gains from shorter cycle times.
Over the decade that followed, few commercial CPU products exceeded 4 GHz, with the majority of performance improvements now coming from incrementally improved microarchitectures, better systems integration, and higher compute density—this largely in the form of packing a larger numbers of independent cores onto the same die, often at the expense of peak operating frequency (Intel's 24-core Xeon E7-8890 from June 2016 has a base operating frequency of just 2.2 GHz, so as to operate within the constraints of a single-socket 165 W power consumption and cooling budget).
Where large performance gains have been realized, it was often associated with increasingly specialized compute units, such as GPU units added to the processor die, or external GPU- or FPGA-based accelerators. In many applications, accelerators struggle with limitations of the interconnect's performance (bandwidth and latency) or with limitations due to the interconnect's architecture (such as lacking memory coherence). Especially in the datacenter, improving the interconnect became paramount in moving toward a hetero |
https://en.wikipedia.org/wiki/Hironaka%20decomposition | In mathematics, a Hironaka decomposition is a representation of an algebra over a field as a finitely generated free module over a polynomial subalgebra or a regular local ring. Such decompositions are named after Heisuke Hironaka, who used this in his unpublished master's thesis at Kyoto University .
Hironaka's criterion , sometimes called miracle flatness, states that a local ring R that is a finitely generated module over a regular Noetherian local ring S is Cohen–Macaulay if and only if it is a free module over S. There is a similar result for rings that are graded over a field rather than local.
Explicit decomposition of an invariant algebra
Let be a finite-dimensional vector space over an algebraically closed field of characteristic zero, , carrying a representation of a group , and consider the polynomial algebra on , . The algebra carries a grading with , which is inherited by the invariant subalgebra
.
A famous result of invariant theory, which provided the answer to Hilbert's fourteenth problem, is that if is a linearly reductive group and is a rational representation of , then is finitely-generated. Another important result, due to Noether, is that any finitely-generated graded algebra with admits a (not necessarily unique) homogeneous system of parameters (HSOP). A HSOP (also termed primary invariants) is a set of homogeneous polynomials, , which satisfy two properties:
The are algebraically independent.
The zero set of the , , coincides with the nullcone (link) of .
Importantly, this implies that the algebra can then be expressed as a finitely-generated module over the subalgebra generated by the HSOP, . In particular, one may write
,
where the are called secondary invariants.
Now if is Cohen–Macaulay, which is the case if is linearly reductive, then it is a free (and as already stated, finitely-generated) module over any HSOP. Thus, one in fact has a Hironaka decomposition
.
In particular, each element in can be written uniq |
https://en.wikipedia.org/wiki/DDoS%20attacks%20on%20Dyn | On October 21, 2016, three consecutive distributed denial-of-service attacks were launched against the Domain Name System (DNS) provider Dyn. The attack caused major Internet platforms and services to be unavailable to large swathes of users in Europe and North America. The groups Anonymous and New World Hackers claimed responsibility for the attack, but scant evidence was provided.
As a DNS provider, Dyn provides to end-users the service of mapping an Internet domain name—when, for instance, entered into a web browser—to its corresponding IP address. The distributed denial-of-service (DDoS) attack was accomplished through numerous DNS lookup requests from tens of millions of IP addresses. The activities are believed to have been executed through a botnet consisting of many Internet-connected devices—such as printers, IP cameras, residential gateways and baby monitors—that had been infected with the Mirai malware.
Affected services
Services affected by the attack included:
Airbnb
Amazon.com
Ancestry.com
The A.V. Club
BBC
The Boston Globe
Box
Business Insider
CNN
Comcast
CrunchBase
DirecTV
The Elder Scrolls Online
Electronic Arts
Etsy
Evergreen ILS
FiveThirtyEight
Fox News
The Guardian
GitHub
Grubhub
HBO
Heroku
HostGator
iHeartRadio
Imgur
Indiegogo
Mashable
National Hockey League
Netflix
The New York Times
Overstock.com
PayPal
Pinterest
Pixlr
PlayStation Network
Qualtrics
Quora
Reddit
Roblox
Ruby Lane
RuneScape
SaneBox
Seamless
Second Life
Shopify
Slack
SoundCloud
Squarespace
Spotify
Starbucks
Storify
Swedish Civil Contingencies Agency
Swedish Government
Tumblr
Twilio
Twitter
Verizon Communications
Visa
Vox Media
Walgreens
The Wall Street Journal
Wikia
Wired
Wix.com
WWE Network
Xbox Live
Yammer
Yelp
Zillow
Investigation
The US Department of Homeland Security started an investigation into the attacks, according to a White House source. No group of hackers claimed responsibility during or in th |
https://en.wikipedia.org/wiki/Dirty%20COW | Dirty COW (Dirty copy-on-write) is a computer security vulnerability of the Linux kernel that affected all Linux-based operating systems, including Android devices, that used older versions of the Linux kernel created before 2018. It is a local privilege escalation bug that exploits a race condition in the implementation of the copy-on-write mechanism in the kernel's memory-management subsystem. Computers and devices that still use the older kernels remain vulnerable.
The vulnerability was discovered by Phil Oester.
Because of the race condition, with the right timing, a local attacker can exploit the copy-on-write mechanism to turn a read-only mapping of a file into a writable mapping. Although it is a local privilege escalation, remote attackers can use it in conjunction with other exploits that allow remote execution of non-privileged code to achieve remote root access on a computer. The attack itself does not leave traces in the system log.
The vulnerability has the Common Vulnerabilities and Exposures designation . Dirty Cow was one of the first security issues transparently fixed in Ubuntu by the Canonical Live Patch service.
It has been demonstrated that the vulnerability can be utilized to root any Android device up to (and excluding) Android version 7 (Nougat).
History
The vulnerability has existed in the Linux kernel since version 2.6.22 released in September 2007, and there is information about it being actively exploited at least since October 2016. The vulnerability has been patched in Linux kernel versions 4.8.3, 4.7.9, 4.4.26 and newer.
The patch produced in 2016 did not fully address the issue and a revised patch was released on November 27, 2017, before public dissemination of the vulnerability.
Applications
The Dirty COW vulnerability has many perceived use cases including proven examples, such as obtaining root permissions in Android devices, as well as several speculated implementations. There are many binaries used in Linux which are re |
https://en.wikipedia.org/wiki/Discriminant%20Book | The Discriminant Book (German: Kenngruppenbuch; literally: Groups to identify the key to the receiver) shortened to K-Book (K. Buch), and also known as the indicator group book or identification group book was a secret distribution list in booklet form, which listed trigraphs in random order. The Kenngruppenbuch was introduced in May 1937, and used by the Kriegsmarine (German War Navy) during World War II as part of the Naval Enigma message encipherment procedure, to ensure secret and confidential communication between Karl Dönitz, Commander of Submarines (BdU) in the Atlantic and in the Mediterranean operating German submarines. The Kenngruppenbuch was used in the generation of the Enigma message Key that was transmitted within the message Indicator. The Kenngruppenbuch was used from 5 October 1941, for the Enigma Model M3, and from 1 February 1942 exclusively for the Enigma M4. It must not be confused with the Kenngruppenheft which was used with the Short Signal Book (German: Kurzsignalbuch).
History
The Kenngruppenbuch was a large document with the first edition coming into force in 1938, that mostly remained unchanged when a second edition was released in 1941. The Zuteilungsliste, however, was continually updated. After 1 May 1937, the Kriegsmarine had stopped using an Indicating system with a repetition of message key within the indicator, a serious security flaw, which was still being used by the Luftwaffe (German Airforce) and Heer (German Army) at the beginning of 1940, making the Naval Enigma more secure. The introduction of the K Book was designed to avert this serious security flaw.
On 9 May 1941, when a version of the K Book was recovered from U-boat U-110, Joan Clarke, and her compatriots at Hut 8, the section at Bletchley Park tasked with solving German naval (Kriegsmarine) Enigma messages, noticed that German telegraphists were not acting in a random way, which they were supposed to when making up the message Indicator. Rather than selecting a r |
https://en.wikipedia.org/wiki/Tim%20Stearns | Tim Stearns (born 1961 in Huntington, New York) is an American biologist and university administrator, and is the Dean of Graduate and Postgraduate Studies, Vice President of Education, and Head of Laboratory at The Rockefeller University. Stearns was formerly the Frank Lee and Carol Hall Professor in the Department of Biology at Stanford University, with appointments in the Department of Genetics and the Cancer Center in the Stanford Medical School. Stearns served as chair of the Department of Biology at Stanford as well as Acting Dean of Research and Senior Associate Vice Provost of Research. Stearns is an HHMI Professor, and is a member of JASON, a scientific advisory group. He has served on the editorial boards of The Journal of Cell Biology, Genetics and Molecular Biology of the Cell.
Education
Stearns received his B.S. in genetics from Cornell University and did his undergraduate thesis work in the lab of Tom Fox on nuclear control of mitochondrial function in yeast. He received his Ph.D. in biology from the Massachusetts Institute of Technology. His Ph.D. advisor at MIT was David Botstein, and the title of his thesis was "Genetic analysis of the yeast microtubule cytoskeleton." Stearns' thesis identified exceptions to the genetic complementation test that were useful for defining genetic interactions and for the first time used the term "synthetic lethality" in the modern sense of two non-lethal mutations resulting in lethality in the double mutant. Stearns credits Botstein with instilling in him a commitment to teaching, and the belief that teaching and research go hand-in-hand.
Professional career
Stearns is known for his work on problems in cell biology and developmental biology, with a focus on the structure and function of the centrosome and cilium of eukaryotic cells. He was a Helen Hay Whitney postdoctoral fellow with Marc Kirschner at UCSF, where he published work on gamma-tubulin and in vitro reconstitution of the centrosome. Stearns has been a |
https://en.wikipedia.org/wiki/Thanatotranscriptome | The thanatotranscriptome denotes all RNA transcripts produced from the portions of the genome still active or awakened in the internal organs of a body following its death. It is relevant to the study of the biochemistry, microbiology, and biophysics of thanatology, in particular within forensic science. Some genes may continue to be expressed in cells for up to 48 hours after death, producing new mRNA. Certain genes that are generally inhibited since the end of fetal development may be expressed again at this time.
Scientific history
Clues to the existence of a post-mortem transcriptome existed at least since the beginning of the 21st century, but the word thanatotranscriptome (from (thanatos-, Greek for "death") seems to have been first used in the scientific literature by Javan et al. in 2015, following the introduction of the concept of the human thanatomicrobiome in 2014 at the 66th Annual Meeting of the American Academy of Forensic Sciences in Seattle, Washington.
In 2016, researchers at the University of Washington confirmed that up to 2 days (48 hours) after the death of mice and zebrafish, many genes still functioned. Changes in the quantities of mRNA in the bodies of the dead animals proved that hundreds of genes with very different functions awoke just after death. The researchers detected 548 genes that awoke after death in zebrafish and 515 in laboratory mice. Among these were genes involved in development of the organism, including genes that are normally activated only in utero or in ovo (in the egg) during fetal development.
The thanatomicrobiome is characterized by a diverse assortment of microorganisms located in internal organs (brain, heart, liver, and spleen) and blood samples collected after a human dies. It is defined as the microbial community of internal body sites, created by a successional process whereby trillions of microorganisms populate, proliferate, and/or die within the dead body, resulting in temporal modifications in the comm |
https://en.wikipedia.org/wiki/Rocker%20Shovel%20Loader | A Rocker Shovel Loader, sometimes simply referred to as a Rocker Shovel or Mucker is a type of mechanical loader used in underground mining.
A Rocker Shovel is usually powered by compressed air, or in some cases electricity. It is commonly mounted on steel wheels designed to run on narrow gauge rails, with some later models using metal or rubber-tyred road wheels. The operator, standing on a raised platform to one side of the machine, operates the controls, one lever to drive the machine along the tracks, and another to raise and lower the bucket. Once the bucket has been filled by driving the loader forwards into the pile of material, the rocker mechanism throws the contents over the top of the machine and into a wagon behind. Once full, the loaded wagon can be taken away and replaced with an empty one to allow loading to continue.
On 28 May 1937, Edwin Burton Royle applied for a patent as inventor of the "loading machine" and US Patent No. 2,134,582 was issued on October 25, 1938 and assigned to the Eastern Iron Metals Company (later to be known as EIMCO).
In 2000, the American Society of Mechanical Engineers added the EIMCO 12B Rocker Shovel Loader of 1938 to its List of Historic Mechanical Engineering Landmarks as reference number 212 out of a total number of 259 objects (as of 2015).
In June 2012, an EIMCO 12B Rocker Shovel was featured in an episode of the American reality television series Auction Hunters, filmed in Littleton, Colorado. It was sold to a gold miner for $3,600.
References
External links
Video of EIMCO 12B being demonstrated at Lea Bailey Light Railway
Mining equipment |
https://en.wikipedia.org/wiki/IET%20Software | IET Software is a peer-reviewed scientific journal on software engineering and related issues, published by the Institution of Engineering and Technology (IET) in the United Kingdom.
The journal was previously published under the following titles:
Software & Microsystems (1982–1986, Online , Print )
Software Engineering Journal (1986–1996, Online , Print )
IEE Proceedings - Software (1997–2006. Online , Print )
The journal is listed on the online IEEE Xplore Digital Library. It is indexed by DBLP, EBSCO, Ei Compendex, IET Inspec, ProQuest, Science Citation Index Expanded (SCI-E), SCImago, and Scopus.
See also
IEEE Software magazine
IEEE Transactions on Software Engineering journal
References
External links
IET Software home page
2007 establishments in the United Kingdom
Bimonthly journals
Computer science in the United Kingdom
Computer science journals
English-language journals
Institution of Engineering and Technology academic journals
Academic journals established in 2007
Software engineering publications |
https://en.wikipedia.org/wiki/Information%20and%20Software%20Technology | Information and Software Technology is a peer-reviewed scientific journal on software development and related issues, published by Elsevier. The journal was established in 1959 as Data Processing, obtaining its current title in 1987. The journal is abstracted and indexed in Scopus.
According to the Journal Citation Reports, the journal has a 2021 impact factor of 3.862.
References
External links
Academic journals established in 1959
Software engineering publications
English-language journals
Elsevier academic journals
10 times per year journals |
https://en.wikipedia.org/wiki/Developer%20ecosystem | A developer ecosystem is a set of software developers functioning as a unit and interacting with a shared market for software artefacts. Developer ecosystems are similar to software ecosystems.
Many software platform producers are currently attempting to create developer ecosystems, by mobilizing developers from other organizations to create extensions to those platforms. Examples of developer programs are the Microsoft Developer Network and the SAP Developer Network.
References
Software development process |
https://en.wikipedia.org/wiki/Toi%20%28programming%20language%29 | Toi is an imperative, type-sensitive language that provides the basic functionality of a programming language. The language was designed and developed from the ground-up by Paul Longtine. Written in C, Toi was created with the intent to be an educational experience and serves as a learning tool (or toy, hence the name) for those looking to familiarize themselves with the inner-workings of a programming language.
Specification
Types
0 VOID - Null, no data
1 ADDR - Address type (bytecode)
2 TYPE - A `type` type
3 PLIST - Parameter list
4 FUNC - Function
5 OBJBLDR - Object builder
6 OBJECT - Object/Class
7 G_PTR - Generic pointer
8 G_INT - Generic integer
9 G_FLOAT - Generic double
10 G_CHAR - Generic character
11 G_STR - Generic string
12 S_ARRAY - Static array
13 D_ARRAY - Dynamic array
14 H_TABLE - Hashtable
15 G_FIFO - Stack
Runtime
Runtime context definition
The runtime context keeps track of an individual threads metadata, such as:
The operating stack
The operating stack where current running instructions push/pop to.
refer to STACK DEFINITION
Namespace instance
Data structure that holds the references to variable containers, also proving the interface for Namespace Levels.
refer to NAMESPACE DEFINITION
Argument stack
Arguments to function calls are pushed on to this stack, flushed on call.
refer to STACK DEFINITION, FUNCTION DEFINITION
Program counter
An interface around bytecode to keep track of traversing line-numbered instructions.
refer to PROGRAM COUNTER DEFINITION
This context gives definition to an 'environment' where code is executed.
Namespace definition
A key part to any operational computer language is the notion of a 'Namespace'.
This notion of a 'Namespace' refers to the ability to declare a name, along with
needed metadata, and call upon the same name to retrieve the values associated
with that name.
In this definition, the namespace will provide the following key mechanisms:
Declari |
https://en.wikipedia.org/wiki/Regulus%20%28geometry%29 | In three-dimensional space, a regulus R is a set of skew lines, every point of which is on a transversal which intersects an element of R only once, and such that every point on a transversal lies on a line of R
The set of transversals of R forms an opposite regulus S. In ℝ3 the union R ∪ S is the ruled surface of a hyperboloid of one sheet.
Three skew lines determine a regulus:
The locus of lines meeting three given skew lines is called a regulus. Gallucci's theorem shows that the lines meeting the generators of the regulus (including the original three lines) form another "associated" regulus, such that every generator of either regulus meets every generator of the other. The two reguli are the two systems of generators of a ruled quadric.
According to Charlotte Scott, "The regulus supplies extremely simple proofs of the properties of a conic...the theorems of Chasles, Brianchon, and Pascal ..."
In a finite geometry PG(3, q), a regulus has q + 1 lines. For example, in 1954 William Edge described a pair of reguli of four lines each in PG(3,3).
Robert J. T. Bell described how the regulus is generated by a moving straight line. First, the hyperboloid is factored as
Then two systems of lines, parametrized by λ and μ satisfy this equation:
and
No member of the first set of lines is a member of the second. As λ or μ varies, the hyperboloid is generated. The two sets represent a regulus and its opposite. Using analytic geometry, Bell proves that no two generators in a set intersect, and that any two generators in opposite reguli do intersect and form the plane tangent to the hyperboloid at that point. (page 155).
See also
References
H. G. Forder (1950) Geometry, page 118, Hutchinson's University Library.
Geometry
Quadrics |
https://en.wikipedia.org/wiki/Coxeter%20decompositions%20of%20hyperbolic%20polygons | A Coxeter decomposition of a polygon is a decomposition into a finite number of polygons in which any two sharing a side are reflections of each other along that side. Hyperbolic polygons are the analogues of Euclidean polygons in hyperbolic geometry. A hyperbolic n-gon is an area bounded by n segments, rays, or entire straight lines. The standard model for this geometry is the Poincaré disk model. A major difference between Euclidean and hyperbolic polygons is that the sum of internal angles of a hyperbolic polygon is not the same as Euclidean polygons. In particular, the sum of the angles of a hyperbolic triangle is less than 180 degrees. Coxeter decompositions are named after Harold Scott MacDonald Coxeter, an accomplished 20th century geometer. He introduced the Coxeter group, an abstract group generated by reflections. These groups have many uses, including producing the rotations of Platonic solids and tessellating the plane.
Coxeter decompositions
Given a polygon P, a group G can be generated by reflecting P around its sides. If the angles of P are /k for natural numbers k, then G will be discrete. A Coxeter decomposition of a polygon is a decomposition into a finite number of polygons in which any two sharing a side are reflections of each other along that side.
The goal of a Coxeter decomposition is to break up a polygon into a composition of congruent triangles reflected on its sides.
Hyperbolic triangles
If triangle ABC can undergo Coxeter decomposition and has angles , where is the number of times the th angle is broken up, the triangle ABC can be written as . Several properties of these fundamental polygons are known for hyperbolic triangles.
The fundamental triangle has a right angle. The proof of this involves two cases dependent on if the angles of the decomposed triangle are fundamental. If they are not, then it follows that since the process of decomposition is finite, eventually a fundamental triangle will be formed with a right angle. If |
https://en.wikipedia.org/wiki/Ballot%20selfie | A ballot selfie is a type of selfie that is intended to depict the photographer's completed ballot in an election, as a way of showing how the photographer cast their vote. Ballot selfies have risen in prominence alongside the increasing availability of smartphone digital cameras and the use of social media in the 21st century. They have also generated controversy as potential violations of laws enacted in the late 19th and early 20th centuries to curtail vote buying, particularly in the United States, though some U.S. courts have rejected restrictions on ballot selfies as inconsistent with the U.S. Constitution's First Amendment guarantees of freedom of speech.
Voters typically take and share ballot selfies to encourage others to vote, to demonstrate their civic involvement, and to express their choice of candidate. The selfie is often taken in or near a voting booth and the ballot paper is often marked. Others do not take pictures of themselves in the voting booth, but photograph their ballots (including absentee ballots) or the voting machines, either before or after filling them out.
Issues
Several concerns have arisen over ballot selfies, typically focused on issues of ballot secrecy, voter fraud, and voter intimidation. These have led to laws prohibiting or restricting ballot selfies in some places, or the application or revision of existing laws to cover the practice, although enforcement has not been widespread in U.S. jurisdictions. Some authorities have indicated that prosecution would be unlikely unless there was some indication that the photograph was associated with voter fraud or intimidation or a vote-buying scheme.
Legality
Laws regarding ballot selfies vary by country and jurisdiction, often with laws varying by jurisdiction even within a country.
Brazil
Brazil's election laws ensure the secrecy of the vote; therefore, taking any photos of the voting machine (or, for that matter, using any electronic device while voting) is a crime subject to |
https://en.wikipedia.org/wiki/Prairie%20remnant | A prairie remnant commonly refers to grassland areas in the Western and Midwestern United States and Canada that remain to some extent undisturbed by European settlement. Prairie remnants range in levels of degradation but nearly all contain at least some semblance of the pre-Columbian local plant assemblage of a particular region. Prairie remnants have become increasingly threatened due to the threats of agricultural, urban and suburban development, pollution, fire suppression, and the incursion of invasive species.
Prairie remnants in restoration ecology
Prairie remnants offer valuable varieties of rare species thus providing excellent opportunities for restoration ecology projects. Many restoration projects are simply recreations of prairie habitats, but restoring prairie remnants provides the preservation of more complete ecological structures that were naturally created after the end of the last ice age. Remnants can also serve as platforms for additional surrounding ecological restoration activities.
Tallgrass prairies in North America
It has been estimated that 99% of tallgrass prairie habitats in North America have been destroyed mainly due to conversion to agriculture. Tallgrass prairies are generally composed of a mixture of native grasses, sedges, and forbs but are usually dominated by grasses.
Shortgrass prairies in North America
The shortgrass prairie is an ecosystem located in the Great Plains of North America. The prairie includes lands to the west as far as the eastern foothills of the Rocky Mountains and extends east as far as Nebraska and north into Saskatchewan. The prairie stretches through parts of Alberta, Wyoming, Montana, North Dakota, South Dakota, and Kansas, and passes south through the high plains of Colorado, Oklahoma, Texas, and New Mexico.
References
Grasslands of North America
Conservation biology |
https://en.wikipedia.org/wiki/Xerox%20Character%20Code%20Standard | The Xerox Character Code Standard (XCCS) is a historical 16-bit character encoding that was created by Xerox in 1980 for the exchange of information between elements of the Xerox Network Systems Architecture. It encodes the characters required for languages using the Latin, Arabic, Hebrew, Greek and Cyrillic scripts, the Chinese, Japanese and Korean writing systems, and technical symbols.
It can be viewed as an early precursor of, and inspiration for, the Unicode Standard.
The International Character Set (ICS) is compatible with XCCS.
The XCCS 2.0 (1990) revision covers Latin, Arabic, Hebrew, Gothic, Armenian, Runic, Georgian, Greek, Cyrillic, Hiragana, Katakana, Bopomofo scripts, technical, and mathematical symbols.
Code charts
Character sets overview
Character set 0x00
Character set 0x21
Character set 0x22
Character set 0x23
Character set 0x24
Character set 0x25
Character set 0x26
Character set 0x27
Character set 0x28
Character set 0x30
Character set 0x31
Character set 0xE0
Character set 0xE1
Character set 0xE2
Character set 0xE3
Character set 0xEE
Character set 0xEF
Character set 0xF0
Character set 0xF1
See also
Interscript
Lotus Multi-Byte Character Set (LMBCS)
References
Further reading
(100 pp.)
Character encoding
Character sets
Computer-related introductions in 1980
Character Code Standard |
https://en.wikipedia.org/wiki/Stanley%20decomposition | In commutative algebra, a Stanley decomposition is a way of writing a ring in terms of polynomial subrings. They were introduced by .
Definition
Suppose that a ring R is a quotient of a polynomial ring k[x1,...] over a field by some ideal. A Stanley decomposition of R is a representation of R as a direct sum (of vector spaces)
where each xα is a monomial and each Xα is a finite subset of the generators.
See also
Rees decomposition
Hironaka decomposition
References
Commutative algebra |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.