source stringlengths 31 227 | text stringlengths 9 2k |
|---|---|
https://en.wikipedia.org/wiki/Unary%20function | In mathematics, a unary function is a function that takes one argument. A unary operator belongs to a subset of unary functions, in that its range coincides with its domain. In contrast, a unary function's domain may or may not coincide with its range.
Examples
The successor function, denoted , is a unary operator. Its domain and codomain are the natural numbers; its definition is as follows:
In many programming languages such as C, executing this operation is denoted by postfixing to the operand, i.e. the use of is equivalent to executing the assignment .
Many of the elementary functions are unary functions, including the trigonometric functions, logarithm with a specified base, exponentiation to a particular power or base, and hyperbolic functions.
See also
Arity
Binary function
Binary operator
List of mathematical functions
Ternary operation
Unary operation |
https://en.wikipedia.org/wiki/American%20Society%20for%20Virology | The American Society for Virology (ASV) is an American scientific society serving the community of researchers in virology. The organization was founded in 1981 and was the first scientific society in the world dedicated exclusively to virology.
Founding and history
Historically, virology has been considered a subdiscipline of microbiology. The motivation for founding a society specifically for virologists dates to the mid-1960s and originated in the community's dissatisfaction with its representation in existing microbiology societies, most notably the International Association of Microbiological Societies and the American Society for Microbiology. The society was formally founded following a meeting organized by Bernard Roizman of 40 prominent virology researchers at O'Hare International Airport in Chicago on June 9, 1981. Its first official annual meeting, organized by Milt Zaitlin, took place at Cornell University in August 1982—its membership had reached almost 1,000 scientists.
The founding president of the ASV was Wolfgang Joklik—who served from 1981 to 1983. Other notable founding members who signed letters sent to members of the virology community soliciting opinions about the possible future society in advance of the O'Hare meeting were David Baltimore, Purnell Choppin, Harold Ginsberg, Thomas Merigan, Bernard Roizman, Peter K. Vogt, Bob Wagner, Julius Youngner, and Norton Zinder. Ginsberg, Wagner, Choppin, and Youngner all served subsequent terms as president.
Activities
The ASV continues to host an annual scientific meeting every summer on a selected university campus in the United States or Canada. The society also hosts career and educational information—including an online jobs directory—and received a grant from the Alfred Sloan Foundation to support a website documenting the history of virology. The website is maintained by former ASV president Sondra Schlesinger.
Since April 2021, the society's president has been Dr. Colin Parrish of Cornell Un |
https://en.wikipedia.org/wiki/Retiperidiolia | Retiperidiolia is a genus of fungi in the family Nidulariaceae. Basidiocarps (fruit bodies) are typically under 10 mm in diameter and irregularly spherical. Each produces a number of peridioles which contain the spores and are released from the disintegrating fruit bodies at maturity. Species are usually found growing on herbaceous stems and other plant debris. The genus has a tropical distribution. Species were previously referred to Mycocalia, but molecular research, based on cladistic analysis of DNA sequences, found that they were not closely related.
See also
List of Agaricales genera |
https://en.wikipedia.org/wiki/William%20A.%20Veech | William A. Veech was the Edgar O. Lovett Professor of Mathematics at Rice University until his death. His research concerned dynamical systems; he is particularly known for his work on interval exchange transformations, and is the namesake of the Veech surface. He died unexpectedly on August 30, 2016 in Houston, Texas.
Education
Veech graduated from Dartmouth College in 1960, and earned his Ph.D. in 1963 from Princeton University under the supervision of Salomon Bochner.
Contributions
An interval exchange transformation is a dynamical system defined from a partition of the unit interval into finitely many smaller intervals, and a permutation on those intervals. Veech and Howard Masur independently discovered that, for almost every partition and every irreducible permutation, these systems are uniquely ergodic, and also made contributions to the theory of weak mixing for these systems. The Rauzy–Veech–Zorich induction map, a function from and to the space of interval exchange transformations is named in part after Veech: Rauzy defined the map, Veech constructed an infinite invariant measure for it, and Zorich strengthened Veech's result by making the measure finite.
The Veech surface and the related Veech group are named after Veech, as is the Veech dichotomy according to which geodesic flow on the Veech surface is either periodic or ergodic.
Veech played a role in the Nobel-prize-winning discovery of buckminsterfullerene in 1985 by a team of Rice University chemists including Richard Smalley. At that time, Veech was chair of the Rice mathematics department, and was asked by Smalley to identify the shape that the chemists had determined for this molecule. Veech answered, "I could explain this to you in a number of ways, but what you've got there, boys, is a soccer ball."
Veech is the author of A Second Course in Complex Analysis (W. A. Benjamin, 1967; Dover, 2008, ).
Awards and honors
In 2012, Veech became one of the inaugural fellows of the American Mathemati |
https://en.wikipedia.org/wiki/Shock%20detector | A shock detector, shock indicator, or impact monitor is a device which indicates whether a physical shock or impact has occurred. These usually have a binary output (go/no-go) and are sometimes called shock overload devices. Shock detectors can be used on shipments of fragile valuable items to indicate whether a potentially damaging drop or impact may have occurred. They are also used in sports helmets to help estimate if a dangerous impact may have occurred.
By contrast, a shock data logger is a data acquisition system for analysis and recording of shock pulses.
Overview
Shocks and impacts are often specified by the peak acceleration expressed in g-s (sometimes called g-forces). The form of the shock pulse and particularly the duration are equally important. For example, a short 1 ms 300 g shock has little damage potential and is not usually of interest but a 20 ms 300 g shock might be critical. Depending on the use, the response to this time sensitivity of a shock detector needs to be matched to the sensitivity of the item it is intended to monitor.
The mounting location also affects the response of most shock detectors. A shock on a rigid item such as a sports helmet or a rigid package might respond to a field shock with a jagged shock pulse which, without proper filtering is difficult to characterize. A shock on a cushioned item usually has a smoother shock pulse., and thus more consistent responses from shock detector.
Shocks are vector quantities with the direction of the shock being important to the item of interest, Shock detectors also can be highly sensitive to the direction of the input shock.
A shock detector can be evaluated:
• Separately in a laboratory physical test, perhaps on an instrumented shock machine.
• Mounted to its intended item in a testing laboratory with controlled fixturing and controlled input shocks.
• In the field with uncontrolled and more highly variable input shocks.
Use of proper test methods and Verification a |
https://en.wikipedia.org/wiki/T%20memory%20stem%20cell | A T memory stem cell (TSCM) is a type of long-lived memory T cell with the ability to reconstitute the full diversity of memory and effector T cell subpopulations as well as to maintain their own pool through self-renewal. TSCM represent an intermediate subset between naïve (Tn) and central memory (Tcm) T cells, expressing both naïve T cells markers, such as CD45RA+, CD45RO-, high levels of CD27, CD28, IL-7Rα (CD127), CD62L, and C-C chemokine receptor 7 (CCR7), as well as markers of memory T cells, such as CD95, CD122 (IL-2Rβ), CXCR3, LFA-1. These cells represent a small fraction of circulating T cells, approximately 2-3%. Like naïve T cells, TSCM cells are found more abundantly in lymph nodes than in the spleen or bone marrow; but in contrast to naïve T cells, TSCM cells are clonally expanded. Similarly to memory T cells, TSCM are able to rapidly proliferate and secrete pro-inflammatory cytokines (IFN-γ, IL-2, and TNF-α) in response to antigen re-exposure, but show higher proliferation potential compared with Tcm cells; their homeostatic turnover is also dependent on IL-7 and IL-15.
Differentiaion
Longitudinal studies on TSCM dynamics in patients undergoing hematopoietic stem cell transplantation (HSCT) have shown that donor-derived TSCM cells were highly enriched early after HSCT, differentiated directly from Tn, and that Tn and TSCM cells (but not central memory or effector T cells) were able to reconstitute the entire heterogeneity of memory T cell subsets including TSCM cells. Together with the transcriptome analysis of differentially expressed genes reflecting the relatedness of TSCM and Tn cells, these data support the existing hierarchical model of human T cell differentiation: naïve T cells (Tn) → stem cell like memory T cells (T scm) → central memory T cells (Tcm) → effector memory T cells (Tem) / effector T cells (Teff).
After primary antigen exposure and elimination, antigen-specific TSCM preferentially survive among memory T cells and stably persist |
https://en.wikipedia.org/wiki/Stokesian%20dynamics | Stokesian dynamics
is a solution technique for the Langevin equation, which is the relevant form of Newton's 2nd law for a Brownian particle. The method treats the suspended particles in a discrete sense while the continuum approximation remains valid for the surrounding fluid, i.e., the suspended particles are generally assumed to be significantly larger than the molecules of the solvent. The particles then interact through hydrodynamic forces transmitted via the continuum fluid, and when the particle Reynolds number is small, these forces are determined through the linear Stokes equations (hence the name of the method). In addition, the method can also resolve non-hydrodynamic forces, such as Brownian forces, arising from the fluctuating motion of the fluid, and interparticle or external forces. Stokesian Dynamics can thus be applied to a variety of problems, including sedimentation, diffusion and rheology, and it aims to provide the same level of understanding for multiphase particulate systems as molecular dynamics does for statistical properties of matter. For rigid particles of radius suspended in an incompressible Newtonian fluid of viscosity and density , the motion of the fluid is governed by the Navier–Stokes equations, while the motion of the particles is described by the coupled equation of motion:
In the above equation is the particle translational/rotational velocity
vector of dimension 6N. is the hydrodynamic force, i.e., force exerted by the fluid on the particle due to relative motion between them. is the stochastic Brownian force due to thermal motion of fluid particles. is the deterministic nonhydrodynamic force, which may be almost any form of interparticle or external force, e.g. electrostatic repulsion between like charged particles. Brownian dynamics is one of the popular techniques of solving the Langevin equation, but the hydrodynamic interaction in Brownian dynamics is highly simplified and normally includes only the isolated body |
https://en.wikipedia.org/wiki/Perceptics | Perceptics LLC is a developer and manufacturer of automated license plate recognition (LPR) equipment based in Farragut, Tennessee, founded in approximately 1978. John Dalton is the CEO. A large hack of their data exposed their operations, as well as the locations of installations.
Their technology is used by the U.S. Customs and Border Protection (CBP) at 43 border crossings, both to Mexico and Canada, as part of a partnership with Unisys Federal Systems. Perceptics is the exclusive license plate recognition provider for CBP. Perceptics operated as a subcontractor to Unisys for the license plate reader contract, worth $229 million over several years. As of 2019, Perceptics has worked on CBP contracts for "nearly 30 years". They also provide "under-vehicle surveillance systems", and have contracts with the Drug Enforcement Administration checkpoints, the Canada Border Services Agency, United Arab Emirates, and Saudi Arabia's Special Forces, and the Jordanian army.
Perceptics was previously a subsidiary of Northrop Grumman. They have been filling CBP contracts since 1982 and license plate readers since 1997. In 2002 the equipment cost was approximately $90,000 per lane.
Perceptics also discussed promoting their license plate reading technology for use on a congestion pricing scheme to MTA in New York City in a presentation titled "Smart Imaging Solutions for New York City Congestion Pricing". They demoed the technology to MTA's Bridges and Tunnels division. The Perceptics system provides much more capabilities than license plate reading, such as "Vehicle Occupancy Imaging System", which can identify drivers and passengers, as well as tracking car locations and driver behavior as a profile. Perceptics and Unisys were also involved in a CBP trial project called the "Vehicle Face System", involving facial recognition of car occupants.
Perceptics used Amazon Rekognition as of August 2018.
Canadian operations
Data from the hack revealed the Canada Border Services Age |
https://en.wikipedia.org/wiki/Cluster%20decay | Cluster decay, also named heavy particle radioactivity, heavy ion radioactivity or heavy cluster decay, is a rare type of nuclear decay in which an atomic nucleus emits a small "cluster" of neutrons and protons, more than in an alpha particle, but less than a typical binary fission fragment. Ternary fission into three fragments also produces products in the cluster size. The loss of protons from the parent nucleus changes it to the nucleus of a different element, the daughter, with a mass number Ad = A − Ae and atomic number Zd = Z − Ze, where Ae = Ne + Ze.
For example:
→ +
This type of rare decay mode was observed in radioisotopes that decay predominantly by alpha emission, and it occurs only in a small percentage of the decays for all such isotopes.
The branching ratio with respect to alpha decay is rather small (see the Table below).
Ta and Tc are the half-lives of the parent nucleus relative to alpha decay and cluster radioactivity, respectively.
Cluster decay, like alpha decay, is a quantum tunneling process: in order to be emitted, the cluster must penetrate a potential barrier. This is a different process than the more random nuclear disintegration that precedes light fragment emission in ternary fission, which may be a result of a nuclear reaction, but can also be a type of spontaneous radioactive decay in certain nuclides, demonstrating that input energy is not necessarily needed for fission, which remains a fundamentally different process mechanistically.
Theoretically, any nucleus with Z > 40 for which the released energy (Q value) is a positive quantity, can be a cluster-emitter. In practice, observations are severely restricted to limitations imposed by currently available experimental techniques which require a sufficiently short half-life, Tc < 1032 s, and a sufficiently large branching ratio B > 10−17.
In the absence of any energy loss for fragment deformation and excitation, as in cold fission phenomena or in alpha decay, the total kinetic e |
https://en.wikipedia.org/wiki/Zumbox | Zumbox was a proposed hybrid mail service for receiving postal mail via the web. The service was intended to work in parallel to traditional postal services, whereby a digital mailbox—a Zumbox— would be created for every street address in the U.S.
People were able to claim their Zumbox via a paper mail confirmation code.
Zumbox provided security in compliance with PCI DSS, BITS and HIPAA security rules and regulations.
Zumbox was founded in 2007 in Los Angeles, and obtained more than $30 million in venture capital. After seven years of development, Zumbox announced in 2014 that it lacked sufficient investor interest to launch as a national brand. |
https://en.wikipedia.org/wiki/European%20Physical%20Journal%20C | The European Physical Journal C (EPJ C) is a biweekly peer-reviewed, open access scientific journal covering theoretical and experimental physics. It is part of the SCOAP3 initiative.
See also
European Physical Journal |
https://en.wikipedia.org/wiki/Bounce%20Address%20Tag%20Validation | In computing, Bounce Address Tag Validation (BATV) is a method, defined in an Internet Draft, for determining whether the bounce address specified in an E-mail message is valid. It is designed to reject backscatter, that is, bounce messages to forged return addresses.
Overview
The basic idea is to send all e-mail with a return address that includes a timestamp and a cryptographic token that cannot be forged. Any e-mail that is returned as a bounce without a valid signature can then be rejected. E-mail that is being bounced back should have an empty (null) return address so that bounces are never created for a bounce and therefore preventing messages from bouncing back and forth forever.
BATV replaces an envelope sender like mailbox@example.com with prvs=tag-value=mailbox@example.com, where prvs, called "Simple Private Signature", is just one of the possible tagging schemes; actually, the only one fully specified in the draft. The BATV draft gives a framework that other possible techniques can fit into. Other types of implementations, such as using public key signatures that can be verified by third parties, are mentioned but left undefined. The overall framework is vague/flexible enough that similar systems such as Sender Rewriting Scheme can fit into this framework.
History
Sami Farin proposed an Anti-Bogus Bounce System in 2003 in news.admin.net-abuse.email, which used the same basic idea of putting a hard to forge hash in a message's bounce address.
In late 2004, Goodman et al. proposed a much more complex "Signed Envelope Sender" that included a hash of the message body and was intended to address a wide variety of forgery threats, including bounces from forged mail. Several months later, Levine and Crocker proposed BATV under its current name and close to its current form.
Problems
The draft anticipates some problems running BATV.
Some mailing lists managers (e.g. ezmlm) still key on the bounce address, and will not recognize it after BATV mangling. |
https://en.wikipedia.org/wiki/Suid%20gammaherpesvirus%205 | Suid gammaherpesvirus 5 (SuHV-5) is a species of virus in the genus Macavirus, subfamily Gammaherpesvirinae, family Herpesviridae, and order Herpesvirales. |
https://en.wikipedia.org/wiki/Linear-feedback%20shift%20register | In computing, a linear-feedback shift register (LFSR) is a shift register whose input bit is a linear function of its previous state.
The most commonly used linear function of single bits is exclusive-or (XOR). Thus, an LFSR is most often a shift register whose input bit is driven by the XOR of some bits of the overall shift register value.
The initial value of the LFSR is called the seed, and because the operation of the register is deterministic, the stream of values produced by the register is completely determined by its current (or previous) state. Likewise, because the register has a finite number of possible states, it must eventually enter a repeating cycle. However, an LFSR with a well-chosen feedback function can produce a sequence of bits that appears random and has a very long cycle.
Applications of LFSRs include generating pseudo-random numbers, pseudo-noise sequences, fast digital counters, and whitening sequences. Both hardware and software implementations of LFSRs are common.
The mathematics of a cyclic redundancy check, used to provide a quick check against transmission errors, are closely related to those of an LFSR. In general, the arithmetics behind LFSRs makes them very elegant as an object to study and implement. One can produce relatively complex logics with simple building blocks. However, other methods, that are less elegant but perform better, should be considered as well.
Fibonacci LFSRs
The bit positions that affect the next state are called the taps. In the diagram the taps are [16,14,13,11]. The rightmost bit of the LFSR is called the output bit, which is always also a tap. The taps are XOR'd sequentially and then fed back into the leftmost bit. The sequence of bits in the rightmost position is called the output stream.
The bits in the LFSR state that influence the input are called taps.
A maximum-length LFSR produces an m-sequence (i.e., it cycles through all possible 2m − 1 states within the shift register except the state wh |
https://en.wikipedia.org/wiki/Lumicitabine | Lumicitabine (ALS-8176) is an antiviral drug which was developed as a treatment for respiratory syncytial virus (RSV) and human metapneumovirus (hMPV). It acts as an RNA polymerase inhibitor. While it showed promise in early clinical trials, poor results in Phase IIb trials led to it being discontinued from development for treatment of RSV. Research continues to determine whether it may be useful for the treatment of diseases caused by other RNA viruses, and it has been found to show activity against Nipah virus.
See also
Palivizumab
Presatovir
Ziresovir |
https://en.wikipedia.org/wiki/Bass%20trap | Bass traps are acoustic energy absorbers which are designed to damp low frequency sound energy with the goal of attaining a flatter low frequency (LF) room response by reducing LF resonances in rooms. They are commonly used in recording studios, mastering rooms, home theatres and other rooms built to provide a critical listening environment. Like all acoustically absorptive devices, they function by turning sound energy into heat through friction.
General description—types
There are generally two types of bass traps: resonant absorbers and porous absorbers. Resonant absorbers are further divided into panel absorbers and Helmholtz resonators.
Both types are effective, but whereas a resonant absorber needs to be mechanically tuned to resonate in sympathy with the frequencies being absorbed, a porous absorber does not resonate and need not be tuned.
Porous absorbers tend to be smaller in size and are easier to design and build, as well as less expensive overall than resonant absorbers. However, the deep bass attenuation of a porous absorber is generally inferior, so its usefulness for attenuating lower frequency room resonances is more limited.
Resonating absorbers tend to absorb a narrower spectrum and porous absorbers tend to absorb a broader spectrum. The spectrum of both types can be either narrowed or broadened by design but the generalized difference in bandwidth and tunability dominates their respective performance.
Examples of resonating type bass traps include a rigid container with one or more portholes or slots (i.e. Helmholtz resonator), or a rigid container with a flexible diaphragm (i.e. membrane absorber). Resonating type bass trap achieves absorption of sound by sympathetic vibration of some free element of the device with the air volume of the room.
Resonating absorbers vary in construction, with one type of membrane absorber using a springy sheet of wood that attaches to the enclosure only along the edges/corners, and another using a more flopp |
https://en.wikipedia.org/wiki/Genome-wide%20association%20study | In genomics, a genome-wide association study (GWA study, or GWAS), is an observational study of a genome-wide set of genetic variants in different individuals to see if any variant is associated with a trait. GWA studies typically focus on associations between single-nucleotide polymorphisms (SNPs) and traits like major human diseases, but can equally be applied to any other genetic variants and any other organisms.
When applied to human data, GWA studies compare the DNA of participants having varying phenotypes for a particular trait or disease. These participants may be people with a disease (cases) and similar people without the disease (controls), or they may be people with different phenotypes for a particular trait, for example blood pressure. This approach is known as phenotype-first, in which the participants are classified first by their clinical manifestation(s), as opposed to genotype-first. Each person gives a sample of DNA, from which millions of genetic variants are read using SNP arrays. If there is significant statistical evidence that one type of the variant (one allele) is more frequent in people with the disease, the variant is said to be associated with the disease. The associated SNPs are then considered to mark a region of the human genome that may influence the risk of disease.
GWA studies investigate the entire genome, in contrast to methods that specifically test a small number of pre-specified genetic regions. Hence, GWAS is a non-candidate-driven approach, in contrast to gene-specific candidate-driven studies. GWA studies identify SNPs and other variants in DNA associated with a disease, but they cannot on their own specify which genes are causal.
The first successful GWAS published in 2002 studied myocardial infarction. This study design was then implemented in the landmark GWA 2005 study investigating patients with age-related macular degeneration, and found two SNPs with significantly altered allele frequency compared to healthy cont |
https://en.wikipedia.org/wiki/Haverhill%20fever | Haverhill fever (or epidemic arthritic erythema) is a systemic illness caused by the bacterium Streptobacillus moniliformis, an organism common in rats and mice. If untreated, the illness can have a mortality rate of up to 13%. Among the two types of rat-bite fever, Haverhill fever caused by Streptobacillus moniliformis is most common in North America. The other type of infection caused by Spirillum minus is more common in Asia and is also known as Sodoku.
The initial non-specific presentation of the disease and hurdles in culturing the causative microorganism are at times responsible for a delay or failure in the diagnosis of the disease. Although non-specific in nature, initial symptoms like relapsing fever, rash and migratory polyarthralgia are the most common symptoms of epidemic arthritic erythema.
Bites and scratches from rodents carrying the bacteria are generally responsible for the affliction. However, the disease can be spread even without physical lacerations by rodents. In fact, the disease was first recognized from a milk-associated outbreak which occurred in Haverhill, Massachusetts in January, 1926. The organism S. moniliformis was isolated from the patients and epidemiologically, consumption of milk from one particular dairy was implicated in association with the infection. Hence, ingestion of food and drink contaminated with the bacteria can also result in the development of the disease.
Symptoms and signs
The illness resembles a severe influenza, with a moderate fever (38-40 °C, or 101-104 °F), sore throat, chills, myalgia, headache, vomiting, and a diffuse red rash (maculopapular, petechial, or purpuric), located mostly on the hands and feet. The incubation period for the bacteria generally lasts from three-ten days. As the disease progresses. almost half the patients experience migratory polyarthralgias.
Mechanism
Although the specific form of pathogenesis is still a subject of ongoing research, the bacteria has been observed to result in mo |
https://en.wikipedia.org/wiki/Devices%20Profile%20for%20Web%20Services | The Devices Profile for Web Services (DPWS) defines a minimal set of implementation constraints to enable secure web service messaging, discovery, description, and eventing on resource-constrained devices.
Its objectives are similar to those of Universal Plug and Play (UPnP) but, in addition, DPWS is fully aligned with Web Services technology and includes numerous extension points allowing for seamless integration of device-provided services in enterprise-wide application scenarios.
DPWS standardization
The DPWS specification was initially published in May 2004 and was submitted for standardization to OASIS in July 2008. DPWS 1.1 was approved as OASIS Standard together with WS-Discovery 1.1 and SOAP-over-UDP 1.1 on June 30, 2009.
DPWS defines an architecture in which devices run two types of services: hosting services and hosted services. Hosting services are directly associated with a device, and play an important part in the device discovery process. Hosted services are mostly functional and depend on their hosting device for discovery.
In addition to these hosted services, DPWS specifies a set of built-in services:
Discovery services: used by a device connected to a network to advertise itself and to discover other devices. Support of discovery has led some to dub DPWS as "the USB for Ethernet."
Metadata exchange services: provide dynamic access to a device's hosted services and to their metadata.
Publish/subscribe eventing services: allowing other devices to subscribe to asynchronous event messages produced by a given service.
DPWS builds on the following core Web Services standards: WSDL 1.1, XML Schema, SOAP 1.2, WS-Addressing, and further comprises WS-MetadataExchange, WS-Transfer, WS-Policy, WS-Security, WS-Discovery and WS-Eventing.
Microsoft's Windows Vista and Windows Embedded CE6R2 platforms natively integrate DPWS with a stack called WSDAPI, included as part of the Windows Rally technologies. Support for OSGi is on the way.
Use cases
Because DPWS |
https://en.wikipedia.org/wiki/Aladdin%20Knowledge%20Systems | Aladdin Knowledge Systems (formerly and ) was a company that produced software for digital rights management and Internet security. The company was acquired by Safenet Inc, in 2009. Its corporate headquarters are located in Belcamp, MD.
History
Aladdin Knowledge Systems was founded in 1985 by Jacob (Yanki) Margalit, when he was 23 years old; he was soon joined by his brother Danny Margalit, who took the responsibility for product development at the age of 18, while at the same time completing a Mathematics and Computer Science degree at Tel Aviv University. In its early years the company developed two product lines, an artificial intelligence package (which was dropped early on) and a hardware product to prevent unauthorized software copying, similar to digital rights management. Margalit raised just $10,000 as an initial capital for the company.
The digital rights management product became a success and by 1993 generated sales of $4,000,000. The same year that company had an initial public offering on NASDAQ raising $7,900,000. In 2004 the company's shares were also listed on the Tel Aviv Stock Exchange. By 2007 the company's annual revenues reached over $105 million.
In mid-2008, Vector Capital was attempting to purchase Aladdin. Vector initially offered $14.50 per share, but Aladdin's founder Margalit refused the offer arguing that the company was worth more. Aladdin's shareholders agreed on the merger in February 2009 at $11.50 per share, in cash. In March 2009, Vector Capital acquired Aladdin and officially merged it with SafeNet.
Corporate timeline
1985 – Aladdin Knowledge Systems was established
1993 – Aladdin held an initial public offering
1996 – Aladdin acquired the German company FAST
1998 – Aladdin patented USB smart card-based authentication tokens
1998_Dec – Aladdin acquired the software protection business of EliaShim
1999 – Aladdin acquired the eSafe "content security" business of EliaShim
2000 – Aladdin acquired 10% of Comsec
2001 – |
https://en.wikipedia.org/wiki/Tog%20%28unit%29 | The tog is a measure of thermal insulance of a unit area, also known as thermal resistance. It is commonly used in the textile industry and often seen quoted on, for example, duvets and carpet underlay.
The Shirley Institute in Manchester, England developed the tog as an easy-to-follow alternative to the SI unit of m2⋅K/W. The name comes from the informal word togs for 'clothing', which itself was probably derived from the word toga, a Roman garment. The backronym thermal overall grade is also attested.
The basic unit of insulation coefficient is the RSI, (1 m2⋅K/W). 1 tog = 0.1 RSI. There is also a US clothing unit, the clo, equivalent to 0.155 RSI or 1.55 tog, described in ASTM D-1518.
A tog is 0.1 m2⋅K/W. In other words, the thermal resistance in togs is equal to ten times the temperature difference (in °C) between the two surfaces of a material, when the flow of heat is equal to one watt per square metre.
British duvets are sold in steps of 1.5 tog from 3.0 tog (summer) to 16.5 tog (extra-warm). The stated values are a minimum; actual values may be up to 3 tog higher. Also, these values assume there is no added duvet cover that can trap air.
A few manufacturers have marketed combined duvet sets consisting of two duvets; one of approximately 4.5 tog and one of approximately 9.0 tog. These can be used individually as summer (4.5 tog) and spring/autumn (9.0 tog). When joined together using press studs around the edges, or Velcro strips across each of the corners, they become a 13.5 tog winter duvet and as such can be made to suit all seasons.
Testing
Launched in the 1940s by the Shirley Institute, the Shirley Togmeter is the standard apparatus for rating thermal resistance of textiles, commonly known as the Tog Test. This apparatus, described in BS 4745:2005, measures a sample of textile, either between two metal plates (for underclothing) or between a metal plate and free air (for outer layers). Each industry has its own specifications and methods for meas |
https://en.wikipedia.org/wiki/Dorothy%20Hodgkin | Dorothy Mary Crowfoot Hodgkin (née Crowfoot; 12 May 1910 – 29 July 1994) was a Nobel Prize-winning British chemist who advanced the technique of X-ray crystallography to determine the structure of biomolecules, which became essential for structural biology.
Among her most influential discoveries are the confirmation of the structure of penicillin as previously surmised by Edward Abraham and Ernst Boris Chain; and mapping the structure of vitamin B12, for which in 1964 she became the third woman to win the Nobel Prize in Chemistry. Hodgkin also elucidated the structure of insulin in 1969 after 35 years of work.
Hodgkin used the name "Dorothy Crowfoot" until twelve years after marrying Thomas Lionel Hodgkin, when she began using "Dorothy Crowfoot Hodgkin". Hodgkin is referred to as "Dorothy Hodgkin" by the Royal Society (when referring to its sponsorship of the Dorothy Hodgkin fellowship), and by Somerville College. The National Archives of the United Kingdom refer to her as "Dorothy Mary Crowfoot Hodgkin".
Early life
Dorothy Mary Crowfoot was born in Cairo, Egypt, the oldest of the four daughters whose parents worked in North Africa and the middle East in the colonial administration and later as archaeologists. Dorothy came from a distinguished family of archaeologists. Her parents were John Winter Crowfoot (1873–1959), working for the country's Ministry of Education, and his wife Grace Mary (née Hood) (1877–1957), known to friends and family as Molly. The family lived in Cairo during the winter months, returning to England each year to avoid the hotter part of the season in Egypt.
In 1914, Hodgkin's mother left her (age 4) and her two younger sisters Joan (age 2) and Elisabeth (age 7 months) with their Crowfoot grandparents near Worthing, and returned to her husband in Egypt. They spent much of their childhood apart from their parents, yet they were supportive from afar. Her mother would encourage Dorothy to pursue the interest in crystals first displayed at th |
https://en.wikipedia.org/wiki/Porencephaly | Porencephaly is an extremely rare cephalic disorder involving encephalomalacia. It is a neurological disorder of the central nervous system characterized by cysts or cavities within the cerebral hemisphere. Porencephaly was termed by Heschl in 1859 to describe a cavity in the human brain. Derived from Greek roots, the word porencephaly means 'holes in the brain'. The cysts and cavities (cystic brain lesions) are more likely to be the result of destructive (encephaloclastic) cause, but can also be from abnormal development (malformative), direct damage, inflammation, or hemorrhage. The cysts and cavities cause a wide range of physiological, physical, and neurological symptoms. Depending on the patient, this disorder may cause only minor neurological problems, without any disruption of intelligence, while others may be severely disabled or die before the second decade of their lives. However, this disorder is far more common within infants, and porencephaly can occur both before or after birth.
Signs and symptoms
Patients diagnosed with porencephaly display a variety of symptoms, from mild to severe effects on the patient. Patients with severe cases of porencephaly have epileptic seizures and developmental delays, whereas patients with a mild case of porencephaly display little to no seizures and typical neurodevelopment. Infants with extensive defects show symptoms of the disorder shortly after birth, and the diagnosis is usually made before the age of 1.
The following text lists out common signs and symptoms of porencephaly in affected individuals along with a short description of certain terminologies.
Cause
Porencephaly is a rare disorder. The exact prevalence of porencephaly is not known; however, it has been reported that 6.8% of patients with cerebral palsy or 68% of patients with epilepsy and congenital vascular hemiparesis have porencephaly. Porencephaly has a number of different, often unknown, causes including absence of brain development and destructio |
https://en.wikipedia.org/wiki/Xedio | Xedio is a professional HD/SD modular application suite for News and Sports Production developed by EVS Broadcast Equipment.
Intended for broadcast professionals, it handles the acquisition, production, media management and the playout of News and sport media. Xedio suite integrates a non-linear editing system. This editor has been included in the IPDirector suite as a plug-in.
This solution is used by Channel One (Russia), RTL-TVI (Belgium), GOL TV (Spain), RTCG (Montenegro), BHRT (Bosnia and Herzegovina), Sky News (UK), ....
History
See also
EVS Broadcast Equipment |
https://en.wikipedia.org/wiki/Stem%20Cells%20%28journal%29 | Stem Cells is a peer-review scientific journal of cell biology. It was established as The International Journal of Cell Cloning in 1983, acquiring its current title in 1993.
The journal is published by AlphaMed Press, and is currently edited by Jan Nolta (University of California). Stem Cells currently has an impact factor of 6.277.
Abstracting and indexing
The journal is abstracted and indexed in the following bibliographic databases: |
https://en.wikipedia.org/wiki/North%20magnetic%20pole | The north magnetic pole, also known as the magnetic north pole, is a point on the surface of Earth's Northern Hemisphere at which the planet's magnetic field points vertically downward (in other words, if a magnetic compass needle is allowed to rotate in three dimensions, it will point straight down). There is only one location where this occurs, near (but distinct from) the geographic north pole. The geomagnetic north pole is the northern antipodal pole of an ideal dipole model of the Earth's magnetic field, which is the most closely fitting model of Earth's actual magnetic field.
The north magnetic pole moves over time according to magnetic changes and flux lobe elongation in the Earth's outer core. In 2001, it was determined by the Geological Survey of Canada to lie west of Ellesmere Island in northern Canada at . It was situated at in 2005. In 2009, while still situated within the Canadian Arctic at , it was moving toward Russia at between per year. In 2013, the distance between the north magnetic pole and the geographic north pole was approximately . As of 2021, the pole is projected to have moved beyond the Canadian Arctic to .
Its southern hemisphere counterpart is the south magnetic pole. Since Earth's magnetic field is not exactly symmetric, the north and south magnetic poles are not antipodal, meaning that a straight line drawn from one to the other does not pass through the geometric center of Earth.
Earth's north and south magnetic poles are also known as magnetic dip poles, with reference to the vertical "dip" of the magnetic field lines at those points.
Polarity
All magnets have two poles, where lines of magnetic flux enter one pole and emerge from the other pole. By analogy with Earth's magnetic field, these are called the magnet's "north" and "south" poles. The north-seeking pole of a magnet was defined to have the north designation, according to their use in early compasses. Because opposite poles attract, this means that as a physical magne |
https://en.wikipedia.org/wiki/Savinja%20Statistical%20Region | The Savinja Statistical Region () is a statistical region in Slovenia. The largest town in the region is Celje. It is named after the Savinja River. The region is very diverse in natural geography; it mainly comprises the wooded mountainous terrain attractive to tourists (the Upper Savinja Valley and part of the Kamnik–Savinja Alps), the fertile Lower Savinja Valley with good conditions for growing hops, the Kozje Hills, and the Velenje Basin with lignite deposits, used for electricity production. In 2013 the region invested more than EUR 127 million in environmental protection (the most of all regions). In 2013, the region accounted for 14% of enterprises created and 8% of enterprises shut down. The region has good natural conditions for agriculture. In 2013 this region had more than 11,000 farms, which is 15% of all farms in Slovenia, ranking the region right behind the Drava Statistical Region. In agricultural area utilised and livestock, the region was also in second place. The region is a well-known and popular tourist destination. In 2012, tourist arrivals and overnight stays in the region represented 11.1% of all tourist arrivals in Slovenia and 15.0% of all overnight stays. On average, tourists spent four nights there.
Cities and towns
The Savinja Statistical Region includes 9 cities and towns, the largest of which is Celje.
Administrative divisions
The Savinja Statistical Region comprises the following 31 municipalities:
Braslovče
Celje
Dobje
Dobrna
Gornji Grad
Kozje
Laško
Ljubno
Luče
Mozirje
Nazarje
Podčetrtek
Polzela
Prebold
Rečica ob Savinji.
Rogaška Slatina
Rogatec
Šentjur
Slovenske Konjice
Šmarje pri Jelšah
Šmartno ob Paki
Solčava
Šoštanj
Štore
Tabor
Velenje
Vitanje
Vojnik
Vransko
Žalec
Zreče
The municipalities of Bistrica ob Sotli and the Radeče were part of the region until January 2015; they became part of the Lower Sava Statistical Region in 2015.
Demographics
The population in 2020 was 263,322. It has a total |
https://en.wikipedia.org/wiki/CRISPR%20activation | CRISPR activation (CRISPRa) is a type of CRISPR tool that uses modified versions of CRISPR effectors without endonuclease activity, with added transcriptional activators on dCas9 or the guide RNAs (gRNAs).
Like for CRISPR interference, the CRISPR effector is guided to the target by a complementary guide RNA. However, CRISPR activation systems are fused to transcriptional activators to increase expression of genes of interest. Such systems are usable for many purposes including but not limited to, genetic screens and overexpression of proteins of interest.
The most commonly-used effector is based on Cas9 (from Type II systems), but other effectors like Cas12a (Type V) have been used as well.
Components
dCas9
Cas9 Endonuclease Dead, also known as dead Cas9 or dCas9, is a mutant form of Cas9 whose endonuclease activity is removed through point mutations in its endonuclease domains. Similar to its unmutated form, dCas9 is used in CRISPR systems along with gRNAs to target specific genes or nucleotides complementary to the gRNA with PAM sequences that allow Cas9 to bind. Cas9 ordinarily has 2 endonuclease domains called the RuvC and HNH domains. The point mutations D10A and H840A change 2 important residues for endonuclease activity that ultimately results in its deactivation. Although dCas9 lacks endonuclease activity, it is still capable of binding to its guide RNA and the DNA strand that is being targeted because such binding is managed by other domains. This alone is often enough to attenuate if not outright block transcription of the targeted gene if the gRNA positions dCas9 in a way that prevents transcriptional factors and RNA polymerase from accessing the DNA. However, this ability to bind DNA can also be exploited for activation since dCas9 has modifiable regions, typically the N and C terminus of the protein, that can be used to attach transcriptional activators.
Guide RNA
See: Guide RNA, CRISPR
A small guide RNA (sgRNA), or gRNA is an RNA with around |
https://en.wikipedia.org/wiki/Windows%20Embedded%20Compact%207 | Windows Embedded Compact 7 (formerly known as Windows Embedded CE 7.0) is the seventh major release of the Windows Embedded CE operating system, released on March 1, 2011. Windows Embedded Compact 7 is a real-time OS, separate from the Windows NT line, and is designed to target enterprise specific tools such as industrial controllers and consumer electronics devices such as digital cameras, GPS systems and also automotive infotainment systems. Windows Embedded Compact is designed to run on multiple CPU architectures and supports x86, SH (automotive only) and ARM.
During development, a Microsoft employee working in this division claimed that Microsoft was working hard on this release and that it shares the underlying kernel with Windows Phone. Microsoft officially confirmed this and said that Windows Phone 7 is based on Windows Embedded CE 6.0 R3 with some features borrowed from Windows Embedded Compact 7, thus making it a hybrid solution.
As with Windows Embedded CE 6.0, the platform builder for Windows Embedded Compact 7 is not a stand-alone product, but is implemented as plug-in for Microsoft Visual Studio - the version required is Visual Studio 2008 with Service Pack 1 installed.
New features
Windows Embedded Compact 7 contains these features:
Silverlight for Windows Embedded: Allows developers to develop application and user interfaces in Silverlight using Microsoft Expression Blend
Internet Explorer for Windows Embedded: A web browser similar to that of Windows Phone 7 with integrated Adobe Flash v10.1 support
Touch support: Windows Embedded Compact 7 recognizes touch and gesture input types
CPU support: Works on dual core CPUs in symmetric multiprocessing mode
Platform support: Runs on x86, SH4 (automotive only) MIPS and ARMv7 platforms
Media playback: Supports Digital Living Network Alliance (DLNA) and Media Transfer Protocol (MTP)
Networking: Now includes NDIS 6.1 and supports Bluetooth 2.1 EDR |
https://en.wikipedia.org/wiki/Implicit%20Shape%20Model | An Implicit Shape Model for a given object category consists of a class-specific alphabet (codebook) of local appearances that are prototypical for the object category, and of a spatial probability distribution which specifies where each codebook entry may be found on the object. |
https://en.wikipedia.org/wiki/Gross%20motor%20skill | Gross motor skills are the abilities usually acquired during childhood as part of a child's motor learning. By the time they reach two years of age, almost all children are able to stand up, walk and run, walk up stairs, etc. These skills are built upon, improved and better controlled throughout early childhood, and continue in refinement throughout most of the individual's years of development into adulthood. These gross movements come from large muscle groups and whole body movement. These skills develop in a head-to-toe order. The children will typically learn head control, trunk stability, and then standing up and walking. It is shown that children exposed to outdoor play time activities will develop better gross motor skills.
Types of motor skills
Motor skills are movements and actions of the muscles. Typically, they are categorized into two groups: gross motor skills and fine motor skills. Gross motor skills are involved in movement and coordination of the arms, legs, and other large body parts and movements. Gross motor skills can be further divided into two subgroups of locomotor skills and object control skills. Gross locomotor skills would include running, jumping, sliding, and swimming. Object control skills would include throwing, catching and kicking. Fine motor skills are involved in smaller movements that occur in the wrists, hands, fingers, and the feet and toes. They participate in smaller actions such as picking up objects between the thumb and finger, writing carefully, and even blinking. These two motor skills work together to provide coordination. Less developed children focus on their gross movements, while more developed children have more control over their fine movements.
Development of posture
Gross motor skills, as well as many other activities, require postural control. Infants need to control the heads to stabilize their gaze and to track moving objects. They also must have strength and balance in their legs to walk.
Newborn infants c |
https://en.wikipedia.org/wiki/Trimethoprim | Trimethoprim (TMP) is an antibiotic used mainly in the treatment of bladder infections. Other uses include for middle ear infections and travelers' diarrhea. With sulfamethoxazole or dapsone it may be used for Pneumocystis pneumonia in people with HIV/AIDS. It is taken orally (swallowed by mouth).
Common side effects include nausea, changes in taste, and rash. Rarely it may result in blood problems such as not enough platelets or white blood cells. Trimethoprim may cause sun sensitivity. There is evidence of potential harm during pregnancy in some animals but not humans. It works by blocking folate metabolism via dihydrofolate reductase in some bacteria which results in their death.
Trimethoprim was first used in 1962. It is on the World Health Organization's List of Essential Medicines. It is available as a generic medication.
Medical uses
It is primarily used in the treatment of urinary tract infections, although it may be used against any susceptible aerobic bacterial species. It may also be used to treat and prevent Pneumocystis jirovecii pneumonia. It is generally not recommended for the treatment of anaerobic infections such as Clostridium difficile colitis (the leading cause of antibiotic-induced diarrhea). Trimethoprim has been used in trials to treat retinitis.
Resistance to trimethoprim is increasing, but it is still a first line antibiotic in many countries.
Spectrum of susceptibility
Cultures and susceptibility tests should be done to make sure bacteria are treated by trimethoprim.
Escherichia coli
Proteus mirabilis
Klebsiella pneumoniae
Enterobacter species
Coagulase-negative Staphylococcus species, including S. saprophyticus
Streptococcus pneumoniae
Haemophilus influenzae
Side effects
Common
Nauseas
Change in taste
Vomiting
Diarrhea
Rashes
Sun sensitivity
Itchiness
Rare
Can cause thrombocytopenia (low levels of platelets) by lowering folic acid levels; this may also cause megaloblastic anemia.
Trimethoprim antagonizes the epit |
https://en.wikipedia.org/wiki/High%20Performance%20Computing%20Center%2C%20Stuttgart | The High Performance Computing Center (HLRS) in Stuttgart, Germany, is a research institute and a supercomputer center. HLRS has currently a flagship installation of a HPE Apollo 9000 system called Hawk 26 PFLOPS peak performance replacing the Cray XC40 system called Hazel Hen, providing ~7,4 PFLOPS peak performance. Additional systems include NEC clusters (NEC SX-ACE systems for testing, NEC Vulcan + Vulcan2 for non-critical computing) and Cray CS-Storm cluster.
Known historical configurations:
1996 - Cray T3E / 512 + NEC SX-4
2000 - Hitachi SR-8000 + NEC SX-5 / 32M2
2003 - ? (Opteron/Xeon cluster) + NEC SX-6
2005 - NEC SX-8
2008 - IBM BW-Grid + NEC SX-9
2009 - Cray XT5M
2010 - Cray XE6 "Hermit"
2014 - Cray XC40 "Hornet"
2019 - Cray CS-Storm
2020 - HPE Apollo 9000 "Hawk" + NEC (Vulcan + Vulcan2 + NEC SX-ACE).
See also
TOP500
Supercomputing in Europe |
https://en.wikipedia.org/wiki/Claudio%20Luchinat | Claudio Luchinat (born February 15, 1952 in Florence) is an Italian chemist. He is author of about 550 publications in Bioinorganic Chemistry, NMR and Structural Biology, and of four books. According to Google scholar, his h-index is 90 and his papers have been quoted more than 33,000 times ().
He earned a PhD in Chemistry from the University of Florence. He has been full professor of Chemistry at the University of Bologna (1986–96).
He is currently a researcher at the University of Florence and full professor of Chemistry at the same university (1996–, CERM and Department of Chemistry). He is member of the Italian Chemical Society, New York Academy of Sciences, American Association for the Advancement of Science. |
https://en.wikipedia.org/wiki/Certificate%20revocation%20list | In cryptography, a certificate revocation list (or CRL) is "a list of digital certificates that have been revoked by the issuing certificate authority (CA) before their scheduled expiration date and should no longer be trusted". CRLs are no longer required by the CA/Browser forum, as alternate certificate revocation technologies (such as OCSP) are increasingly used instead. Nevertheless, CRLs are still widely used by the CAs.
Revocation states
There are two different states of revocation defined in RFC 5280:
Revoked A certificate is irreversibly revoked if, for example, it is discovered that the certificate authority (CA) had improperly issued a certificate, or if a private-key is thought to have been compromised. Certificates may also be revoked for failure of the identified entity to adhere to policy requirements, such as publication of false documents, misrepresentation of software behaviour, or violation of any other policy specified by the CA operator or its customer. The most common reason for revocation is the user no longer being in sole possession of the private key (e.g., the token containing the private key has been lost or stolen).
Hold This reversible status can be used to note the temporary invalidity of the certificate (e.g., if the user is unsure if the private key has been lost). If, in this example, the private key was found and nobody had access to it, the status could be reinstated, and the certificate is valid again, thus removing the certificate from future CRLs.
Reasons for revocation
Reasons to revoke, hold, or unlist a certificate according to RFC 5280 are:
unspecified (0)
keyCompromise (1)
cACompromise (2)
affiliationChanged (3)
superseded (4)
cessationOfOperation (5)
certificateHold (6)
removeFromCRL (8)
privilegeWithdrawn (9)
aACompromise (10)
Note that value 7 is not used.
Publishing revocation lists
A CRL is generated and published periodically, often at a defined interval. A CRL can also be published immediately afte |
https://en.wikipedia.org/wiki/Portland%20Packing%20Company%20Factory | The Portland Packing Company Factory is an historic factory building at 14-26 York Street in Portland, Maine. It was added to the National Register of Historic Places in 1996. Built in 1884, it was home to Maine's oldest and largest vegetable canning company until 1927. After years of neglect, it was rehabilitated in 1995-6. It was listed on the National Register of Historic Places in 1996.
Description and history
The former Portland Packing Company Factory building is located in Portland's downtown area, on the east side of York Street opposite the end of Danforth Street. The building is a two-story brick structure, shaped roughly like a parallelogram. Due to the sloping lot, the rear (east side) of the building has a fully exposed third floor. It has a flat roof and rests on a rubblestone foundation. The main facade is divided into six similar sections, separated by wide pilasters. Each section has three bays, most of which are windows set in segmented-arch openings, and is topped by a line of corbelling, and has a band of projecting brickwork between the first and second levels. Three of the sections have a loading door instead of a window at the central bay, and there is one section where the entire ground floor is taken by a half-round opening with a large entrance sized (but no longer used) for vehicular access.
The Portland Packing Company was founded about 1862, by the merger of two smaller packing companies. By the turn of the 20th century, the company was Maine's largest vegetable canning business, an industry that exceeded the combined size of the state's slate, granite, and ice harvesting business in economic importance. This building was part of one of thirteen factories operated by the company in the state, and was built in 1884. Its design is attributed to Portland architect Francis H. Fassett on stylistic grounds. It functioned primarily as a storage facility, and was in 1927 sold to Blake Rounds Supply Company. After some years of va |
https://en.wikipedia.org/wiki/Venix | Venix is a discontinued version of the Unix operating system for low-end computers, developed by VenturCom, a "company that specialises in the skinniest implementations of Unix".
Overview
A working version of Venix/86 for the IBM PC XT was demoed at COMDEX in May 1983. It was based on Version 7 Unix with some enhancements from BSD (notably vi, more and csh) and custom inter-process communication mechanisms. It was the first licensed UNIX operating system available for the IBM PC and its compatibles, supported read/write access to a separate DOS/FAT-partition and could run in as little as 128 KB (256 KB - 512 KB recommended).
In September 1984, Venix/86 Encore was released; it supported a number of early PC-compatibles, including the AT&T 6300, the Zenith 150, the (first) NCR PC, and the Texas Instruments Professional Computer.
Venix Encore, which then became Venix 2.0, was still based on Version 7 Unix, and ran on the DEC Rainbow 100 (Venix/86R) as well as PCs (Venix/86 and /286). The system contained a number of enhancements, notably tools to access DOS files directly on a DOS/FAT-partition, and an updated ADB debugger. The system came in two flavors: a 2-user version priced at $800, and an 8-user version at $1,000. There were no technical differences between the two.
Confusingly, Venix 2.0 for the DEC PRO-380 microcomputer (Venix/PRO) was based "essentially" on System III. It no longer ran on the PRO-350. This is made clear in the ckermit 4E build instructions, which has a special target for Pro running Venix 1.0, but instructs the user to use the sysiii target for the Pro running Venix 2.0. These same sources also make it clear that Venix had an enhanced TTY interface relative to a pure V7 Unix System.
Venix 2.1 was released for at least the PC. Like the original Venix/86, it included a C compiler, a BASIC interpreter and added a Fortran 77 compiler as an option. An optional driver kit made it possible to develop hardware drivers for the system and generate |
https://en.wikipedia.org/wiki/Inversion%20recovery | Inversion recovery is an MRI sequence that provides high contrast between tissue and lesion. It can be used to provide high T1 weighted image, high T2 weighted image, and to suppress the signals from fat, blood, or cerebrospinal fluid (CSF).
Fluid-attenuated inversion recovery
Fluid-attenuated inversion recovery (FLAIR) is an inversion-recovery pulse sequence used
to nullify the signal from fluids. For example, it can be used in brain imaging to suppress cerebrospinal fluid so as to bring out periventricular hyperintense lesions, such as multiple sclerosis plaques. By carefully choosing the inversion time TI (the time between the inversion and excitation pulses), the signal from any particular tissue can be suppressed.
Turbo inversion recovery magnitude
Turbo inversion recovery magnitude (TIRM) measures only the magnitude of a turbo spin echo after a preceding inversion pulse, thus is phase insensitive.
TIRM is superior in the assessment of osteomyelitis and in suspected head and neck cancer. Osteomyelitis appears as high intensity areas. In head and neck cancers, TIRM has been found to both give high signal in tumor mass, as well as low degree of overestimation of tumor size by reactive inflammatory changes in the surrounding tissues.
Double Inversion Recovery
It is a sequence that suppress both cerebrospinal fluid (CSF) and white matter, and samples the remaining transverse magnetisation in fast spin echo, where the majority of the signals are from the grey matter. Thus, this sequence is useful in detecting small changes on the brain cortex such as focal cortical dysplasia and hippocampal sclerosis in those with epilepsy. These lesions are difficult to detect in other MRI sequences.
History
Erwin Han first used inversion recovery technique to determine the value of T1 (the time taken for longitudinal magnetisation to recover 63% of its maximum value) for water in 1949, 3 years after the nuclear magnetic resonance was discovered. |
https://en.wikipedia.org/wiki/AgBiotechNet | AgBiotechNet is a searchable online database of scientific literature on topics related to agricultural biotechnology. Its target audience consists of biotechnology researchers and policymakers. Though some features on the site are available for free, others can only be accessed by paid subscribers. First launched in January 1999, AgBiotechNet is run by the Centre for Agriculture and Bioscience International (also known as CABI), which founded it along with Michigan State University's Agricultural Biotechnology Support Project. |
https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93Dushnik%E2%80%93Miller%20theorem | In the mathematical theory of infinite graphs, the Erdős–Dushnik–Miller theorem is a form of Ramsey's theorem stating that every infinite graph contains either a countably infinite independent set, or a clique with the same cardinality as the whole graph.
The theorem was first published by , in both the form stated above and an equivalent complementary form: every infinite graph contains either a countably infinite clique or an independent set with equal cardinality to the whole graph. In their paper, they credited Paul Erdős with assistance in its proof. They applied these results to the comparability graphs of partially ordered sets to show that each partial order contains either a countably infinite antichain or a chain of cardinality equal to the whole order, and that each partial order contains either a countably infinite chain or an antichain of cardinality equal to the whole order.
The same theorem can also be stated as a result in set theory, using the arrow notation of , as . This means that, for every set of cardinality , and every partition of the ordered pairs of elements of into two subsets and , there exists either a subset of cardinality or a subset of cardinality , such that all pairs of elements of belong to . Here, can be interpreted as the edges of a graph having as its vertex set, in which (if it exists) is a clique of cardinality , and (if it exists) is a countably infinite independent set.
If is taken to be the cardinal number itself, the theorem can be formulated in terms of ordinal numbers with the notation , meaning that (when it exists) has order type . For uncountable regular cardinals (and some other cardinals) this can be strengthened to ; however, it is consistent that this strengthening does not hold for the cardinality of the continuum.
The Erdős–Dushnik–Miller theorem has been called the first "unbalanced" generalization of Ramsey's theorem, and Paul Erdős's first significant result in set theory. |
https://en.wikipedia.org/wiki/Cognitive%20polyphasia | Cognitive polyphasia is where different kinds of knowledge, possessing different rationalities live side by side in the same individual or collective. From Greek: polloi "many", phasis "appearance".
In his research on popular representations of psychoanalysis in France, Serge Moscovici observed that different and even contradictory modes of thinking about the same issue often co-exist. In contemporary societies people are "speaking" medical, psychological, technical, and political languages in their daily affairs. By extending this phenomenon to the level of thought he suggests that "the dynamic co-existence—interference or specialization—of the distinct modalities of knowledge, corresponding to definite relations between man and his environment, determines a state of cognitive polyphasia".
Extension and applications
Cognitive systems do not habitually develop towards a state of consistency. Instead, judgements are based on representational terms being dominant in one field of interests, while playing a minor role in other fields; that is, thoughts tend to be locally but not globally consistent. Contemporaries in Western and non-western societies alike face a variety of situations where particular modes of reasoning fit better than others. Some are more useful in the family and in matters involving relatives, and others are more apt in situations involving political, economic, societal, religious or scientific matters. Knowledge and talking are always situated.
Scientific explanations frequently contradict everyday and common-sense based explanations. Nevertheless, people tend to apply each of the two ways of explanation in their talk depending on the audience and the particular situation. This can be observed with health related issues where Sandra Jovchelovitch and Marie-Claude Gervais have shown how members of the Chinese community attend to Western medical doctors and simultaneously apply traditional Chinese treatments. In their study on modernization proce |
https://en.wikipedia.org/wiki/Hut%204 | Hut 4 was a wartime section of the Government Code and Cypher School (GC&CS) at Bletchley Park tasked with the translation, interpretation and distribution of Kriegsmarine (German navy) messages deciphered by Hut 8. The messages were largely encrypted by Enigma machines. As the Kriegsmarine operated Enigma more securely, Hut 8 had less information for Ultra than Hut 6 which handled Army and Air Force messages. Hut 4 also broke various hand cyphers and some Italian naval traffic.
Location
Located initially in one of the original single-story wooden huts, the name "Hut 4" was retained when the section moved to a new brick building, Block A, in 1941.
Operation
Naval Ultra was handled differently from Army and Air Force Ultra because the Admiralty was an operational HQ and could give orders during a battle; while the Imperial General Staff (Army) and Air Staff would give directives to theatre commanders general orders say to "clear the enemy out of Africa" with discretion over how to do it. Verbatim translations of naval decodes were sent to the Naval Intelligence Division (NID) of the Admiralty's Operational Intelligence Centre (OIC) in London and nowhere else (except for some naval intelligence sent directly from Bletchley to Commanders-in-Chief in the Mediterranean).
Hut 4 also decoded a manual system known as the "dockyard cipher". This sometimes carried messages that were also sent on an Enigma network. Feeding these back to Hut 8 provided excellent cribs for breaking the current naval Enigma key.
People
M. M. G. Jennings (Margaret Allan), racing driver
J. W. B. Barns, later Professor of Egyptology at Oxford
Sarah Baring, later Viscountess Astor
Osla Benning, Prince Philip's first girlfriend
Jocelyn Bostock from Lady Margaret Hall (LMH) Oxford, assistant to Hinsley
Alec Naylor Dakin
Pamela Rose
Leonard R. Palmer, later Professor of Comparative Philology at Oxford.
Bernard Willson
Note: Frank Birch and Harry Hinsley were both associated with the |
https://en.wikipedia.org/wiki/Langenberg%20transmission%20tower | The Langenberg transmission tower (also translated as "Sender Langenberg" or "Transmission Facility Langenberg") is a broadcasting station for ananlog FM Radio and Digital-TV (DVB-T2 HD) signals. It is located in Langenberg, Velbert, Germany and owned and operated by Westdeutscher Rundfunk, WDR.
The history of the transmitting site is very changing. The transmitter first went into service in 1927 with 60 kilowatts (kW) of power and a T-aerial hanging on two 100-metre freestanding steel-frame towers insulated against ground.
History
In 1926 the „Westdeutschen Funkstunde“ and a association of locals agreed to build the transmitter tower in Langenberg. On Januar 15. 1927 the transmitter was inaugorated.
In the early 1930s, communist underground groups tried to manipulate the line from the studio to the transmitter in order to broadcast their own propaganda. Their attempts failed, but they did manage to attach a red star to the top of one of the towers, which was removed on the same day.
In 1934 the T-aerial was replaced by an aerial hanging from a 160-metre wood framework tower and the transmission power was increased to 100 kW. However, this tower was destroyed on October 10, 1935 by a tornado. After this a triangular aerial hung on three 45-metre freestanding towers was built; this went into service in December 1935. In 1940/41 a second aerial was installed on a 240-metre insulated guyed steel tube mast. The entire aerial system was destroyed by SS-Postschutz troops on April 12, 1945.
Post-1945
After World War II, British forces built two triangular aerials mounted on 6 masts, each 50 metres high. One of these aerials was removed in 1948 and a insulated radio mast built on its site. The other aerial was destroyed in a storm in 1949 which broke two of the three masts. The third mast was transformed into an AM transmitter and was in service until 1957. In 1949 a second radio mast with a height of 120 metres was built, and in 1952 a third guyed mast followe |
https://en.wikipedia.org/wiki/Futhark%20%28programming%20language%29 | Futhark is a functional data parallel array programming language originally developed at UCPH Department of Computer Science (DIKU) as part of the HIPERFIT project. It focuses on enabling data parallel programs written in a functional style to be executed with high performance on massively parallel hardware, in particular on graphics processing units (GPUs). Futhark is strongly inspired by NESL, and its implementation uses a variant of the flattening transformation, but imposes constraints on how parallelism can be expressed in order to enable more aggressive compiler optimisations. In particular, irregular nested data parallelism is not supported.
Overview
Futhark is a language in the ML family, with an indentation-insensitive syntax derived from OCaml, Standard ML, and Haskell. The type system is based on Hindley-Milner with a variety of extensions, such as uniqueness types and size-dependent types. Futhark is not intended as a general-purpose programming language for writing full applications, but is instead focused on writing computational "kernels" (not necessarily the same as a GPU kernel) which are then invoked from applications written in conventional languages.
Examples
Dot product
The following program computes the dot product of two vectors containing double-precision numbers.
def dotprod xs ys = f64.sum (map2 (*) xs ys))
It can also be equivalently written with explicit type annotations as follows.
def dotprod [n] (xs: [n]f64) (ys: [n]f64) : f64 = f64.sum (map2 (*) xs ys))
This makes the size-dependent types explicit: this function can only be invoked with two arrays of the same size, and the type checker will reject any program where this cannot be statically determined.
Matrix multiplication
The following program performs matrix multiplication, using the definition of dot product above.
def matmul [n][m][p] (A: [n][m]f64) (B: [m][p]f64) : [n][p]f64 =
map (\A_row ->
map (\B_col -> dotprod A_row B_col)
(transpose B) |
https://en.wikipedia.org/wiki/Library%20of%20the%20Printed%20Web | Library of the Printed Web is a physical archive devoted to web-to-print artists’ books, zines and other printout matter. Founded by Paul Soulellis in 2013, the collection was acquired by The Museum of Modern Art Library in January 2017. The project has been described as "web culture articulated as printed artifact," an "archive of archives," characterized as an "accumulation of accumulations," much of it printed on demand. Techniques for appropriating web content used by artists in the collection include grabbing, hunting, scraping and performing, detailed by Soulellis in "Search, Compile, Publish," and later referenced by Alessandro Ludovico.
Among the 130 artists included in Library of the Printed Web are Olia Lialina, Mishka Henner, Clement Valla, Karolis Kosas, Lauren Thorson, Cory Arcangel, Silvio Lorusso, Angela Genusa, Jean Keller, Aaron Krach, Joachim Schmid, Benjamin Shaykin, Chantal Zakari, Richard Prince, David Horvitz and Penelope Umbrico. Over 240 works are in the collection. Library of the Printed Web continues to grow through curatorial acquisition and artist contributions.
The collection is used primarily for experimental publishing research, as a way to question issues of copyright, privacy and appropriation by artists on the internet, and as the basis for academic workshops in design and new media.
The project is frequently featured at book fairs, independent publishing conferences and schools, appearing at Miss Read Berlin Art Book Fair, Stadtbibliothek Stuttgart, Merz Akademie, Printed Matter's NY Art Book Fair, Offprint London, Theorizing the Web, Interrupt 3 at Brown University, The Internet Yami-Ichi, Printed Matter's LA Art Book Fair, Odds and Ends Art Book Fair at Yale Art Gallery, Rhode Island School of Design School of Visual Arts, International Center of Photography, School of the Museum of Fine Arts Boston and Offprint Paris. In 2013 Library of the Printed Web was featured at Theorizing the Web and The Book Affair at the opening of t |
https://en.wikipedia.org/wiki/History%20of%20anthropometry | The history of anthropometry includes its use as an early tool of anthropology, use for identification, use for the purposes of understanding human physical variation in paleoanthropology and in various attempts to correlate physical with racial and psychological traits. At various points in history, certain anthropometrics have been cited by advocates of discrimination and eugenics often as part of novel social movements or based upon pseudoscience.
Craniometry and paleoanthropology
In 1716 Louis-Jean-Marie Daubenton, who wrote many essays on comparative anatomy for the Académie française, published his Memoir on the Different Positions of the Occipital Foramen in Man and Animals (Mémoire sur les différences de la situation du grand trou occipital dans l'homme et dans les animaux). Six years later Pieter Camper (1722–1789), distinguished both as an artist and as an anatomist, published some lectures that laid the foundation of much work. Camper invented the "facial angle," a measure meant to determine intelligence among various species. According to this technique, a "facial angle" was formed by drawing two lines: one horizontally from the nostril to the ear; and the other perpendicularly from the advancing part of the upper jawbone to the most prominent part of the forehead. Camper's measurements of facial angle were first made to compare the skulls of men with those of other animals. Camper claimed that antique statues presented an angle of 90°, Europeans of 80°, Central Africans of 70° and the orangutan of 58°.
Swedish professor of anatomy Anders Retzius (1796–1860) first used the cephalic index in physical anthropology to classify ancient human remains found in Europe. He classed skulls in three main categories; "dolichocephalic" (from the Ancient Greek kephalê "head", and dolikhos "long and thin"), "brachycephalic" (short and broad) and "mesocephalic" (intermediate length and width). Scientific research was continued by Étienne Geoffroy Saint-Hilaire (1772– |
https://en.wikipedia.org/wiki/Hugo%20Krawczyk | Hugo Krawczyk is an Argentine-Israeli cryptographer best known for co-inventing the HMAC message authentication algorithm and contributing in fundamental ways to the cryptographic architecture of central Internet standards, including IPsec, IKE, and SSL/TLS. In particular, both IKEv2 and TLS 1.3 use Krawczyk’s SIGMA protocol as the cryptographic core of their key exchange procedures. He has also contributed foundational work in the areas of threshold and proactive cryptosystems and searchable symmetric encryption, among others.
Education
Krawczyk acquired a Bachelor of Arts in Mathematics from the University of Haifa. Later he received his Master of Science and Ph.D. in computer science from Technion - Israel Institute of Technology with Oded Goldreich as doctoral thesis advisor.
Career
Hugo Krawczyk is a Senior Principal Scientist at Amazon Web Services (AWS). Between 2019 and 2023 he was a Principal Researcher at the Algorand Foundation and part of its founding team. Prior to that, he was an IBM Fellow and Distinguished Research Staff Member at the IBM T.J. Watson Research Center in New York as a member of the Cryptography Research group from 1992 to 1997, and again from 2004 to 2019. He was an Associate Professor at the Department of Electrical Engineering at the Technion in Israel from 1997 until 2004.
Krawczyk has published over 100 papers with more than 30,000 citations, and is an inventor in 30 issued patents.
His research includes both theoretical and applied elements of cryptography, with a focus on internet security, privacy, and authentication. His most recent projects in the area include: TLS 1.3, the new-generation SSL/TLS; HKDF, the standard for key derivation embraced by TLS 1.3, Signal, WhatsApp, Facebook Messenger, and others; and OPAQUE, a password authentication protocol being standardized by the IRTF and recently deployed by Facebook in its implementation of end-to-end encrypted chat backups for WhatsApp.
Krawczyk is the author of many |
https://en.wikipedia.org/wiki/Theatrograph | R. W. Paul presented Britain's second film projector, and the first commercially produced 35mm projector, the Theatrograph, on 20 February 1896. It was first demonstrated at Finsbury Technical College. The use of Paul's Theatrograph in music halls up and down the country popularised early cinema in Britain. It was first revealed to the public at the Egyptian Hall in Piccadilly, London.
Carl Hertz sailed from England on 28 March 1896 aboard the Royal Mail Ship and during the voyage exhibited Paul's Theatrograph to the passengers. He also exhibited films in South Africa and Australia. The screenings in South Africa were the first public screenings of moving pictures in that country. After Australia, Hertz took the Theatrograph to Ceylon (Sri Lanka), India, China, Japan, the Fiji Islands, and Hawaii.
George Melies purchased one of Paul's theatrographs in 1896.
The Theatrograph was also known as the Animatograph.
Notes |
https://en.wikipedia.org/wiki/Set-theoretic%20limit | In mathematics, the limit of a sequence of sets (subsets of a common set ) is a set whose elements are determined by the sequence in either of two equivalent ways: (1) by upper and lower bounds on the sequence that converge monotonically to the same set (analogous to convergence of real-valued sequences) and (2) by convergence of a sequence of indicator functions which are themselves real-valued. As is the case with sequences of other objects, convergence is not necessary or even usual.
More generally, again analogous to real-valued sequences, the less restrictive limit infimum and limit supremum of a set sequence always exist and can be used to determine convergence: the limit exists if the limit infimum and limit supremum are identical. (See below). Such set limits are essential in measure theory and probability.
It is a common misconception that the limits infimum and supremum described here involve sets of accumulation points, that is, sets of where each is in some This is only true if convergence is determined by the discrete metric (that is, if there is such that for all ). This article is restricted to that situation as it is the only one relevant for measure theory and probability. See the examples below. (On the other hand, there are more general topological notions of set convergence that do involve accumulation points under different metrics or topologies.)
Definitions
The two definitions
Suppose that is a sequence of sets. The two equivalent definitions are as follows.
Using union and intersection: define and If these two sets are equal, then the set-theoretic limit of the sequence exists and is equal to that common set. Either set as described above can be used to get the limit, and there may be other means to get the limit as well.
Using indicator functions: let equal if and otherwise. Define and where the expressions inside the brackets on the right are, respectively, the limit infimum and limit supremum of the real-valued seq |
https://en.wikipedia.org/wiki/Neutron%20spectroscopy | Neutron spectroscopy is a spectroscopic method of measuring atomic and magnetic motions by measuring the kinetic energy of emitted neutrons. The measured neutrons may be emitted directly (for example, by nuclear reactions), or they may scatter off cold matter before reaching the detector. Inelastic neutron scattering observes the change in the energy of the neutron as it scatters from a sample and can be used to probe a wide variety of different physical phenomena such as the motions of atoms (diffusional or hopping), the rotational modes of molecules, sound modes and molecular vibrations, recoil in quantum fluids, magnetic and quantum excitations or even electronic transitions.
Since its discovery, neutron spectroscopy has become useful in medicine as it has been applied to radiation protection and radiation therapy.
It is also used in nuclear fusion experiments, where the neutron spectrum can be used to infer the plasma temperature, density, and composition, in addition to the total fusion power.
Although neutron spectroscopy is currently capable of operating on many orders of neutron energy, much research focuses on expanding these capabilities to higher energies. In 2001, US researchers were able to measure neutrons with energies up to 100 gigaelectronvolts
See also
Neutron diffraction
Raman scattering
Nested Neutron Spectrometer |
https://en.wikipedia.org/wiki/THOG%20problem | The THOG problem is one of cognitive psychologist Peter Wason's logic puzzles, constructed to show some of the weaknesses in human thinking.
You are shown four symbols
a black square
a white square
a black circle
a white circle
and told by the experimenter "I have picked one colour (black or white) and one shape (square or circle). A symbol that possesses exactly one, but not both, of the properties I have picked, is called a THOG. The black square is a THOG. For each of the other symbols, are they a) definitely a THOG, b) undecidable, or c) definitely not a THOG?"
Presented in this form, the task is quite difficult, because much information must be held in working memory at the same time.
Solution 1 (Analysis of symbol properties)
The chosen symbol of the experimenter is not a black square, since it shares both properties with the black square, and so the black square would not be a THOG.
The chosen symbol of the experimenter is not a white circle, since it shares 0 properties with the black square, and so the black square would not be a THOG.
So the experimenter could have chosen either a black circle or a white square. Since the colours and shapes of these two possibilities are opposites, it means:
Any symbol that shares exactly one property with one of the possibilities must necessarily share exactly one property with the other possibility. (So we can conclude that both the black square and the white circle are definitely THOGs.)
Any symbol that shares both properties with one of the possibilities must necessarily share 0 properties with the other possibility (and vice versa). (So we can conclude that both the black circle and the white square are definitely not THOGs.)
Solution 2 (Case Analysis of the Experimenter's choice)
This solution analyzes the four possible hidden choices of the experimenter.
From the table we see that the only valid choice for the experimenter are the white square or the black circle. In both of those rows, all of the inf |
https://en.wikipedia.org/wiki/SageMath | SageMath (previously Sage or SAGE, "System for Algebra and Geometry Experimentation") is a computer algebra system (CAS) with features covering many aspects of mathematics, including algebra, combinatorics, graph theory, numerical analysis, number theory, calculus and statistics.
The first version of SageMath was released on 24 February 2005 as free and open-source software under the terms of the GNU General Public License version 2, with the initial goals of creating an "open source alternative to Magma, Maple, Mathematica, and MATLAB". The originator and leader of the SageMath project, William Stein, was a mathematician at the University of Washington.
SageMath uses a syntax resembling Python's, supporting procedural, functional and object-oriented constructs.
Development
Stein realized when designing Sage that there were many open-source mathematics software packages already written in different languages, namely C, C++, Common Lisp, Fortran and Python.
Rather than reinventing the wheel, Sage (which is written mostly in Python and Cython) integrates many specialized CAS software packages into a common interface, for which a user needs to know only Python. However, Sage contains hundreds of thousands of unique lines of code adding new functions and creating the interfaces among its components.
SageMath uses both students and professionals for development. The development of SageMath is supported by both volunteer work and grants. However, it was not until 2016 that the first full-time Sage developer was hired (funded by an EU grant). The same year, Stein described his disappointment with a lack of academic funding and credentials for software development, citing it as the reason for his decision to leave his tenured academic position to work full-time on the project in a newly founded company, SageMath, Inc.
Achievements
2007: first prize in the scientific software division of Les Trophées du Libre, an international competition for free software.
2012: on |
https://en.wikipedia.org/wiki/Ab%20initio%20multiple%20spawning | The ab initio multiple spawning, or AIMS, method is a time-dependent formulation of quantum chemistry.
In AIMS, nuclear dynamics and electronic structure problems are solved simultaneously. Quantum mechanical effects in the nuclear dynamics are included, especially the nonadiabatic effects which are crucial in modeling dynamics on multiple electronic states.
The AIMS method makes it possible to describe photochemistry from first principles molecular dynamics, with no empirical parameters. The method has been applied to two molecules of interest in organic photochemistry - ethylene and cyclobutene.
The photodynamics of ethylene involves both covalent and ionic electronic excited states and the return to the ground state proceeds through a pyramidalized geometry. For the photoinduced ring opening of cyclobutene, is it shown that the disrotatory motion predicted by the Woodward–Hoffmann rules is established within the first 50 fs after optical excitation.
The method was developed by chemistry professor Todd Martinez. |
https://en.wikipedia.org/wiki/Kype | A kype is a hook-like secondary sex characteristic which develops at the distal tip of the lower jaw in some male salmonids prior to the spawning season. The structure usually develops in the weeks prior to, and during, migration to the spawning grounds. In addition to the development of the kype, a large depression forms in the two halves of the premaxilla in the upper jaw, allowing the kype to fit into the premaxilla when the mouth is closed.
The kype functions as a secondary sexual characteristic and influences the formation of dominance hierarchies at the spawning grounds. The size of the kype is believed to determine male spawning frequency.
Description
The kype grows rapidly from bony needles proliferating from the tip of the dentary (the anterior and largest of the bones making up the lower jaw). The needles form a mesh, but do not interfere with the connective tissues used by bone marrow. At the snout, the needles strengthen into Sharpey's fibres. The speed at which the kype skeleton develops results in many osteoblasts and proteoglycans appearing along the growth zone. The dentary itself is made of compact bone, but the kype tissue contains chondrocytes and cartilage. The kype formation process has been described as "making bone as fast as possible and with as little material as possible".
Some species of salmon are semelparous (they have a single reproductive bout before death) whereas others are iteroparous (they spawn multiple times after maturation). In iteroparous cases, at least in Atlantic salmon, the kype is not fully resorbed after the breeding season, although basal parts of the kype skeleton are re-modelled into regular dentary bone. Some fish never lose their kype. Rather, as they re-enter subsequent spawning seasons, their kypes continue to grow. This fast growing skeletal tissue fuses with the dense dentary, becoming a permanent, growing kype.
Occurrence
Many male trout (e.g. Brown trout (Salmo trutta) and rainbow trout (Oncorhynchus my |
https://en.wikipedia.org/wiki/Ideal%20tasks | Ideal tasks arise during task analysis. Ideal tasks are different from real tasks. They are ideals in the Platonic sense of a circle being an ideal whereas a drawn circle is flawed and real. The study of theoretically best or “mathematically ideal” tasks (Green & Swets, 1966), has been the basis of the branch of stimulus control in psychology called Psychophysics as well as being part of Artificial Intelligence (e. g. Goel & Chandrasekaran, 1992). Such studies include the instantiation of such ideal tasks in the real world. The notion of the ideal task has also played an important role in information theory. Tasks are defined as sequences of contingencies, each presenting stimuli and requiring an action or a sequence of actions to occur in some non-arbitrary fashion. These contingencies may not only provide stimuli that require the discrimination of relations among actions and events but among task actions themselves. Again, Task actions, E, are actions that are required to complete tasks. Properties of tasks (usually the stimuli, or the relationship among stimuli and actions) are varied, and responses to them can be measured and analyzed. |
https://en.wikipedia.org/wiki/HDBaseT | HDBaseT is a consumer electronic (CE) and commercial connectivity standard for transmission of uncompressed ultra-high-definition video, digital audio, DC power, Ethernet, USB 2.0, and other control communication (such as RS-232 and Consumer IR) over a single category cable (Cat 5e or better) up to 100 m (328 ft) in length, terminated using the same 8P8C modular connectors as used in Ethernet networks. HDBaseT technology is promoted and advanced by the HDBaseT Alliance.
History
The HDBaseT Alliance was incorporated on June 14, 2010 by Samsung Electronics, Sony Pictures Entertainment, LG Electronics and Valens Semiconductor as the not-for-profit organization to promote the HDBaseT standard originally created by Valens. The HDBaseT 1.0 specification was also finalized in June 2010.
HDBaseT initially dominated in extender products for HDMI technology before later also being embedded in AV devices such as video projectors and AV receivers.
Products utilizing HDBaseT technology were first demonstrated by multiple vendors at the Consumer Electronics Show as early as 2010, and by the HDBaseT Alliance in 2013.
In 2013, the HDBaseT Alliance issued Specification 2.0 to enrich the HDBaseT offering to the pro-AV market and enable a multimedia home connectivity solution. Spec 2.0 maintains all the features of Spec 1.0, but also adds HDBaseT networking, switching, and control-point capabilities such as flexible and fully utilized mesh topology, distributed routing, and end-to-end error handling, enabling multipoint-to-multipoint connectivity and multi-streaming. Spec 2.0 also embeds USB 2.0 signals, enabling the zero-latency distribution of signals such as for touch-screen functionality and keyboard-video-mouse (KVM). HDBaseT 2.0 maintained category cable as the transmission medium, but also added the option for optical fiber for even greater distances and infrastructure cabling adaptability.
In 2014, the IEEE Standards Association announced that it formed a series of P1911 W |
https://en.wikipedia.org/wiki/Alternatives%20to%20Darwinian%20evolution | Alternatives to Darwinian evolution have been proposed by scholars investigating biology to explain signs of evolution and the relatedness of different groups of living things. The alternatives in question do not deny that evolutionary changes over time are the origin of the diversity of life, nor that the organisms alive today share a common ancestor from the distant past (or ancestors, in some proposals); rather, they propose alternative mechanisms of evolutionary change over time, arguing against mutations acted on by natural selection as the most important driver of evolutionary change.
This distinguishes them from certain other kinds of arguments that deny that large-scale evolution of any sort has taken place, as in some forms of creationism, which do not propose alternative mechanisms of evolutionary change but instead deny that evolutionary change has taken place at all. Not all forms of creationism deny that evolutionary change takes place; notably, proponents of theistic evolution, such as the biologist Asa Gray, assert that evolutionary change does occur and is responsible for the history of life on Earth, with the proviso that this process has been influenced by a god or gods in some meaningful sense.
Where the fact of evolutionary change was accepted but the mechanism proposed by Charles Darwin, natural selection, was denied, explanations of evolution such as Lamarckism, catastrophism, orthogenesis, vitalism, structuralism and mutationism (called saltationism before 1900) were entertained. Different factors motivated people to propose non-Darwinian mechanisms of evolution. Natural selection, with its emphasis on death and competition, did not appeal to some naturalists because they felt it immoral, leaving little room for teleology or the concept of progress (orthogenesis) in the development of life. Some who came to accept evolution, but disliked natural selection, raised religious objections. Others felt that evolution was an inherently progressive |
https://en.wikipedia.org/wiki/Elizabeth%20Maywood | Elizabeth Maywood is an English researcher who studies circadian rhythms and sleep in mice. Her studies are focused on the suprachiasmatic nucleus (SCN), a small region of the brain that controls circadian rhythms.
Biography
Elizabeth Susan Maywood was born in Leeds, England. She attained a degree in Pharmacology before going on to obtain her Ph.D. in biochemical endocrinology in London. After receiving her Ph.D., in 1988 she joined Michael Hastings’ group as a postdoc in the Department of Anatomy at the University of Cambridge (now part of the Physiology, Development and Neuroscience (PDN) Department) to study seasonal biology in Syrian hamsters. In 2001 she moved with Hastings to the MRC Laboratory of Molecular Biology in Cambridge, where he had set up a new research group to study the molecular neurobiology of circadian rhythms. Since then, she has moved the focus of her study to circadian rhythms and sleep.
Research contributions
Early research in the field of chronobiology utilizing lesion experiments has suggested that the suprachiasmatic nucleus (SCN) serves as the master circadian clock of the mammalian brain and is entrained through retinal inputs. More recently, research on the SCN has focused on the function of individual neuropeptides and their complex interactions in the scope of the SCN circuitry. Research into the role of vasoactive intestinal polypeptide (VIP), gastrin-releasing peptide (GRP), arginine vasopressin (AVP), and GABA has started to paint a picture of the hierarchy of neuropeptides in the maintenance of circadian coherence in the SCN.
Maywood's research investigates the complex interactions of various neuropeptides and the role of events at the membrane in feedback loops in the SCN. Furthermore, Maywood's research also seeks to understand how different parts of the SCN coordinate rhythms and more broadly understand the interaction of the SCN with sleep.
Studies of CRY1/CRY2 in the Suprachiasmatic Nucleus
In one experiment, Maywood |
https://en.wikipedia.org/wiki/HD-MAC | HD-MAC (High Definition Multiplexed Analogue Components) was a broadcast television standard proposed by the European Commission in 1986, as part of Eureka 95 project. It belongs to the MAC - Multiplexed Analogue Components standard family. It is an early attempt by the EEC to provide High-definition television (HDTV) in Europe. It is a complex mix of analogue signal (based on the Multiplexed Analogue Components standard), multiplexed with digital sound, and assistance data for decoding (DATV). The video signal (1250 lines/50 fields per second in 16:9 aspect ratio, with 1152 visible lines) was encoded with a modified D2-MAC encoder.
HD-MAC could be decoded by normal D2-MAC standard definition receivers, but no extra resolution was obtained and certain artifacts were visible. To decode the signal in full resolution a specific HD-MAC tuner was required .
Naming convention
The European Broadcasting Union video format description is as follows: width x height [scan type: i or p] / number of full frames per second
European standard definition digital broadcasts use 720×576i/25, meaning 25 720 pixels wide and 576 pixels high interlaced frames: odd lines (1, 3, 5 ...) are grouped to build the odd field, which is transmitted first, then it is followed by the even field containing lines 2, 4, 6... Thus, there are two fields in a frame, resulting in a field frequency of 25 × 2 = 50 Hz.
The visible part of the video signal provided by an HD-MAC receiver was 1152i/25, which exactly doubles the vertical resolution of standard definition. The amount of information is multiplied by 4, considering the encoder started its operations from a 1440x1152i/25 sampling grid.
Standard history
Work on HD-MAC specification started officially in May 1986. The purpose was to react against a Japanese proposal, supported by the US, which aimed to establish the NHK-designed Hi-Vision (also known as MUSE) system as a world standard. Besides preservation of the European electronic industr |
https://en.wikipedia.org/wiki/Genome%20India%20Project | Genome India Project (GIP) is a research initiative led by the Bangalore-based Indian Institute of Science's Centre for Brain Research and involves over 20 universities across the country in an effort to gather samples, compile data, conduct research, and create an ‘Indian reference genome' grid.
Background
The initiative is funded by Department of Biotechnology (DBT) to sequence at least 10,000 Indian genomes in phase 1. The goal of the research is to develop predictive diagnostic indicators for several high-priority diseases and other uncommon and genetic disorders. In phase 2, the project would collect genetic samples from patients with three broad categories - cardiovascular diseases, mental illness, and cancer.
Participating institutions
The list includes;
All India Institute of Medical Sciences, Jodhpur
Centre for Cellular and Molecular Biology
Centre for DNA Fingerprinting and Diagnostics
Institute of Genomics and Integrative Biology
Gujarat Biotechnology Research Centre, Gandhinagar
Indian Institute of Information Technology, Allahabad
Indian Institute of Science Education and Research, Pune
Indian Institute of Technology, Madras
Indian Institute of Technology, Delhi
Indian Institute of Technology Jodhpur
Institute of Bioresources And Sustainable Development, Imphal
Institute of Life Sciences, Bhubhaneswar
Mizoram University
National Centre for Biological Sciences
National Institute of Biomedical Genomics
National Institute of Mental Health and Neurosciences
Rajiv Gandhi Centre for Biotechnology
Sher-i-Kashmir Institute of Medical Sciences |
https://en.wikipedia.org/wiki/Exciton | An electron and a electron hole that are attracted to each other by the Coulomb force can form a bound state called an exciton. It is an electrically neutral quasiparticle that exists mainly in condensed matter, including insulators, semiconductors, some metals, but also in certain atoms, molecules and liquids. The exciton is regarded as an elementary excitation that can transport energy without transporting net electric charge.
An exciton can form when an electron from the valence band of a crystal is promoted in energy to the conduction band e.g., when a material absorbs a photon. Promoting the electron to the conduction band leaves a positively charged hole in the valence band. Here 'hole' represents the unoccupied quantum mechanical electron state with a positive charge, an analogue in crystal of a positron. Because of the attractive coulomb force between the electron and the hole, a bound state is formed, akin to that of the electron and proton in a hydrogen atom or the electron and positron in positronium. Excitons are composite bosons since they are formed from two fermions which are electron and hole.
The concept of excitons was first proposed by Yakov Frenkel in 1931, when he described the excitation of an atomic lattice considering what is now called the tight-binding description of the band structure. In his model the electron and the hole bound by the coulomb interaction are located either on the same or on the nearest neighbouring sites of the lattice, but the exciton as a composite quasi-particle is able to travel through the lattice without any net transfer of charge, which lead to many propositions for optoelectronic devices.
Excitons are often treated in the two limiting cases:
(i) The small radius excitons, or Frankel excitons, where the electron-hole relative distance is restricted to one or only a few nearest neighbour unit cells. Frenkel excitons typically occur in insulators and organic semiconductors with relatively narrow allowed energy b |
https://en.wikipedia.org/wiki/CYP26%20family | Cytochrome P450, family 26, also known as CYP26, is an mammal cytochrome P450 monooxygenase family found in human genome. There are three members in the human genome, CYP26A1, CYP26B1 and CYP26C1. Synteny mapping of CYP26 family members showing linkages to CYP16 family members of many invertebrates, means the tetrapod's CYP26 may evolved from CYP16 of fish. |
https://en.wikipedia.org/wiki/Shelter%20%28video%20game%29 | Shelter is a survival video game developed by Might and Delight for Windows and Mac. It was released on 28 August 2013 after being accepted through Steam Greenlight. In the game players control a mother badger who must protect and feed her cubs while travelling from their burrow to a new one. During the journey the cubs must be fed and are in danger from threats such as birds of prey and wildfires.
The game was received positively and got favorable reviews on the graphics and sound, as well as the emotional impact that it evoked. Reviewers gave mixed reactions when it came to the difficulty and length.
Gameplay
In Shelter, players control a mother badger who is escorting her five cubs from their burrow to a new home and must protect them from danger. Along the journey the cubs will gradually get hungry and require food which the player must provide for them by either catching prey, such as foxes, or finding fruit and vegetables. Threats to the cubs come in different forms in each section of the game.
In the first and latter sections of the game, there are areas that contain circling birds of prey which can fly down and pick off a cub if it is out in the open for too long. One section of the game takes place at night, giving the player limited vision of their surroundings. In this section, cubs will occasionally be scared by noises and will run away from the player, requiring them to chase the cubs until they are within a safe radius of the mother. In a later section, the player must travel through a wildfire, keeping the cubs safe from the spreading blaze. Another requires the player to escort the cubs up an overflowing river. The goal of the game becomes to have as many cubs as possible survive the journey to shelter.
Development
Might and Delight began development on Shelter in January 2013 following the release of Pid. The game was announced and listed on Steam Greenlight in April, accepted in July, and released on 28 August of the same year. In December 201 |
https://en.wikipedia.org/wiki/Aleksandr%20Khazanov | Aleksandr Leonidovich Khazanov (May 4, 1979 – disappeared June 10, 2001) is a Russian American mathematician. A child prodigy, he wrote a perfect paper at the International Mathematical Olympiad 1994, one of the youngest ever to do so. Khazanov was reported missing in June 2001. He suffered from depression or bipolar disorder at the time of his disappearance.
Biography
Born to Anna and Leonid Khazanov, a math professor, Aleksandr had attended a special school for math students several years older in Leningrad, Soviet Union (now Saint Petersburg, Russia), before his family fled antisemitic threats and moved to Brooklyn, New York, United States as refugees in 1992. He attended Stuyvesant High School and showed his exceptional talent in mathematics. In the summer of 1994, he passed qualifying exams for Penn State University's doctoral program in math, and he was the youngest member on the United States team for the International Mathematical Olympiad of which all six members got perfect scores. In the following year, he was named a finalist and eventually placed 7th at the 54th Westinghouse Science Talent Search for a paper dealing with a variant of Fermat's Last Theorem, mentored by Leonid Vaserstein, a math professor at Penn State University also from Russia. He represented the United States for the second time in the 1995 International Mathematical Olympiad, and he entered the PhD math program at Penn State University right after high school in fall 1995.
Disappearance
In the morning of June 10, 2001, he left home with his mountain bike and went missing, leaving a note in Russian saying "I went to a library." He was taking a semester leave due to personal problems, and had been taking drugs for depression, but he did not bring them when he left home.
See also
Lists of people who disappeared |
https://en.wikipedia.org/wiki/Attix5%20Online%20Backup | Attix5 in computing, is an online disk-to-disk backup solution. Attix5 is one of the oldest managed online backup companies, powering SMBs and enterprises.
On September 11, 2015, Attix5 was acquired by UK based cloud data management company Redstor. Redstor has since renamed the technology Redstor Pro.
See also
List of backup software
Remote backup service
Timeline of computing |
https://en.wikipedia.org/wiki/65%2C535 | 65535 is the integer after 65534 and before 65536.
It is the maximum value of an unsigned 16-bit integer.
In mathematics
65535 is the sum of 20 through 215 (20 + 21 + 22 + ... + 215) and is therefore a repdigit in base 2 (1111111111111111), in base 4 (33333333), and in base 16 (FFFF).
It is the ninth number whose Euler totient has an aliquot sum that is : , and the twenty-eighth perfect totient number equal to the sum of its iterated totients.
65535 is the fifteenth 626-gonal number, the fifth 6555-gonal number, and the third 21846-gonal number.
65535 is the product of the first four Fermat primes: 65535 = (2 + 1)(4 + 1)(16 + 1)(256 + 1). Because of this property, it is possible to construct with compass and straightedge a regular polygon with 65535 sides (see, constructible polygon).
In computing
65535 occurs frequently in the field of computing because it is (one less than 2 to the 16th power), which is the highest number that can be represented by an unsigned 16-bit binary number. Some computer programming environments may have predefined constant values representing 65535, with names like .
In older computers with processors having a 16-bit address bus such as the MOS Technology 6502 popular in the 1970s and the Zilog Z80, 65535 (FFFF16) is the highest addressable memory location, with 0 (000016) being the lowest. Such processors thus support at most 64 KiB of total byte-addressable memory.
In Internet protocols, 65535 is also the number of TCP and UDP ports available for use, since port 0 is reserved.
In some implementations of Tiny BASIC, entering a command that divides any number by zero will return 65535.
In Microsoft Word 2011 for Mac, 65535 is the highest line number that will be displayed.
In HTML, 65535 is the decimal value of the web color Aqua (#00FFFF) .
See also
4,294,967,295
255 (number)
16-bit computing |
https://en.wikipedia.org/wiki/Four-tensor | In physics, specifically for special relativity and general relativity, a four-tensor is an abbreviation for a tensor in a four-dimensional spacetime.
Generalities
General four-tensors are usually written in tensor index notation as
with the indices taking integer values from 0 to 3, with 0 for the timelike components and 1, 2, 3 for spacelike components. There are n contravariant indices and m covariant indices.
In special and general relativity, many four-tensors of interest are first order (four-vectors) or second order, but higher-order tensors occur. Examples are listed next.
In special relativity, the vector basis can be restricted to being orthonormal, in which case all four-tensors transform under Lorentz transformations. In general relativity, more general coordinate transformations are necessary since such a restriction is not in general possible.
Examples
First-order tensors
In special relativity, one of the simplest non-trivial examples of a four-tensor is the four-displacement
a four-tensor with contravariant rank 1 and covariant rank 0. Four-tensors of this kind are usually known as four-vectors. Here the component x0 = ct gives the displacement of a body in time (coordinate time t is multiplied by the speed of light c so that x0 has dimensions of length). The remaining components of the four-displacement form the spatial displacement vector x = (x1, x2, x3).
The four-momentum for massive or massless particles is
combining its energy (divided by c) p0 = E/c and 3-momentum p = (p1, p2, p3).
For a particle with invariant mass , also known as rest mass, four momentum is defined by
with the proper time of the particle.
The relativistic mass is with Lorentz factor
Second-order tensors
The Minkowski metric tensor with an orthonormal basis for the (−+++) convention is
used for calculating the line element and raising and lowering indices. The above applies to Cartesian coordinates. In general relativity, the metric tensor is given by much m |
https://en.wikipedia.org/wiki/Theta%20wave | Theta waves generate the theta rhythm, a neural oscillation in the brain that underlies various aspects of cognition and behavior, including learning, memory, and spatial navigation in many animals. It can be recorded using various electrophysiological methods, such as electroencephalogram (EEG), recorded either from inside the brain or from electrodes attached to the scalp.
At least two types of theta rhythm have been described. The hippocampal theta rhythm is a strong oscillation that can be observed in the hippocampus and other brain structures in numerous species of mammals including rodents, rabbits, dogs, cats, and marsupials. "Cortical theta rhythms" are low-frequency components of scalp EEG, usually recorded from humans. Theta rhythms can be quantified using quantitative electroencephalography (qEEG) using freely available toolboxes, such as, EEGLAB or the Neurophysiological Biomarker Toolbox (NBT).
In rats, theta wave rhythmicity is easily observed in the hippocampus, but can also be detected in numerous other cortical and subcortical brain structures. Hippocampal theta waves, with a frequency range of 6–10 Hz, appear when a rat is engaged in active motor behavior such as walking or exploratory sniffing, and also during REM sleep. Theta waves with a lower frequency range, usually around 6–7 Hz, are sometimes observed when a rat is motionless but alert. When a rat is eating, grooming, or sleeping, the hippocampal EEG usually shows a non-rhythmic pattern known as large irregular activity or LIA. The hippocampal theta rhythm depends critically on projections from the medial septal area, which in turn receives input from the hypothalamus and several brainstem areas. Hippocampal theta rhythms in other species differ in some respects from those in rats. In cats and rabbits, the frequency range is lower (around 4–6 Hz), and theta is less strongly associated with movement than in rats. In bats, theta appears in short bursts associated with echolocation.
|
https://en.wikipedia.org/wiki/Oncotherm | Oncotherm medical devices are produced by Oncotherm Ltd. and used for cancer treatment.
Methodology
The company's methodology is based on the view of its founder that the heat-dose sensitive characterization of tissue is at the core of the oncothermia treatment. The tumor tissue has lower impedance than the surrounding tissues, so most of the energy is transmitted and absorbed by the cancerous lesion. This selection of the tumor tissues (self-focusing) renders external focusing unnecessary.
Company
Oncotherm Ltd. was founded in 1988 by Prof. Dr. András Szász in Hungary. In 2002, it received investment from a German company and was reorganized as a German-Hungarian company consisting of Oncotherm Hungary Ltd and Oncotherm GmbH.
Medical devices
Their main products are the Oncotherm EHY-2000 and the Oncotherm EHY-2030, hyperthermia devices using the thermoelectric effects of electrical fields.
More than 450 devices have been placed into operation, mostly in Germany and South Korea, and performed over 400,000 treatments. The devices are not approved by the U.S. Food and Drug Administration. |
https://en.wikipedia.org/wiki/GPS%20in%20the%20earthmoving%20industry | GPS when applied in the earthmoving industry can be a viable asset to contractors and increase the overall efficiency of the job. Since GPS satellite positioning information is free to the public, it allows for everyone to take advantage of its uses. Heavy equipment manufacturers, in conjunction with GPS guidance system manufacturers, have been co-developing GPS guidance systems for heavy equipment since the late 1990s. These systems allow the equipment operator to use GPS position data to make decisions based on actual grade and design features. Some heavy equipment guidance systems can even operate the machine's implements automatically from a set design that was created for the particular jobsite. GPS guidance systems can have tolerances as small as two to three centimeters making them extremely accurate compared to relying on the operator's skill level. Since the machine's GPS system has the ability to know when it is off the design grade, this can reduce surveying and material costs required for a specific job.
History
GPS Technology was officially introduced as a guidance system for earthmoving machines in the late 1990s. Since this time, many manufacturers of earthmoving equipment now offer GPS and other guidance systems, as a factory option. Many companies exist that also sell GPS guidance systems for the earthmoving industry as a retrofit option. The two main companies for heavy equipment guidance systems are Trimble and Topcon. In April 2002, Trimble and Caterpillar Inc. began a joint venture known as Caterpillar Trimble Controls Technology LLC (CTCT). "The joint venture develops machine control products that use site design information combined with accurate positioning technology to automatically control dozer blades and other machine tools". Though aftermarket kits were available from various companies to retrofit an existing machine for GPS guidance, Caterpillar Inc. was the first heavy equipment manufacturer to offer GPS guidance systems as a factor |
https://en.wikipedia.org/wiki/Faceware%20Technologies | Faceware Technologies is a facial animation and motion capture development company in America. The company was established under Image Metrics and became its own company at the beginning of 2012.
Faceware produces software used to capture an actor's performance and transfer it onto an animated character, as well as the hardware needed to capture the performances. The software line includes Faceware Analyzer, Faceware Retargeter, and Faceware Live.
Faceware software is used by film studios and video game developers including Rockstar Games, Bungie, Cloud Imperium Games, and 2K in games such as Grand Theft Auto V, Destiny, Star Citizen, and Halo: Reach.
Through its application in the video game industry, Faceware won the Develop Award while it was still part of Image Metrics for Technical Innovation in 2008. It won the Develop Award again for Creative Contribution: Visuals in 2014. Faceware received Best of Show recognition at the Game Developers Conference 2011 in San Francisco as well as Computer Graphics World's Silver Edge Award at SIGGRAPH 2014 and 2016. Faceware won the XDS Gary Award in 2016 for its contributions to the Faceware-EA presentation at the 2016 XDS Summit.
History
Image Metrics, founded in 2000, is a provider of facial animation and motion capture technology within the video game and entertainment industries. In 2008, Image Metrics offered a beta version of its facial animation technology to visual effects and film studios. The technology captured an actor's performance on video, analyzed it, and mapped it onto a CG model. The release of the beta allowed studios to incorporate facial animation technology into internal pipelines rather than going to the Image Metrics studio as they had in the past. The first studio to beta test Image Metric's software in 2009 was the visual effects studio Double Negative out of London.
In 2010, Image Metrics launched the facial animation technology platform Faceware. Faceware focused on increasing creative contr |
https://en.wikipedia.org/wiki/Separatrix%20%28mathematics%29 | In mathematics, a separatrix is the boundary separating two modes of behaviour in a differential equation.
Example: simple pendulum
Consider the differential equation describing the motion of a simple pendulum:
where denotes the length of the pendulum, the gravitational acceleration and the angle between the pendulum and vertically downwards. In this system there is a conserved quantity H (the Hamiltonian), which is given by
With this defined, one can plot a curve of constant H in the phase space of system. The phase space is a graph with along the horizontal axis and on the vertical axis – see the thumbnail to the right. The type of resulting curve depends upon the value of H.
If then no curve exists (because must be imaginary).
If then the curve will be a simple closed curve which is nearly circular for small H and becomes "eye" shaped when H approaches the upper bound. These curves correspond to the pendulum swinging periodically from side to side.
If then the curve is open, and this corresponds to the pendulum forever swinging through complete circles.
In this system the separatrix is the curve that corresponds to . It separates — hence the name — the phase space into two distinct areas, each with a distinct type of motion. The region inside the separatrix has all those phase space curves which correspond to the pendulum oscillating back and forth, whereas the region outside the separatrix has all the phase space curves which correspond to the pendulum continuously turning through vertical planar circles.
Example: FitzHugh–Nagumo model
In the FitzHugh–Nagumo model, when the linear nullcline pierces the cubic nullcline at the left, middle, and right branch once each, the system has a separatrix. Trajectories to the left of the separatrix converge to the left stable equilibrium, and similarly for the right. The separatrix itself is the stable manifold for the saddle point in the middle. Details are found in the page.
The separatrix is cle |
https://en.wikipedia.org/wiki/Karyogamy | Karyogamy is the final step in the process of fusing together two haploid eukaryotic cells, and refers specifically to the fusion of the two nuclei. Before karyogamy, each haploid cell has one complete copy of the organism's genome. In order for karyogamy to occur, the cell membrane and cytoplasm of each cell must fuse with the other in a process known as plasmogamy. Once within the joined cell membrane, the nuclei are referred to as pronuclei. Once the cell membranes, cytoplasm, and pronuclei fuse, the resulting single cell is diploid, containing two copies of the genome. This diploid cell, called a zygote or zygospore can then enter meiosis (a process of chromosome duplication, recombination, and division, to produce four new haploid cells), or continue to divide by mitosis. Mammalian fertilization uses a comparable process to combine haploid sperm and egg cells (gametes) to create a diploid fertilized egg.
The term karyogamy comes from the Greek karyo- (from κάρυον karyon) 'nut' and γάμος gamos 'marriage'.
Importance in haploid organisms
Haploid organisms such as fungi, yeast, and algae can have complex cell cycles, in which the choice between sexual or asexual reproduction is fluid, and often influenced by the environment. Some organisms, in addition to their usual haploid state, can also exist as diploid for a short time, allowing genetic recombination to occur. Karyogamy can occur within either mode of reproduction: during the sexual cycle or in somatic (non-reproductive) cells.
Thus, karyogamy is the key step in bringing together two sets of different genetic material which can recombine during meiosis. In haploid organisms that lack sexual cycles, karyogamy can also be an important source of genetic variation during the process of forming somatic diploid cells. Formation of somatic diploids circumvents the process of gamete formation during the sexual reproduction cycle and instead creates variation within the somatic cells of an already developed organ |
https://en.wikipedia.org/wiki/Vincent%20Poor | Harold Vincent Poor FRS FREng is the Michael Henry Strater University Professor of Electrical Engineering at Princeton University, where he is also the Interim Dean of the School of Engineering and Applied Science. He is a specialist in wireless telecommunications, signal processing and information theory. He has received many honorary degrees and election to national academies. He was also President of IEEE Information Theory Society (1990). He is on the board of directors of the IEEE Foundation.
Education
Poor received a BSEE degree from Auburn University in 1972, and a MSEE from there in 1974. In 1977, he received his PhD from Princeton University. From 1977 to 1990, he was a faculty member of the University of Illinois at Urbana–Champaign. From 1990, he joined Princeton University as a professor.
Research
His research interests lie in the areas of stochastic analysis, statistical signal processing and information theory, and their applications in a number of fields including wireless networks, social networks, and smart grid. This research work has attracted over 10,000 citations. He has published a book on Signal Detection and Estimation. This book is considered the definitive reference in the subject. He was reported to have made a particular impact in the field of wireless communications.
Awards
He was inducted National Academy of Engineering in 2001, into the American Academy of Arts & Sciences in 2005, and into the National Academy of Sciences in 2011. He was inducted as International Fellow of the UK Royal Academy of Engineering in 2009, as Corresponding Fellow of the Royal Society of Edinburgh in 2013, as Foreign Member of the Royal Society of London, UK in 2014, and as Foreign Member of the Chinese Academy of Sciences in 2017. He was inducted as Fellow of the IEEE in 1987 for contributions to the theory of robust linear filtering applied to signal detection and estimation, of the American Association for the Advancement of Science in 1991, of |
https://en.wikipedia.org/wiki/Wiki%20hosting%20service | A wiki hosting service, or wiki farm, is a server or an array of servers that offers users tools to simplify the creation and development of individual, independent wikis. Wiki farms are not to be confused with wiki "families", a more generic term for any group of wikis located on the same server.
Prior to wiki farms, someone who wanted to operate a wiki had to install the software and manage the server(s) themselves. With a wiki farm, the farm's administration installs the core wiki code once on its own servers, centrally maintains the servers, and establishes unique space on the servers for the content of each individual wiki with the shared core code executing the functions of each wiki.
Both commercial and non-commercial wiki farms are available for users and online communities. While most of the wiki farms allow anyone to open their own wiki, some impose restrictions. Many wiki farm companies generate revenue through the insertion of advertisements, but often allow payment of a monthly fee as an alternative to accepting ads.
Some examples of wiki hosting services are: Fandom, a for-profit wiki-hosting service created by Jimmy Wales and Angela Beesley Starling in 2004, and Miraheze, a non-profit wiki-hosting service created by John Lewis and Ferran Tufan.
See also
Comparison of wiki hosting services |
https://en.wikipedia.org/wiki/Michael%20Krivelevich | Michael Krivelevich (born January 30, 1966) is a professor with the School of Mathematical Sciences of Tel Aviv University, Israel.
Krivelevich received his PhD from Tel Aviv University in 1997 under the supervision of Noga Alon. He has published extensively in combinatorics and adjacent fields and specializes in extremal and probabilistic combinatorics.
He serves as an editor-in-chief of the Journal of Combinatorial Theory (Series B) and is on the editorial board of several other journals in the field.
Awards and honors
In 2007, Krivelevich and Alan Frieze won the Pazy Memorial Award for research into probabilistic reasoning in combinatorics.
In 2014, Krivelevich gave an invited address in the Combinatorics section at the International Congress of Mathematicians.
He was elected as a member of the 2017 class of Fellows of the American Mathematical Society "for contributions to extremal and probabilistic combinatorics". |
https://en.wikipedia.org/wiki/SPRY3 | Protein sprouty homolog 3 is a protein that in humans is encoded by the SPRY3 gene.
The SPRY3 gene is one of the genes found in the pseudoautosomal regions of the human sex chromosomes (i.e. those 19 genes that are found on both the X and Y chromosome). It is located in the PAR2 region. |
https://en.wikipedia.org/wiki/BlazBlue%3A%20Calamity%20Trigger | BlazBlue: Calamity Trigger is a fighting game developed by Arc System Works in 2008. The game's name is a combination of the words "blaze" and "blue" when the title is rendered in rōmaji, and of the words "brave" and "blue" when rendered in katakana. As Japanese people usually follow the katakana rendering, the Japanese prononciation is made similar to the word "bray", entirely omitting the "z" sound. Originally released for the arcades, it was also released for the PlayStation 3, Xbox 360, and Microsoft Windows. A port for the PlayStation Portable, titled BlazBlue: Calamity Trigger Portable, was released in 2010. It was the first title in the BlazBlue game series and extended franchise.
Gameplay
BlazBlue is a traditional 2D fighter where two characters participate in a duel. A round is called a "rebel" and one match can consist of one to five "rebels". To win a round, one player must either incapacitate the other by inflicting damage through various attacks to reduce their opponent's health to zero or by having more remaining health than their opponent after the clock runs out.
Every character has a weak, medium and strong attack, as well as a "unique" technique, called a Drive attack, which is different for each character. Those attacks are also known as "A", "B", "C" and "D". Various combos can be performed by every character through careful input of regular and Drive attacks. A combo consists of two or more consecutive attacks that hit an opponent without them retaliating. As combos become longer, each attack will do less damage than normal to give the opponent a chance to retaliate. Grabs can be incorporated into combos also by pressing the "B" and "C" buttons at the same time. Occasionally, some attacks (e.g. Jin's Hirensou) will use portions of the player's heat gauge at the bottom of the screen. The heat gauge is filled by either dealing or receiving damage. When a character has 50% or more heat, special moves called "Distortion Drives" can be performed. W |
https://en.wikipedia.org/wiki/Epic%20of%20evolution | In social, cultural and religious studies in the United States, the "epic of evolution" is a narrative that blends religious and scientific views of cosmic, biological and sociocultural evolution in a mythological manner. According to The Encyclopedia of Religion and Nature, an "epic of evolution" encompasses
History
"Epic of evolution" seems to have originated from the sociobiologist Edward O. Wilson's use of the phrase "evolutionary epic" in 1978. Wilson was not the first to use the term but his prominence prompted its usage as the morphed phrase 'epic of evolution'. In later years, he also used the latter term.
Naturalistic and liberal religious writers have picked up on Wilson's term and have used it in a number of texts. These authors however have at times used other terms to refer to the idea: Universe Story (Brian Swimme, John F. Haught), Great Story (Connie Barlow, Michael Dowd), Everybody's Story (Loyal Rue), New Story (Thomas Berry, Al Gore, Brian Swimme) and Cosmic Evolution (Eric Chaisson).
Narrative
Evolution generally refers to biological evolution, but here it means a process in which the whole universe is a progression of interrelated phenomena, a gradual process in which something changes into a different and usually more complex form (emergence). It should not be "biologized" as it includes many areas of science. In addition, outside of the scientific community, the term evolution is frequently used differently from scientists' usage. This often leads to misunderstanding since scientists are viewing evolution from a different perspective. The same applies to the use of the term theory as used in the theory of evolution (see references for Evolution as theory and fact).
This epic is not a long narrative poem but a series of events that form the proper subject for a laudable kind of tale. It is mythic in that it is a story of ostensibly historical events that serves to unfold part of the worldview of a people and explains a natural phenomenon. |
https://en.wikipedia.org/wiki/Bioinformatics%20Research%20Network | Bioinformatics Research Network (BRN) is a non-profit open-science research-based organization aiming to provide volunteer opportunities and bioinformatics research training that is free and open to everyone. It is a community-driven 501(c)(3) non-profit organization that aims to establish a worldwide network that is open to anyone interested in bioinformatics irrespective of academic background and to provide bioinformatics training, mentorship and the opportunity to collaborate on exciting research projects.
Training and Projects
BRN provides free training workshops through its partner group Bioinformatics Interest Group. BIG is a student club of The University of Texas Health Science Center at San Antonio established to promote the development of student bioinformaticians and encourage the growth of bioinformatics skills in the community. BRN is open to academic labs to host projects for open collaboration. These projects are then available for everyone to contribute. To work on a project, a volunteer has to complete the required skill assessments for the specific project and apply to the respected team. The decision to allow the volunteer to work depends on the team of the respective project.
Publication
BRN has published its projects in BioRxiv and in peer-reviewed journals. |
https://en.wikipedia.org/wiki/Boundary%20scan | Boundary scan is a method for testing interconnects (wire lines) on printed circuit boards or sub-blocks inside an integrated circuit. Boundary scan is also widely used as a debugging method to watch integrated circuit pin states, measure voltage, or analyze sub-blocks inside an integrated circuit.
The Joint Test Action Group (JTAG) developed a specification for boundary scan testing that was standardized in 1990 as the IEEE Std. 1149.1-1990. In 1994, a supplement that contains a description of the Boundary Scan Description Language (BSDL) was added which describes the boundary-scan logic content of IEEE Std 1149.1 compliant devices. Since then, this standard has been adopted by electronic device companies all over the world. Boundary scan is now mostly synonymous with JTAG.
Testing
The boundary scan architecture provides a means to test interconnects (including clusters of logic, memories, etc.) without using physical test probes; this involves the addition of at least one test cell that is connected to each pin of the device and that can selectively override the functionality of that pin. Each test cell may be programmed via the JTAG scan chain to drive a signal onto a pin and thus across an individual trace on the board; the cell at the destination of the board trace can then be read, verifying that the board trace properly connects the two pins. If the trace is shorted to another signal or if the trace is open, the correct signal value does not show up at the destination pin, indicating a fault.
On-chip infrastructure
To provide the boundary scan capability, IC vendors add additional logic to each of their devices, including scan cells for each of the external traces. These cells are then connected together to form the external boundary scan shift register (BSR), and combined with JTAG Test Access Port (TAP) controller support comprising four (or sometimes more) additional pins plus control circuitry.
Some TAP controllers support scan chains between on-chi |
https://en.wikipedia.org/wiki/Ecological%20network | An ecological network is a representation of the biotic interactions in an ecosystem, in which species (nodes) are connected by pairwise interactions (links). These interactions can be trophic or symbiotic. Ecological networks are used to describe and compare the structures of real ecosystems, while network models are used to investigate the effects of network structure on properties such as ecosystem stability.
Properties
Historically, research into ecological networks developed from descriptions of trophic relationships in aquatic food webs; however, recent work has expanded to look at other food webs as well as webs of mutualists. Results of this work have identified several important properties of ecological networks.
Complexity (linkage density): the average number of links per species. Explaining the observed high levels of complexity in ecosystems has been one of the main challenges and motivations for ecological network analysis, since early theory predicted that complexity should lead to instability.
Connectance: the proportion of possible links between species that are realized (links/species2). In food webs, the level of connectance is related to the statistical distribution of the links per species. The distribution of links changes from (partial) power-law to exponential to uniform as the level of connectance increases. The observed values of connectance in empirical food webs appear to be constrained by the variability of the physical environment, by habitat type, which will reflect on an organism's diet breadth driven by optimal foraging behaviour. This ultimately links the structure of these ecological networks to the behaviour of individual organisms.
Degree distribution: the degree distribution of an ecological network is the cumulative distribution for the number of links each species has. The degree distributions of food webs have been found to display the same universal functional form. The degree distribution can be split into its two |
https://en.wikipedia.org/wiki/Schwarz%E2%80%93Ahlfors%E2%80%93Pick%20theorem | In mathematics, the Schwarz–Ahlfors–Pick theorem is an extension of the Schwarz lemma for hyperbolic geometry, such as the Poincaré half-plane model.
The Schwarz–Pick lemma states that every holomorphic function from the unit disk U to itself, or from the upper half-plane H to itself, will not increase the Poincaré distance between points. The unit disk U with the Poincaré metric has negative Gaussian curvature −1. In 1938, Lars Ahlfors generalised the lemma to maps from the unit disk to other negatively curved surfaces:
Theorem (Schwarz–Ahlfors–Pick). Let U be the unit disk with Poincaré metric ; let S be a Riemann surface endowed with a Hermitian metric whose Gaussian curvature is ≤ −1; let be a holomorphic function. Then
for all
A generalization of this theorem was proved by Shing-Tung Yau in 1973. |
https://en.wikipedia.org/wiki/Championship%20Lode%20Runner | Championship Lode Runner is a sequel to the 1983 puzzle-platform game Lode Runner. It was released in 1984 for the Apple II, Commodore 64, and IBM PC (as a self-booting disk), then ported to the Atari 8-bit family, Famicom, SG-1000, and MSX. Mostly the same as Lode Runner, Championship Lode Runner has levels that are much more difficult. Unlike the original, it does not include a level editor.
Gameplay
The object of the game is to pick up all the gold pieces (which appear as piles of gold) and get them to the top. Using non-violent methods, enemies had to be overcome. Bumping into enemies cost the player a life and all of his hard-earned gold pieces. Fifty of the hardest levels ever designed are used and they had to be tackled in proper sequential order. While games can be saved, the player automatically loses a life for restoring his game.
Unlike the original Lode Runner game, this version does not come with a level editor. Many of the levels made for this game were designed using the built-in level editor from the original game.
Ports
The game was first released for the Apple II. The Famicom port of the game was published by Hudson Soft. Famicom players can start at any of the first ten levels while needing passwords to skip to the next levels. The Apple II version and Famicom offered players a certificate for completing the game.
The IBM PC self-booting disk version was written by Doug Greene.
In 1985, Sega published the game for the SG-1000 in Japan and it was released on the My Card format. A port was also released for the MSX. Both versions were developed by Compile.
Reception
Based on sales and market-share data, Video magazine listed the game seventh on its list of best selling video games in February 1985.
Reviews
Games #61 |
https://en.wikipedia.org/wiki/Potty%20parity | Potty parity is equal or equitable provision of public toilet facilities for females and males within a public space.
Definition of parity
Parity may be defined in various ways in relation to facilities in a building. The simplest is as equal floorspace for male and female washrooms. Since men's and boys' bathrooms include urinals, which take up less space than stalls, this still results in more facilities for males. An alternative parity is by number of fixtures within washrooms. However, since females on average spend more time in washrooms more males are able to use more facilities per unit time. More recent parity regulations therefore require more fixtures for females to ensure that the average time spent waiting to use the toilet is the same for females as for males, or to equalise throughputs of male and female toilets.
The lack of diaper-changing stations for babies in men's restrooms has been listed as a potty parity issue by fathers. Some jurisdictions have considered legislation mandating diaper-changing stations in men's restrooms.
Sex differences
Women and girls often spend more time in washrooms than men and boys, for both physiological and cultural reasons. The requirement to use a cubicle rather than a urinal means urination takes longer and hand washing must be done more thoroughly. Females also make more visits to washrooms. Urinary tract infections and incontinence are more common in females. Pregnancy, menstruation, breastfeeding, and diaper-changing increase usage. The elderly, who are disproportionately female, take longer and more frequent bathroom visits.
A variety of female urinals and personal funnels have been invented to make it easier for females to urinate standing up. None has become widespread enough to affect policy formation on potty parity.
John F. Banzhaf III, a law professor at George Washington University, calls himself the "father of potty parity." Banzhaf argues that to ignore potty parity; that is, to have merely equ |
https://en.wikipedia.org/wiki/Survey%20marker | Survey markers, also called survey marks, survey monuments, or geodetic marks, are objects placed to mark key survey points on the Earth's surface. They are used in geodetic and land surveying. A benchmark is a type of survey marker that indicates elevation (vertical position). Horizontal position markers used for triangulation are also known as triangulation stations.
Benchmarking is the hobby of "hunting" for these marks.
Types
All sorts of different objects, ranging from the familiar brass disks to liquor bottles, clay pots, and rock cairns, have been used over the years as survey markers. Some markers have been used to designate tripoints, or the meeting points of three or more countries. In the 19th century, these marks were often drill holes in rock ledges, crosses or triangles chiseled in rock, or copper or brass bolts sunk into bedrock.
Today in the United States, the most common geodetic survey marks are cast metal disks with stamped legends on their face set in rock ledges, embedded in the tops of concrete pillars, or affixed to the tops of pipes that have been sunk into the ground. These marks are intended to be permanent, and disturbing them is generally prohibited by federal and state law.
Survey markers in Nagoya, Japan, which bear stylized images of shachihoko, are noted for their elaborate design.
History
Survey markers were often placed as part of triangulation surveys, measurement efforts that moved systematically across states or regions, establishing the angles and distances between various points. Such surveys laid the basis for map-making across the world.
Geodetic survey markers were often set in groups. For example, in triangulation surveys, the primary point identified was called the triangulation station, or the "main station". It was often marked by a "station disk" (see upper photo at left), a brass disk with a triangle inscribed on its surface and an impressed mark that indicated the precise point over which a surveyor's plumb-bob s |
https://en.wikipedia.org/wiki/Closed-loop%20authentication | Closed-loop authentication, as applied to computer network communication, refers to a mechanism whereby one party verifies the purported identity of another party by requiring them to supply a copy of a token transmitted to the canonical or trusted point of contact for that identity. It is also sometimes used to refer to a system of mutual authentication whereby two parties authenticate one another by signing and passing back and forth a cryptographically signed nonce, each party demonstrating to the other that they control the secret key used to certify their identity.
E-mail Authentication
Closed-loop email authentication is useful for simple i another, as a weak form of identity verification. It is not a strong form of authentication in the face of host- or network-based attacks (where an imposter, Chuck, is able to intercept Bob's email, intercepting the nonce and thus masquerading as Bob.)
A use of closed-loop email authentication is used by parties with a shared secret relationship (for example, a website and someone with a password to an account on that website), where one party has lost or forgotten the secret and needs to be reminded. The party still holding the secret sends it to the other party at a trusted point of contact. The most common instance of this usage is the "lost password" feature of many websites, where an untrusted party may request that a copy of an account's password be sent by email, but only to the email address already associated with that account. A problem associated with this variation is the tendency of a naïve or inexperienced user to click on a URL if an email encourages them to do so. Most website authentication systems mitigate this by permitting unauthenticated password reminders or resets only by email to the account holder, but never allowing a user who does not possess a password to log in or specify a new one.
In some instances in web authentication, closed-loop authentication is employed before any access is gra |
https://en.wikipedia.org/wiki/Nosokinetics | Nosokinetics is the science/subject of measuring and modelling the process of care in health and social care systems. Nosokinetics brings together the Greek words for noso: disease and kinetics: movement.
Black box models are currently used to plan changes in health and social care systems. These input-output models overlook the process of inpatient care, as a result suboptimal decisions are made. Nosokinetics, (analogous to Pharmacokinetics), seeks to develop dynamic methods which measure and model the process of inpatient care. The aim is to develop a scientific base to underpin the planning of sustainable health and social care systems.
Establishment
Nosokinetics is a new science that was established in the UK in the early 1990s by Prof Peter H Millard after publishing his PhD thesis. In 2004 Nosokinetics group newsletter was established.
Origin
Prof Peter H Millard writes about Nosokinetics : "If the random forces of wind and tide can make such a beautiful statue (referring to an iceberg), how much better could mankind do if a new science was developed which explains the complex processes of health and social care. Until new methods of planning health and social care services to meet the needs of an ageing population are introduced, service delivery will stumble on from crisis to crisis. The world population is ageing and sustainable systems of health care need to be developed."
He has established the nosokinetics group of interested researchers. The group collaborates to organize conferences and disseminates news of nosokinetics and other researchers' research and practical use of modelling to enhance decision making in health and social care systems.
Network
The Nosokinetics Group has succeeded in attracting a lot of researchers. Nosokinetics interested people are present in many countries including Australia, UK & Egypt. They are from different disciplines ranging from health care providers to management scientists. The news related to nosokinetics i |
https://en.wikipedia.org/wiki/Ammonium%20malate | Ammonium malate refers to organic compounds containing malate and ammonium. Two stoichiometries are discussed: NH4H(C2H3OH(CO2)2) with one ammonium ion per formula unit, and (NH4)2(C2H3OH(CO2)2). Malate, the conjugate base of malic acid, is chiral. Consequently a variety of salts are possible, R vs S vs racemic. The monoammonium salt has been crystallized as the monohydrate.
As a food additive, diammonium malate has been used as flavoring agent and as an acidity regulator. It has the E number E349. |
https://en.wikipedia.org/wiki/ZNF367 | Zinc finger protein 367 is a protein that in humans is encoded by the ZNF367 gene. The human gene is also known as ZFF29 and CDC14B; the orthologue in mice is Zfp367. ZNF367 contains a unique Cys2His2 zinc finger motif and is a member of the zinc finger protein family.
Model organisms
Model organisms have been used in the study of ZNF367 function. A conditional knockout mouse line, called Zfp367tm1a(KOMP)Wtsi was generated as part of the International Knockout Mouse Consortium program — a high-throughput mutagenesis project to generate and distribute animal models of disease to interested scientists.
Male and female animals underwent a standardized phenotypic screen to determine the effects of deletion. Twenty six tests were carried out on mutant mice, but no significant abnormalities were observed. |
https://en.wikipedia.org/wiki/Meniscal%20cyst | Meniscal cyst is a well-defined cystic lesion located along the peripheral margin of the meniscus, a part of the knee, nearly always associated with horizontal meniscal tears.
Signs and symptoms
Pain and swelling or focal mass at the level of the joint. The pain may be related to a meniscal tear or distension of the knee capsule or both. The mass varies in consistency from soft/fluctuant to hard. Size is variable, and meniscal cysts are known to change in size with knee flexion/extension.
Cause
Various etiologies have been proposed, including trauma, hemorrhage, chronic infection, and mucoid degeneration. The most widely accepted theory describes meniscal cysts resulting from extrusion of synovial fluid through a peripherally extended horizontal meniscal tear, accumulating outside the joint capsule. They arise more commonly from the lateral joint margin, and occur most often in 20- to 40-year-old males.
Diagnosis
Magnetic resonance imaging is the modality of choice for diagnosis of meniscal cysts. In their most subtle form, meniscal cysts present as focal areas of high signal intensity within a swollen meniscus. It is not uncommon for radiologists to miss this type of meniscal cyst because the signal intensity is not quite as great as fluid on T2 weighted sequences.2 When this fluid is extruded into the adjacent soft tissues, the swollen meniscus subsequently assumes a more normal shape, and the extruded fluid demonstrates a higher T2 signal typical of parameniscal cysts.
Medial meniscus horizontal tear extending into a meniscal cyst.
Sagittal T2 images of a medial meniscus horizontal tear extending into a meniscal cyst.
Large medial meniscus cyst.
Treatment
Treatment of meniscal cysts consists of a combination of cyst decompression (intraarticular decompression versus open cystectomy) and arthroscopic repair of any meniscal abnormalities. Success rates are significantly higher when both the cyst and meniscal tear are treated compared to treating only one di |
https://en.wikipedia.org/wiki/World%20Games%20%28video%20game%29 | World Games is a sports video game developed by Epyx for the Commodore 64 in 1986. Versions for the Apple IIGS, Amstrad CPC, ZX Spectrum, Master System and other contemporary systems were also released. The NES version was released by Milton Bradley, and ported by Software Creations on behalf of producer Rare.
The game is a continuation of the Epyx sports line that includes Summer Games and Winter Games.
World Games was made available in Europe for the Wii virtual console on April 25, 2008.
Events
The events available vary slightly depending on the platform, and may include:
Weightlifting (Soviet Union)
Slalom skiing (France)
Log rolling (Canada)
Cliff diving (Mexico)
Caber toss (Scotland)
Bull riding (United States)
Barrel jumping (Germany)
Sumo Wrestling (Japan)
The game allows the player to compete in all of the events sequentially, choose a few events, choose just one event, or practice an event.
Reception
Writing for Info, Benn Dunnington gave the Commodore 64 version of World Games three-plus stars out of five and described it as "my least favorite of the series". Stating that slalom skiing was the best event, he concluded that "Epyx does such a nice, consistent job of execution, tho, that it's hard to take off too many points even for such boring material". Computer Gaming Worlds Rick Teverbaugh criticized the slalom skiing and log rolling events' difficulty, but concluded that "World Games is still a must for the avid sports games". Charles Ardai called the game "an adequate sequel" to Epyx's previous Games, and praised the graphics. He criticized the mechanics "as bizarre little joystick patterns which have little to do with the events" but still recommended the game because of the log rolling event. Jame Trunzo praised the game's use of advanced graphics and sound, including humorous effects. Also noted was the variety in the included games, preventing the game from getting too repetitive.
The game was reviewed in 1988 in Dragon #132 by Har |
https://en.wikipedia.org/wiki/Perturbed%20angular%20correlation | The perturbed γ-γ angular correlation, PAC for short or PAC-Spectroscopy, is a method of nuclear solid-state physics with which magnetic and electric fields in crystal structures can be measured. In doing so, electrical field gradients and the Larmor frequency in magnetic fields as well as dynamic effects are determined. With this very sensitive method, which requires only about 10-1000 billion atoms of a radioactive isotope per measurement, material properties in the local structure, phase transitions, magnetism and diffusion can be investigated. The PAC method is related to nuclear magnetic resonance and the Mössbauer effect, but shows no signal attenuation at very high temperatures.
Today only the time-differential perturbed angular correlation (TDPAC) is used.
History and development
PAC goes back to a theoretical work by Donald R. Hamilton from 1940. The first successful experiment was carried out by Brady and Deutsch in 1947. Essentially spin and parity of nuclear spins were investigated in these first PAC experiments. However, it was recognized early on that electric and magnetic fields interact with the nuclear moment, providing the basis for a new form of material investigation: nuclear solid-state spectroscopy.
Step by step the theory was developed.
After Abragam and Pound published their work on the theory of PAC in 1953 including extra nuclear fields, many studies with PAC were carried out afterwards. In the 1960s and 1970s, interest in PAC experiments sharply increased, focusing mainly on magnetic and electric fields in crystals into which the probe nuclei were introduced. In the mid-1960s, ion implantation was discovered, providing new opportunities for sample preparation. The rapid electronic development of the 1970s brought significant improvements in signal processing. From the 1980s to the present, PAC has emerged as an important method for the study and characterization of materials, e.g. for the study of semiconductor materials, intermetal |
https://en.wikipedia.org/wiki/All%20fifths%20tuning | Among guitar tunings, all-fifths tuning refers to the set of tunings in which each interval between consecutive open strings is a perfect fifth. All-fifths tuning is also called fifths, perfect fifths, or mandoguitar. The conventional "standard tuning" consists of perfect fourths and a single major third between the g and b strings:
E-A-d-g-b-e'
All-fifths tuning has the set of open strings
C-G-d-a-e'-b' or G'-D-A-e-b-f',
which have intervals of 3 octaves minus a half-step between the lowest and highest string. The conventional tuning has an interval of 2 octaves between lowest and highest string.
All-fifths tuning is a tuning in intervals of perfect fifths like that of a mandolin or a violin. It has a wide range. It was used by jazz guitarist Carl Kress in the form
B'-F-c-g-d'-a'.
An approximation: new standard tuning
All-fifths tuning has been approximated with tunings that avoid the high b' replacing it with a g' in the New Standard Tuning of King Crimson's Robert Fripp, which has been taught in Guitar Craft courses. Guitar Craft, which has been succeeded by Guitar Circle, has taught Fripp's tuning to 3,000 students.
An approximation: through the looking glass guitar
All-fifths tuning has been approximated with tunings in the Through The Looking Glass Guitar of Kei Nakano, which has been played by him since 2015.
This new tuning is like a mirror to all kinds of string instruments including guitar.
Also it can adapt to any other tunings of guitar.
If tuned to normal guitar for the right handed person, it is able to use for lefty guitar in general, and vice versa.
Relation with all-fourths tuning
All-fifths tuning is closely related to all-fourths tuning. All-fifths tuning is based on the perfect fifth (the interval with seven semitones), and all-fourths tuning is based on the perfect fourth (five semitones). The perfect-fifth and perfect-fourth intervals are inversions of one another, and the chords of all-fourth and all-fifths are paired as inverted chords |
https://en.wikipedia.org/wiki/Risk%20intelligence | Risk intelligence is a concept that generally means "beyond risk management", though it has been used in different ways by different writers. The term is being used more frequently by business strategists when discussing integrative business processes related to governance, risk, and compliance.
Definitions
The first non-definitive usage of the phrase "risk intelligence" appears in the 1980s and aligns to the definition of intelligence as being information from an enemy (for example, regarding credit risk.) The topic of balancing risk and innovation using information and the cognitive processes involved also appears at this time. Recent usage is more aligned to intelligence as understanding and problem solving.
The US business writer David Apgar defines it as the capacity to learn about risk from experience.
Deloitte Risk Advisory partner (since retired) Stephen Wagner, along with former Deloitte partner and current management consultant and risk advisor Rick Funston, defined risk intelligence as a dynamic approach to protect and create value amid uncertainty. It is an enterprise wide process integrating people, processes (systems), and tools to increase information available to decision makers for improved decision making.
The UK philosopher and psychologist Dylan Evans defines it as "a special kind of intelligence for thinking about risk and uncertainty", at the core of which is the ability to estimate probabilities accurately. Evans includes a risk intelligence test (RQ) in his book and on his website (below) analogous to
IQ or EQ.
Computer scientist and risk researcher Jochen L. Leidner distinguishes three types of risk (risk type, likelihood and impact; the latter has also been called loss if expressed in negative financial terms); risk intelligence then is the process of obtaining these three pieces of information for any risk type an entity has non-trivial exposure. Risk intelligence can be obtained manually or with computer support.
American financi |
https://en.wikipedia.org/wiki/Stochastic%20computing | Stochastic computing is a collection of techniques that represent continuous values by streams of random bits. Complex computations can then be computed by simple bit-wise operations on the streams. Stochastic computing is distinct from the study of randomized algorithms.
Motivation and a simple example
Suppose that is given, and we wish to compute . Stochastic computing performs this operation using probability instead of arithmetic.
Specifically, suppose that there are two random, independent bit streams called stochastic numbers (i.e. Bernoulli processes), where the probability of a one in the first stream is , and the probability in the second stream is . We can take the logical AND of the two streams.
The probability of a one in the output stream is . By observing enough output bits and measuring the frequency of ones, it is possible to estimate to arbitrary accuracy.
The operation above converts a fairly complicated computation (multiplication of and ) into a series of very simple operations (evaluation of ) on random bits.
More generally speaking, stochastic computing represents numbers as streams of random bits and reconstructs numbers by calculating frequencies. The computations are performed on the streams and translate complicated operations on and into simple operations on their stream representations. (Because of the method of reconstruction, devices that perform these operations are sometimes called stochastic averaging processors.) In modern terms, stochastic computing can be viewed as an interpretation of calculations in probabilistic terms, which are then evaluated with a Gibbs sampler. It can also be interpreted as a hybrid analog/digital computer.
History
Stochastic computing was first introduced in a pioneering paper by John von Neumann in 1953. However, the
theory could not be fully developed until advances in computing of the 1960s,
mostly through a series of simultaneous and parallel efforts in the US
and the UK.
By the lat |
https://en.wikipedia.org/wiki/Tennessee%20Ornithological%20Society | The Tennessee Ornithological Society (TOS) is an independent non-profit educational, scientific, and conservation organization in Tennessee, United States, dedicated to the study and conservation of birds. It was formed in 1915 and has published a quarterly journal, The Migrant, since 1930. The organization conducts statewide meetings and its local chapters have regular meetings and field trips.
The TOS was started by a group of six amateur ornithologists who met at Faucon's French Restaurant in Nashville on October 7, 1915. A historical marker at the site commemorates the organization's founding.
Annual activities of the organization include annual autumn hawk counts, spring and fall bird counts, winter raptor surveys, and annual Christmas bird counts.
The quarterly journal of the Tennessee Ornithological Society, The Migrant, is a repository for sighting reports and articles related to monitoring the status of Tennessee's bird populations. The organization also published the Atlas of the Breeding Birds of Tennessee by Charles P. Nicholson (), a compendium on the birds of the state, based on research conducted by TOS members from 1986 through 1991. The atlas includes the first fully documented account of distribution patterns for 170 bird species confirmed as breeding in Tennessee, as well as for several unconfirmed or extirpated bird species.
See also
List of state ornithological organizations in the United States |
https://en.wikipedia.org/wiki/Pierre%20Fr%C3%A9d%C3%A9ric%20Sarrus | Pierre Frédéric Sarrus (; 10 March 1798, Saint-Affrique – 20 November 1861) was a French mathematician.
Sarrus was a professor at the University of Strasbourg, France (1826–1856) and a member of the French Academy of Sciences in Paris (1842). He is the author of several treatises, including one on the solution of numeric equations with multiple unknowns (1842); one on multiple integrals and their integrability conditions; and one on the determination of the orbits of the comets. He also discovered a mnemonic rule for solving the determinant of a 3-by-3 matrix, named Sarrus' scheme. Sarrus also demonstrated the fundamental lemma of the calculus of variations.
Sarrus numbers are pseudoprimes to base 2.
Sarrus also developed the Sarrus linkage, the first linkage capable of transforming rotary motion into perfect straight-line motion, and vice versa.
1798 births
1861 deaths
People from Saint-Affrique
19th-century French mathematicians
Linear algebraists |
https://en.wikipedia.org/wiki/CASTEP | CASTEP is a shared-source academic and commercial software package which uses density functional theory with a plane wave basis set to calculate the electronic properties of crystalline solids, surfaces, molecules, liquids and amorphous materials from first principles. CASTEP permits geometry optimisation and finite temperature molecular dynamics with implicit symmetry and geometry constraints, as well as calculation of a wide variety of derived properties of the electronic configuration. Although CASTEP was originally a serial, Fortran 77-based program, it was completely redesigned and rewritten from 1999 to 2001 using Fortran 95 and MPI for use on parallel computers by researchers at the Universities of York, Durham, St. Andrews, Cambridge and Rutherford Labs.
History
CASTEP was created in the late 1980s and early 1990s in the TCM Group of the Cavendish Laboratory in Cambridge. It was then an academic code written in Fortran77, and the name was originally derived from CAmbridge Serial Total Energy Package. In the mid 1990s it was commercialised by licensing it to Molecular Simulations International (the company was later purchased by Accelrys, in turn purchased by Biovia) in an arrangement through which the University of Cambridge received a share of the royalties, and much of the development remained with the original academic authors. The code was then redesigned and completely rewritten from 1999–2001 to make use of the features of modern Fortran, enable parallelism throughout the code and improve its software sustainability. The name CASTEP was adopted by the new codebase, but without the implied former meaning since the code was now parallel, and capable of computing many quantities besides the total energy. By this point annual sales exceeded £1m. Despite its commercialisation, CASTEP and its source code remained free to UK academics.
In 2019 the free academic licence was extended to world-wide academic use (not just UK academia). Commercial users can pu |
https://en.wikipedia.org/wiki/Anterior%20atlantoaxial%20ligament | The anterior atlantoaxial ligament is a strong membrane, fixed above the lower border of the anterior arch of the atlas; below, to the front of the body of the axis.
It is strengthened in the middle line by a rounded cord, which connects the tubercle on the anterior arch of the atlas to the body of the axis. It is a continuation upward of the anterior longitudinal ligament.
Anatomy
Anatomical relations
The anterior atlantoaxial ligament is situated anterior to the longus capitis muscle.
See also
Atlanto-axial joint |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.