source
stringlengths
31
203
text
stringlengths
28
2k
https://en.wikipedia.org/wiki/Limbic-predominant%20age-related%20TDP-43%20encephalopathy
LATE is a term that describes a prevalent condition with impaired memory and thinking in advanced age, often culminating in the dementia clinical syndrome. In other words, the symptoms of LATE are similar to those of Alzheimer's disease.   The acronym LATE stands for Limbic-predominant Age-related TDP-43 Encephalopathy: “limbic” is related to the brain areas first involved, “age-related” and the name “LATE” itself refer to the onset of disease usually in persons aged 80 or older,  “TDP-43” indicates the aberrant mis-folded protein (or proteinopathy) deposits in the brain that characterize LATE, and “encephalopathy” means illness of brain. At present LATE can only be diagnosed with certainty at autopsy. The terminology used to refer to the brain changes identified in autopsy-confirmed LATE is: LATE neuropathologic change (LATE-NC). The diagnosis of LATE-NC at autopsy requires detection of pathologic TDP-43 protein deposits in the brain, especially in the amygdala and hippocampus. LATE is a very common condition: autopsy studies around the world indicate that LATE is present in the brains of about one-third of people over 85. LATE typically affects persons older than 75 years of age (with some exceptions; please see below) and becomes increasingly prevalent every year in advanced old age. This is in contrast to Alzheimer's disease pathology, which tends to level off and perhaps decrease in prevalence among persons beyond age 85 years. LATE is often comorbid with (i.e., occurs in the same brain as) other pathologic changes that are associated with dementia, such as Alzheimer's disease and cerebrovascular disease(s). LATE has a large impact on public health. Clinical-pathologic correlation studies have established that the presence of LATE-NC is associated with impairments in memory and thinking. In older persons whose brains lack Alzheimer's disease-type amyloid plaques and neurofibrillary tangles, the presence of LATE-NC at autopsy is associated with a rel
https://en.wikipedia.org/wiki/Sloshing%20bucket%20model%20of%20evolution
The sloshing bucket model of evolution is a theory in evolutionary biology that describes how environmental disturbances varying in magnitude will affect the species present. The theory emphasizes the causal relationship between environmental factors that impinge and affect genealogical systems, providing an overarching view that determines the relationship between the variety of biological systems. This theory was developed by Niles Eldredge, a U.S. biologist and paleontologist, and published in the journal 'Evolutionary Dynamics: Exploring the Interplay of Selection, Accident, Neutrality and Function ' where Eldredge introduces his sloshing bucket model in the article titled: 'The Sloshing Bucket: How the Physical Realm Controls Evolution'. Summary The sloshing bucket model uses the imagery of water representing species sloshing back and forth in the environment, represented by the bucket. Disturbances in the environment are represented by the movement of the bucket, creating the sloshes. Starting off, small sloshes/disturbances do not spill any water; stasis of the current species are dominant. However, as physical disturbances grow in magnitude and size, the sloshes will result in large amounts of water spilling out, representing the extinction and speciation of the organisms present. An example Eldredge uses is the dinosaurs, which were the prevalent life form on earth for 150 million years, surviving smaller sloshes in the bucket without much evolutionary change. It was not until the Mesozoic asteroid impact that extinction occurred to the dinosaurs and after a lag of five to seven million year, did mammals begin to speciate and diversify. The three patterns in life Stasis Built directly off his previous paper titled 'Punctuated equilibria', the stasis pattern of life represents periods of 'dynamic, non-regular oscillation' of intra-population variation. Stasis does not mean a species collective genome is stable, but instead still are in constant
https://en.wikipedia.org/wiki/Byte%20Sieve
The Byte Sieve is a computer-based implementation of the Sieve of Eratosthenes published by Byte as a programming language performance benchmark. It first appeared in the September 1981 edition of the magazine and was revisited on occasion. Although intended to compare the performance of different languages on the same computers, it quickly became a widely used machine benchmark. The Sieve was one of the more popular benchmarks of the home computer era, another being the Creative Computing Benchmark of 1983, and the Rugg/Feldman benchmarks, mostly seen in the UK in this era. Byte later published the more thorough NBench in 1995 to replace it. History Origins Jim Gilbreath of the Naval Ocean System Center had been considering the concept of writing a small language benchmarking program for some time, desiring one that would be portable across languages, small enough that the program code would fit on a single printed page, and did not rely on specific features like hardware multiplication or division. The solution was inspired by a meeting with Chuck Forsberg at the January 1980 USENIX meeting in Boulder, CO, where Forsberg mentioned an implementation of the sieve written by Donald Knuth. Gilbreath felt the sieve would be an ideal benchmark as it avoided indirect tests on arithmetic performance, which varied widely between systems. The algorithm mostly stresses array lookup performance and basic logic and branching capabilities. Nor does it require any advanced language features like recursion or advanced collection types. The only modification from Knuth’s original version was to remove a multiplication by two and replace it with an addition instead. Machines with hardware multipliers would otherwise run so much faster that the rest of the performance would be hidden. After six months of effort porting it to as many platforms as he had access to, the first results were introduced in the September 1981 edition of Byte in an article entitled "A High-Level Languag
https://en.wikipedia.org/wiki/Thermally%20activated%20delayed%20fluorescence
Thermally activated delayed fluorescence (TADF) is a process through which a molecular species in a non-emitting excited state can incorporate surrounding thermal energy to change states and only then undergo light emission. The TADF process involves an excited molecular species in a triplet state, which commonly has a forbidden transition to the ground state termed phosphorescence. By absorbing nearby thermal energy the triplet state can undergo reverse intersystem crossing (RISC) converting it to a singlet state, which can then de-excite to the ground state and emit light in a process termed fluorescence. Along with fluorescent and phosphorescent compounds, TADF compounds are one of the three main light-emitting materials used in organic light-emitting diodes (OLEDs). Another type of TADF process has been shown to originate from conformational trapping to a dark state. Thermal energy allows the repopulation of the emissive state resulting in a delayed fluorescence. History The first evidence of thermally activated delayed fluorescence in a fully organic molecule was discovered in 1961 using the compound eosin. The emission that was detected was termed "E-type" delayed fluorescence and the mechanism was not completely understood. In 1986, the TADF mechanism was further investigated and described in detail using aromatic thiones, but it was not until much later that a practical application was identified. From 2009 to 2012 Adachi and coworkers published a series of papers reporting effective TADF molecular design strategies and competitive external quantum efficiencies (EQE) for green, orange, and blue OLEDs. These publications spiked interest in the topic and TADF compounds were soon considered a possible higher efficiency alternative to traditional fluorescent and phosphorescent compounds used in lighting and displays. TADF materials are being considered the third generation of OLEDs following fluorescent and phosphorescent based devices. Mechanism The steps
https://en.wikipedia.org/wiki/Code%20Girls
The Code Girls or World War II Code Girls is a nickname for the more than 10,000 women who served as cryptographers (code makers) and cryptanalysts (code breakers) for the United States Military during World War II, working in secrecy to break German and Japanese codes. These women were a crucial part of the war and broke numerous codes that were of significant importance to the Allied Forces and helped them to win and shorten the Second World War. Recruitment In the months prior to the Japanese attack on Pearl Harbor, the United States Military began to recruit women to work for their various branches, as the men who previously occupied these positions were deployed overseas to fight in the war. Many of the recruited women were hired to work as cryptographers and cryptanalysts by the United States Navy. These women had to be native to the United States, as to make sure that they had no ties to foreign countries. The government sought after young females who pioneered the STEM field and excelled in mathematics and world languages—often young college students and teachers with a driving motivation. After the attack, the Navy's recruitment activities and advertisements increased dramatically as the United States' joined the Allied Forces to fight Axis powers during World War II. During the recruitment process, the women were asked if they liked crossword puzzles and if they were engaged or wanted to be married. Those who answered 'yes' and 'no', respectively, were moved forward in the hiring process. The military were looking for women that were willing to relocate and had little to no ties to their current lifestyle. Candidates were invited to secret meetings where they were offered the opportunity to take a code-breaking training course and were sworn to secrecy- exposing their work was considered treason and could have been punishable by death. Those who passed the course were invited to Washington, D.C. after college graduation to join the Navy as civilian
https://en.wikipedia.org/wiki/%E2%84%93-adic%20sheaf
In algebraic geometry, an ℓ-adic sheaf on a Noetherian scheme X is an inverse system consisting of -modules in the étale topology and inducing . Bhatt–Scholze's pro-étale topology gives an alternative approach. Motivation The development of étale cohomology as a whole was fueled by the desire to produce a 'topological' theory of cohomology for algebraic varieties, i.e. a Weil cohomology theory that works in any characteristic. An essential feature of such a theory is that it admits coefficients in a field of characteristic 0. However, constant étale sheaves with no torsion have no interesting cohomology. For example, if is a smooth variety over a field , then for all positive . On the other hand, the constant sheaves do produce the 'correct' cohomology, as long as is invertible in the ground field . So one takes a prime for which this is true and defines -adic cohomology as . This definition, however, is not completely satisfactory: As in the classical case of topological spaces, one might want to consider cohomology with coefficients in a local system of -vector spaces, and there should be a category equivalence between such local systems and continuous -representations of the étale fundamental group. Another problem with the definition above is that it behaves well only when is a separably closed. In this case, all the groups occurring in the inverse limit are finitely generated and taking the limit is exact. But if is for example a number field, the cohomology groups will often be infinite and the limit not exact, which causes issues with functoriality. For instance, there is in general no Hochschild-Serre spectral sequence relating to the Galois cohomology of . These considerations lead one to consider the category of inverse systems of sheaves as described above. One has then the desired equivalence of categories with representations of the fundamental group (for -local systems, and when is normal for -systems as well), and the issue in the la
https://en.wikipedia.org/wiki/Xiangfeng%20wu
Xiangfeng wu () were wind surveying instruments used to gather and measure the direction of the wind in ancient China. History Prior to the invention of Xiangfeng wu, the ancient Chinese used pieces of silk or cloth that was hung on a pole to measure wind direction. Epigraphic evidence attributing to the discovery of weather crow on a wall painting in a tomb dating to the Eastern Han dynasty in 1972. The Sanfu huangtu (三輔黃圖, Description of the Three Districts in the Capital), a 3rd-century book written by Miao Changyan about the palaces at Chang'an, describes a copper bird-shaped wind vane situated on a tower roof for the measurement of wind direction. Xiangfeng wu composed of copper slices that were fixed on the top of a pole which could be revolved if the wind was blowing in a certain direction. Xiangfeng wu were first used in meteorological observatories and were later installed in government towers and private houses. See also Weather vane References Chinese inventions Han dynasty Measuring instruments Meteorological instrumentation and equipment Wind
https://en.wikipedia.org/wiki/Sixth%20Tone
Sixth Tone is a state-owned English-language online magazine published by Shanghai United Media Group. Name Sixth Tone's name relates to the number of tones in Mandarin Chinese, but also is stated to carry more metaphorical meaning as well. Mandarin Chinese has four active tones and a fifth dropped tone that has less prominence than the other four. Because of the language's five tones, the publication's name refers to an ideal of expanding beyond traditionally-reported items in Anglophone media, making it the "sixth tone". History The online magazine began publication on April 6, 2016, with an investment of US$4.5 million from the Shanghai United Media Group. It is a sister publication of The Paper. Wei Xing was its first editor-in-chief until May 30, 2016, when he moved to create a start-up company and therefore no longer worked for the paper. Succeeding Wei, Zhang Jun became the new editor-in-chief that year. By 2018, Western media began to cite Sixth Tone in news reports. Vincent Ni, in an essay published in Westminster Papers in Communication and Culture, stated that, "For foreign journalists, it has also shown a diverse and authentic side of China that rarely received much attention elsewhere" and that the publication "has proved far more effective than the hundreds of millions of dollars invested in English-language news programs by the state broadcaster Xinhua, CCTV, and CRI." Reception Bethany Allen-Ebrahimian, writing for Foreign Policy, stated that Sixth Tone has a less staid and "saccharine" tone compared to many other English-language publications from China. She stated, "If webby U.S. media startup Vox were acquired by the Chinese Communist Party, it might resemble Sixth Tone". In a 2016 interview with The New York Times, the then-editor-in-chief, Wei Xing, sought to differentiate his magazine from other Chinese English-language publications. Wei stated that compared to other government-owned news publications, Sixth Tone would have an easier time
https://en.wikipedia.org/wiki/Random%20cluster%20model
In statistical mechanics, probability theory, graph theory, etc. the random cluster model is a random graph that generalizes and unifies the Ising model, Potts model, and percolation model. It is used to study random combinatorial structures, electrical networks, etc. It is also referred to as the RC model or sometimes the FK representation after its founders Cees Fortuin and Piet Kasteleyn. Definition Let be a graph, and be a bond configuration on the graph that maps each edge to a value of either 0 or 1. We say that a bond is closed on edge if , and open if . If we let be the set of open bonds, then an open cluster is any connected component in union the set of vertices. Note that an open cluster can be a single vertex (if that vertex is not incident to any open bonds). Suppose an edge is open independently with probability and closed otherwise, then this is just the standard Bernoulli percolation process. The probability measure of a configuration is given as The RC model is a generalization of percolation, where each cluster is weighted by a factor of . Given a configuration , we let be the number of open clusters, or alternatively the number of connected components formed by the open bonds. Then for any , the probability measure of a configuration is given as Z is the partition function, or the sum over the unnormalized weights of all configurations, The partition function of the RC model is a specialization of the Tutte polynomial, which itself is a specialization of the multivariate Tutte polynomial. Special values of q The parameter of the random cluster model can take arbitrary complex values. This includes the following special cases: : linear resistance networks. : negatively-correlated percolation. : Bernoulli percolation, with . : the Ising model. : -state Potts model. Edwards-Sokal representation The Edwards-Sokal (ES) representation of the Potts model is named after Robert G. Edwards and Alan D. Sokal. It provides
https://en.wikipedia.org/wiki/Z80-RIO
The Z80 Operating System with Relocatable Modules and I/O Management (Z80-RIO) is a general-purpose operating system developed by Zilog in the late 1970s for various computer systems including the Z80 Micro Computer System (MCZ-1) series and the Z80 Development System (ZDS). The MCZ systems were primarily used for software development and automation solutions. RIO was designed to facilitate the development and integration of user's programs into a production environment. Features The system provides a modest environment with a minimum of system support and an enhanced environment. The modest environment provides a program debugger with file manipulation capability, a floppy disk driver (supporting up to eight disk drives), and a basic console driver with provision for paper tape operation. The enhanced environment provides access to the RIO Executive and to system support utilities such as the Zilog Floppy Disk File System (ZDOS), and the Zilog Hard Disk File System (DFS). It also provides access to a number of disk-resident software such a text editor, macro assembler, and linker. Commands The following list of commands are supported by Z80-RIO. ACTIVATE ALLOCATE ASM BRIEF CAT CLOSE COMPARE COPY COPY.DISK COPYSD DATE DEACTIVATE DEALLOCATE DEBUG DEFINE DELETE DISK.FORMAT DISK.REPAIR DISK.STATUS DISPLAY DO DUMP ECHO EDIT ERROR ERRORS EXTRACT FORCE FORMAT HELP IMAGE INITIALIZE LADT LINK MASTER MEMORY MOVE PAUSE RELEASE RENAME RESTORE_TABS SAVE_TABS SET STATUS VERBOSE XEQ Clones UDOS, a Z80-RIO compatible clone by VEB Robotron, was available for a number of computers by the same company, such as the A 5120 or the PC 1715, which were based on the U880 processor (the latter being a clone of Zilog's Z80). UDOS was also one of the operating systems available for the P8000, a microcomputer system developed in 1987 by the VEB Elektro-Apparate-Werke Berlin-Treptow „Friedrich Ebert“ (EAW) in the German Democratic Republic (D
https://en.wikipedia.org/wiki/NYC%20Mesh
NYC Mesh is a physical network of interconnected routers and a group of enthusiasts working to support the expansion of the project as a freely accessible, open, wireless community network. NYC Mesh is not an internet service provider (ISP), although it does connect to the internet and offer internet access as a service to members. The network includes over 1,000 active member nodes throughout the five boroughs of New York City, with concentrations of users in lower Manhattan and Brooklyn. Aim The goal of NYC Mesh is to build a large scale, decentralized digital network, owned by those who run it, that will eventually cover all of New York City and neighboring urban areas. Participation in the project is governed by its Network Commons License. This agreement, partially modeled on a similar license in use by Guifi.net, lists four key tenets: Participants are free to use the network for any purpose that does not limit the freedom of others to do the same, Participants are free to know how the network and its components function, Participants are free to offer and accept services on the network on their own terms, and By joining the free network, participants agree to extend the network to others under the same conditions. Other similar projects include Freifunk in Germany, Ninux in Italy, Sarantaporo.gr in Greece, the People's Open Network in Oakland, CA, and Red Hook Wi-Fi in Brooklyn, NY. Technology Like many other free community-driven networks, NYC Mesh uses mesh technology to facilitate robustness and resiliency. NYC Mesh previously used BGP for routing within the network, though this was found to be too static so the network was changed to use OSPF routing instead. The network relies on a variety of wireless links to connect individual nodes and larger sections of the network together. Most nodes use both a long range directional antenna for up-link to a hub along with a shorter range omni-directional antenna that provides connections to other nea
https://en.wikipedia.org/wiki/Epik
Epik is an American domain registrar and web hosting company known for providing services to alt-tech websites that host far-right, neo-Nazi, and other extremist materials. It has been described as a "safehaven for the extreme right" because of its willingness to provide services to far-right websites that have been denied service by other Internet service providers. Some of Epik's notable clients have included social network Gab and the imageboard website 8chan. In 2021, the Parler social network moved its domain registration to Epik when it was denied hosting and other web services after it was used to help plan the 2021 storming of the U.S. Capitol. Epik has also provided hosting and registrar services to Patriots.win, formerly TheDonald.win, an independent far-right forum that has served as the successor for the r/The_Donald subreddit that was banned in June 2020. Epik was founded in 2009 by Rob Monster, and is based in Washington State. In September and October 2021, hackers identifying themselves as a part of Anonymous released several caches of data obtained from Epik in a large-scale data breach. History Epik was founded in 2009 by Rob Monster, who served as the company's chief executive officer until 2022. Until 2018, Epik primarily focused on domain trading and mostly stayed out of the public spotlight. In 2018, the company came to public attention when they decided to provide services to Gab. Epik is primarily known for its domain name registration services, and describes itself as the "Swiss bank of the domain industry". In the late 2010s, following a series of acquisitions, Epik also began providing an increasing variety of other web services including web hosting, content delivery network (CDN) services, and DDoS protection. Acquisitions In February 2019, it was announced that Epik had acquired BitMitigate, an American cybersecurity company based in Vancouver, Washington. BitMitigate protects websites against potential threats including distribut
https://en.wikipedia.org/wiki/Surface%20growth
In mathematics and physics, surface growth refers to models used in the dynamical study of the growth of a surface, usually by means of a stochastic differential equation of a field. Examples Popular growth models include: KPZ equation Dimer model Eden growth model SOS model Self-avoiding walk Abelian sandpile model Kuramoto–Sivashinsky equation (or the flame equation, for studying the surface of a flame front) They are studied for their fractal properties, scaling behavior, critical exponents, universality classes, and relations to chaos theory, dynamical system, non-equilibrium / disordered / complex systems. Popular tools include statistical mechanics, renormalization group, rough path theory, etc. Kinetic Monte Carlo surface growth model Kinetic Monte Carlo (KMC) is a form of computer simulation in which atoms and molecules are allowed to interact at given rate that could be controlled based on known physics. This simulation method is typically used in the micro-electrical industry to study crystal surface growth, and it can provide accurate models surface morphology in different growth conditions on a time scales typically ranging from micro-seconds to hours. Experimental methods such as scanning electron microscopy (SEM), X-ray diffraction, and transmission electron microscopy (TEM), and other computer simulation methods such as molecular dynamics (MD), and Monte Carlo simulation (MC) are widely used. How KMC surface growth works 1. Absorption process First, the model tries to predict where an atom would land on a surface and its rate at particular environmental conditions, such as temperature and vapor pressure. In order to land on a surface, atoms have to overcome the so-called activation energy barrier. The frequency of passing through the activation barrier can by calculated by the Arrhenius equation: where A is thermal frequency of molecular vibration, k is Boltzmann constant. 2. Desorption process When atoms land on a surface, there are
https://en.wikipedia.org/wiki/ASME%20Y14.5
ASME Y14.5 is a standard published by the American Society of Mechanical Engineers (ASME) to establish rules, symbols, definitions, requirements, defaults, and recommended practices for stating and interpreting Geometric Dimensions and Tolerances (GD&T). ASME/ANSI issued the first version of this Y-series standard in 1973. Overview ASME Y14.5 is a complete definition of Geometric Dimensioning and Tolerancing. It contains 15 sections which cover symbols and datums as well as tolerances of form, orientation, position, profile and runout. It is complemented by ASME Y14.5.1 - Mathematical Definition of Dimensioning and Tolerancing Principles. Together these standards allow for clear and concise detailing of dimensional requirements on a product drawing or electronic drawing package as well as the verification of the requirements on manufactured parts. Effective application of GD&T allows for parts to be verified by dimensional measurements, gauging, or by CMM. History The modern standard can trace its roots to the military standard MIL-STD-8 published in 1949. It was revised by MIL-STD-8A in 1953 which introduced the concept of modern GD&T "Rule 1". Further revisions have continued to add new concepts and address new technology like Computer Aided Design and Model-based definition. A list of revisions follows: ASME Y14.5-2018, "Dimensioning and Tolerancing" Current Standard Preceded by ASME Y14.5-2009 ASME Y14.5-2-2017, "Certification of Geometric Dimensioning and Tolerancing Professionals" Current Standard Preceded by ASME Y14.5-2-2000 ASME Y14.5-2009 Succeeded by ASME Y14.5-2018 Preceded by ASME Y14.5M-1994 ASME Y14.5M-1994 Succeeded by ASME Y14.5-2009 Reaffirmed in 2004 Preceded by ANSI Y14.5M-1982 ANSI Y14.5M-1982 Preceded by ANSI Y14.5-1973 Reaffirmed in 1988 ANSI Y14.5-1973 Succeeded by ASME Y14.5M-1982 Preceded by USASI Y14.5-1966 USASI Y14.5-1966 Succeeded by ANSI Y14.5-1973 Preceded by ASA Y14.5-1957 ASA Y14.5-1957 Succeeded
https://en.wikipedia.org/wiki/Freezing%20point%20depression%20osmometer
The freezing point depression osmometer is an osmometer that is used in determining a solution's osmotic concentration as its osmotically active aspects depress its freezing point. In the past, freezing point osmometry has been used to assess the osmotic strength of colloids and solutions. The osmometer uses the solution's freezing point depression to establish its strength. It is also used to determine the level of osmotically appropriate body fluid in various chemicals dissolved in the blood using the relationship in which a mole of dissolved substance reduces the freezing point of water by . The freezing point depression osmometer is also used in various medical practices, including pharmaceutical manufacturing, quality control laboratories, and clinical chemistry. Method Freezing point depression osmometers are utilized to determine a solution's osmotic strength. It is the approach that is most frequently used for a variety of medical tasks. It is used in assessing the osmotic strength of colloids as well as solutions. The freezing point depression osmometer operates by using the solution's freezing point to determine the concentration of the solution. It uses a nanoliter nanometer, a device that facilitates the establishment of the solution's melting and freezing points. Calibration, loading, deep freezing, and determination are the four separate procedures involved in determining the freezing and melting points. The concentration of the solution can be determined by knowing the number of particles present in it, which can be done by determining the freezing point of the solution. When particles are dissolved in a solution, their freezing point is lowered compared to that of the original solvent. A further increase in the solute decreases the freezing point even further. The freezing point depression osmometer uses the solution's freezing point to establish its concentration. The freezing point depression osmometer is calibrated using standards that ar
https://en.wikipedia.org/wiki/NIMPLY%20gate
The NIMPLY gate is a digital logic gate that implements a material nonimplication. Symbols A right-facing arrow with a line through it () can be used to denote NIMPLY in algebraic expressions. Logically, it is equivalent to material nonimplication, and the logical expression A ∧ ¬B. Usage The NIMPLY gate is often used in synthetic biology and genetic circuits. See also IMPLY gate AND gate NOT gate NAND gate NOR gate XOR gate XNOR gate Boolean algebra (logic) Logic gates References Logic gates
https://en.wikipedia.org/wiki/Transaction%20Protocol%20Data%20Unit
The Transaction Protocol Data Unit (TPDU) is a packet-based protocol originally designed for financial transaction processing over an X.25 network. References Data unit
https://en.wikipedia.org/wiki/Nikiel%27s%20conjecture
In mathematics, Nikiel's conjecture in general topology was a conjectural characterization of the continuous image of a compact total order. The conjecture was first formulated by in 1986. The conjecture was proven by Mary Ellen Rudin in 1999. The conjecture states that a compact topological space is the continuous image of a total order if and only if it is a monotonically normal space. Notes Topology Conjectures that have been proved
https://en.wikipedia.org/wiki/Songfacts
Songfacts is a music-oriented website that has articles about songs, detailing the meaning behind the lyrics, how and when they were recorded, and any other info that can be found. The journalists who work for the site have interviewed thousands of artists and songwriters to get the facts behind the songs, including Peter Murphy, Gene Simmons, Mick Jones, Ian Anderson, Brad Arnold (3 Doors Down), Billy Steinberg, Matt Thiessen, Tomas Haake, Jo Dee Messina, Marc Roberge, Bill Withers, Janis Ian and Emily Saliers. The site was started by WHCN DJ Carl Wiser in Hartford, Connecticut, in August 1999. Wiser originally created the list as a database to prepare for his radio programs but then he posted it online. It was initially used mainly by DJs, but in 2002 it was chosen as a "Yahoo! Pick". The August 2004 issue of Men's Journal listed Songfacts as one of the "100 Best Websites for Guys". USA Weekend has praised it as "a virtual Behind the Music". References External links Internet properties established in 1999 Online databases American music websites Online music and lyrics databases Companies based in Hartford, Connecticut 1999 establishments in Connecticut
https://en.wikipedia.org/wiki/Register%20of%20Antarctic%20Marine%20Species
The Register of Antarctic Marine Species, also known as RAMS, is a taxonomic database that provides a list of marine species found in the Southern Ocean surrounding Antarctica. Its purpose is to provide authoritative and comprehensive information on the diversity of marine life in the region, which provides a reference point for marine science, research, conservation and sustainable management. The database includes marine species found on the sea floor, in the water column, and around sea-ice. RAMS is a regionally-focused database within the World Register of Marine Species. References Database
https://en.wikipedia.org/wiki/Feedback%20terminal
A feedback terminal is a physical device that is commonly used to collect large amounts of anonymous real-time feedback from customers, employees, travelers, visitors or students. Typically, feedback terminals feature buttons that users can press to indicate how satisfied they are with the service provided. This information can then be used by organisations to analyze where the experience is optimal, and where it can be improved. Applications Feedback terminals are utilized to measure and improve customer and employee experience across a broad range of industries. Feedback terminals have seen use in a wide variety of industries, including retail, healthcare, hospitality industry, airports, and educational institutions. Feedback terminals also allow for the collection of real-time feedback. For example, by collecting real-time feedback in a public restroom, a facilities manager can be alerted if the collected customer experience has dropped below the certain threshold and then can immediately send out cleaners. Feedback terminals are also often used to measure Net Promoter Score (NPS) on-site, a metric which can be used to gauge the loyalty of a company’s customer relationships. Using a five button feedback terminal, the Net Promoter Score is can be calculated based on responses to a question asking about a customer's experience. Doing so can allow companies to more efficiently categorize customers based on their expected behavior. Benefits The main benefit of such feedback collection is that organisations can collect a larger amount of data when compared to traditional surveys, potentially registering thousands of impressions in a day from a wide range of customers. Measuring people’s experience along the customer journey helps business and other organisations understand how people feel about their experiences at various touchpoints. The organisations can use the feedback data to make improvements that meet the expectations of the majority. Real-time feedb
https://en.wikipedia.org/wiki/Tradebe
Tradebe is a waste management company based in Barcelona that was established in 1980. It operates in Spain, France, the United Kingdom, the United States and Oman. The chairman is Josep Creixell, and the Chief Executive is Victor Creixell. Tradebe is a significant player in the solvent recycling and automated oil tank cleaning markets. Prosecutions The company has been prosecuted in the UK. In 2016 it was fined £38,960 after a chemical leak at their Hendon Dock plant in Sunderland. It was prosecuted in 2013 after a spillage of highly flammable liquid at a site in Knottingley. The United States Environmental Protection Agency fined it after environmental violations at the firm’s hazardous waste treatment facilities in Connecticut. Their subsidiary Norlite had to pay around £15,000 for air pollution violations in Cohoes, New York, in 2016. Acquisitions 1984 Acquisition of Fragsa, which since 1975 had been working in the management and recycling of out-of-service vehicles using processes for the fragmentation and separation of metals. 1989 Ecoimsa obtains authorization for the collection of hydrocarbon waste from ships (Marpol). 1990 Acquisition of Fragnor, a fragmentation plant located in Amorebieta (Vizcaya) 1992 Acquisition of Lunagua with the goal of setting up an industrial waste treatment plant 1997 Creation of Intraval to develop new lines of business in consultancy, engineering and treatment of urban solid waste. 1998 Construction and development of a composting plant in Jorba (Barcelona). 2002 Acquisition of Ecología Química located in Gualba (Barcelona) specializing in the recycling of solvents from the chemical and pharmaceutical industries. Establishment of Recovery & Recycling Solutions located in Houston, the “center” of the world’s oil industry. 2003 Purchase of Willacy Oil Services, founded in 1989, providing an extensive range of equipment and services for the treatment of refinery sludge. 2003 Purchase of Tecnoambiente, a pion
https://en.wikipedia.org/wiki/Dawson%20Engler
Dawson R. Engler is an American computer scientist and an associate professor of computer science and electrical engineering at Stanford University. Career After graduating from University of Arizona, Engler earned his Ph.D. from the Massachusetts Institute of Technology in 1998 while working with Frans Kaashoek in the MIT CSAIL Parallel and Distributed Operating Systems Group. The focus of his graduate studies was the exokernel. Engler is currently an associate professor of computer science and electrical engineering at Stanford University. In 2002, he co-founded Coverity with several of his students to commercialize his group's work in static code analysis for bug-finding technology. Awards and honors Engler and his co-authors received the Best Paper award at USENIX's OSDI conferences in 2000, 2004, and 2008. With his students Cristian Cadar and Daniel Dunbar, he was jointly awarded the 2018 SIGOPS Hall of Fame Award for their paper at the 2008 conference. Engler won the 2006 SIGOPS Mark Weiser Award for his work in operating systems research. In 2008, he received the Grace Murray Hopper Award for "ground-breaking work on automated program checking and bug-finding". Selected publications References External links Dawson Engler on the Stanford University website Living people American computer scientists Programming language researchers Software testing people Arizona State University alumni Massachusetts Institute of Technology alumni Stanford University faculty Stanford University Department of Electrical Engineering faculty Grace Murray Hopper Award laureates Year of birth missing (living people)
https://en.wikipedia.org/wiki/IOS%2013
iOS 13 is the thirteenth major release of the iOS mobile operating system developed by Apple Inc. for the iPhone, iPod Touch and HomePod. The successor to iOS 12, it was announced at the company's Worldwide Developers Conference (WWDC) on June 3, 2019, and released on September 19, 2019. It was succeeded by iOS 14, released on September 16, 2020. As of iOS 13, the iPad lines run a separate operating system, derived from iOS, named iPadOS. Both iPadOS 13 and iOS 13 dropped support for devices with less than 2 GB of RAM. Overview iOS 13 and iPadOS 13 were introduced by Senior Vice President of Software Engineering Craig Federighi at the WWDC keynote address on June 3, 2019. The first beta was made available to registered developers after the keynote. The second beta was released to registered developers on June 18, 2019, and the first public beta was released on June 24, 2019. The initial release of iOS 13 was version 13.0, which was released to the public on September 19, 2019. System features Privacy iOS 13 changes the handling of location data. When an app requests access to location, the user chooses whether to grant access whenever they are using the app, never, or only once. The user will receive similar prompts for background location access, and when an app requests access to Bluetooth or Wi-Fi (which may also be used for non-consensual location tracking). In August 2019, it was reported that beginning in April 2020, the PushKit API for VoIP would be restricted to internet telephone usage, closing a "loophole" that had been used by other apps for background data collection. User interface A system-wide dark mode allows users to enable a light-on-dark color scheme for the entire iOS and iPadOS user interface, all native applications, and supported third-party apps. It can be manually turned on or set to automatically switch between light and dark modes based on the time of day. The volume indicator was redesigned, replacing the larger, centered ove
https://en.wikipedia.org/wiki/Soil%20Biology%20and%20Biochemistry
Soil Biology and Biochemistry is a monthly peer-reviewed scientific journal that was established in 1969 and is currently published by Elsevier. Its founding editor-in-chief was John Saville Waid. It publishes original research papers that describe and explain biological processes occurring in soil. Since 2020, the editors-in-chief are Karl Ritz (University of Nottingham) and Josh Schimel (University of California Santa Barbara). According to the Journal Citation Reports, the journal has a 2020 impact factor of 7.609. References External links Soil science journals English-language journals Academic journals established in 1969 Soil biology
https://en.wikipedia.org/wiki/Reuschle%27s%20theorem
In elementary geometry, Reuschle's theorem describes a property of the cevians of a triangle intersecting in a common point and is named after the German mathematician Karl Gustav Reuschle (1812–1875). It is also known as Terquem's theorem after the French mathematician Olry Terquem (1782–1862), who published it in 1842. In a triangle with its three cevians intersecting in a common point other than the vertices , or let , and denote the intersections of the (extended) triangle sides and the cevians. The circle defined by the three points , and intersects the (extended) triangle sides in the (additional) points , and . Reuschle's theorem now states that the three new cevians , and intersect in a common point as well. References Friedrich Riecke (ed.): Mathematische Unterhaltungen. Volume I, Stuttgart 1867, (reprint Wiesbaden 1973), , p. 125 (German) M. D. Fox, J. R. Goggins: "Cevian Axes and Related Curves." The Mathematical Gazette, volume 91, no. 520, 2007, pp. 3-4 (JSTOR). External links Terquem's theorem at cut-the-knot.org Elementary geometry Theorems about triangles and circles
https://en.wikipedia.org/wiki/Tropical%20desert
Tropical deserts are located in regions between 15 and 30 degrees latitude. The environment is very extreme, and they have the highest average monthly temperature on Earth. Rainfall is sporadic; precipitation may not be observed at all in a few years. In addition to these extreme environmental and climate conditions, most tropical deserts are covered with sand and rocks, and thus too flat and lacking in vegetation to block out the wind. Wind may erode and transport sand, rocks and other materials; these are known as eolian processes. Landforms caused by wind erosion vary greatly in characteristics and size. Representative landforms include depressions and pans, Yardangs, inverted topography and ventifacts. No significant populations can survive in tropical deserts due to extreme aridity, heat and the paucity of vegetation; only specific flora and fauna with special behavioral and physical mechanisms are supported. Although tropical deserts are considered to be harsh and barren, they are in fact important sources of natural resources and play a significant role in economic development. Besides the equatorial deserts, there are many hot deserts situated in the tropical zone. Distribution Geographical distribution Tropical deserts are located in both continental interiors and coastal areas between the Tropic of Cancer and Tropic of Capricorn. Representative deserts include the Sahara Desert in North Africa, the Australian Desert, Arabian Desert and Syrian Desert in Western Asia, the Kalahari Desert in Southern Africa, Sonoran Desert in the United States and Mexico, Mojave Desert in the United States, Thar Desert in India and Pakistan, Dasht-e Margo and Registan Desert in Afghanistan and Dasht-e Kavir and Dasht-e Loot in Iran. Controlling factor Tropics form a belt around the equator from latitude 3 degrees north to latitude 3 degrees south, which is called the Intertropical Convergence Zone. Tropical heat generates unstable air in this area, and air masses becom
https://en.wikipedia.org/wiki/Coded%20exposure%20photography
Coded exposure photography, also known as a flutter shutter, is the name given to any mathematical algorithm that reduces the effects of motion blur in photography. The key element of the coded exposure process is the mathematical formula that affects the shutter frequency. This involves the calculation of the relationship between the photon exposure of the light sensor and the randomized code. The camera is made to take a series of snapshots with random time intervals using a simple computer, this creates a blurred image that can be reconciled into a clear image using the algorithm. Motion de-blurring technology grew due to increasing demand for clearer images in sporting events and other digital media. The relative inexpensiveness of the coded exposure technology makes it a viable alternative to expensive cameras and equipment that are built to take millions of images per second. History Photography was developed to enable imaging of the visible world. Early cameras utilized film made of plastic coated with compounds of silver. The film is highly sensitive to light. When photons (light) hit the film a reaction occurs which semi-permanently stores the data on its surface. This film is then developed by exposing it to several chemicals to create the image. The film is highly sensitive and the process is complicated. It must be stored away from light to prevent spoilage. Digital cameras utilize digital technologies to create images. This process involves exposing light-sensitive material to photons, creating electrical signals that are recorded in computer files. This process is simple and has improved the availability of photography. One problem that digital cameras have faced is motion blur. Motion blur occurs when the camera or the subject are in motion. When motion blur happens, the resulting image is blurry, fuzzy edges and indistinct features. One solution to remove motion blur in photography is to increase the shutter speed of the camera. Unlike the coded
https://en.wikipedia.org/wiki/Write%20Once%20Read%20Forever
Write Once Read Forever (WORF) is a data storage method which allows users to write data once and allows storage of the users data without ever being refreshed. This differs from common digital storage techniques such as drives that need to be re-written often to prevent loss or corruption of data. WORF was tested on the International Space Station in 2019. Method WORF uses a novel high-density patented data storage mechanism based on silver halide, which after substantial testing has been determined to last for more than a century under conventional ambient environmental conditions. WORF digital data is stored as microscopic, metallic, interference-created standing waves (representing narrowband ‟colors” ) embedded in a modern, super-resolution, dye-free, photosensitive emulsion. Wavelengths encode multiple superimposed states allowing complex data permutations to be stored per data region. Permutations enable extremely high data density to be stored on WORF media. Multi-state data architecture within each domain also enhances data integrity, error-checking, and accelerates parallel writing and reading for the entire media module. Once data is written to WORF, energy is needed only for reading—no periodic refresh is necessary, and data is both immutable and truly permanent. Human readable text and images are embedded in the WORF module adjacent to the digital data. This text and imagery contain meta- information about the media's content, and instructions for decoding for future generations. NASA Experiment WORF payload was delivered and docked to the International Space Station (ISS) via SpaceX CRS-17 on May 9, 2019. NASA's ISS test will determine if WORF media can survive a hostile space environment during long-term space missions, such as Lunar, Mars missions, and beyond. The WORF media payload will stay on the ISS for up to one year. Due to a previous NASA mission already named WORF, NASA renamed the experiment HELIOS (Hardened Extremely Long Life Informat
https://en.wikipedia.org/wiki/Vectorette%20PCR
Vectorette PCR is a variation of polymerase chain reaction (PCR) designed in 1988. The original PCR was created and also patented during the 1980s. Vectorette PCR was first noted and described in an article in 1990 by John H. Riley and his team. Since then, multiple variants of PCR have been created. Vectorette PCR focuses on amplifying a specific sequence obtained from an internal sequence that is originally known until the fragment end. Multiple researches have taken this method as an opportunity to conduct experiments in order to uncover the potential uses that can be derived from Vectorette PCR. Introduction Vectorette PCR is similar to PCR with the difference being that it is capable of obtaining the sequence desired for amplification from an already known primer site. While PCR needs information of already known sequences at both ends, Vectorette PCR only requires previous knowledge of one. This means that is able to apply the method of PCR which needs sequence information from both ends to fragments of DNA that contain the information of the sequence at only one end and not the other. In order to achieve this, there are specific steps that this method must first go through. These steps have been researched for the purpose of discovering the scientific uses of Vectorette PCR and how they can be applied. Steps Vectorette PCR can develop a strategy to bring about PCR amplification that is unidirectional. Vectorette PCR comprises three main steps. The first step includes utilizing a restriction enzyme in order to accomplish digestion of the sample DNA. The DNA that is to be utilized for the purpose of investigation has to be capable of being digested by restriction enzymes that are appropriate for that gene otherwise the DNA fragments that form the general population cannot be created. After that is completed, a Vectorette library is brought together by ligating the Vectorette units to the appropriate DNA fragments which were previously digested. Ligation is
https://en.wikipedia.org/wiki/Philosophy%20of%20ecology
Philosophy of ecology is a concept under the philosophy of science, which is a subfield of philosophy. Its main concerns centre on the practice and application of ecology, its moral issues, and the intersectionality between the position of humans and other entities. This topic also overlaps with metaphysics, ontology, and epistemology, for example, as it attempts to answer metaphysical, epistemic and moral issues surrounding environmental ethics and public policy. The aim of the philosophy of ecology is to clarify and critique the 'first principles’, which are the fundamental assumptions present in science and the natural sciences. Although there has yet to be a consensus about what presupposes philosophy of ecology, and the definition for ecology is up for debate, there are some central issues that philosophers of ecology consider when examining the role and purpose of what ecologists practice. For example, this field considers the 'nature of nature', the methodological and conceptual issues surrounding ecological research, and the problems associated with these studies within its contextual environment. Philosophy addresses the questions that make up ecological studies, and presents a different perspective into the history of ecology, environmental ethics in ecological science, and the application of mathematical models. Background History Ecology is considered as a relatively new scientific discipline, having been acknowledged as a formal scientific field in the late nineteenth and early twentieth century. Although an established definition of ecology has yet to be presented, there are some commonalities in the questions proposed by ecologists. Ecology was considered as “the science of the economy [and] habits,” according to Stauffer, and was proponent in understanding the external interrelations between organisms. It was recognised formally as a field of science in 1866 by German zoologist Ernst Haeckel (1834-1919). Haeckel termed ‘ecology’ in his book, Ge
https://en.wikipedia.org/wiki/Minoan%20Moulds%20of%20Palaikastro
The Minoan Moulds of Palaikastro () are two double-sided pieces of schist, formed in the Minoan period as casting moulds for plaques with figures and symbols. These include female figures with raised arms, labrys double axes (Λάβρυες, labryes) and opium poppy flowers or capsules, two double axes with indented edges, the Horns of Consecration symbol, and a sun-like disc with complex markings, which has been claimed by some researchers to be for making objects to use in astronomical predictions of solar and lunar eclipses. They were found in 1899 near Palaikastro in the eastern part of Crete, and are now in the Herakleion Archeological Museum in Crete. Description Stefanos Xanthoudidis, who published the find in 1900 described the two moulds, which were made from relatively soft and brittle schist as Plate Α and Plate Β. His plaster casts, which are also reproduced on the right hand side, are mirror images of the original moulds. Both moulds are wide, high and thick, while the width of the plaster casts is . The front of Plate Α shows a large disc with rectangular spokes and a serrated edge (which some are keen to interpret as "geared"), a female figure with raised arms, who holds flowers in her hands and a small disc with a cross in the centre on top of a bell-shaped and horizontally striped base, above a crescent. Double horns, the 'Horns of Consecration' of the Minoan culture, and a trident are shown on the rear. A small piece of the lower edge of the mould is broken-off. The front of Plate B shows engravings of a couple of double axes, dissimilar in size with teethed edges. The double axe or labrys was a cultural, almost certainly religious, symbol of the Minoan culture, often used for votive offerings, as were goddess figures with uplifted hands. The rear of the plate shows a female figure with raised arms holding two double axes. A small piece of the lower edge of the mould is broken-off as well. Both plates are exhibited side by side in the Heraklion Arc
https://en.wikipedia.org/wiki/Wound%20response%20in%20plants
Plants are constantly exposed to different stresses that result in wounding. Plants have adapted to defend themselves against wounding events, like herbivore attacks or environmental stresses. There are many defense mechanisms that plants rely on to help fight off pathogens and subsequent infections. Wounding responses can be local, like the deposition of callose, and others are systemic, which involve a variety of hormones like jasmonic acid and abscisic acid. Overview There are many forms of defense that plants use to respond to wounding events. There are physical defense mechanisms that some plants utilize, through structural components, like lignin and the cuticle. The structure of a plant cell wall is incredibly important for wound responses, as both protect the plant from pathogenic infections by preventing various molecules from entering the cell. Plants are capable of activating innate immunity, by responding to wounding events with damage-associated Molecular Patterns (DAMPs). Additionally, plants rely on microbe-associated molecular patterns (MAMPs) to defend themselves upon sensing a wounding event. There are examples of both rapid and delayed wound responses, depending on where the damage took place. MAMPs/ DAMPS & Signaling Pathways Plants have pattern recognition receptors (PRRs) that recognize MAMPs, or microbe-associated molecular patterns. Upon entry of a pathogen, plants are vulnerable to infection and lose a fair amount of nutrients to said pathogen. The constitutive defenses are the physical barriers of the plant; including the cuticle or even the metabolites that act toxic and deter herbivores. Plants maintain an ability to sense when they have an injured area and induce a defensive response. Within wounded tissues, endogenous molecules become released and become Damage Associated Molecular Patterns (DAMPs), inducing a defensive response. DAMPs are typically caused by insects that feed off the plant. Such responses to wounds are found at th
https://en.wikipedia.org/wiki/Associative%20interference
Associative interference is a cognitive theory established on the concept of associative learning, which suggests that the brain links related elements. When one element is stimulated, its associates can also be activated. The most known study demonstrating the credibility of this concept was Pavlov's experiment in 1927 which was later developed into the learning procedure known as classical conditioning. However, whilst classical conditioning and associative learning both explore how the brain utilizes this cognitive association to benefit us, studies have also shown how the brain can mistakenly associate related, but incorrect elements together, and this is known as associative interference. A simple example of this would be when one was asked a series of multiplication questions. A study conducted in 1985 showed that over 90% of the mistakes subjects made were actually answers to other questions with a common multiplicand. That is, questions such as 4 x 6 = 24 and 3 x 8 = 24 were very likely to promote errors (8 x 4 = 24) due to associative interference. Associative interference was widely investigated and researchers realized there were different types of interference, namely retroactive interference which investigates how new memories disrupts the retrieval of old memories, and proactive interference which investigates how old memories disrupts the retrieval of new memories. These two were subsequently known as the interference theory. Therefore, associative interference is a fundamental theory which the interference theory draws upon. The essential difference between these two is time. Both retroactive and proactive interference are concerned with when the interfering elements, or memories were obtained. However, associative interference does not encompass time, as shown by the previous example. The chronological acquisition of the four times table in relation to the three times table is independent as to why subjects made an error, highlighting the differe
https://en.wikipedia.org/wiki/BBN%20Time-Sharing%20System
The BBN Time-Sharing System was an early time-sharing system created at Bolt, Beranek and Newman (BBN) for the PDP-1 computer. It began operation in September 1962. History J. C. R. Licklider left MIT to become a vice president at Bolt Beranek and Newman in 1957. He learned about time-sharing from Christopher Strachey at a UNESCO-sponsored conference on Information Processing in Paris in June 1959. Digital Equipment Corporation's prototype PDP-1 was ready in November, 1959, and the machine was featured in the November/December issue of Datamation magazine. BBNer Ed Fredkin saw a prototype system at the Eastern Joint Computer Conference in Boston in December 1959, and was extremely interested. Given BBN's interest, DEC's founder and President Ken Olsen visited and explained that DEC had just completed construction of a prototype PDP-1, and that they needed a test site for a month. BBN agreed to be the test site, at its regular hourly rates, and then in early 1960 obtained the prototype PDP-1. The first production PDP-1 arrived in November 1960, and was formally accepted in April 1961. With the PDP-1 installed at BBN, in 1960 Licklider took on MIT's John McCarthy and Marvin Minsky as consultants. McCarthy had been advocating for the concept of time-sharing computers since the same year, but had found slow progress at MIT. At BBN, Licklider and Fredkin were keenly interested. In particular, Fredkin insisted that "timesharing could be done on a small computer, namely, a PDP-1." As Fredkin recounts: John’s invention of time-sharing and his telling me about his ideas all occurred before the PDP-1 existed. When I first saw the PDP-1 at the Eastern Joint Computer Conference, I realized that it was the perfect low-cost vehicle for implementing John's ideas. That is why I specified that several of the modifications for time sharing be part of the PDP-1b. McCarthy recalled in 1989: I kept arguing with him. I said "Well, you’d have to ... get an interrupt system." And h
https://en.wikipedia.org/wiki/Express%20Data%20Path
XDP (eXpress Data Path) is an eBPF-based high-performance data path used to send and receive network packets at high rates by bypassing most of the operating system networking stack. It is merged in the Linux kernel since version 4.8. This implementation is licensed under GPL. Large technology firms including Amazon, Google and Intel support its development. Microsoft released their free and open source implementation XDP for Windows in May 2022. It is licensed under MIT License. Data path The idea behind XDP is to add an early hook in the RX path of the kernel, and let a user supplied eBPF program decide the fate of the packet. The hook is placed in the network interface controller (NIC) driver just after the interrupt processing, and before any memory allocation needed by the network stack itself, because memory allocation can be an expensive operation. Due to this design, XDP can drop 26 million packets per second per core with commodity hardware. The eBPF program must pass a preverifier test before being loaded, to avoid executing malicious code in kernel space. The preverifier checks that the program contains no out-of-bounds accesses, loops or global variables. The program is allowed to edit the packet data and, after the eBPF program returns, an action code determines what to do with the packet: XDP_PASS: let the packet continue through the network stack XDP_DROP: silently drop the packet XDP_ABORTED: drop the packet with trace point exception XDP_TX: bounce the packet back to the same NIC it arrived on XDP_REDIRECT: redirect the packet to another NIC or user space socket via the AF_XDP address family XDP requires support in the NIC driver but, as not all drivers support it, it can fallback to a generic implementation, which performs the eBPF processing in the network stack, though with slower performance. XDP has infrastructure to offload the eBPF program to a network interface controller which supports it, reducing the CPU load. In 2022, many ne
https://en.wikipedia.org/wiki/Hazel%20Perfect
Hazel Perfect (circa 1927 – 8 July 2015) was a British mathematician specialising in combinatorics. Contributions Perfect was known for inventing gammoids, for her work with Leon Mirsky on doubly stochastic matrices, for her three books Topics in Geometry, Topics in Algebra, and Independence Theory in Combinatorics, and for her work as a translator (from an earlier German translation) of Pavel Alexandrov's book An Introduction to the Theory of Groups (Hafner, 1959). The Perfect–Mirsky conjecture, named after Perfect and Leon Mirsky, concerns the region of the complex plane formed by the eigenvalues of doubly stochastic matrices. Perfect and Mirsky conjectured that for matrices this region is the union of regular polygons of up to sides, having the roots of unity of each degree up to as vertices. Perfect and Mirsky proved their conjecture for ; it was subsequently shown to be true for and false for , but remains open for larger values Education and career Perfect earned a master's degree through Westfield College (a constituent college for women in the University of London) in 1949, with a thesis on The Reduction of Matrices to Canonical Form. In the 1950s, Perfect was a lecturer at University College of Swansea; she collaborated with Gordon Petersen, a visitor to Swansea at that time, on their translation of Alexandrov's book. She completed her Ph.D. at the University of London in 1969; her dissertation was Studies in Transversal Theory with Particular Reference to Independence Structures and Graphs. She became a reader in mathematics at the University of Sheffield. Selected publications Books Research papers Translation References 2015 deaths British mathematicians Women mathematicians Alumni of Westfield College Academics of Swansea University Academics of the University of Sheffield German–English translators Technical translators Combinatorialists Year of birth uncertain
https://en.wikipedia.org/wiki/3-4-3-12%20tiling
In geometry of the Euclidean plane, the 3-4-3-12 tiling is one of 20 2-uniform tilings of the Euclidean plane by regular polygons, containing regular triangles, squares, and dodecagons, arranged in two vertex configuration: 3.4.3.12 and 3.12.12. The 3.12.12 vertex figure alone generates a truncated hexagonal tiling, while the 3.4.3.12 only exists in this 2-uniform tiling. There are 2 3-uniform tilings that contain both of these vertex figures among one more. It has square symmetry, p4m, [4,4], (*442). It is also called a demiregular tiling by some authors. Circle Packing This 2-uniform tiling can be used as a circle packing. Cyan circles are in contact with 3 other circles (1 cyan, 2 pink), corresponding to the V3.122 planigon, and pink circles are in contact with 4 other circles (2 cyan, 2 pink), corresponding to the V3.4.3.12 planigon. It is homeomorphic to the ambo operation on the tiling, with the cyan and pink gap polygons corresponding to the cyan and pink circles (one dimensional duals to the respective planigons). Both images coincide. Dual tiling The dual tiling has kite ('ties') and isosceles triangle faces, defined by face configurations: V3.4.3.12 and V3.12.12. The kites meet in sets of 4 around a center vertex, and the triangles are in pairs making planigon rhombi. Every four kites and four isosceles triangles make a square of side length . This is one of the only dual uniform tilings which only uses planigons (and semiplanigons) containing a 30° angle. Conversely, 3.4.3.12; 3.122 is one of the only uniform tilings in which every vertex is contained on a dodecagon. Related tilings It has 2 related 3-uniform tilings that include both 3.4.3.12 and 3.12.12 vertex figures: This tiling can be seen in a series as a lattice of 4n-gons starting from the square tiling. For 16-gons (n=4), the gaps can be filled with isogonal octagons and isosceles triangles. Notes References Keith Critchlow, Order in Space: A design source book, 1970, pp. 62–67 Ghyka,
https://en.wikipedia.org/wiki/Value%20tree%20analysis
Value tree analysis is a multi-criteria decision-making (MCDM) implement by which the decision-making attributes for each choice to come out with a preference for the decision makes are weighted. Usually, choices' attribute-specific values are aggregated into a complete method. Decision analysts (DAs) distinguished two types of utility. The preferences of value are made among alternatives when there is no uncertainty. Risk preferences solves the attitude of DM to risk taking under uncertainty. This learning package focuses on deterministic choices, namely value theory, and in particular a decision analysis tool called a value tree. History The concept of utility was used by Daniel Bernoulli (1738) first in 1730s while explaining the evaluation of St Petersburg paradox, a specific uncertain gable. He explained that money was not enough to measure how much value is. For an individual, however, the worth of money was a non-linear function. This discovery led to the emergence of utility theory, which is a numerical measure that indicates how much value alternative choices have. With the development of decision analysis, utility played an important role in the explanation of economics behavior. Some utilitarian philosophers like Bentham and Mill took advantage of it as an implement to build a certain kind of ethics theory either. Nevertheless, there was no possibility of measuring one's utility function. Moreover, the theory was not so important as in practice. With the time past, the utility theory gradually based on a solid theoretical foundation. People started to use theory of games to explain the behavior of those who are rational and calm when engaging with others with conflict happening. In 1944 John von Neumann and Oskar Morgenstern's Theory of Games and Economic Behavior was published. Afterwards, it emerged since it has become of the key implement researchers and practitioners from statistics and operations research use to give a helping hand to decision
https://en.wikipedia.org/wiki/Carl%20Theodore%20Heisel
Carl Theodore Heisel (1852–1937) was a mathematical crank who wrote several books in the 1930s challenging accepted mathematical truths. Among his claims is that he found a way to square the circle. He is credited with 24 works in 62 publications. Heisel did not charge money for his books; he gave thousands of them away for free. Because of this, they are available at many libraries and universities. Heisel's books have historic and monetary value. Paul Halmos referred to one of Heisel's works as a "classic crank book." Selected works References Pseudomathematics 1852 births 1937 deaths
https://en.wikipedia.org/wiki/TENEX%20%28operating%20system%29
TENEX is an operating system developed in 1969 by BBN for the PDP-10, which later formed the basis for Digital Equipment Corporation's TOPS-20 operating system. Background In the 1960s, BBN was involved in a number of LISP-based artificial intelligence projects for DARPA, many of which had very large (for the era) memory requirements. One solution to this problem was to add paging software to the LISP language, allowing it to write out unused portions of memory to disk for later recall if needed. One such system had been developed for the PDP-1 at MIT by Daniel Murphy before he joined BBN. Early DEC machines were based on an 18-bit word, allowing addresses to encode for a 256 kiloword memory. The machines were based on expensive core memory and included nowhere near the required amount. The pager used the most significant bits of the address to index a table of blocks on a magnetic drum that acted as the pager's backing store. The software would fetch the pages if needed, and then resolve the address to the proper area of RAM. In 1964 DEC announced the PDP-6. DEC was still heavily involved with MIT's AI Lab, and many feature requests from the LISP hackers were moved into this machine. 36-bit computing was especially useful for LISP programming because with an 18-bit address space, a word of storage on these systems contained two addresses, a perfect match for the common LISP CAR and CDR operations. BBN became interested in buying one for their AI work when they became available, but wanted DEC to add a hardware version of Murphy's pager directly into the system. With such an addition, every program on the system would have paging support invisibly, making it much easier to do any sort of programming on the machine. DEC was initially interested, but soon (1966) announced they were in fact dropping the PDP-6 and concentrating solely on their smaller 18-bit and new 16-bit lines. The PDP-6 was expensive and complex, and had not sold well for these reasons. It was n
https://en.wikipedia.org/wiki/Electric%20car%20charging%20methods
Various methods exist for recharging the batteries of electric cars. Currently, the largest concern surrounding electric vehicle transportation is the total travel range available before the need to recharge. The longest range recorded till date was 606.2 miles, achieved by a Tesla Model 3. However, this was conducted in very controlled conditions where the car maintained a constant speed without the added drain of the air conditioning compressor. Typically, the battery would last for approximately 300 miles - the equivalent to three days of city commuting in warmer weather, or one day in colder weather. With these limitations, long-distance trips are currently unsuited for an electric car unless rapid charging stations are available on the route of the trip. Principles of rapid charging The process of discharging involves lithium ions from a positive electrode passing through a separator/electrolyte. The ions then transfer, via a solid electrolyte interface (SEI) and intercalate, into the negative electrode. The potential negative impact for rapid charging is that battery aging may be accelerated by the unstable SEI, produced by charging and discharging multiple times. New pulse charging has significantly improved the stability of SEI since the unnecessary chemical reaction has been reduced by new charging methods and SEI is grown via a reduction reaction. Thus, a battery's life cycle and efficiency has also improved in comparison to the traditional charging method. Also, in traditional charging method, ethylene might be produced during charging due to electrolyte (EC, mostly) reduction with lithium ions. Since the battery is closed, the internal pressure is contained. The produced ethylene will lead to overpressure inside the battery. Overpressure of the battery may also cause the battery to expand due to an internal temperature rise, potentially causing the battery to explode. By using these new composite waveform charging method, however, it can reduce the pr
https://en.wikipedia.org/wiki/Costa%20Rica%20Thermal%20Dome
The Costa Rica Thermal Dome (CRTD; also called the Costa Rica Dome), is an oceanographic feature and marine biodiversity hotspot that varies in size from 300 to 1,000 km in diameter. The dome is located off the western coast of Central America in the Tropical Eastern Pacific. Through the interaction of wind and ocean currents, deeper waters are drawn towards the surface in a dome-like shape at this location. These waters displace the warmer, nutrient-poor waters with colder, nutrient-richer waters. An investigation by UNESCO'S World Heritage Site and International Union for Conservation of nature (IUCN) in 2016 considered it eligible to become a World Heritage Site in the near future. The average position of the center of the Costa Rica Dome is located at latitude 9°N, longitude 90°W, which is off the coast of Costa Rica. The dome is positioned above the Cocos underwater tectonic valley and provides a subaquatic cyclonic current that moves in sync with the above air flows. The Costa Rica Thermal Dome is full of biodiversity and many forms of marine life. The nutrient hotspot consists of all types of animals ranging from zoo-plankton to the blue whale. The location is also within one of the largest tuna catchment areas in the world The Costa Rica Dome is also positioned on the major seaway to the Panama Canal for transportation. The dome and its marine life provide economic benefits for countries such as Panama and Costa Rica. The Ticos are very proud of the dome, and believe it is one of the features that make the country blessed by nature. At the 12th conference of parties at the Convention on Biological Diversity (CBD) in South Korea, the dome was considered a marine zone of biological or ecological importance. Geology and geography The Costa Rica Dome operates beyond national jurisdiction and its diameter and position change yearly and in a characteristic annual cycle. The dome's upwelling consists of vertical changes, water from oceanic depths rising towards
https://en.wikipedia.org/wiki/Bagger%201473
Bagger 1473 is a bucket-wheel excavator left abandoned in a field in the municipality of Schipkau in Germany. History The excavator was used at the Tagebau Meuro mine from 1965 to 2002. After it was withdrawn from service, the municipalities Senftenberg, Großräschen, and Schipkau decided on a joint action to preserve the opencast mining machine. Between 29 August to 15 September 2003, Bagger 1473 was moved approximately from the Meuro mine to near the EuroSpeedway Lausitz, where it would serve as a monument to the area's former lignite mining. The machine was moved across industrial roads and railways owned by the LMBV but public traffic was not affected. When Bagger 1473 became popular with the urban explorers, it was misidentified as Bagger 258 because of markings found on its information plate. Scrapping In January 2019 the municipalities that supported its move announced that the excavator was to be scrapped. Their decision was mainly due to the machine's dilapidation and damage. It was financially impossible to maintain and because vandalism and theft had become so extensive, the structure was no longer safe for people. Parts of the excavator would be preserved, such as its wheel. However, when the Brandenburg Landesamt für Denkmalpflege (State Office for the Preservation of Monuments) and state archaeological museum learned of the planned demolition from news reports, they issued a statement the excavator had acknowledged historical structure protection since 2002/2003. Although at that time, it was simply assumed that it was unnecessary to formally place it on the list of such structures. The fact that the structure was identified as historically significant was considered sufficient to declare it as protected. The Landesamt quickly made it official in February. Constructive discussions about the future of Bagger 1473 are ongoing with the local municipalities and communities. Despite imminent demolition work being prevented, it remains unclear how th
https://en.wikipedia.org/wiki/PRNET
The Packet Radio Network (PRNET) was a set of early, experimental mobile ad hoc networks whose technologies evolved over time. It was funded by the Advanced Research Projects Agency (ARPA). Major participants in the project included BBN Technologies, Hazeltine Corporation, Rockwell International's Collins division, and SRI International. History ARPA initiated the PRNET project in 1973, funding both theoretical and experimental research. Its goals were outlined in a 1975 paper by Bob Kahn, namely, to investigate the feasibility of using packet-switched, store-and-forward radio communications to provide reliable computer communications in a mobile environment. The earlier ALOHAnet served as an inspiration, but PRNET tackled a significantly harder set of problems, namely, multi-hop communications between mobile vehicles without a central station. In Kahn's initial conception, the overall system design was "predicated upon the existence of an array of low cost repeaters", where he defines the term to mean "a particular kind of packet radio which is equipped to retransmit by radio some or all packets which it receives by radio". In today's terminology, this might be called a router or a packet switch, rather than a radio repeater. The first PRNET was established under the auspices of SRI in the San Francisco Bay Area, with BBN contributing network technology and Collins creating the Experimental Packet Radios (EPRs), which implemented L-band spread-spectrum waveforms and supported half-duplex communications at 100 or 400 kilobits/second. There was also a smaller network at BBN, for software development and testing. The first packet radios were delivered in mid-1975 for initial testing and a quasi-operational network capability was established for the first time in September 1976, shortly after the prototype networking software was developed. By 1977, this software included radio network routing control; a gateway to other networks; network measurement; debugging tools
https://en.wikipedia.org/wiki/Kioxia
Kioxia Holdings Corporation (), simply known as Kioxia and stylized as KIOXIA, is a Japanese multinational computer memory manufacturer headquartered in Tokyo, Japan. The company was spun off from the Toshiba conglomerate as in June 2018. It became a wholly owned subsidiary company of Toshiba Memory Holdings Corporation on March 1, 2019, and was renamed Kioxia in October 2019. In the early 1980s, while still part of Toshiba, the company was credited with inventing flash memory. In the second quarter of 2021, the company was estimated to have 18.3% of the global revenue share for NAND flash solid-state drives. The company is the parent company of Kioxia Corporation. Name Kioxia is a combination of the Japanese word kioku meaning memory and the Greek word axia meaning value. History In 1980, Fujio Masuoka, an engineer at Kioxia predecessor Toshiba, invented flash memory, and in 1984, Masuoka and his colleagues presented their invention of NOR flash. In January 2014, the Toshiba Corporation completed its acquisition of OCZ Storage Solutions, renaming it OCZ and making it a brand of Toshiba. On June 1, 2018, due to heavy losses experienced by the bankruptcy of the Westinghouse subsidiary of former parent company Toshiba over nuclear power plant construction at Vogtle Electric Generating Plant in 2016, Toshiba Memory Corporation was spun off from the Toshiba Corporation. Toshiba maintained a 40.2% equity in the new company. The new company consisted of all of Toshiba memory businesses. Toshiba Memory Corporation became a subsidiary of the newly formed Toshiba Memory Holdings Corporation on March 1, 2019. In June 2019, Kioxia experienced a power cut at one of its factories in Yokkaichi, Japan, resulting in the loss of at least 6 exabytes of flash memory, with some sources estimating the loss as high as 15 exabytes. Western Digital used (and still uses) Kioxia's facilities for making its own flash memory chips. On July 18, 2019, Toshiba Memory Holdings Corporatio
https://en.wikipedia.org/wiki/Journal%20of%20Computational%20and%20Nonlinear%20Dynamics
The Journal of Computational and Nonlinear Dynamics is a quarterly peer-reviewed multidisciplinary scientific journal covering the study of nonlinear dynamics. It was established in 2006 and is published by the American Society of Mechanical Engineers. The editor-in-chief is Balakumar Balachandran (University of Maryland). According to the Journal Citation Reports, the journal has a 2017 impact factor of 1.996. References External links Multidisciplinary scientific journals Academic journals established in 2006 Quarterly journals Dynamics (mechanics) English-language journals Systems science literature American Society of Mechanical Engineers academic journals
https://en.wikipedia.org/wiki/Genevi%C3%A8ve%20Raugel
Geneviève Raugel (27 May 1951 – 10 May 2019) was a French mathematician working in the field of numerical analysis and dynamical systems. Biography Raugel entered the École normale supérieure de Fontenay-aux-Roses in 1972, obtaining the agrégation in mathematics in 1976. She earned her Ph.D degree from University of Rennes 1 in 1978 with a thesis entitled Résolution numérique de problèmes elliptiques dans des domaines avec coins (Numerical resolution of elliptic problems in domains with edges). Raugel got a tenured position in the CNRS the same year, first as a researcher (1978–1994) then as a research director (exceptional class from 2014 on). Beginning in 1989, she worked at the Orsay Math Lab of CNRS affiliated to the University of Paris-Sud since 1989. Raugel also held visiting professor positions in several international institutions: the University of California, Berkeley (1986–1987), Caltech (1991), the Fields Institute (1993), University of Hamburg (1994–95), and the University of Lausanne (2006). She delivered the Hale Memorial Lectures in 2013, at the first international conference on the dynamic of differential equations, Atlanta. She co-directed the international Journal of Dynamics and Differential Equations from 2005 on. Research Raugel's first research works were devoted to numerical analysis, in particular finite element discretization of partial differential equations. With Christine Bernardi, she studied a finite element for the Stokes problem, now known as the Bernardi-Fortin-Raugel element. She was also interested in problems of bifurcation, showing for instance how to use invariance properties of the dihedral group in these questions. In the mid-1980s, she started working on the dynamics of evolution equations, in particular on global attractors, perturbation theory, and the Navier-Stokes equations in thin domains. In the last topic she was recognized as a world expert. Selected publications with Christine Bernardi, Approximation nu
https://en.wikipedia.org/wiki/Williamson%20conjecture
In combinatorial mathematics, specifically in combinatorial design theory and combinatorial matrix theory the Williamson conjecture is that Williamson matrices of order exist for all positive integers . Four symmetric and circulant matrices , , , are known as Williamson matrices if their entries are and they satisfy the relationship where is the identity matrix of order . John Williamson showed that if , , , are Williamson matrices then is an Hadamard matrix of order . It was once considered likely that Williamson matrices exist for all orders and that the structure of Williamson matrices could provide a route to proving the Hadamard conjecture that Hadamard matrices exist for all orders . However, in 1993 the Williamson conjecture was shown to be false via an exhaustive computer search by Dragomir Ž. Ðoković, who showed that Williamson matrices do not exist in order . In 2008, the counterexamples 47, 53, and 59 were additionally discovered. References Combinatorial design Disproved conjectures
https://en.wikipedia.org/wiki/Fast%20Pair
The Google Fast Pair Service, or simply Fast Pair, is Google's proprietary standard for quickly pairing Bluetooth devices when they come in close proximity for the first time using Bluetooth Low Energy (BLE). It was announced in October 2017 and initially designed for connecting audio devices such as speakers, headphones and car kits with the Android operating system. In 2018, Google added support for ChromeOS devices, and in 2019, Google announced that Fast Pair connections could now be synced with other Android devices with the same Google Account. Google has partnered with Bluetooth SoC designers including Qualcomm, Airoha Technology, and BES Technic to add Fast Pair support to their SDKs. In May 2019, Qualcomm announced their Smart Headset Reference Design, Qualcomm QCC5100, QCC3024 and QCC3034 SoC series with support for Fast Pair and Google Assistant. In July 2019, Google announced True Wireless Features (TWF), Find My Device and enhanced Connected Device Details. List of supported devices Earbuds 1More EVO 1More Dual Driver BT ANC Anker Soundcore Liberty 4 NC Anker Spirit Pro GVA Beats Studio Buds Cleer Ally Plus Dizo Wireless Dash Neckband Dizo Go pods D Jabra Elite 2 Jabra Elite 3 Jabra Elite 4 Active Jabra Elite 10 Jaybird Tarah Jaybird Vista 2 JBL Live Free NC+ TWS JBL Live Tune 225 TWS JBL Reflect Mini NC TWS JBL Peak II JBL Tour Pro+ JBL Club Pro+ LG Tone Free (All devices) Marshall Minor III Microsoft Surface Earbuds Moto Buds 600 ANC Nothing ear (1) Nothing ear (2) Nothing ear (stick) OnePlus Buds OnePlus Buds Z OnePlus Buds Z2 OnePlus Buds Pro 2 OnePlus Buds Pro 2R OPPO Enco Air3 OPPO Enco Air3 Pro Google Pixel Buds (First generation) Google Pixel Buds A-Series Google Pixel Buds (Second generation) Google Pixel Buds Pro Realme Buds Air Realme Buds Air 2 Realme Buds Air 3 Realme Buds Air 3s Realme Buds Air 5 Pro Realme Buds Air Neo Realme Buds Air Pro Realme Buds Q2 Realme Buds Q2S Realme Techlife Buds
https://en.wikipedia.org/wiki/Microtechnique
Microtechnique is an aggregate of methods used to prepare micro-objects for studying. It is currently being employed in many fields in life science. Two well-known branches of microtechnique are botanical (plant) microtechnique and zoological (animal) microtechnique. With respect to both plant microtechnique and animal microtechnique, four types of methods are commonly used, which are whole mounts, smears, squashes, and sections, in recent micro experiments. Plant microtechnique contains direct macroscopic examinations, freehand sections, clearing, maceration, embedding, and staining. Moreover, three preparation ways used in zoological micro observations are paraffin method, celloidin method, and freezing method. History The early development of microtechnique in botany is closely related to that in zoology. Zoological and botanical discoveries are adopted by both zoologists and botanists. The field of microtechnique lasted from at the end of the 1930s when the principle of dry preparation emerged. The early development of microtechnique in botany is closely related to that in zoology. Zoological and botanical discoveries are adopted by both zoologists and botanists. Since Hooke discovered cells, microtechnique had also developed with the emergence of early microscopes. Microtechnique then had advanced over the period of 1800–1875. After 1875, modern micro methods have emerged. In recent years, both traditional methods and modern microtechnique have been in use in many experiments. Commonly used methods Some general microtechnique can be used in both plant and animal micro observation. Whole mounts, smears, squashes, and sections are four commonly used methods when preparing plant and animal specimens for specific purposes. Whole mounts Whole mounts are usually used when observers need to use a whole organism or do some detailed research on specific organ structure. This method requires objects in which moisture can be removed, like seeds and micro fossils.
https://en.wikipedia.org/wiki/Newton%E2%80%93Gauss%20line
In geometry, the Newton–Gauss line (or Gauss–Newton line) is the line joining the midpoints of the three diagonals of a complete quadrilateral. The midpoints of the two diagonals of a convex quadrilateral with at most two parallel sides are distinct and thus determine a line, the Newton line. If the sides of such a quadrilateral are extended to form a complete quadrangle, the diagonals of the quadrilateral remain diagonals of the complete quadrangle and the Newton line of the quadrilateral is the Newton–Gauss line of the complete quadrangle. Complete quadrilaterals Any four lines in general position (no two lines are parallel, and no three are concurrent) form a complete quadrilateral. This configuration consists of a total of six points, the intersection points of the four lines, with three points on each line and precisely two lines through each point. These six points can be split into pairs so that the line segments determined by any pair do not intersect any of the given four lines except at the endpoints. These three line segments are called diagonals of the complete quadrilateral. Existence of the Newton−Gauss line It is a well-known theorem that the three midpoints of the diagonals of a complete quadrilateral are collinear. There are several proofs of the result based on areas or wedge products or, as the following proof, on Menelaus's theorem, due to Hillyer and published in 1920. Let the complete quadrilateral be labeled as in the diagram with diagonals and their respective midpoints . Let the midpoints of be respectively. Using similar triangles it is seen that intersects at , intersects at and intersects at . Again, similar triangles provide the following proportions, However, the line intersects the sides of triangle , so by Menelaus's theorem the product of the terms on the right hand sides is −1. Thus, the product of the terms on the left hand sides is also −1 and again by Menelaus's theorem, the points are collinear on the sides
https://en.wikipedia.org/wiki/Quantinuum
Quantinuum is a quantum computing company formed by the merger of Cambridge Quantum and Honeywell Quantum Solutions. The company's H-Series trapped ion quantum computers set the highest quantum volume to date of 524,288. This architecture supports all-to-all qubit connectivity - allowing entangled states to be created between all qubits – and enables a high fidelity of quantum states. Quantinuum has developed middleware and software products that run on trapped-ion and other quantum computing platforms for quantum chemistry, quantum machine learning and quantum artificial intelligence. The company also offers quantum-computing-hardened encryption keys designed to protect data assets and enhance cryptographic defenses. History Formed in 2021, Quantinuum is the combination of the quantum hardware team from Honeywell Quantum Solutions (HQS) and the quantum software team at Cambridge Quantum Computing (CQC). HQS was founded in 2014. The company used a trapped ion architecture for its quantum computing hardware, which Honeywell believed could be used to fulfill the needs of its various business units in aerospace, building technology, performance materials, safety and productivity solutions. CQC was founded in 2014 as an independent quantum computing company through the University of Cambridge’s “Accelerate Cambridge” program. CQC focused on building tools for the commercialization of quantum technologies with a focus on quantum software and quantum cybersecurity. By coming together as Quantinuum, the company offers an integrated, end-to-end quantum platform. Ilyas Khan, the founder of Cambridge Quantum and the founding Chairman of the Stephen Hawking Foundation and Fellow at Cambridge Judge Business School, was named the CEO of Quantinuum. Tony Uttley, formerly an operations manager at NASA and President of Honeywell Quantum Solutions, was named the President and Chief Operating Officer. In 2023, Quantinuum named Rajeeb “Raj” Hazra, formerly a corporate vice presid
https://en.wikipedia.org/wiki/Diffusion-limited%20escape
Diffusion-limited escape occurs when the rate of atmospheric escape to space is limited by the upward diffusion of escaping gases through the upper atmosphere, and not by escape mechanisms at the top of the atmosphere (the exobase). The escape of any atmospheric gas can be diffusion-limited, but only diffusion-limited escape of hydrogen has been observed in our solar system, on Earth, Mars, Venus and Titan. Diffusion-limited hydrogen escape was likely important for the rise of oxygen in Earth's atmosphere (the Great Oxidation Event) and can be used to estimate the oxygen and hydrogen content of Earth's prebiotic atmosphere. Diffusion-limited escape theory was first used by Donald Hunten in 1973 to describe hydrogen escape on one of Saturn's moons, Titan. The following year, in 1974, Hunten found that the diffusion-limited escape theory agreed with observations of hydrogen escape on Earth. Diffusion-limited escape theory is now used widely to model the composition of exoplanet atmospheres and Earth's ancient atmosphere. Diffusion-Limited Escape of Hydrogen on Earth Hydrogen escape on Earth occurs at ~500 km altitude at the exobase (the lower border of the exosphere) where gases are collisionless. Hydrogen atoms at the exobase exceeding the escape velocity escape to space without colliding into another gas particle. For a hydrogen atom to escape from the exobase, it must first travel upward through the atmosphere from the troposphere. Near ground level, hydrogen in the form of H2O, H2, and CH4 travels upward in the homosphere through turbulent mixing, which dominates up to the homopause. At about 17 km altitude, the cold tropopause (known as the "cold trap") freezes out most of the H2O vapor that travels through it, preventing the upward mixing of some hydrogen. In the upper homosphere, hydrogen bearing molecules are split by ultraviolet photons leaving only H and H2 behind. The H and H2 diffuse upward through the heterosphere to the exobase where they escape the
https://en.wikipedia.org/wiki/HarmonyOS
HarmonyOS (HMOS) () is a distributed operating system developed by Huawei for smartphones, tablets, TVs, smart watches, and other smart devices. It has a multikernel design with dual frameworks: the operating system selects suitable kernels from the abstraction layer in the case of devices that use diverse resources. The operating system was officially launched by Huawei in August 2019. Architecture HarmonyOS is designed with a layered architecture, which consists of four layers; the kernel layer at the bottom provides the upper three layers, i.e., the system service layer, framework layer and application layer, with basic kernel capabilities, such as process and thread management, memory management, file system, network management, and peripheral management. In the kernel layer, the system applies a multikernel design and selects an appropriate kernel for a device with different resource limitations. For wearables, screenless I/O devices and IoT devices, the system is based on real-time operating system LiteOS; while for smartphones and tablets, the system operates by utilizing a Linux kernel subsystem and executing the AOSP code with a modified EMUI user interface, enabling Android apps and HarmonyOS apps to run seamlessly through a compatibility layer in the userland outside the kernel. The system includes a communication base called DSoftBus for integrating physically separate devices into a virtual Super Device, allowing one device to control others and sharing data among devices with distributed communication capabilities. To address security concerns arising from varying devices, the system provides a hardware-based Trusted Execution Environment (TEE) to prevent leakage of sensitive personal data when they are stored or processed. It supports several forms of apps, including the apps that can be installed from AppGallery on smartphones and tablets, installation-free Quick apps and lightweight Meta Services accessible by users. History Early developmen
https://en.wikipedia.org/wiki/Galactic%20algorithm
A galactic algorithm is one that outperforms other algorithms for problems that are sufficiently large, but where "sufficiently large" is so big that the algorithm is never used in practice. Galactic algorithms were so named by Richard Lipton and Ken Regan, because they will never be used on any data sets on Earth. Possible use cases Even if they are never used in practice, galactic algorithms may still contribute to computer science: An algorithm, even if impractical, may show new techniques that may eventually be used to create practical algorithms. Available computational power may catch up to the crossover point, so that a previously impractical algorithm becomes practical. An impractical algorithm can still demonstrate that conjectured bounds can be achieved, or that proposed bounds are wrong, and hence advance the theory of algorithms. As Lipton states: Similarly, a hypothetical large but polynomial algorithm for the Boolean satisfiability problem, although unusable in practice, would settle the P versus NP problem, considered the most important open problem in computer science and one of the Millennium Prize Problems. Examples Integer multiplication An example of a galactic algorithm is the fastest known way to multiply two numbers, which is based on a 1729-dimensional Fourier transform. It needs bit operations, but as the constants hidden by the big O notation are large, it is never used in practice. However, it also shows why galactic algorithms may still be useful. The authors state: "we are hopeful that with further refinements, the algorithm might become practical for numbers with merely billions or trillions of digits." Matrix multiplication The first improvement over brute-force matrix multiplication (which needs multiplications) was the Strassen algorithm: a recursive algorithm that needs multiplications. This algorithm is not galactic and is used in practice. Further extensions of this, using sophisticated group theory, are the Coppers
https://en.wikipedia.org/wiki/Coral%20of%20life
The coral of life is a metaphor or a mathematical model useful to illustrate evolution of life or phylogeny at various levels of resolution, including individual organisms, populations, species and large taxonomic groups. Its use in biology resolves several practical and conceptual difficulties that are associated with the tree of life. History of the concept In biological context, the 'coral of life' as a metaphor is almost as old as the 'tree of life'. After returning from his voyage around the world, Darwin suggested in his notebooks that: with obvious reference to branching corals whose dead colonies may form very thick deposits in the ocean (representing past life) with live animals occurring only on the top (recent life). This comment was illustrated by two simple diagrams, the first coral metaphors of evolution ever drawn in the history of biology. However, Darwin later abandoned his idea, and in the Origin of Species he referred to the tree of life as the most appropriate means to summarize affinities of living organisms, thanks most likely to obvious connotations of this metaphor with religion, ancient and folk art and mythology. Darwin’s early musing was rediscovered by several authors more than a century later, graphical schemes as simple heuristics were drawn again early this century, and corals were raised to the level of mathematically defined objects even more recently. Structure The picture to the right explains the different parts of a coral. Vertical axis is time, horizontal axis may be richness, morphological diversity, some other population measure or even scaled arbitrarily. Each point x in the diagram corresponds to an individual, a population or a taxon. At point of time h, there is an equivalence partition of points into classes C. A class Cg1 is the ancestor of the entire branch above it, Cr1 is the closest common ancestor class of two segments. A segment S is defined by two classes such that there is no branching between them. On t
https://en.wikipedia.org/wiki/SMPTE%202110
SMPTE 2110 is a suite of standards from the Society of Motion Picture and Television Engineers (SMPTE) that describes how to send digital media over an IP network. SMPTE 2110 is intended to be used within broadcast production and distribution facilities where quality and flexibility are more important than bandwidth efficiency. History SMPTE 2110 was based on the TR-03 and TR-04 work published by the Video Services Forum on 12 November 2015. The first four parts of SMPTE 2110, -10, -20, -21 and -30, were published by SMPTE on 27 November 2017. Standard SMPTE 2110 is specified in several parts: ST 2110-10 - System architecture and synchronization: essences, RTP, SDP and PTP ST 2110-20 - Uncompressed video transport, based on SMPTE 2022-6 ST 2110-21 - Traffic shaping and network delivery timing ST 2110-22 - Constant Bit-Rate Compressed Video transport ST 2110-30 - Audio transport, based on AES67 ST 2110-31 - Transport of AES3 formatted audio ST 2110-40 - Transport of ancillary data ST 2110-43 - Transport of Timed Text Markup Language for captions and subtitles in systems conforming to SMPTE ST 2110-10. ST 2110-10: System architecture and synchronization There are several important features of ST 2110-10: Individual audio, video and ancillary data tracks or clips are carried as separate individual streams. These streams are referred to as "essences", e.g., a 5.1 JPEG mp4 clip could have 9 essences: a video essence, 6 separate audio essences, and two closed caption essences, English and Chinese. Real-time Transport Protocol (RTP) is used to transmit streaming essences. Session Initiation Protocol (SIP) is used to manage the connection and distribution of RTP streams including IP multicast one-to-many distribution. Precision Time Protocol (PTP) provides global microsecond accuracy timing of all essences. Synchronization is based on SMPTE 2059. ST 2110-20: Uncompressed video transport SMPTE 2110-20 defines the key requirements for transporting uncompressed vid
https://en.wikipedia.org/wiki/Digital%20ion%20trap
The digital ion trap (DIT) is an quadrupole ion trap driven by digital signals, typically in a rectangular waveform, generated by switching rapidly between discrete DC voltage levels. The digital ion trap has been mainly developed as a mass analyzer. History A digital ion trap (DIT) is an ion trap having a trapping waveform generated by the rapid switching between discrete high-voltage levels. The timing of the high voltage switch is controlled precisely with digital electronic circuitry. Ion motion in a quadrupole ion trap driven by a rectangular wave signal was theoretically studied in 1970s by Sheretov, E.P. and Richards, J.A. Sheretov also implemented the pulsed waveform drive for the quadrupole ion trap working in mass-selective instability mode, although no resonance excitation/ejection was used. The idea was substantially revisited by Ding L. and Kumashiro S. in 1999, where the ion stability in the rectangular wave quadrupole field was mapped in the Mathieu space a-q coordinate system, with the parameters a and q having the same definition as the Mathieu parameters normally used in dealing with sinusoidal RF driven quadrupole field. The secular frequency dependence on the a, q parameters was also derived thus the foundation was laid for many modern ion trap operation modes based on the resonance excitation. Also, in 1999, Peter T.A. Reilly began trapping and subsequently ablating and mass analyzing the product ions from nanoparticles obtained from car exhaust with a primitive hybrid square wave/sine wave driven 3D ion trap. In 2001 Reilly attended the 49th American Society for Mass Spectrometry (ASMS) Conference on Mass Spectrometry and Applied Topics where he presented his nanoparticle mass analysis work and met Li Ding for the first time. Reilly suggested to Ding at that time that they should focus the DIT for analysis in the high mass range where other instruments could not compete. However, work published by Ding and Shimadzu over the years followin
https://en.wikipedia.org/wiki/Toons%20Mag
Toons Mag is a cartoon magazine that offers a global online platform for publishing editorial cartoons, comics, caricatures, illustrations, and related news. It is a multilingual publication and organizer of an international cartoon contest and exhibitions. It was founded in 2009 by cartoonist Arifur Rahman, based in Drøbak, Norway. Founder and history In 2007, Cartoonist Arifur Rahman started drawing for a Bangladeshi satirical magazine called Alpin. It was a fun supplementary publication by Prothom Alo. In Alpin, one of his cartoons that made a joke about adding Mohammad to the beginning of a person's name. The cartoon culminates in a young boy introducing his cat as Mohammad Cat. The cartoon, which was published during the Islamic holiday of Ramadan, ignited protests across Bangladesh and led to Arifur Rahman's arrest. On 18 September 2007, Alpin was banned permanently and the editor of Alpin was suspended. Bangladeshi newspaper editors decided that they will never be published his cartoons in the future. In prison, Arifur Rahman wished that when he was freed, he would be able to start a magazine like Alpin. On 20 March 2008, after six months and two days in prison for "hurting religious sentiments," he was freed but found himself unable to publish his work. One year later, some newspapers published his cartoons but all of them were published anonymously. Arifur Rahman was not happy to publish his cartoon anonymously. He always wished to publish his cartoons under his own name. So, he tried to start a printed cartoon magazine but did not have enough money. Then, he decided to publish on the internet, a cheap and easy way to get a global audience. In 2009, he started, Toons Mag Online Cartoon Magazine. Award In 2015, Toons won "Best of online activism awards" in the people's choice category in Deutsche Welle, Germany. Toons promotes freedom of expression through cartoons, comics, caricatures and articles. It is published in English, Bengali, Arabic, Spanish
https://en.wikipedia.org/wiki/Enumeration%20algorithm
In computer science, an enumeration algorithm is an algorithm that enumerates the answers to a computational problem. Formally, such an algorithm applies to problems that take an input and produce a list of solutions, similarly to function problems. For each input, the enumeration algorithm must produce the list of all solutions, without duplicates, and then halt. The performance of an enumeration algorithm is measured in terms of the time required to produce the solutions, either in terms of the total time required to produce all solutions, or in terms of the maximal delay between two consecutive solutions and in terms of a preprocessing time, counted as the time before outputting the first solution. This complexity can be expressed in terms of the size of the input, the size of each individual output, or the total size of the set of all outputs, similarly to what is done with output-sensitive algorithms. Formal definitions An enumeration problem is defined as a relation over strings of an arbitrary alphabet : An algorithm solves if for every input the algorithm produces the (possibly infinite) sequence such that has no duplicate and if and only if . The algorithm should halt if the sequence is finite. Common complexity classes Enumeration problems have been studied in the context of computational complexity theory, and several complexity classes have been introduced for such problems. A very general such class is EnumP, the class of problems for which the correctness of a possible output can be checked in polynomial time in the input and output. Formally, for such a problem, there must exist an algorithm A which takes as input the problem input x, the candidate output y, and solves the decision problem of whether y is a correct output for the input x, in polynomial time in x and y. For instance, this class contains all problems that amount to enumerating the witnesses of a problem in the class NP. Other classes that have been defined include the f
https://en.wikipedia.org/wiki/PLUMED
PLUMED is an open-source library implementing enhanced-sampling algorithms, various free-energy methods, and analysis tools for molecular dynamics simulations. It is designed to be used together with ACEMD, AMBER, DL_POLY, GROMACS, LAMMPS, NAMD, OpenMM, ABIN, CP2K, i-PI, PINY-MD, and Quantum ESPRESSO, but it can also be used to together with analysis and visualization tools VMD, HTMD, and OpenPathSampling. In addition, PLUMED can be used as a standalone tool for analysis of molecular dynamics trajectories. A graphical user interface named METAGUI is available. Collective variables PLUMED offers a large collection of collective variables that serve as descriptions of complex processes that occur during molecular dynamics simulations, for example angles, positions, distances, interaction energies, and total energy. References External links METAGUI Molecular dynamics software Computational biology Free software programmed in C++ Free and open-source software
https://en.wikipedia.org/wiki/Nintendo%20Gateway%20System
The Nintendo Gateway System is a version of the Super Nintendo Entertainment System, Game Boy, Game Boy Color, Game Boy Advance, Nintendo 64, or GameCube that was installed on some Northwest, Singapore Airlines, Air China, Air Canada, Alitalia-Linee Aeree Italiane, All Nippon Airways, British Midland International, Kuwait Airways, Malaysia Airlines, Thai Airways, and Virgin Atlantic passenger aircraft, as well as certain hotels with LodgeNet, NXTV, or Quadriga, from late 1993 up until the late 2000s. It was a series of video game consoles rather than a single console, specialized for airlines and hotels, featured in about 40,000 airline seats and 955,000 hotel rooms. It was one of the first in-seat airline entertainment services, provided by Matsushita Avionics, Rockwell Collins, and Thales Avionics. Its official website was discontinued in mid-2008, but units have been seen as late as 2013 for Nintendo 64 in hotels, and as late as 2012 for Game Boy and Game Boy Color on Singapore Airlines. It was part of a much larger computer system that allowed air passengers to not only play video games, but also watch movies and shows, listen to music, talk on the phone, and even shop while in-flight, before the rise of the internet. Upon its release, there were 10 games installed in the system, which included The Legend of Zelda: A Link to the Past, F-Zero and Super Mario World. Future plans for the system were to have it installed in hotels and cruise ships as well. The controller, or remote, for the airline version of the Gateway System had a button setup similar to the Super NES controller. It also doubled as a remote for the movies and music aspect of the system. Hotels had modified versions of the original console controllers. LodgeNet was the most widespread pay-per-view system for hotels that used it. LodgeNet partnered with Nintendo to bring video games directly into guest hotel rooms through streaming over the LodgeNet server, with the special LodgeNet control
https://en.wikipedia.org/wiki/Jarman%E2%80%93Bell%20principle
The Jarman–Bell principle is a concept in ecology that the food quality of a herbivore's intake decreases as the size of the herbivore increases, but the amount of such food increases to counteract the low quality foods. It operates by observing the allometric (non- linear scaling) properties of herbivores. The principle was coined by P.J Jarman (1968.) and R.H.V Bell (1971). Large herbivores can subsist on low quality food. Their gut size is larger than smaller herbivores. The increased size allows for better digestive efficiency, and thus allow viable consumption of low quality food. Small herbivores require more energy per unit of body mass compared to large herbivores. A smaller size, thus smaller gut size and lower efficiency, imply that these animals need to select high quality food to function. Their small gut limits the amount of space for food, so they eat low quantities of high quality diet. Some animals practice coprophagy, where they ingest fecal matter to recycle untapped/ undigested nutrients. However, the Jarman–Bell principle is not without exception. Small herbivorous members of mammals, birds and reptiles were observed to be inconsistent with the trend of small body mass being linked with high-quality food. There have also been disputes over the mechanism behind the Jarman–Bell principle; that larger body sizes does not increase digestive efficiency. The implications of larger herbivores ably subsisting on poor quality food compared smaller herbivores mean that the Jarman–Bell principle may contribute evidence for Cope's rule. Furthermore, the Jarman–Bell principle is also important by providing evidence for the ecological framework of "resource partitioning, competition, habitat use and species packing in environments" and has been applied in several studies. Links with allometry Allometry refers to the non-linear scaling factor of one variable with respect to another. The relationship between such variables is expressed as a power law, wher
https://en.wikipedia.org/wiki/Major%20irrigation%20project
Major irrigation project is a classification of irrigation projects used in India. A project with a cultivable command area of more than 10000 hectares is classified as a major irrigation project. Before the Fifth Five-Year Plan, irrigation schemes were classified on the basis of investments needed to implement the scheme. Since the Fifth Five-Year Plan, India has adopted the command area-based system of classification. References Irrigation projects Irrigation in India
https://en.wikipedia.org/wiki/Lightweight%20software
In computing, lightweight software also called lightweight program and lightweight application, is a computer program that is designed to have a small memory footprint (RAM usage) and low CPU usage, overall a low usage of system resources . To achieve this, the software should avoid software bloat and code bloat and try to find the best algorithm efficiency. See also Software optimization Application footprint Light-weight process Lightweight protocol Lightweight Procedure Call Lightweight programming language Lightweight markup language Load (computing) References Software optimization
https://en.wikipedia.org/wiki/HDR10%2B
HDR10+ is a high dynamic range (HDR) video technology that adds dynamic metadata to HDR10 source files. The dynamic metadata are used to adjust and optimize each frame of the HDR video to the consumer display's capabilities in a way based on the content creator's intentions. HDR10+ is an alternative to Dolby Vision, which also uses dynamic metadata. HDR10+ is the default variant of dynamic metadata as part of the HDMI 2.1 standard. HDR10+ Adaptive is an update designed to optimize HDR10+ content according to the ambient light. Description HDR10+, also known as HDR10 Plus, was announced on 20 April 2017, by Samsung and Amazon Video. HDR10+ updates HDR10 by adding dynamic metadata that can be used to more accurately adjust brightness levels up to the full range of PQ code values (10,000 nits maximum brightness) on a scene-by-scene or frame-by-frame basis. The technology is standardized and defined in SMPTE ST 2094-40. HDR10+ is an open standard and is royalty-free; it is supported by a growing list of post-production software and tools. A certification and logo program for HDR10+ device manufacturers is available with an annual administration fee for certain adopter categories and no per-unit royalty. Authorized test centers conduct certification testing for HDR10+ devices. On 28 August 2017, Samsung, Panasonic, and 20th Century Fox created the HDR10+ Technologies LLC to promote the HDR10+ standard. HDR10+ video started being offered by Amazon Video on 13 December 2017. On 5 January 2018, Warner Bros. announced their support for the HDR10+ standard. On 6 January 2018, Panasonic announced Ultra HD Blu-ray players with support for HDR10+. On 4 April 2019, Universal Pictures Home Entertainment announced a technology collaboration with Samsung Electronics to release new titles mastered with HDR10+. It is considered to have most of the advantages of Dolby Vision over HDR10, as well as being royalty free. HDR10+ signals the dynamic range and scene characteristics on a
https://en.wikipedia.org/wiki/Sensorvault
Sensorvault is an internal Google database that contains records of users' historical geo-location data. It has been used by law enforcement to execute a geo-fence warrant and to search for all devices within the vicinity of a crime, (within a geo-fenced area) and after looking at those devices' movements and narrowing those devices down to potential suspects or witnesses, then asking Google for the information about the owners of those devices. References Internet privacy Google Geographical databases
https://en.wikipedia.org/wiki/IBM%20Cloud
IBM Cloud (formerly known as Bluemix) is a set of cloud computing services for business offered by the information technology company IBM. Services As of 2021, IBM Cloud contains more than 170 services including compute, storage, networking, database, analytics, machine learning, and developer tools. History SoftLayer SoftLayer Technologies, Inc. (now IBM Cloud) was a dedicated server, managed hosting, and cloud computing provider, founded in 2005 and acquired by IBM in 2013. SoftLayer initially specialized in hosting workloads for gaming companies and startups, but shifted focus to enterprise workloads after its acquisition. SoftLayer had bare-metal compute offerings before other large cloud providers such as Amazon Web Services. SoftLayer has hosted workloads for companies such as The Hartford, WhatsApp, Whirlpool, Daimler, and Macy's. Timeline Year 2005: SoftLayer was established in 2005 by Lance Crosby and several of his ex-coworkers. Year 2010 - August: GI Partners acquired a majority equity stake in SoftLayer in August 2010. Year 2010 - November: In November of that year it merged the company with The Planet Internet Services, SoftLayer's biggest competitor, and consolidated the customer base under the SoftLayer brand. Year 2011 - Q1: In Q1 2011, the company reported hosting more than 81,000 servers for more than 26,000 customers in locations throughout the United States. Year 2011 - July: In July 2011, the company announced plans for international expansion to Amsterdam and Singapore to add to the existing network of North American-based data centers in Dallas (Texas), San Jose (California), Seattle (Washington), Santiago de Querétaro (Mexico), Houston (Texas) and Washington, D.C. Most of these data centers were leased via Digital Realty. Year 2013 June 4: On June 4, 2013, IBM announced its acquisition of SoftLayer under undisclosed financial terms, in a deal that according to Reuters could have fetched more than $2 billion, to form an IBM Clo
https://en.wikipedia.org/wiki/Organ%20futures
Organ futures is the short term used in academic proposals for futures contracts on organs from human cadavers. They are not legal anywhere at this time. Organ futures would be used as an economic means to encourage organ donation by compensating transplant organ donors. Financial futures contracts are essentially agreements to pay a specified sum at a specified time. The four key academic papers describing proposals for organ futures were published between the mid-1980s and mid-1990s. Proposals The explanations below focus on donation of cadaveric organs and ignore living donation. Proposal by Schwindt, Vining 1986 Schwindt & Vining (1986) suggest that the organ donor is paid at the time they agree to enter the life-time futures contract. The agreement is mutually revocable. They propose a single government broker as the buyer. Organ recipients would pay the supply price plus a load factor to the broker. Proposal by Hansmann 1989 Hansmann (1989) also suggests payment at the time of contract. Instead of direct payment, he proposes reductions to health insurance premiums as indirect incentive. The hospital where the organ donor dies is expected to verify a seller registry and determine the buyer. Buyers may be health insurance providers or specialist traders. Proposal by Cohen 1989 Cohen (1989) introduces a significant change to previous proposals by making payment conditional on organ extraction. Thus, the donor is not directly compensated during their lifetime. However, the payment is allocated to their estate or a designee. Hospitals are expected to notify buyers and preserve cadavers. They can be made liable for consequences of negligence. Buyers may be public or private organizations. Proposal by Crespi 1994 Crespi (1994) aims to integrate what he deems the most useful aspects of previous models into his own. Payment would be either guaranteed upon death or dependent on organ extraction. The money would go to the seller's estate; rights would not be
https://en.wikipedia.org/wiki/RF%20CMOS
RF CMOS is a metal–oxide–semiconductor (MOS) integrated circuit (IC) technology that integrates radio-frequency (RF), analog and digital electronics on a mixed-signal CMOS (complementary MOS) RF circuit chip. It is widely used in modern wireless telecommunications, such as cellular networks, Bluetooth, Wi-Fi, GPS receivers, broadcasting, vehicular communication systems, and the radio transceivers in all modern mobile phones and wireless networking devices. RF CMOS technology was pioneered by Pakistani engineer Asad Ali Abidi at UCLA during the late 1980s to early 1990s, and helped bring about the wireless revolution with the introduction of digital signal processing in wireless communications. The development and design of RF CMOS devices was enabled by van der Ziel's FET RF noise model, which was published in the early 1960s and remained largely forgotten until the 1990s. History Pakistani engineer Asad Ali Abidi, while working at Bell Labs and then UCLA during the 1980s1990s, pioneered radio research in metal–oxide–semiconductor (MOS) technology and made seminal contributions to radio architecture based on complementary MOS (CMOS) switched-capacitor (SC) technology. In the early 1980s, while working at Bell, he worked on the development of sub-micron MOSFET (MOS field-effect transistor) VLSI (very large-scale integration) technology, and demonstrated the potential of sub-micron NMOS integrated circuit (IC) technology in high-speed communication circuits. Abidi's work was initially met with skepticism from proponents of GaAs and bipolar junction transistors, the dominant technologies for high-speed communication circuits at the time. In 1985 he joined the University of California, Los Angeles (UCLA), where he pioneered RF CMOS technology during the late 1980s to early 1990s. His work changed the way in which RF circuits would be designed, away from discrete bipolar transistors and towards CMOS integrated circuits. Abidi was researching analog CMOS circuits for s
https://en.wikipedia.org/wiki/Virtual%20Machine%20Communication%20Facility
The IBM Virtual Machine Communication Facility (VMCF) is a feature of the VM/370 operating system introduced in Release 3 in 1976. It "provides a method of communication and data transfer between virtual machines operating under the same VM/370 system." VMCF uses paravirtualization through the diagnose instruction VMCF SEND function to send data, in blocks of up to 2048 bytes, from one virtual machine to another. The receiving virtual machine accesses the data thru the diagnose RECEIVE function. It provides a simpler interface and greater performance than the prior use of virtual channel-to-channel adapters for the same purpose. VMCF was superseded by the Inter User Communication Vehicle (IUCV), introduced in 1980 with VM/SP. References IBM mainframe operating systems VM (operating system) Virtualization
https://en.wikipedia.org/wiki/IEEE%20Transactions%20on%20Circuits%20and%20Systems%20I%3A%20Regular%20Papers
IEEE Transactions on Circuits and Systems I: Regular Papers (sometimes abbreviated IEEE TCAS-I) is a monthly peer-reviewed scientific journal covering the theory, analysis, design, and practical implementations of electrical and electronic circuits, and the application of circuit techniques to systems and to signal processing. It is published by the IEEE Circuits and Systems Society. The journal was established in 1952 and the editor-in-chief is Weisheng Zhao (Beihang University). According to the Journal Citation Reports, the 2020 impact factor of the journal is 3.605. Title history Adapted from IEEE Xplore. 1992–2003: IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications 1974–1991: IEEE Transactions on Circuits and Systems 1963–1973: IEEE Transactions on Circuit Theory 1954–1962: IRE Transactions on Circuit Theory 1952–1954: Transactions of the IRE Professional Group on Circuit Theory Editors-in-chief The following people are or have been editor-in-chief: References External links Transactions on Circuits and Systems I: Regular Papers Academic journals established in 1952 English-language journals
https://en.wikipedia.org/wiki/Calculus%20on%20finite%20weighted%20graphs
In mathematics, calculus on finite weighted graphs is a discrete calculus for functions whose domain is the vertex set of a graph with a finite number of vertices and weights associated to the edges. This involves formulating discrete operators on graphs which are analogous to differential operators in calculus, such as graph Laplacians (or discrete Laplace operators) as discrete versions of the Laplacian, and using these operators to formulate differential equations, difference equations, or variational models on graphs which can be interpreted as discrete versions of partial differential equations or continuum variational models. Such equations and models are important tools to mathematically model, analyze, and process discrete information in many different research fields, e.g., image processing, machine learning, and network analysis. In applications, finite weighted graphs represent a finite number of entities by the graph's vertices, any pairwise relationships between these entities by graph edges, and the significance of a relationship by an edge weight function. Differential equations or difference equations on such graphs can be employed to leverage the graph's structure for tasks such as image segmentation (where the vertices represent pixels and the weighted edges encode pixel similarity based on comparisons of Moore neighborhoods or larger windows), data clustering, data classification, or community detection in a social network (where the vertices represent users of the network, the edges represent links between users, and the weight function indicates the strength of interactions between users). The main advantage of finite weighted graphs is that by not being restricted to highly regular structures such as discrete regular grids, lattice graphs, or meshes, they can be applied to represent abstract data with irregular interrelationships. If a finite weighted graph is geometrically embedded in a Euclidean space, i.e., the graph vertices represent p
https://en.wikipedia.org/wiki/Transcriptional%20memory
Transcriptional memory is a biological phenomenon, initially discovered in yeast, during which cells primed with a particular cue show increased rates of gene expression after re-stimulation at a later time. This event was shown to take place: in yeast during growth in galactose and inositol starvation; plants during environmental stress; in mammalian cells during LPS and interferon induction. Prior work has shown that certain characteristics of chromatin may contribute to the poised transcriptional state allowing faster re-induction. These include: activity of specific transcription factors, retention of RNA polymerase II at the promoters of poised genes, activity of chromatin remodeling complexes, propagation of H3K4me2 and H3K36me3 histone modifications, occupancy of the H3.3 histone variant, as well as binding of nuclear pore components. Moreover, locally bound cohesin was shown to inhibit establishment of transcriptional memory in human cells during interferon gamma stimulation. References Cells Microbiology Gene therapy
https://en.wikipedia.org/wiki/List%20of%20biosafety%20level%204%20organisms
Biosafety level 4 (BSL-4) organisms are dangerous or exotic agents which pose high risk of life-threatening disease, aerosol-transmitted lab infections, or related agents with unknown risk of transmission. US federal biocontainment regulations Biosafety level 4 laboratories are designed for diagnostic work and research on easily respiratory-acquired viruses which can often cause severe and/or fatal disease. What follows is a list of select agents that have specific biocontainment requirements according to US federal law. Organisms include those harmful to human health, or to animal health. The Plant Protection and Quarantine programs (PPQ) of the Animal and Plant Health Inspection Service (APHIS) are listed in 7 CFR Part 331. The Department of Health and Human Services (HHS) lists are located at 42 CFR Part 73.3 and 42 CFR Part 73.4. The USDA animal safety list is located at 9 CFR Subchapter B. Not all select agents require BSL-4 handling, namely select bacteria and toxins, but most select agent viruses do (with the notable exception of SARS-CoV-1 which can be handled in BSL3). Many non-select agent viruses are often handled in BSL-4 according to facility SOPs or when dealing with new viruses closely related to viruses that require BSL-4. For instance, Andes orthohantavirus and MERS-CoV are both non-select agents that are often handled in BSL-4 because they cause severe and fatal disease in humans. Newly characterized viruses closely related to select agents and/or BSL-4 viruses (for example newly discovered henipaviruses or ebolaviruses) are typically handled in BSL-4 even if they aren't yet known to be readily transmissible or cause severe disease. International BSL-4 regulations Globally, there are no official agreements on what agents must be handled in BSL-4. However, select agents and toxins originating or ending in US BSL-4 labs must adhere to US select agent laws. Select agents HHS human threats: select agents and toxins Crimean-Congo hemorrhagic fe
https://en.wikipedia.org/wiki/Nonlocal%20operator
In mathematics, a nonlocal operator is a mapping which maps functions on a topological space to functions, in such a way that the value of the output function at a given point cannot be determined solely from the values of the input function in any neighbourhood of any point. An example of a nonlocal operator is the Fourier transform. Formal definition Let be a topological space, a set, a function space containing functions with domain , and a function space containing functions with domain . Two functions and in are called equivalent at if there exists a neighbourhood of such that for all . An operator is said to be local if for every there exists an such that for all functions and in which are equivalent at . A nonlocal operator is an operator which is not local. For a local operator it is possible (in principle) to compute the value using only knowledge of the values of in an arbitrarily small neighbourhood of a point . For a nonlocal operator this is not possible. Examples Differential operators are examples of local operators. A large class of (linear) nonlocal operators is given by the integral transforms, such as the Fourier transform and the Laplace transform. For an integral transform of the form where is some kernel function, it is necessary to know the values of almost everywhere on the support of in order to compute the value of at . An example of a singular integral operator is the fractional Laplacian The prefactor involves the Gamma function and serves as a normalizing factor. The fractional Laplacian plays a role in, for example, the study of nonlocal minimal surfaces. Applications Some examples of applications of nonlocal operators are: Time series analysis using Fourier transformations Analysis of dynamical systems using Laplace transformations Image denoising using non-local means Modelling Gaussian blur or motion blur in images using convolution with a blurring kernel or point spread function See also Fraction
https://en.wikipedia.org/wiki/ShareChat
ShareChat is an Indian social networking service platform, owned by Bangalore-based Mohalla Tech. It was founded by Ankush Sachdeva, Bhanu Pratap Singh and Farid Ahsan, and incorporated on 8 January 2015. ShareChat app has over 350 million monthly active users across 15 Indian languages. The current valuation of the company is $5 billion. Foundation and history ShareChat's holding company, Mohalla Tech Pvt Ltd, was incorporated in January 2015 by graduate from Indian Institutes of Technology Kanpur: Ankush Sachdeva, Bhanu Pratap Singh and Farid Ahsan. The company is headquartered in Bengaluru, Karnataka, and as of 2020, employed over 2500 people. Initially, ShareChat primarily worked as a content sharing platform, without any scope of users generating their own content. In April 2016, however, ShareChat enabled user-generated content creation on its platform, allowing its users to share their own posters and creative content. At around the same time, it also introduced open tagging for users, which would allow anyone to create their own hashtags depending on the content. In January 2023, Farid Ahsan and Bhanu Pratap Singh stepped down from their respective roles as Chief Operating Officer and Chief Technology Officer. As of 2022, Manohar Singh Charan (CFO) and Amit Zunjarwad (CPO) led the management along with Ankush Sachdeva (CEO). Acquisitions In March 2019, Mohalla Tech acquired Transversal Tech-owned short video sharing platform, Clip. In February 2020, it acquired Bengaluru-based online fashion marketplace Elanic. In March 2020, it acquired a meme discovery and sharing platform, Memer. In August 2020, it acquired a hyperlocal information platform, Circle Internet. Sharechat parent company also bought MX TakaTak from Times Internet Group for $700 million in one of the biggest acquisitions of 2022. Funding In September 2020, ShareChat raised $40 million from investors Pawan Munjal of Hero MotoCorp, Ajay Shridhar Shriram of DCM Shriram, Twitter, SAIF Part
https://en.wikipedia.org/wiki/Comet%20Lake
Comet Lake is Intel's codename for its 10th generation Core processors. They are manufactured using Intel's third 14 nm Skylake process revision, succeeding the Whiskey Lake U-series mobile processor and Coffee Lake desktop processor families. Intel announced low-power mobile Comet Lake-U CPUs on August 21, 2019, H-series mobile CPUs on April 2, 2020, desktop Comet Lake-S CPUs April 30, 2020, and Xeon W-1200 series workstation CPUs on May 13, 2020. Comet Lake processors and Ice Lake 10 nm processors are together branded as the Intel "10th Generation Core" family. Intel officially launched Comet Lake-Refresh CPUs on the same day as 11th Gen Core Rocket Lake launch. The low-power mobile Comet Lake-U Core and Celeron 5205U CPUs were discontinued on July 7, 2021. Generational changes All Comet Lake CPUs feature an updated Platform Controller Hub with CNVio2 controller with Wi-Fi 6 and external AX201 CRF module support. Comet Lake-S compared to Coffee Lake-S Refresh Up to ten CPU cores Hyperthreading on all models, except for Celeron Single core turbo boost up to 5.3 GHz (300 MHz higher); all-core turbo boost up to 4.9 GHz; Thermal Velocity Boost for Core i9; Turbo Boost Max 3.0 support for Core i7, i9 DDR4-2933 memory support for Core i7 and i9; DDR4-2666 for Core i3, Core i5, Pentium Gold, Celeron 400-series chipset based on the LGA 1200 socket Comet Lake-H compared to Coffee Lake-H Refresh Higher turbo frequencies by up to 300 Mhz DDR4-2933 memory support Thermal Velocity Boost for Core i7 and i9 Comet Lake-U compared to Whiskey Lake-U Up to six CPU cores Higher turbo frequencies by up to 300 MHz DDR4-2666 and LPDDR3-2133 memory support One notable architectural change of Comet Lake from its predecessors is removal of TSX instruction set extensions. Entry-level CPUs like the i3 series no longer support ECC memory. List of 10th generation Comet Lake processors Desktop Comet Lake-S Pentium and Celeron CPUs lack AVX and AVX2 support. Comet Lake-W
https://en.wikipedia.org/wiki/OutGuess
OutGuess is a steganographic software for hiding data in the most redundant content data bits of existing (media) files. It has handlers for image files in the common Netpbm and JPEG formats, so it can, for example, specifically alter the frequency coefficients of JPEG files. It is written in C and published as Free Software under the terms of the old BSD license. It has been tested on a variety of Unix-like operating systems and is included in the standard software repositories of the popular Linux distributions Debian and Arch Linux (via user repository) and their derivatives. Method of operation An algorithm estimates the capacity for hidden data without the distortions of the decoy data becoming apparent. OutGuess determines bits in the decoy data that it considers most expendable and then distributes secret bits based on a shared secret in a pseudorandom pattern across these redundant bits, flipping some of them according to the secret data. For JPEG images, OutGuess recompresses the image to a user-selected quality level and then embeds secret bits into the least significant bits (LSB) of the quantized coefficients while skipping zeros and ones. Subsequently, corrections are made to the coefficients to make the global histogram of discrete cosine transform (DCT) coefficients match that of the decoy image, counteracting detection by the chi-square attack that is based on the analysis of first-order statistics. This technique is criticized because it actually facilitates detection by further disturbing other statistics. Also, data embedded in JPEG frequency coefficients has poor robustness and does not withstand JPEG reencoding. History OutGuess was originally developed in Germany in 1999 by Niels Provos. In 1999, Andreas Westfeld published the statistical chi-square attack, which can detect common methods for steganographically hiding messages in LSBs of quantized JPEG coefficients. In response, Provos implemented a method that exactly preserves the DCT histo
https://en.wikipedia.org/wiki/H.234
In cryptography, H.234 is an international standard that defines popular encryption systems used to secure communications, specifically Diffie-Hellman key exchange and RSA. It also defines ISO 8732 key management. History H.234 was first defined by the International Telecommunications Union's Standardization sector (ITU-T) in 1994 by Study Group 15. Subsequently, the standard was revised by Study Group 16 in November 2002, which remains in force to date. At the time of first publication of H.234, RSA was covered by a patent in the United States (but not elsewhere), the US patent expired in 2000. Specification The standard describes three methods of encryption key management: Diffie-Hellman RSA ISO 8732 References ITU-T recommendations ITU-T H Series Recommendations Cryptography standards
https://en.wikipedia.org/wiki/Learning%20Factory
Learning factories represent a realistic manufacturing environment for education, training, and research. In the last decades, numerous learning factories have been built in academia and industry. Definition The term learning factory consists of two words. The word ‘learning’ indicates the development of competencies, while the word ‘factory’ defines a realistic manufacturing environment. The generally accepted definition was agreed within the CIRP CWG and published in the CIRP Encyclopedia: According to the International Academy for Production Engineering (CIRP) a learning factory is defined by processes that are authentic, include multiple stations, and comprise technical as well as organizational aspects, a setting that is changeable resembles a real value chain, a physical product being manufactured, and a didactical concept that comprises formal, informal and non-formal learning, enabled by own actions of the trainees in an on-site learning approach. Depending on the purpose of the learning factory, learning takes place through teaching, training and/or research. Consequently, learning outcomes may be competency development and/or innovation. An operating model ensuring the sustained operation of the learning factory is desirable. The difference between learning factories and model factories is that learning factories provide a didactical concept and an operating model for training. History The term ‘learning factory’ was first coined in the US in 1994, when the National Science Foundation (NSF) awarded a consortium of the Penn State University. Industry-related design projects have been supported on a 2000 m² facility with machines, tools, and materials. Real problems of the industry could be solved in a realistic environment. In 2006, the program received the National Academy of Engineering’s Gordon Prize for Innovation in Engineering Education. In Europe, more and more learning factories have been designed in the last decade. One of the first lear
https://en.wikipedia.org/wiki/Olivier%20pile
An Olivier pile is a drilled displacement pile:. This is an underground deep foundation pile made of concrete or reinforced concrete with a screw-shaped shaft (helical shaft) which is performed without soil removal. History The Belgian Gerdi Vankeirsbilck applied for the production patent for the Olivier pile in April 1996. This technique was implemented by his own company and various licences have been granted in Belgium and abroad. Due to its screw-shaped shaft, the Olivier pile is particularly suitable for use in soils with low load-bearing capacities, such as clay and loam. In 2018 a patent was applied for drilling without the use of a lost bit. Description An Olivier pile is drilled into the ground by the use of drilling rig with a top-type rotary drive with variable rate of penetration. A lost tip is attached to a partial-flight auger which, in turn, is attached to a casing. The casing, which is rotated clockwise continuously, penetrates into the ground by the action of a torque and a vertical force. At the desired installation depth, the lost tip is released, and the reinforcing cage is inserted into the casing. Concrete is then placed inside the casing through a funnel. The casing and the partialflight auger are extracted by counter-clockwise rotation. The shaft of the Olivier pile has the shape of a screw. The casing has an external diameter of 324mm (12.75"), with a wall thickness of 25mm (1"). The casing consists of several parts assembled with watertight couplings, which are strong enough to handle the maximum torque produced by the rotary drive. The various auger heads, for the various diameters of the Olivier Pile, all have a larger diameter than the casing. Common diameters of the auger head diameter 36–56 cm (14"-22") diameter 41–61 cm (16"-24") diameter 46–66 cm (18"- 26") diameter 51–71 cm (20"-28") diameter 56–76 cm (22"-30") Installation An Olivier Pile is screwed into the ground without vibration, the soil is displaced sideways.
https://en.wikipedia.org/wiki/Hay%27s%20bridge
Hay's bridge is used to determine the Inductance of an inductor with a high Q factor. Maxwell's bridge is only appropriate for measuring the values for inductors with a medium quality factor. Thus, the bridge is the advanced form of Maxwell’s bridge. One of the arms of a Hay's bridge has an accurately characterized capacitor used to balance the unknown inductance value. The other arms contain resistors. References Electrical meters Bridge circuits
https://en.wikipedia.org/wiki/Irwin%20screen
In the realms of toxicology and pathology, the Irwin screen is utilised to determine whether the subject(s) show adverse effects from a course of pharmaceutical treatment or environmental pollution. It is an observational methodology. History Mice were first used systematically to determine a drug's central nervous system side effects by S. Irwin in 1962 and then again in 1968. Its use in the pharmaceutical industry has become ingrained since then, as below. The National Academy of Sciences issued in 1975 a position paper on the "Principles for Evaluating Chemicals in the Environment." This paper influenced government and academic circles, and was adopted by e.g. Brimblecombe for his study of atmospheric arsenic levels. The critical review in 1982 by Mitchell and Tilson, caused the US EPA to develop guidelines for several behavioural tests including a test series based on the Irwin Screen, named the Functional Observational Battery (FOB) by Sette in 1989. In 1998, the FOB was published in the late 1990s as EPA Human Health 870 Series Test Guidelines, and in praxis the Irwin screen and the FOB "overlap and to some extent are interchangeable." The American batteries were harmonised with the OECD's from the same era. Similar tests on food chemicals were recommended by the FDA in their Red Book. Behavioural test batteries are now required for new drugs by the S7A group of the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). Interchangeability with FOB The Irwin screen was as of 2010 in the pharmaceutical industry almost exclusively used with lab mice, whereas the FOB, or some modification thereof, was used with lab rats and other nonrodent species, such as rabbits, dogs, guinea pigs and nonhuman primates. Sample checklist A sample Irwin screen includes overt behavior observations and autonomic observations. Overt Behavior increased activity decreased activity sedation tremor convulsions myoclonic to
https://en.wikipedia.org/wiki/Index%20to%20Organism%20Names
The Index to Organism Names (ION) is an extensive compendium of scientific names of taxa at all ranks in the field of zoology, compiled from the Zoological Record (later supplemented with content from Sherborn's Index Animalium) by its operators as a publicly accessible internet resource. Initially developed by BIOSIS, its ownership then passed to Thomson Reuters and is currently with Clarivate Analytics. History ION was initially developed as a freely available, web accessible component of a larger project, "TRITON" (the Taxonomy Resource and Index To Organism Names system) by BIOSIS, the then publishers of the Zoological Record ("ZR") and Biological Abstracts, in approximately 2000. As originally released it covered all animal names (sensu lato) reported in Zoological Record since 1978, along with names from some other groups not covered by the Zoological Record contributed by several partner organizations (the latter were subsequently deprecated in the system). Its initially stated aim was to provide basic nomenclatural and hierarchy information, plus ZR volume occurrence counts (reflecting use in the literature) for animal names, to identify the taxonomic group to which an organism belongs, and to link to further information from ZR (or initially, other collaborating organization). By 2006, the BIOSIS products had been purchased by Thomson Scientific, subsequently Thomson Reuters, who continued and extended the ION database (example archived search interface here) using the URL www.organismnames.com, where it continues to reside. The Intellectual Property and Science division of Thomson Reuters was subsequently acquired by Clarivate Analytics who continue to make ION available (as at mid 2019). Included content In its initial release, the Index contained content from Zoological Record dating back to 1978, which was subsequently extended to the full span of the Zoological Record commencing in 1864. In 2011, Nigel Robinson of Thomson Reuters described an in-pro
https://en.wikipedia.org/wiki/List%20of%20Institution%20of%20Engineering%20and%20Technology%20academic%20journals
This is a list of journals published by Institution of Engineering and Technology (IET), including those from its predecessors Institution of Electrical Engineers (IEE) and Institution of Incorporated Engineers (IIE). B C E H I J M P R S W External links IET Digital Library IET Academic journals Institution of Engineering and Technology
https://en.wikipedia.org/wiki/DARPA%20Quantum%20Network
The DARPA Quantum Network (2002–2007) was the world's first quantum key distribution (QKD) network, operating 10 optical nodes across Boston and Cambridge, Massachusetts. It became fully operational on October 23, 2003 in BBN's laboratories, and in June 2004 was fielded through dark fiber under the streets of Cambridge and Boston, where it ran continuously for over 3 years. The project also created and fielded the world's first superconducting nanowire single-photon detector. It was sponsored by DARPA as part of the QuIST program, and built and operated by BBN Technologies in close collaboration with colleagues at Harvard University and the Boston University Photonics Center. The DARPA Quantum Network was fully compatible with standard Internet technology, and could provide QKD-derived key material to create Virtual Private Networks, to support IPsec or other authentication, or for any other purpose. All control mechanisms and protocols were implemented in the Unix kernel and field-programmable gate arrays. QKD-derived key material was routinely used for video-conferencing or other applications. The DARPA Quantum Network was built in stages. In the project's first year (year 1), BBN designed and built a full QKD system (Alice and Bob), with an attenuated laser source (~ 0.1 mean photon number) running through telecom fiber, phase-modulated via an actively stabilized Mach-Zender interferometer. BBN also implemented a full suite of industrial-strength QKD protocols based on BB84. In year 2, BBN created two 'Mark 2' versions of this system (4 nodes) with commercial-quality InGaAs detectors created by IBM Research. These 4 nodes ran continuously in BBN's laboratory from October 2003, then two were deployed at Harvard and Boston University in June 2004, when the network began running continuously across the metro Boston area, 24x7. In year 3, the network expanded to 8 nodes with the addition of an entanglement-based system (derived from work at Boston University) desig
https://en.wikipedia.org/wiki/A.M.C.%3A%20Astro%20Marine%20Corps
A.M.C.: Astro Marine Corps is a 1989 platform shooter video game developed by Creepsoft and published by Dinamic Software. It was released for the Amiga, Atari ST, Commodore 64, ZX Spectrum, MSX, and Amstrad CPC. The program as written by Pablo Ariza with music by José A. Martín. Gameplay The player controls a member of the eponymous Astro Marine Corps, on a mission to stop the Deathbringers, a conglomerate of multiple alien criminals, from launching a campaign to conquer the galaxy. To fulfill his mission, the marine must run from left to right, jumping over obstacles and shooting aliens that try to kill him. Colliding with the aliens or their shots will lower the marine's health bar, or in the case of some enemies even kill him outright, making him lose one of his four lives. From time to time, boxes containing power ups or booby traps drop from the sky, and the marine has to shoot them to reveal their contents. Power ups include health refills, grenades and various alternate shot types such as a flame thrower or a spread shot. The marine can also obtain temporary invulnerability, although it doesn't work with enemies that can kill him instantly. The game is divided in two phases. Each phase must be completed in a limited amount of time, which is extended upon reaching various checkpoints. In the first phase, the marine has to reach the Deathbringers' ship landed on planet Dendar and hijack it. At the end of the phase, a boss alien, the Krauer, must be defeated. In the second phase, the marine must fight through the Deathbringers' home planet and destroy their leader, the Great Alien King, to end the invasion. Reception Retro Games Review felt the title was worth repeated play-throughs. ACE: Advanced Computer Entertainment felt the title was slick but straightforward. The One felt the game offered the "monster mashing" genre a "bit of class". ST Format thought it was no more than a competent shooter. Amiga Reviews felt the graphics were the best part. The games
https://en.wikipedia.org/wiki/Cohomology%20of%20a%20stack
In algebraic geometry, the cohomology of a stack is a generalization of étale cohomology. In a sense, it is a theory that is coarser than the Chow group of a stack. The cohomology of a quotient stack (e.g., classifying stack) can be thought of as an algebraic counterpart of equivariant cohomology. For example, Borel's theorem states that the cohomology ring of a classifying stack is a polynomial ring. See also l-adic sheaf smooth topology References Algebraic geometry Cohomology theories
https://en.wikipedia.org/wiki/Where%20in%20the%20World%20is%20Carmen%20Sandiego%3F%20%28Prodigy%20video%20game%29
Where in the World is Carmen Sandiego? is a game within the Carmen Sandiego franchise made for the Prodigy Interactive online service, a "special edition" and Prodigy service adaptation of the 1985 Broderbund educational game of the same name. Prodigy was a computer service from a partnership of IBM and Sears. World was one of three games available on the extra audio card, alongside Silpheed and Cakewalk Apprentice. It was an on-line version of the popular PC title, written specifically for Prodigy. This version had a new adventure each week, each Carmen Sandiego episode sponsored by the Prodigy online. The game was pitted to teach geography in an "exciting new way". The book Parents, kids & computers describes Prodigy s version of Carmen Sandiego as "a sort of online Carmen miniseries that changes from time to time". It was often used as a major selling point of the Prodigy service to parents, and advertised as "your kids' personal tutor' and for "adventure and role-playing enthusiasts". It was highlighted as part of Prodigy's "Education and entertainment spanning school subjects". Other kid-friendly programming included Sesame Street and Nickelodeon. Gameplay and plot Players can find the game through a series of menus for instance the Stories Menu, or by '[JUMP]: carmen'. Once in the game, the player will see 4 menu options: the current case, a How To Play page, Last Week's Winners, and an Acme Detective Agency Honor Roll. Clicking on the game yields three choices: start the case, read the eight criminal dossiers, or look at some hints. As with other games in the series, World sees players traipse behind Carmen and her crooks in the hunt to capture them in the least amount of time possible. Players travel to three cities per case - after two stops the players should have enough information to know the name on the arrest warrant; in the third city the player must find the criminal and make the arrest with the correct warrant. When players start a case, the
https://en.wikipedia.org/wiki/Sony%20TC-50
The Sony TC-50 was a compact cassette recorder and portable audio player designed and marketed by Sony. It was conceived for the purpose of dictation but gained popularity with the general public after its use during the Apollo program lunar missions. It was released commercially in 1968. NASA furnished every astronaut with a Sony TC-50 from Apollo 7 in 1968 onward. Procured to facilitate the recording of personal mission logs, negating the need for additional paper work, the TC-50 was also used by astronauts to play their favorite mixtapes in the Apollo spacecraft. "Rather than blast off with only blank cassettes, the astronauts took tapes that had been pre-filled with music befitting their tastes" and recorded over them as the mission advanced forward. Neil Armstrong, Buzz Aldrin, and several other astronauts were provided with cassette recordings by Al Bishop and Hollywood producer Mickey Kapp. The TC-50 has variously been described as a "predecessor" or "proto-Walkman" due to its compact design and similarity. References External links National Air and Space Museum - Apollo 12 Sony TC-50 held by the Smithsonian Sony's proto-Walkman that went to the moon An examination of the TC-50 "Fly Me to the Moon" Gene Cernan plays Frank Sinatra as the crew orbits the Moon on Apollo 10 Audio players Portable media players Products introduced in 1968 Sony hardware
https://en.wikipedia.org/wiki/Antibody-oligonucleotide%20conjugate
Antibody-oligonucleotide conjugates or AOCs belong to a class of chimeric molecules combining in their structure two important families of biomolecules: monoclonal antibodies and oligonucleotides. Combination of exceptional targeting capabilities of monoclonal antibodies with numerous functional modalities of oligonucleotides has been fruitful for a variety of applications with AOC including imaging, detection and targeted therapeutics. Cell uptake/internalisation still represents the biggest hurdle towards successful ON therapeutics. A straightforward uptake, like for most small-molecule drugs, is hindered by the polyanionic backbone and the molecular size of ONs. Being adapted from the broad and successful class of Antibody-Drug conjugates, antibodies and antibody analogues are more and more used in research in order to overcome hurdles related to delivery and internalisation of ON therapeutics. By exploiting bioconjugation methodology several conjugates have been obtained. Development of therapeutic AOCs The first AOC was reported in 1995 where the lysines of a transferrin-antibody were connected using a SMCC bifunctional linker (NHS ester and maleimide moiety) to radiolabelled and cys-bearing ASOs targeting HIV mRNA. Marcin and his colleagues developed a different construct using the same chemistry, but they utilized siRNA instead of an ASO in 2011. In 2013, MYERS and coworkers then unspecifically labelled an anti-CD19 antibody with N-succinimidyl 3-(2-pyridyl-dithio) propionate to form disulphide bonds with cys-modified ASO targeting the mRNA of oncoprotein E2A–PBX1. Ultimately, they could prove in-vivo antitumour effects which in contrast were not obtained with the single entities. In the same timeframe, several antibodies were exploited for ON delivery in combination with nanoparticles and in non-covalent strategies. Only recently the first examples for a site-selective conjugation between an ON therapeutic and a mAb was published: in 2015 Genentech exp
https://en.wikipedia.org/wiki/Deterministic%20Networking
Deterministic Networking (DetNet) is an effort by the IETF DetNet Working Group to study implementation of deterministic data paths for real-time applications with extremely low data loss rates, packet delay variation (jitter), and bounded latency, such as audio and video streaming, industrial automation, and vehicle control. DetNet operates at the IP Layer 3 routed segments using a software-defined networking layer to provide IntServ and DiffServ integration, and delivers service over lower Layer 2 bridged segments using technologies such as MPLS and IEEE 802.1 Time-Sensitive Networking. Deterministic Networking aims to migrate time-critical, high-reliability industrial control and audio-video applications from special-purpose Fieldbus networks (HDMI, CAN bus, PROFIBUS, RS-485, RS-422/RS-232, and I²C) to packet networks and IP in particular. DetNet will support both the new applications and existing IT applications on the same physical network. To support real-time applications, DetNet implements reservation of data plane resources in intermediate nodes along the data flow path, calculation of explicit routes that do not depend on network topology, and redistribute data packets over time and/or space to deliver data even with the loss of one path. Rationale Standard IT infrastructure cannot efficiently handle latency-sensitive data. Switches and routers use fundamentally uncertain algorithms for processing packet/frames, which may result in sporadic data flow. A common solution for smoothing out these flows is to increase buffer sizes, but this has a negative effect on delivery latency because data has to fill the buffers before transmission to the next switch or router can start. IEEE Time-Sensitive Networking (TSN) task group has defined deterministic algorithms for queuing, shaping and scheduling which allow each node to allocate bandwidth and latency according to requirements of each data flow, by computing the buffer size at the network switch. The same
https://en.wikipedia.org/wiki/Type%20IV%20secretion%20system
The bacterial type IV secretion system, also known as the type IV secretion system or the T4SS, is a secretion protein complex found in gram negative bacteria, gram positive bacteria, and archaea. It is able to transport proteins and DNA across the cell membrane. The type IV secretion system is just one of many bacterial secretion systems. Type IV secretion systems are related to conjugation machinery which generally involve a single-step secretion system and the use of a pilus. Type IV secretion systems are used for conjugation, DNA exchange with the extracellular space, and for delivering proteins to target cells. The type IV secretion system is divided into type IVA and type IVB based on genetic ancestry. Notable instances of the type IV secretion system include the plasmid insertion into plants of Agrobacterium tumefaciens, the toxin delivery methods of Bordetella pertussis (whooping cough) and Legionella pneumophila (Legionnaires' disease), and the F sex pilus. Function The type IV secretion system is a protein complex found in prokaryotes used to transport DNA, proteins, or effector molecules from the cytoplasm to the extracellular space beyond the cell. The type IV secretion system is related to prokaryotic conjugation machinery. Type IV secretion systems are a highly versatile group, present in Gram positive bacteria, Gram negative bacteria, and archaea. They usually involve a single step which utilizes a pilus, though exceptions exist.   Type IV secretion systems are highly diverse, with a variety of functions and types due to different evolutionary paths. Primarily, type IV secretion systems are grouped based on structural and genetic similarity and are only distantly related to each other. Type IVA systems are similar to the VirB/D4 system of Agrobacterium tumefaciens. Type IVB systems are similar to the Dot/Icm systems found in intracellular pathogens such as Legionella pneumophila. The “other” type systems resemble neither IVA or IVB. Types are gen
https://en.wikipedia.org/wiki/Photopea
Photopea ( ) is a web-based photo and graphics editor. It is used for image editing, making illustrations, web design or converting between different image formats. Photopea is advertising-supported software. It is compatible with all modern web browsers, including Opera, Edge, Chrome, and Firefox. The app is compatible with raster and vector graphics, such as Photoshop's PSD as well as JPEG, PNG, DNG, GIF, SVG, PDF and other image file formats. While browser-based, Photopea stores all files locally, and does not upload any data to a server. Features Photopea has various image editing tools including spot healing, a clone stamp healing brush, and a patch tool. The software supports layers, layer masks, channels, selections, paths, smart objects, layer styles, text layers, filters and vector shapes. Reception Photopea has received positive coverage due to its similarities to Adobe Photoshop in design and workflow, making it an easier program for those trained in Photoshop to use, compared to other free raster image editors such as GIMP. See also Comparison of raster graphics editors Pixelmator Adobe Photoshop SumoPaint Procreate GIMP References External links Official website Photopea Blog Adware Cross-platform software Proprietary photo software Web applications 2013 software Graphics software Proprietary cross-platform software Raster graphics editors Vector graphics editors
https://en.wikipedia.org/wiki/Mediterranean%20Biogeographic%20Region
The Mediterranean Biogeographic Region is the biogeographic region around and including the Mediterranean Sea. The term is defined by the European Environment Agency as applying to the land areas of Europe that border on the Mediterranean Sea, and the corresponding territorial waters. The region is rich in biodiversity and has many endemic species. The term may also be used in the broader sense of all the lands of the Mediterranean Basin, or in the narrow sense of just the Mediterranean Sea. Extent The European Commission defines the Mediterranean Biogeographic Region as consisting of the Mediterranean Sea, Greece, Malta, Cyprus, large parts of Portugal, Spain and Italy, and a smaller part of France. The region includes 20.6% of European Union territory. Climate The region has cool humid winters and hot dry summers. Wladimir Köppen divided his "Cs" mediterranean climate classification into "Csa" with a highest mean monthly temperature over and "Csb" where the mean monthly temperature was always lower than . The region may also be subdivided into dry zones such as Alicante in Spain, and humid zones such as Cinque Terre in Italy. Terrain The region has generally hilly terrain and includes islands, high mountains, semi-arid steppes and thick Mediterranean forests, woodlands, and scrub with many aromatic plants. There are rocky shorelines and sandy beaches. The region has been greatly affected by human activity such as livestock grazing, cultivation, forest clearance and forest fires. In recent years tourism has put greater pressure on the shoreline environment. Biodiversity The Mediterranean Biogeographic Region is rich in biodiversity and has many endemic species. The region has more plants species than all the other biogeographical regions of Europe combined. The wildlife and vegetation are adapted to the unpredictable weather, with sudden downpours or strong winds. Coastal wetlands are home to endemic species of insects, amphibians and fish, which provide
https://en.wikipedia.org/wiki/Localized%20Chern%20class
In algebraic geometry, a localized Chern class is a variant of a Chern class, that is defined for a chain complex of vector bundles as opposed to a single vector bundle. It was originally introduced in Fulton's intersection theory, as an algebraic counterpart of the similar construction in algebraic topology. The notion is used in particular in the Riemann–Roch-type theorem. S. Bloch later generalized the notion in the context of arithmetic schemes (schemes over a Dedekind domain) for the purpose of giving #Bloch's conductor formula that computes the non-constancy of Euler characteristic of a degenerating family of algebraic varieties (in the mixed characteristic case). Definitions Let Y be a pure-dimensional regular scheme of finite type over a field or discrete valuation ring and X a closed subscheme. Let denote a complex of vector bundles on Y that is exact on . The localized Chern class of this complex is a class in the bivariant Chow group of defined as follows. Let denote the tautological bundle of the Grassmann bundle of rank sub-bundles of . Let . Then the i-th localized Chern class is defined by the formula: where is the projection and is a cycle obtained from by the so-called graph construction. Example: localized Euler class Let be as in #Definitions. If S is smooth over a field, then the localized Chern class coincides with the class where, roughly, is the section determined by the differential of f and (thus) is the class of the singular locus of f. Consider an infinite dimensional bundle E over an infinite dimensional manifold M with a section s with Fredholm derivative. In practice this situation occurs whenever we have system of PDE’s which are elliptic when considered modulo some gauge group action. The zero set Z(s) is then the moduli space of solutions modulo gauge, and the index of the derivative is the virtual dimension. The localized Euler class of the pair (E,s) is a homology class with closed support on the zero set of th
https://en.wikipedia.org/wiki/Human%20impact%20on%20marine%20life
Human activities affect marine life and marine habitats through overfishing, habitat loss, the introduction of invasive species, ocean pollution, ocean acidification and ocean warming. These impact marine ecosystems and food webs and may result in consequences as yet unrecognised for the biodiversity and continuation of marine life forms. The ocean can be described as the world's largest ecosystem and it is home for many species of marine life. Different activities carried out and caused by human beings such as global warming, ocean acidification, and pollution affect marine life and its habitats. For the past 50 years, more than 90 percent of global warming resulting from human activity has been absorbed into the ocean. This results in the rise of ocean temperatures and ocean acidification which is harmful to many fish species and causes damage to habitats such as coral. With coral producing materials such as carbonate rock and calcareous sediment, this creates a very unique and valuable ecosystem not only providing food/homes for marine creatures but also having many benefits for humans too. Ocean acidification caused by rising levels of carbon dioxide leads to coral bleaching where the rates of calcification is lowered affecting coral growth. Additionally, another issue caused by humans which impacts marine life is marine plastic pollution, which poses a threat to marine life. According to the IPCC (2019), since 1950 "many marine species across various groups have undergone shifts in geographical range and seasonal activities in response to ocean warming, sea ice change and biogeochemical changes, such as oxygen loss, to their habitats." It has been estimated only 13% of the ocean area remains as wilderness, mostly in open ocean areas rather than along the coast. Overfishing Overfishing is occurring in one third of world fish stocks, according to a 2018 report by the Food and Agriculture Organization of the United Nations. In addition, industry observers beli
https://en.wikipedia.org/wiki/Australian%20Women%27s%20Register
The Australian Women's Register is a fully searchable online database which aims to cover Australian women and Australian Women's organisations. It combines many resources and allows users to find historical and contemporary material on notable Australian women in all fields. It aims to help users find women organisations archives publications and other digital resources. Part of the Australian Women's Archives Project, it was established in 2000 and is maintained by the National Foundation for Australian Women (NFAW), together with the University of Melbourne. National Foundation for Australian Women The National Foundation for Australian Women (NFAW) was set up by a group of women's rights campaigners who wished to establish a body to promote women's movement ideas and policies. It was established in 1989 with seed money of $100,000 from Pamela Denoon and a trust fund in her name. It was to be independent of political parties and was to form partnerships with other women's organisations. Its purpose was to ensure that women's history, knowledge, and wisdom would be accessible to new generations of women, and to advance and protect Australian women's interest in all spheres of life. See also Convict women in Australia List of Australian women artists List of Australian women writers List of Australian sportswomen Women and government in Australia Women in the Australian military References External links National Federation for Australian women: website 2000 establishments in Australia Databases in Australia Online databases Women's organisations based in Australia Women in Australia