source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/PALISADE%20%28software%29 | PALISADE is an open-source cross platform software library that provides implementations of lattice cryptography building blocks and homomorphic encryption schemes.
History
PALISADE adopted the open modular design principles of the predecessor SIPHER software library from the DARPA PROCEED program. SIPHER development began in 2010, with a focus on modular open design principles to support rapid application deployment over multiple FHE schemes and hardware accelerator back-ends, including on mobile, FPGA and CPU-based computing systems. PALISADE began building from earlier SIPHER designs in 2014, with an open-source release in 2017 and substantial improvements every subsequent 6 months.
PALISADE development was funded originally by the DARPA PROCEED and SafeWare programs, with subsequent improvements funded by additional DARPA programs, IARPA, the NSA, NIH, ONR, the United States Navy, the Sloan Foundation and commercial entities such as Duality Technologies. PALISADE has subsequently been used in commercial offerings, such as by Duality Technologies who raised funding in a Seed round and a later Series A round led by Intel Capital.
In 2022 OpenFHE was released as a fork that also implements CKKS bootstrapping.
Features
PALISADE includes the following features:
Post-quantum public-key encryption
Fully homomorphic encryption (FHE)
Brakerski/Fan-Vercauteren (BFV) scheme for integer arithmetic with RNS optimizations
Brakerski-Gentry-Vaikuntanathan (BGV) scheme for integer arithmetic with RNS optimizations
Cheon-Kim-Kim-Song (CKKS) scheme for real-number arithmetic with RNS optimizations
Ducas-Micciancio (FHEW) scheme for Boolean circuit evaluation with optimizations
Chillotti-Gama-Georgieva-Izabachene (TFHE) scheme for Boolean circuit evaluation with extensions
Multiparty extensions of FHE
Threshold FHE for BGV, BFV, and CKKS schemes
Proxy re-encryption for BGV, BFV, and CKKS schemes
Digital signature
Identity-based encryption
Ciphertext-policy at |
https://en.wikipedia.org/wiki/Exsiccata | Exsiccata (Latin, gen. -ae, plur. -ae) is a work with "published, uniform, numbered set[s] of preserved specimens distributed with printed labels". Typically, exsiccatae refer to numbered collections of dried herbarium specimens respectively preserved biological samples published in several duplicate sets with a common theme/ title like Lichenes Helvetici (see figure). Exsiccatae are regarded as scientific contributions of the editor(s) with characteristics from the library world (published booklets of scientific literature, with authors/ editors, titles, often published as serials in formats with fascicles) and features from the herbarium world (uniform and numbered collections of duplicate herbarium specimens). Exsiccatae works represent a special method of scholarly communication. The text in the printed matters/published booklets is basically a list of labels (schedae) with informations on each single numbered exsiccatal unit. Extensions of the concept occur.
There are several comprehensive bibliographies and treatments on exsiccatae devoted to algae, bryophytes and lichens, lichens and fungi. A printed bibliography on works devoted to vascular plants is missing.
Early history
Exsiccatae are also known under the terms exsiccatal series, exsiccata(e) series, exsiccata(e) works, exsiccatae collections, sometimes exsiccati, exsiccate. Furthermore, the feminine noun term "exsiccata" (Latin, gen. -ae, plur. -ae) for exsiccata series is often not clearly distinguished from the neuter noun "exsiccatum" (Latin, gen. -i, plur. -a) which is used in general for a dried herbarium specimen. There exists also the Latin adjective "exsiccatus, -a, -um" meaning "dried" which is often part of a Latin title of an exsiccata, e.g. Lichenes exsiccati.
The oldest series known as an exsiccata is that of the German naturalist and pharmacist called Herbarium vivum recens collectum... It was distributed in 1732. The plant material and text information is for the education of physicia |
https://en.wikipedia.org/wiki/RustDesk | RustDesk is a remote access and remote control software, allowing maintenance of computers and other devices. The RustDesk client is available for different operating systems. RustDesk has the aspiration to be an open source alternative for Remote desktop software like TeamViewer or AnyDesk. Therefore, RustDesk is able to function without additional tools like VPNs or port forwardings, even behind firewalls or NATs. RustDesk used to be based on the proprietary Sciter UI runtime library, but as of 2022 there are plans to replace it with Flutter.
Features (extract)
Remote access for multiple operating systems (Windows, Linux, macOS, iOS, Android)
End-to-end encryption
Optional self hosted server
File transfer
Chat
TCP tunneling
See also
Comparison of remote desktop software
Remote desktop software
References
External links
source code repository at GitHub
RustDesk in the Chocolatey repository for Windows
RustDesk in the F-Droid (Android) app store
RustDesk in the Google Play (Android) app store
RustDesk in the Apple App Store (iOS)
Linux remote administration software
MacOS remote administration software
Portable software
Cross-platform software
Remote desktop
Windows remote administration software |
https://en.wikipedia.org/wiki/Microbial%20electrochemical%20technologies | Microbial electrochemical technologies (METs) use microorganisms as electrochemical catalyst, merging the microbial metabolism with electrochemical processes for the production of bioelectricity, biofuels, H2 and other valuable chemicals. Microbial fuel cells (MFC) and microbial electrolysis cells (MEC) are prominent examples of METs. While MFC is used to generate electricity from organic matter typically associated with wastewater treatment, MEC use electricity to drive chemical reactions such as the production of H2 or methane. Recently, microbial electrosynthesis cells (MES) have also emerged as a promising MET, where valuable chemicals can be produced in the cathode compartment. Other MET applications include microbial remediation cell, microbial desalination cell, microbial solar cell, microbial chemical cell, etc.,.
History
The use of microbial cells to produce electricity was perceived by M.C. Potter in 1911 with the finding that "The disintegration of organic compounds by microorganisms is accompanied by the liberation of electrical energy". A noteworthy addition in MFC research was made by B. Cohen in 1931, when microbial half fuel cells stack connected in series was created, capable of producing over 35 V with a current of 0.2 mA. Two breakthroughs were made in the late 1980s when two of the first known bacteria capable of transporting electron from the cell interior to the extracellular metal oxides without artificial redox mediators: Shewanella (formerly Alteromonas) oneidensis MR-1 and Geobacter sulfurreducens PCA were isolated. In late 90s, Kim et al. showed that the Fe(III)-reducing bacterium, S. oneidensis MR-1 was electrochemically active and can generate electricity in a MFC without any added electron mediators. These findings set basis for the development of electromicrobiology, and the field of MFC started. However, due to low power generation, it was also doubtful whether the MFC can be practical application on wastewater organics reduction. |
https://en.wikipedia.org/wiki/Resilience%20engineering | Resilience engineering is a subfield of safety science research that focuses on understanding how complex adaptive systems cope when encountering a surprise. The term resilience in this context refers to the capabilities that a system must possess in order to deal effectively with unanticipated events. Resilience engineering examines how systems build, sustain, degrade, and lose these capabilities.
Resilience engineering researchers have studied multiple safety-critical domains, including aviation, anesthesia, fire safety, space mission control, military operations, power plants, air traffic control, rail engineering, health care, and emergency response to both natural and industrial disasters. Resilience engineering researchers have also studied the non-safety-critical domain of software operations.
Whereas other approaches to safety (e.g., behavior-based safety, probabilistic risk assessment) focus on designing controls to prevent or mitigate specific known hazards (e.g., hazard analysis), or on assuring that a particular system is safe (e.g., safety cases), resilience engineering looks at a more general capability of systems to deal with hazards that were not previously known before they were encountered.
In particular, resilience engineering researchers study how people are able to cope effectively with complexity to ensure safe system operation, especially when they are experiencing time pressure. Under the resilience engineering paradigm, accidents are not attributable to human error. Instead, the assumption is that humans working in a system are always faced with goal conflicts, and limited resources, requiring them to constantly make trade-offs while under time pressure. When failures happen, they are understood as being due to the system temporarily being unable to cope with complexity. Hence, resilience engineering is related to other perspectives in safety that have reassessed the nature of human error, such as the "new look", the "new view", "safety d |
https://en.wikipedia.org/wiki/Cognitive%20systems%20engineering | Cognitive systems engineering (CSE) is a field of study that examines the intersection of people, work, and technology, with a focus on safety-critical systems. The central tenet of cognitive systems engineering is that it views a collection of people and technology as a single unit that is capable of cognitive work, which is called a joint cognitive system.
CSE draws on concepts from cognitive psychology and cognitive anthropology, such as Edwin Hutchins's distributed cognition, James Gibson's ecological theory of visual perception, Ulric Neisser's perceptual cycle, and William Clancey's situated cognition. CSE techniques include cognitive task analysis and cognitive work analysis.
History
Cognitive systems engineering emerged in the wake of the Three Mile Island (TMI) accident. At the time, existing theories about safety were unable to explain how the operators at TMI could be confused about what was actually happening inside of the plant.
Following the accident, Jens Rasmussen did early research on cognitive aspects of nuclear power plant control rooms. This work influenced a generation of researchers who would later come to be associated with cognitive systems engineering, including Morten Lind, Erik Hollnagel, and David Woods.
Following the publication of a textbook on cognitive systems engineering by Kim Vicente in 1999 the techniques employed to establish a cognitive work analysis (CWA) were used to aid the design of any kind of system were humans have to interact with technology. The tools outlined by Vicente were not tried and tested, and there are few if any published accounts of the five phases of analysis being implemented.
"Cognitive systems engineering" vs "Cognitive engineering"
The term "cognitive systems engineering" was introduced in a 1983 paper by Hollnagel and Woods.
Although the term cognitive engineering had already been introduced by Don Norman, Hollnagel and Woods deliberately introduced new terminology. They were unhappy with the |
https://en.wikipedia.org/wiki/Quantum%20Experiments%20using%20Satellite%20Technology | Quantum Experiment using Satellite Technology launched in 2017 by Raman Research Institute in February 2021 demonstrated Quantum communication for 50 m apart and on 19 March 2021 for 300 m apart inline of sight in Space Applications Centre in coordination with Indian Space Research Organisation ,Indian Institute of Science and Tata Institute of Fundamental Research,. India 's first project on satellite based long distance Quantum communication,
Technical specifications
Indigenously developed NAVIC receiver for time synchronization
Gimbal mechanism system instead of large aperture telescope for optical alignment.
Shared Quantum secure text.
Shared image transmission.
Quantum assisted two ways video conferencing at Space Applications Centre and Physical Research Laboratory
It used robust and high brightness entangled photon source, used BBM92 protocol implementation NAVIC enabled synchronization polarization compensation technique.
It is hack proof as it uses Quantum key distribution.
It uses Quantum cryptography and carried out by Quantum information and computing (QuIC) lab .
References
Satellites
Quantum computing
Indian Space Research Organisation |
https://en.wikipedia.org/wiki/Ordinal%20priority%20approach | Ordinal priority approach (OPA) is a multiple-criteria decision analysis method that aids in solving the group decision-making problems based on preference relations.
Description
Various methods have been proposed to solve multi-criteria decision-making problems. The basis of most methods such as analytic hierarchy process and analytic network process is pairwise comparison matrix. The advantages and disadvantages of the pairwise comparison matrix were discussed by Munier and Hontoria in their book. In recent years, the OPA method was proposed to solve the multi-criteria decision-making problems based on the ordinal data instead of using the pairwise comparison matrix. The OPA method is a major part of Dr. Amin Mahmoudi's PhD thesis from the Southeast University of China.
This method uses linear programming approach to compute the weights of experts, criteria, and alternatives simultaneously. The main reason for using ordinal data in the OPA method is the accessibility and accuracy of the ordinal data compared with exact ratios used in group decision-making problems involved with humans.
In real-world situations, the experts might not have enough knowledge regarding one alternative or criterion. In this case, the input data of the problem is incomplete, which needs to be incorporated into the linear programming of the OPA. To handle the incomplete input data in the OPA method, the constraints related to the criteria or alternatives should be removed from the OPA linear-programming model.
Various types of data normalization methods have been employed in multi-criteria decision-making methods in recent years. Palczewski and Sałabun showed that using various data normalization methods can change the final ranks of the multi-criteria decision-making methods. Javed and colleagues showed that a multiple-criteria decision-making problem can be solved by avoiding the data normalization. There is no need to normalize the preference relations and thus, the OPA method does |
https://en.wikipedia.org/wiki/BBM92%20protocol | BBM92 is a quantum key distribution without Bell's theorem developed using polarized entangled photon pairs by Charles H. Bennett , Gilles Brassard and N. David Mermin in 1992. It is named after the trio's surnames as (Bennett, Brassard and Mermin, BBM92). It uses decoy state of multiple photon instead of single. The key differences in E91 protocol and B92 uses only two states instead of four states used by E91 protocol and BB84
It is used for non orthogonal quantum transmission 0 can be encrypted as 0 degree and 1 as 45 degree in diagonal basis BB92 protocol. There are no eavesdropping secure and hack proof for distance of 200-300 m.
References
Quantum cryptography
Photonics |
https://en.wikipedia.org/wiki/C.%20Hart%20Merriam%20Award | The C. Hart Merriam Award is given annually by the American Society of Mammalogists for "outstanding research in mammalogy".
The Merriam Award was established in 1974. Before 1996 the award was given for "outstanding contributions to mammalogy through research, teaching, and service". The award is named in honor of C. Hart Merriam (1855–1942). He was not only a founding member of the American Society of Mammalogists and a physician with an M.D. from Columbia University, but also "naturalist, ethnologist, explorer, scholar, lecturer, author, personal friend of Presidents ..."
References
Biology awards
Awards established in 1974 |
https://en.wikipedia.org/wiki/Ada%20Lovelace%20%28microarchitecture%29 | Ada Lovelace, also referred to simply as Lovelace, is the codename for a graphics processing unit (GPU) microarchitecture developed by Nvidia as the successor to the Ampere architecture, officially announced on September 20, 2022. Named after English mathematician Ada Lovelace, who is often regarded as the first computer programmer, it is the first architecture to include a first and last name. Nvidia announced the architecture along with the new GeForce 40 series consumer GPUs and the RTX 6000 Ada Generation pro workstation graphics card. The new GPUs were revealed to use TSMC's new 5 nm "4N" process which offers increased efficiency over the previous Samsung 8 nm and TSMC N7 processes used by Nvidia for its previous-generation Ampere architecture.
Background
The Ada Lovelace architecture follows on from the Ampere architecture that was released in 2020. The Ada Lovelace architecture was announced by Nvidia CEO Jensen Huang during a GTC 2022 keynote on September 20, 2022 with the architecture powering Nvidia's GPUs for gaming, workstations and datacenters.
Architectural details
Architectural improvements of the Ada Lovelace architecture include the following:
CUDA Compute Capability 8.9
TSMC 4Nprocess (custom designed for NVIDIA) - not to be confused with TSMC's regular N4 node
4th-generation Tensor Cores with FP8, FP16, bfloat16, TensorFloat-32 (TF32) and sparsity acceleration
3rd-generation Ray Tracing Cores, plus concurrent ray tracing and shading and compute
Shader Execution Reordering (SER)
Nvidia video encoder/decoder (NVENC/NVDEC) with 8K 10-bit 60FPS AV1 fixed function hardware encoding
No NVLink support
Streaming multiprocessors (SMs)
CUDA cores
128 CUDA cores are included in each SM.
RT cores
Ada Lovelace features third-generation RT cores
The RTX 4090 features 128 RT cores compared to the 84 in the previous generation RTX 3090 Ti. These 128 RT cores can provide up to 191 TFLOPS of compute with 1.49 TFLOPS per RT core.
A new stage in the r |
https://en.wikipedia.org/wiki/Truthfinder | TruthFinder is an American personal information search website based in San Diego, California.
History
TruthFinder was founded in March 2015 in San Diego, California by Kris Kibak and Joey Rocco.
In December 2021, TechRadar reviewed TruthFinder.
TruthFinder provides information related to people for background checks and reverse address lookup. The website also provides an option to opt-out.
References
Organizations based in California
Internet properties established in 2015
Data brokers
2015 establishments in California |
https://en.wikipedia.org/wiki/Misra%E2%80%93Gries%20heavy%20hitters%20algorithm | Misra and Gries defined the heavy-hitters problem
(though they did not introduce the term heavy-hitters) and described the first algorithm
for it in the paper Finding repeated elements. Their algorithm
extends the Boyer-Moore majority finding algorithm
in a significant way.
One version of the heavy-hitters problem is as follows: Given is a
bag of elements and an integer . Find the values that
occur more than times in . The Misra-Gries algorithm solves
the problem by making two passes over the values in , while storing
at most values from and their number of occurrences during the
course of the algorithm.
Misra-Gries is one of the earliest streaming algorithms,
and it is described below in those terms in section #Summaries.
Misra–Gries algorithm
A bag is like a set in which the same value may occur multiple
times. Assume that a bag is available as an array of elements.
In the abstract description of the algorithm, we treat
and its segments also as bags. Henceforth, a heavy hitter of
bag is a value that occurs more than times in it, for some integer , .
A -reduced bag for bag is derived from by
repeating the following operation until no longer possible: Delete distinct elements from . From its definition, a -reduced bag contains fewer than different values.
The following theorem is easy to prove:
Theorem 1. Each heavy-hitter of is an element of a -reduced bag for .
The first pass of the heavy-hitters computation constructs a -reduced
bag . The second pass declares an element of to be a heavy-hitter if
it occurs more than times in . According to Theorem 1, this
procedure determines all and only the heavy-hitters. The second pass
is easy to program, so we describe only the first pass.
In order to construct , scan the values in in arbitrary order, for
specificity the following algorithm scans them in the order of
increasing indices. Invariant of the
algorithm is that is a -reduced bag for the scanned values and is
the number of distinct valu |
https://en.wikipedia.org/wiki/Vogue%20Scandinavia | Vogue Scandinavia is the Scandinavian edition of the American fashion and lifestyle monthly magazine Vogue. The magazine has been published since August 2021 and is the twenty-sixth local edition of Vogue.
History
In June 2020, Vogue launched a Scandinavian edition of the magazine, with Martina Bonnier being the Editor-in-Chief, featuring Scandinavian fashion as well covering politics of the Nordic region. It was announced that the magazine would be published in English, so it would be accessible worldwide. It was also revealed that in effort to be more sustainable, the magazine would be the first edition of Vogue not to be sold in physical shops. In May 2021, Vogue Scandinavia opened a digital flagship store.
The first issue was released on August 2021, with Greta Thunberg, a Swedish environmental activist, on its cover. In February 2022, Prince Nikolai of Denmark appeared on the fourth issue of the magazine.
See also
List of Vogue Scandinavia cover models
References
External links
Condé Nast magazines
Fashion websites
Lifestyle magazines
Magazines established in 2020
Online magazines
Women's fashion magazines
Vogue (magazine) |
https://en.wikipedia.org/wiki/List%20of%20ATSC%203.0%20television%20stations%20in%20the%20United%20States | This is a list of United States television stations which broadcast using the ATSC 3.0 standard, branded as "NextGen TV".
References
United States
ATSC 3.0
ATSC 3.0 |
https://en.wikipedia.org/wiki/Height%20above%20mean%20sea%20level | Height above mean sea level is a measure of the vertical distance (height, elevation or altitude) of a location in reference to a historic mean sea level taken as a vertical datum. In geodesy, it is formalized as orthometric heights.
The quantity is called "metres above mean sea level" in the metric system, while in United States customary and imperial units it would be called "feet above mean sea level".
Mean sea levels are affected by climate change and other factors and change over time. For this and other reasons, recorded measurements of elevation above sea level at a reference time in history might differ from the actual elevation of a given location over sea level at a given moment.
Uses
Metres above sea level is internationally the standard measurement of the elevation or altitude of:
Geographic locations such as towns, mountains and other landmarks.
The top of buildings and other structures.
Flying objects such as airplanes or helicopters in China and below Transition Altitude (or TA) in Russia and many CIS member countries.
Methods of measurement
The elevation or altitude in metres above sea level of a location, object, or point can be determined in a number of ways. The most common include:
Global Navigation Satellite System (like GPS), where a receiver determines a location from pseudoranges to multiple satellites. A geoid is needed to convert the 3D position to sea-level elevation.
Altimeter, that measures atmospheric pressure, which decreases as altitude increases. As atmospheric pressure changes with the weather too, a recent local measure of the pressure at a known altitude is needed to calibrate the altimeter.
Stereoscopy in aerial photography.
Aerial lidar and satellite laser altimetry.
Aerial or satellite radar altimetry.
Surveying, especially levelling.
Accurate measurement of historical mean sea levels is complex. Land mass subsidence (as occurs naturally in some regions) can give the appearance of rising sea levels. Conversely, |
https://en.wikipedia.org/wiki/Certified%20Organic%20Sunscreen | A Certified Organic Sunscreen, also known as Petrochemical-Free Sunscreen, is a third party certified sunscreen product consisting of certified and approved organic ingredients, with typically zinc oxide acting as the photo-protector. An organic sunscreen is verified and approved by a certifier to international or national organic standards, such as NSF/ANSI 305 and USDA organic, which define production and labelling requirements for personal care products containing organic ingredients. These standards are complemented by existing sunscreen regulatory bodies such as the FDA that regulate the efficacy of the sunscreen, safety and permitted ingredients. Generally speaking, sunscreen has photo-protective properties that reduce the risk of skin cancer and ageing with relation to the SPF value and proper application.
Certified organic sunscreen is part of a broader trend towards certified organic cosmetics and certified natural cosmetics. Especially in the sunscreen market, developers have been 'pushed' towards alternatives to petrochemical UV filters due to their lack of safety data and their detrimental ecological effects, which has resulted in various petrochemical UV filters being banned in different countries and ecological areas.
Organisations that manage standards and certifiers generally provide allowances for natural ingredients, such as water, as well as minerals such zinc oxide and titanium dioxide towards their organic calculation as they have photo-protective properties and a well regarded safety profile.
Certified Organic refers to the processing and production of a personal care product without the use of synthetic pesticides, herbicides, petrochemicals, aromatic hydrocarbons and other contaminants or practices. Petrochemical suppliers may promote their ingredients as organic compounds; however, it is essential not to confuse this with the term "organic" or "certified organic". In this context, "organic compounds" simply means that the substances are d |
https://en.wikipedia.org/wiki/Aleste%20Gaiden | is a 1989 vertically scrolling shooter video game developed and published by Compile for the MSX2 home computer. A follow-up to Aleste (1988), it was included as part of the autumn special edition of Disc Station, a monthly disk publication by Compile. It is a sidestory to the main series, taking place in an alternative continuity. Controlling the soldier Raymond Waizen, protagonist of the first game wearing a cybernetic ninja suit, the player must overthrow the supercomputer DIA 51 by fight waves of enemies and bosses, while avoiding collision with their projectiles and other obstacles.
Gameplay
Aleste Gaiden is a vertical-scrolling shoot 'em up game. The plot takes place in an alternative continuity and follows Raymond Waizen, protagonist of the first Aleste game, who wears a cybernetic ninja suit codenamed in order to overthrow the supercomputer DIA 51. Its gameplay differs from the original entry; the player controls Raymond instead of a ship through five increasingly difficult stages over a constantly scrolling background, populated with an assortment of enemy forces and obstacles such as pits that must be avoided by jumping, and the scenery never stops moving until a boss is reached, which must be fought to progress further.
Unlike other Aleste titles, enemies move in a preset formation instead of being determined by an artificial intelligence. There are three types of power-ups that can be collected by the player, which come in capsules marked with a chikara symbol, as well as equip Raymond with shadow clones of himself. Gaiden employs a checkpoint system in which a downed player will start off at the beginning of the checkpoint they managed to reach before dying. However, the player respawn immediately when fighting a boss. Getting hit will result in losing a life, as well as a penalty of decreasing Raymond's firepower to his original state and the game is over once all lives are lost.
Development and release
Aleste Gaiden was created by Compile, which |
https://en.wikipedia.org/wiki/Passive%20daytime%20radiative%20cooling | Passive daytime radiative cooling (PDRC) is a zero-energy building cooling method proposed as a solution to reduce air conditioning, lower urban heat island effect, cool human body temperatures in extreme heat, move toward carbon neutrality and control global warming by enhancing terrestrial heat flow to outer space through the installation of thermally-emissive surfaces on Earth that require zero energy consumption or pollution. Application of PDRCs may also increase the efficiency of systems benefiting of a better cooling, such like photovoltaic systems, dew collection techniques, and thermoelectric generators.
PDRC surfaces are designed to be high in solar reflectance (to minimize heat gain) and strong in longwave infrared (LWIR) thermal radiation heat transfer through the atmosphere's infrared window (8–13 µm) to cool temperatures even during the daytime. It is also referred to as passive radiative cooling (PRC), daytime passive radiative cooling (DPRC), radiative sky cooling (RSC), photonic radiative cooling, and terrestrial radiative cooling. PDRC differs from solar radiation management because it increases radiative heat emission rather than merely reflecting the absorption of solar radiation.
Some estimates propose that if 1–2% of the Earth's surface area were dedicated to PDRC that warming would cease and temperature increases would be rebalanced to survivable levels. Regional variations provide different cooling potentials with desert and temperate climates benefiting more from application than tropical climates, attributed to the effects of humidity and cloud cover on reducing the effectiveness of PDRCs. Low-cost scalable PDRC materials feasible for mass production have been developed, such as coatings, thin films, metafabrics, aerogels, and biodegradable surfaces.
PDRCs can be included in self-adaptive systems, 'switching' from passive cooling to heating to mitigate any potential "overcooling" effects in urban environments. They have also been develo |
https://en.wikipedia.org/wiki/Pasta%20by%20Design | Pasta by Design is a book by George L. Legendre, with a foreword by Paola Antonelli, and photography by Stefano Graziani. It is based on an idea by Marco Guarnieri.
Overview
Pasta by Design spans the fields of architecture, food, and popular science . The book features 92 pasta shapes, each depicted by a photograph, a mathematical equation, a 3D visual, and a short paragraph on geographic provenance and cooking etiquette. It was first published in 2011 by Thames & Hudson, while a German translation was published in 2012 by Springer Verlag.
Taxonomy
Pasta by Design is primarily a work of taxonomy, or classification. The critical inventory of shapes recalls the compilations of building-related knowledge known in the nineteenth century as architectural treatises, in which the source material was systematically drawn and formatted as a catalogue. To organise and classify the large variety of pasta shapes, the book employed principles of phylogenetics. The first application of phylogenetics to architectural criticism appeared in Phylogenesis: FOA’s Ark (2003), an architectural monograph by Foreign Office Architects, the first such publication to catalogue projects exclusively by design property, instead of the commonly used markers of programme, location, and client. Pasta by Design opens with a phylogenetic chart and uses it to classify 92 pasta shapes. Unlike FOA’s Ark its criteria of classification are couched in analytic mathematics.
Morphology
The organizing principle of classification is the morphology of each pasta shape, reduced to its elemental characteristics and expressed by simple mathematical relationships. Shapes which may look dissimilar at first glance, such as Sagne Incannulate and Cappelletti, may still be described with the same mathematical relationships and hence may turn out to be more closely related than is immediately apparent. Technically, each shape is depicted by three parametric equations of two mathematical functions, the sine and cosin |
https://en.wikipedia.org/wiki/Safety%20and%20liveness%20properties | Properties of an execution of a computer program—particularly for concurrent and distributed systems—have long been formulated by giving safety properties ("bad things don't happen") and liveness properties ("good things do happen").
A simple example will illustrate safety and liveness. A program is totally correct with respect to a precondition and postcondition if any execution started in a state satisfying terminates in a state satisfying . Total correctness is a conjunction of a safety property and a liveness property:
The safety property prohibits these "bad things": executions that start in a state satisfying and terminate in a final state that does not satisfy . For a program , this safety property is usually written using the Hoare triple .
The liveness property, the "good thing", is that execution that starts in a state satisfying terminates.
Note that a bad thing is discrete, since it happens at a particular place during execution.
A "good thing" need not be discrete, but the liveness property of termination is discrete.
Formal definitions that were ultimately proposed for safety properties and liveness properties demonstrated that this decomposition is not only intuitively appealing but is also complete: all properties of an execution are a conjunction of safety and liveness properties. Moreover, undertaking the decomposition can be helpful, because the formal definitions enable a proof that different methods must be used for verifying safety properties versus for verifying liveness properties.
Safety
A safety property proscribes discrete bad things from occurring during an execution. A safety property thus characterizes what is permitted by stating what is prohibited. The requirement that the bad thing be discrete means that a bad thing occurring during execution necessarily occurs at some identifiable point.
Examples of a discrete bad thing that could be used to define a safety property include:
An execution that starts in a state satisfy |
https://en.wikipedia.org/wiki/Nervos%20Network | Nervos Network is a blockchain platform which consists of multiple blockchain layers that are designed for different functions. The foundational layer is known as the Common Knowledge Base, whilst the native cryptocurrency of this layer is called CKB. This foundational layer uses a proof-of-work consensus model. Smart contracts and decentralized applications can be deployed on any layer.
Nervos Network was founded in 2018 by Jan Xie, Terry Tai, Kevin Wang, Daniel Lv, and Cipher Wang.
Architecture
Nervos Network utilizes multiple blockchain layers to for different functions. The base layer prioritizes security and decentralization, and is optimized to verify transactions. It can settle transactions submitted from upper layers and resolves disputes. Layer 2 and above are able to favor greater throughput demands of software applications.
Layer 1
The foundational layer of Nervos Network is known as the Common Knowledge Base. The native cryptocurrency to this layer is referred to as CKB (or CKByte). This currency also stores digital assets and executes smart contracts. One CKB represents 1 byte of storage on the blockchain.
NC MAX
Layer 1 achieves cryptographic consensus through proof of work, using a modified version of Bitcoin's Nakamoto consensus algorithm: NC-MAX. NC-MAX was presented at the Internet Society's Network and Distributed System Security (NDSS) Symposium in 2022. The consensus process uses a novel hash function called "Eaglesong."
Cell Model
The accounting method on layer 1 is an expansion of Bitcoin's UTXO model, and is dubbed the "Cell model". This model is programmable, thereby supporting smart contracts. Additionally, a cell is able to store data on-chain, such as non-fungible tokens (NFTs), compiled code, or serialized data like JSON strings.
CKB-VM
The CKB virtual machine (CKB-VM) is a software-based emulated computer that executes smart contracts on Nervos Network's base layer.
Like the Ethereum virtual machine, CKB is a Turing-complete |
https://en.wikipedia.org/wiki/JMA%20Wireless | JMA Wireless is an American wireless networking hardware manufacturing company in Syracuse, New York. It was founded in 2012 by the current chief executive officer John Mezzalingua.
It offers Open-RAN compliant 5G Radio access network (RAN) products, 5G millimeter wave products, private wireless technology hardware products, focusing on design, code, and manufacture of 4G and 5G devices in the United States. JMA Wireless created the first fully virtualized RAN for carrier and private networks as well as the first indoor 5G millimeter wave radio in the United States.
On May 19, 2022, JMA Wireless and Syracuse University announced the signing of a 10-year naming rights deal of the on-campus stadium, renaming the Carrier Dome after 42 years. The stadium was renamed as the JMA Wireless Dome, referred to as the JMA Dome.
History
The company was founded in 2012 by John D. Mezzalingua as John Mezzalingua Associates LLC, initially employing about 150 people. He previously ran the Production Products Company (PPC), which was sold to Belden Incorporated for $515.7 million.
The headquarters, located in Clay, New York was expanded in 2017. In 2021, the company operated manufacturing facilities in Syracuse, with additional R&D, manufacturing, and sales staff Dallas, Austin, Chicago, Boulder, Richmond, VA, and Europe. In 2022, the company's $100 million 5G manufacturing campus was inaugurated by New York state Governor Kathy Hochul.
It is a member of the Wireless Infrastructure Association.
Software and other technology
JMA Wireless operates on a software-based XRAN architecture, which integrates processes into a single common server and removes the need for radios and jumpers in Distributed antenna system (DAS) deployments.
In 2018, JMA Wireless acquired 5G radio provider PHAZR for an undisclosed amount. The deal allowed JMA Wireless to offer 5G RAN technologies that supports spectrum from 600 MHz all the way up to 47 GHz.
Notable projects
JMA Wireless has deployed 5g in |
https://en.wikipedia.org/wiki/International%20System%20Safety%20Society | The International System Safety Society (ISSS) is a non-profit professional organization for system safety engineers. ISSS was established in 1963 to support the development of system safety as a distinct engineering discipline.
ISSS has local chapters in several states across the United States, as well as in Singapore and Canada. The society currently has members from over 25 countries across the world.
History
The event recognized as the founding of the Society occurred on December 4, 1963, in the main lecture hall at the School of Aviation Safety on the University of Southern California campus in Los Angeles. The gathering consisted of about 40 individuals, including many students and others from the USAF Aerospace Safety Center, some USC faculty members, along with system safety representatives of the numerous aerospace companies located in the area.
Events
Since the first event in 1972, the society has sponsored the annual International System Safety Conference. The society also sponsors annual member awards which are presented at the annual awards banquet during the International System Safety Conference. In addition, the society and local chapters organize webinars and symposia for society members.
ISSS is also one of the sponsoring societies for the annual Reliability and Maintainability Symposium. The society is also a sponsor of the Board of Certified Safety Professionals (BCSP) Global Learning Summit.
Publications
The society publishes the Journal of System Safety, a triannual peer-reviewed academic journal, as well as periodic member newsletters and formerly the System Safety Analysis Handbook. The journal was established in 1965 as Hazard Prevention and obtained its current name in 1999. It is considered one of the important journals in the field of reliability and safety and is one of the oldest in continuous publication. The journal seeks to advance the discipline of system safety across a wide range of application domains, including aerospa |
https://en.wikipedia.org/wiki/IFIP%20Working%20Group%202.3 | IFIP Working Group 2.3 on Programming Methodology is a working group of the International Federation for Information Processing (IFIP). Its main aim is to increase programmers’ ability to compose programs. To this end, WG2.3 provides an international forum for discussion and cross-fertilization of ideas between researchers in programming methodology and neighboring fields. Generally, members report on work in progress and expect suggestions and advice. Discussions are often broadened by inviting "observers" to meetings as full participants, some of whom eventually become members.
Scope
This scope of work in WG2.3 was introduced by Edsger W. Dijkstra in meeting 0 (Oslo, Norway, July 1969).
Identification of sources of difficulties encountered in present-day programming;
The interdependence between the formulation of problems and the formulation of programs, and the mapping of relations existing in the world of problems into the relations among programs and their components;
Intellectual disciplines and problem-solving techniques that can aid programmers in the composition of programs;
The problem of achieving program reliability;
The consequences of requirements for program adaptability;
The problem of provability of program correctness and its influence on the structure of programs and on the process of their composition;
Guidelines of partitioning large programming tasks and defining the interfaces between the parts;
Software for mechanized assistance to program composition.
History
In December 1968, IFIP Working Group 2.1 adopted the proposal by Aad van Wijngaarden as a successor to Algol 60 (ultimately leading to ALGOL 68). A group of members of WG2.1 opposed it and produced a minority report. The group also felt that rather than just programming languages, a forum was needed to discuss the general problem of programming. Another impetus for the creation of a group was the findings of the first of the NATO Software Engineering Conferences, held in 1968, wh |
https://en.wikipedia.org/wiki/Omega%20Strikers | Omega Strikers is a free-to-play action sport video game, developed and published by Canada-based Odyssey Interactive. The game features short, three-on-three online multiplayer matches, in which players compete to score more goals than the opposing team. It was released on April 27, 2023 for Microsoft Windows, Nintendo Switch, iOS and Android, on May 5 for Xbox One and Xbox Series X/S, and on May 19 for PlayStation 5 and PlayStation 4.
Gameplay
Matches in Omega Strikers consist of two teams of three players, who fight to score by sending the Core, a large hockey puck, into the opponent's goal. The game is presented from a top-down perspective, with the whole playfield visible at once.
Players can choose from a cast of characters, known as Strikers. Each Striker has a unique set of abilities which can be used to attack enemy players, move the Core, buff allies, debuff enemies, and more. When a character takes damage, their stagger bar decreases, and once it reaches 0 it becomes easier for their opponents to knock them back, or even off the field entirely for a short duration. Through interactions like scoring goals, knocking out enemies, making saves, etc., players obtain experience points that can power up strikers throughout the game. There are also various power-ups called "awakenings" that serve as upgrades throughout the game.
Development
Omega Strikers is developed and self-published by Odyssey Interactive, a studio composed of former staff of Riot Games.
The game entered a Steam-exclusive closed beta testing phase on 16 September 2022, and transitioned to an open beta shortly after.
References
External links
2023 video games
Action games
Android (operating system) games
Air hockey video games
Free-to-play video games
Indie games
IOS games
Multiplayer online games
Nintendo Switch games
PlayStation 4 games
PlayStation 5 games
Science fiction video games
Sports video games
Strategy video games
Unreal Engine games
Video games developed in Canada
Video ga |
https://en.wikipedia.org/wiki/Tripartite%20symbiosis | Tripartite symbiosis is a type of symbiosis involving three species. This can include any combination of plants, animals, fungi, bacteria, or archaea, often in interkingdom symbiosis.
Ants
Fungus-growing ants
Ants of Attini cultivate fungi. Microfungi, specialized to be parasites of the fungus gardens, coevolved with them.
Allomerus-Hirtella-Trimmatostroma
Allomerus decemarticulatus ants use Trimmatostroma sp. to create structures within Hirtella physophora. The fungi are connected endophytically and actively transfer nitrogen.
Lichen
The mycobiont in a lichen can form a relationship with both cyanobacteria and green algae as photobionts concurrently.
Legumes
Rhizobia are nitrogen-fixating bacteria that form symbiotic relationships with legumes. Sometimes, this is aided by the presence of a fungal species. This is most effective in undistributed soil. The presence of mycorrhizae can improve the rhizobial-liquorice nutrient transfer in droughts. Soybeans in particular can improve their ability to withstand soil salinity with the presence of both rhizobium and mycorrhizae.
References
Symbiosis |
https://en.wikipedia.org/wiki/Crungus | A Crungus is an imaginary creature found in artificial intelligence text-to-image models, sometimes also referred to as a digital cryptid. Twitch streamer and voice actor Guy Kelly found that typing the made-up word into the Craiyon image generator consistently produced pictures of a monstrous, hairy humanoid.
He later tweeted about this which resulted in a very long thread of reactions and experiments with "Crungus" and variations of this AI-prompt also on different engines.
Since Craiyon version 2 was introduced in 2023, the prompt "Crungus" no longer produces the same imaginary creature.
Origins
It is unclear how the Crungus in Craiyon's output came into existence. Kelly thinks an error in the AI software models is the most likely explanation. In any case, the Krampus was quickly eliminated as a model. Although the horned mythical figure from the Alpine region has a similar name and his mask looks almost exactly the same, the term "Krampus" on Craiyon provides different images.
Kelly also speculated at the AI was responding to the "-rungus" suffix, noting the similar appearance of heavy metal performer Oderus Urungus, and could be interpreting the word as something "orc-based", in reference to the fantasy creatures.
See also
Artificial intelligence art
Loab, another AI generated cryptid
References
Artificial intelligence art
Text-to-image generation
Fictional characters introduced in 2022
Internet memes introduced in 2022
Internet meme characters |
https://en.wikipedia.org/wiki/Hemoglycin | Hemoglycin (previously termed hemolithin) is a space polymer that is the first polymer of amino acids found in meteorites.
Structure
Structural work has determined that its 1494 Dalton core unit (Glycine18 / Hydroxy-glycine4 / Fe2O4) contains iron, but not lithium, leading to the more general term hemoglycin for these molecules The hemoglycin core contains a total of 22 glycine residues in an anti-parallel beta sheet chain that is terminated at each end by an iron atom plus two oxygens. Four of these glycine residues are oxidized to hydroxy-glycine with hydroxy groups (-OH) on the alpha carbon. This structure was determined by mass spectrometry of meteoritic solvent extracts and has been confirmed in X-ray scattering by crystals of hemoglycin, and also by optical absorption. Crystals show a 480 nm characteristic absorption that can only exist when hydroxy-glycine residues have “R” chirality and are C-terminal bonded to Fe.
History
Because hemoglycin has now been found to be the dominant polymer of amino acids in 6 different meteorites (Allende, Acfer 086, Efremovka, Kaba, Orgueil and Sutter's Mill), each time with the same structure, it has been proposed that it is produced by a process of template replication. The measured 480 nm absorbance is larger than expected for a racemic distribution of R and S chirality in the hydroxy-glycine residues, indicating an R chirality excess in the polymer. Modeling of template replication that is assumed to depend on 480 nm absorption leads to an excess of R chirality and thus is consistent with this finding.
Significance
Hemoglycin is a completely abiotic molecule that forms in molecular clouds going on to protoplanetary disks, way before biochemistry on exoplanets like Earth begins. Hemoglycin via its glycine could seed an exoplanet (one able to support early biochemistry) but its main function appears to be the accretion of matter via formation of an extensive low-density lattice in space in a protoplanetary disk. Besid |
https://en.wikipedia.org/wiki/List%20of%20organisms%20with%20names%20derived%20from%20Indigenous%20languages%20of%20the%20Americas | This list includes organisms whose common or scientific names are drawn from indigenous languages of the Americas. When the common name of the organism in English derives from an indigenous language of the Americas, it is given first.
In biological nomenclature, organisms receive scientific names, which are formally in Latin, but may be drawn from any language and many have incorporated words from indigenous language of the Americas. These scientific names are generally formally published in peer-reviewed journal articles or larger monographs along with descriptions of the named taxa and ways to distinguish them from other taxa.
List
References
indigenous
Taxonomy (biology)
Taxonomic lists
Indigenous languages of the Americas |
https://en.wikipedia.org/wiki/Floral%20morphology | In botany, floral morphology is the study of the diversity of forms and structures presented by the flower, which, by definition, is a branch of limited growth that bears the modified leaves responsible for reproduction and protection of the gametes, called floral pieces.
Fertile leaves or sporophylls carry sporangiums, which will produce male and female gametes and therefore are responsible for producing the next generation of plants. The sterile leaves are modified leaves whose function is to protect the fertile parts or to attract pollinators. The branch of the flower that joins the floral parts to the stem is a shaft called the pedicel, which normally dilates at the top to form the receptacle in which the various floral parts are inserted.
All spermatophytes ("seed plants") possess flowers as defined here (in a broad sense), but the internal organization of the flower is very different in the two main groups of spermatophytes: living gymnosperms and angiosperms. Gymnosperms may possess flowers that are gathered in strobili, or the flower itself may be a strobilus of fertile leaves. Instead a typical angiosperm flower possesses verticils or ordered whorls that, from the outside in, are composed first of sterile parts, commonly called sepals (if their main function is protective) and petals (if their main function is to attract pollinators), and then the fertile parts, with reproductive function, which are composed of verticils or whorls of stamens (which carry the male gametes) and finally carpels (which enclose the female gametes).
The arrangement of the floral parts on the axis, the presence or absence of one or more floral parts, the size, the pigmentation and the relative arrangement of the floral parts are responsible for the existence of a great variety of flower types. Such diversity is particularly important in phylogenetic and taxonomic studies of angiosperms. The evolutionary interpretation of the different flower types takes into account aspects of |
https://en.wikipedia.org/wiki/Nucleon%20magnetic%20moment | The nucleon magnetic moments are the intrinsic magnetic dipole moments of the proton and neutron, symbols μp and μn. The nucleus of an atom comprises protons and neutrons, both nucleons that behave as small magnets. Their magnetic strengths are measured by their magnetic moments. The nucleons interact with normal matter through either the nuclear force or their magnetic moments, with the charged proton also interacting by the Coulomb force.
The proton's magnetic moment, surprisingly large, was directly measured in 1933 by Otto Stern team in University of Hamburg. While the neutron was determined to have a magnetic moment by indirect methods in the mid 1930s, Luis Alvarez and Felix Bloch made the first accurate, direct measurement of the neutron's magnetic moment in 1940. The proton's magnetic moment is exploited to make measurements of molecules by proton nuclear magnetic resonance. The neutron's magnetic moment is exploited to probe the atomic structure of materials using scattering methods and to manipulate the properties of neutron beams in particle accelerators.
The existence of the neutron's magnetic moment and the large value for the proton magnetic moment indicate that nucleons are not elementary particles. For an elementary particle to have an intrinsic magnetic moment, it must have both spin and electric charge. The nucleons have spin ħ/2, but the neutron has no net charge. Their magnetic moments were puzzling and defied a valid explanation until the quark model for hadron particles was developed in the 1960s. The nucleons are composed of three quarks, and the magnetic moments of these elementary particles combine to give the nucleons their magnetic moments.
Description
The CODATA recommended value for the magnetic moment of the proton is or The best available measurement for the value of the magnetic moment of the neutron is Here, μN is the nuclear magneton, a standard unit for the magnetic moments of nuclear components, and μB is the Bohr mag |
https://en.wikipedia.org/wiki/ITU-T%20Study%20Group%2015 | The ITU-T Study Group 15 (SG15) 'Transport' is a standardization committee of ITU-T concerned with networks, technologies and infrastructures for transport, access and home. It responsible for standards such as GPON, G.fast, etc.
Administratively, SG15 is a statutory meeting of the World Telecommunication Standardization Assembly (WTSA), which creates the ITU-T Study Groups and appoints their management teams. The secretariat is provided by the Telecommunication Standardization Bureau (under Director Chaesub Lee).
The goal of SG15 is to produce recommendations (international standards) for networks.
Area of work
SG15 focuses on developing standards and recommendations related to optical transport networks, access network transport, and associated technologies.
Some of the key responsibilities of SG15 include:
Developing international standards for optical and transport networks, which covers fiber-optic communication systems, dense wavelength division multiplexing (DWDM), and synchronization aspects.
Addressing issues related to access network transport, such as digital subscriber lines (DSL), gigabit-capable passive optical networks (GPON), and Ethernet passive optical networks (EPON).
Developing recommendations for network management, control, and performance monitoring, as well as resilience, protection, and restoration mechanisms.
SG15 collaborates with other ITU-T study groups, regional standardization bodies, and industry stakeholders to ensure a comprehensive and coordinated approach to global telecommunication standardization.
See also
ITU-T
References
External links
ITU main site
ITU-T Study Group 15 web site
International Telecommunication Union
ITU-T Study Groups
Computer networking |
https://en.wikipedia.org/wiki/Statistical%20associating%20fluid%20theory | Statistical associating fluid theory (SAFT) is a chemical theory, based on perturbation theory, that uses statistical thermodynamics to explain how complex fluids and fluid mixtures form associations through hydrogen bonds. Widely used in industry and academia, it has become a standard approach for describing complex mixtures. Since it was first proposed in 1990, SAFT has been used in a large number of molecular-based equation of state models for describing the Helmholtz energy contribution due to association.
Overview
SAFT is a Helmholtz energy term that can be used in equations of state that describe the thermodynamic and phase equilibrium properties of pure fluids and fluid mixtures. SAFT was developed using statistical mechanics. SAFT models the Helmholtz free energy contribution due to association, i.e. hydrogen bonding. SAFT can be used in combination with other Helmholtz free energy terms. Other Helmholtz energy contributions consider for example Lennard-Jones interactions, covalent chain-forming bonds, and association (interactions between segments caused by, for example, hydrogen bonding). SAFT has been applied to a wide range of fluids, including supercritical fluids, polymers, liquid crystals, electrolytes, surfactant solutions, and refrigerants.
Development
SAFT evolved from thermodynamic theories, including perturbation theories developed in the 1960s, 1970s, and 1980s by John Barker and Douglas Henderson, Keith Gubbins and Chris Gray, and, in particular, Michael Wertheim's first-order, thermodynamic perturbation theory (TPT1) outlined in a series of papers in the 1980s.
The SAFT equation of state was developed using statistical mechanical methods (in particular the perturbation theory of Wertheim) to describe the interactions between molecules in a system. The idea of a SAFT equation of state was first proposed by Chapman and by Chapman et al. in 1988 and 1989. Many different versions of the SAFT models have been proposed, but all use the sam |
https://en.wikipedia.org/wiki/Slip%20bands%20in%20metals | Slip bands or stretcher-strain marks are localized bands of plastic deformation in metals experiencing stresses. Formation of slip bands indicates a concentrated unidirectional slip on certain planes causing a stress concentration. Typically, slip bands induce surface steps (e.g., roughness due persistent slip bands during fatigue) and a stress concentration which can be a crack nucleation site. Slip bands extend until impinged by a boundary, and the generated stress from dislocations pile-up against that boundary will either stop or transmit the operating slip depening on its (mis)orientation.
Formation of slip bands under cyclic conditions is addressed as persistent slip bands (PSBs) where formation under monotonic condition is addressed as dislocation planar arrays (or simply slip-bands, see Slip bands in the absence of cyclic loading section). Slip-bands can be simply viewed as boundary sliding due to dislocation glide that lacks (the complexity of ) PSBs high plastic deformation localisation manifested by tongue- and ribbon-like extrusion. And, where PSBs normally studied with (effective) Burgers vector aligned with the extrusion plane because a PSB extends across the grain and exacerbates during fatigue; a monotonic slip-band has a Burger’s vector for propagation and another for plane extrusions both controlled by the conditions at the tip.
Persistent slip bands (PSBs)
Persistent slip-bands (PSBs) are associated with strain localisation due to fatigue in metals and cracking on the same plane. Transmission electron microscopy (TEM) and three-dimensional discrete dislocation dynamics (DDD) simulation were used to reveal and understand dislocations type and arrangement/patterns to relate it to the sub-surface structure. PSB – ladder structure – is formed mainly from low-density channels of mobile gliding screw dislocation segments and high-density walls of dipolar edge dislocation segments piled up with tangled bowing-out edge segment and different sizes of d |
https://en.wikipedia.org/wiki/Tokenomics | Tokenomics, also known as token economics, is an emerging field concerned with the economic properties of agent-driven systems that use cryptographic tokens that are typically created and managed on blockchain-based distributed ledgers.
Key areas of interest include determining the value properties of the tokens themselves, and how the properties of tokens (together with other cryptographically secured rules and associated system actions) affect broader economic characteristics of the system including:
How they provision and distribute scarce resources
How that system interacts with other external economic processes
How economic agents behave
The economic efficiency of all these processes
The field often has a strong applied focus, concerning itself with how to use its insights and principles to engineer economic systems to possess specific, desired properties.
Both cryptocurrency and tokens are the subclasses of digital assets that use the technology of cryptography. Crypto is the native currency of a blockchain, and it is developed directly by the blockchain protocol.
Tokens can be created as native elements of a blockchain protocol, or by using a smart contract that is deployed on a blockchain which will host the new token. For example, Ether (ETH) is the native crypto asset of the Ethereum blockchain, and was created by the core Ethereum developer team to incentivise proper maintenance of the blockchain. While Axie Infinity Shards (AXS) tokens, were created using a Ethereum smart contract developed an unaffiliated third party, in order to give token holders certain governance rights over the game Axie Infinity.
In both cases, different tokenomic attributes are chosen to support the token's intended role. With particular attention typically being paid to tokens' ability to function as an incentive mechanism, and choosing monetary policy that brings token supply into line with its demand. This includes specifying rules about how and when new token sh |
https://en.wikipedia.org/wiki/Temporal%20plasticity | Temporal plasticity, also known as fine-grained environmental adaptation, is a type of phenotypic plasticity that involves the phenotypic change of organisms in response to changes in the environment over time. Animals can respond to short-term environmental changes with physiological (reversible) and behavioral changes; plants, which are sedentary, respond to short-term environmental changes with both physiological and developmental (non-reversible) changes.
Temporal plasticity takes place over a time scale of minutes, days, or seasons, and in environments that are both variable and predictable within the lifespan of an individual. Temporal plasticity is considered adaptive if the phenotypic response results in increased fitness. Non-reversible phenotypic changes can be observed in metameric organisms such as plants that depend on the environmental condition(s) each metamer was developed under. Under some circumstances early exposure to specific stressors can affect how an individual plant is capable of responding to future environmental changes (Metaplasticity).
Reversible plasticity
A reversible change is defined as one that is expressed in response to an environmental stressor but returns to a normal state after the stress is no longer present. Reversible changes are more likely to be adaptive for an organism when the stress driving the change is temporary and the organism is likely to be exposed to it again within its lifetime. Reversible plasticity often involves changes in physiology or behavior. Perennial plants, which often experience recurring stresses in their environment due to lack of mobility, benefit greatly from reversible physiological plasticity such as changes in resource uptake and allocation. When essential nutrients are low, root and leaf resorption rates can increase, persisting at a high rate until there are more nutrients available in the soil and resorption rates can return back to their normal state.
Irreversible plasticity
Irrevers |
https://en.wikipedia.org/wiki/Inverse%20lithography | In semiconductor device fabrication, the inverse lithography technology (ILT) is an approach to photomask design. This is basically an approach to solve an inverse imaging problem: to calculate the shapes of the openings in a photomask ("source") so that the passing light produces a good approximation of the desired pattern ("target") on the illuminated material, typically a photoresist. As such, it is treated as a mathematical optimization problem of a special kind, because usually an analytical solution does not exist. In conventional approaches known as the optical proximity correction (OPC) a "target" shape is augmented with carefully tuned rectangles to produce a "Manhattan shape" for the "source", as shown in the illustration. The ILT approach generates curvilinear shapes for the "source", which deliver better approximations for the "target".
The ILT was proposed in 1980s, however at that time it was impractical due to the huge required computational power and complicated "source" shapes, which presented difficulties for verification (design rule checking) and manufacturing. However in late 2000s developers started reconsidering ILT due to significant increases in computational power.
References
Lithography (microfabrication)
Inverse problems |
https://en.wikipedia.org/wiki/Calcarisporiellales | Calcarisporiellaceae is a family of fungi within the subkingdom Mucoromycota. It is the only family in the order Calcarisporiellales, class Calcarisporiellomycetes, subphylum Calcarisporiellomycotina and phylum Calcarisporiellomycota. It contains two known genera, Calcarisporiella and Echinochlamydosporium. The two genera each have one species.
General description
They have a thallus that is branched, with septate (has a singular septum) hyphae. The vegetative hyphae is hyaline (has a glassy appearance), smooth and thin-walled. It has cultures with no distinctive smell. The sporangiophores (a receptacle in ferns which bears the sporangia, if present) simple, hyaline, smooth, arising from undifferentiated hyphae. The sporangia is unispored, ellipsoid (in shape), with or without a small columella. Spores are uninucleate (having a single nucleus), hyaline, smooth, thin-walled, ovoid to ellipsoid, with a rounded base. Chlamydospores (if present) are 1-celled, elongate to globose, thick-walled and spiny, and are born laterally on short hyphae. The sexual cycle not known, but they are saprotrophic in soil and non-nematophagous (not carnivorous).
It can be found in soils.
History
Calcarisporiella was originally published in 1974 and originally thought to be an anamorphic member of the Pezizomycotina division, but later phylogenetic analysis of rDNA found that it was separate from the Endogonales and Mucorales clades.
A new genus, Echinochlamydosporium, was described in 2011 and placed in Mortierellaceae family. Then in 2018, after molecular analyses, Echinochlamydosporium was transferred to a new family Calcarisporiellaceae with Calcarisporiella.
The newly described Calcarisporiellomycota (comprising Calcarisporiella thermophila and Echinochlamydosporium variabile) represented a deep lineage with strongest affinities to Mucoromycota or Mortierellomycota.
Evolution and systematics
The Calcarisporiellaceae are a monophyletic group containing two species. According to a |
https://en.wikipedia.org/wiki/Competitive%20Carriers%20Association | The Competitive Carriers Association (commonly the CCA) was founded in 1992 by nine small wireless carriers in the United States as a 501(c)(6) non-profit trade association to promote the common interests of competitive, regional, and rural wireless services providers. Its counterpart, particularly for non-regional wireless carriers, is the CTIA.
History
The organization was founded in 1992 as the Rural Carriers Association (RCA), but became the Competitive Carriers Association in 2012 as national carriers Sprint and T-Mobile US joined. It has long advocated for policies and standards that promote greater competitive in the wireless industry, particularly with regard to issues around wireless spectrum.
References
External links
Telecommunications organizations
Wireless networking
Trade associations based in the United States
Trade shows in the United States
1992 establishments in the United States
Organizations established in 1992
501(c)(6) nonprofit organizations
Non-profit organizations based in Washington, D.C. |
https://en.wikipedia.org/wiki/Gabriel%20Popescu%20%28scientist%29 | Gabriel Popescu (October 24, 1971 — June 16, 2022) was an American optical engineer, who was the William L. Everitt Distinguished Professor in Electrical and Computer Engineering at University of Illinois Urbana-Champaign. He was best known for his work on biomedical optics and quantitative phase-contrast microscopy.
Biography
Popescu was born on October 24, 1971, in Romania. He obtained bachelor's and master's degrees in physics from the University of Bucharest in 1995 and 1996, respectively. He further obtained obtained a master's degree and a PhD in optics, in 1999 and 2002, respectively, from School of Optics at University of Central Florida. In 2002, he joined Massachusetts Institute of Technology as a postdoctoral researcher under the supervision of Michael Stephen Feld.
In 2007, Popescu joined the University of Illinois Urbana-Champaign as an assistant professor and established the Quantitative Light Imaging (QLI) Laboratory. At this institution, he was affiliated with the Electrical and Computer Engineering and Bioengineering departments, as well as with the Beckman Institute for Advanced Science and Technology. In 2009, he obtained American citizenship.
Popescu was an associate editor for Optics Express and Biomedical Optics Express and served as an editorial board member for Journal of Biomedical Optics and Scientific Reports. He was a Fellow of Optica, SPIE and American Institute for Medical and Biological Engineering. He is the author of the textbook Quantitative Phase Imaging of Cells and Tissues and editor of Nanobiophotonics, which were released respectively in 2010 and 2011. He is also the founder of the startup company Phi Optics, which focuses on the commercialization of quantitative phase-contrast microscopy.
On June 16, 2022, Popescu died in his native village of Prundu, after suffering a fatal motorcycle accident. He was survived by his wife Catherine Best-Popescu, a research assistant professor at University of Illinois Urbana-Champaign, an |
https://en.wikipedia.org/wiki/%C3%81lvaro%20R%C3%ADos%20Poveda | Álvaro Ríos Poveda (born 3 February 1974, Cali, Colombia) is a Colombian electronic engineer, university professor, and researcher who specializes in biomedical engineering and mechatronics. He has performed research on myoelectric prostheses, sensory feedback, and bionic vision technologies.
Early life and education
He began his studies at San Juan Berchmans school in Cali, Colombia.
Ríos earned an undergraduate degree in Electronic Engineering at Pontifical Xavierian University and completed his master's at Simon Bolivar University and doctorate studies at USF in Biomedical Engineering. Along with that, he obtained an MBA from ISEAD. His professional career began with research in neural prostheses and bionics systems. Ríos researched biomedical engineering, artificial intelligence, and robotics in control systems.
Career
Ríos is a researcher and university professor in undergraduate and postgraduate studies in Europe and Mexico. From a young age, his interest had grown into motor limitations.
In 1996, he developed prosthetic systems that allow patients to have greater limb functionality while ensuring accessibility for these systems in developing countries.
In 1997, a sensory feedback system for prostheses was presented at France's World Congress on Biomedical Engineering. Ríos's public work includes myoelectric prosthesis with sensory feedback, presented at MEC'02: The Next Generation. His work mainly aims of his work is to control prosthetics more naturally, utilizing artificial intelligence, neural control, machine learning, and gesture control.
Research
He has remained a member of the Publications Committee of the International Federation of Medical and Biological Engineering and a founding member of the Colombian Association of Biomedical Engineering. Since its inception in 2017, Ríos has been participating at every CBS IEEE International Conference on Bionic Systems and Cyborg and giving guest participation at IEEE's CBS 2017. Later in 2018, he represe |
https://en.wikipedia.org/wiki/Fermi%E2%80%93Dirac%20prime | In number theory, a Fermi–Dirac prime is a prime power whose exponent is a power of two. These numbers are named from an analogy to Fermi–Dirac statistics in physics based on the fact that each integer has a unique representation as a product of Fermi–Dirac primes without repetition. Each element of the sequence of Fermi–Dirac primes is the smallest number that does not divide the product of all previous elements. Srinivasa Ramanujan used the Fermi–Dirac primes to find the smallest number whose number of divisors is a given power of two.
Definition
The Fermi–Dirac primes are a sequence of numbers obtained by raising a prime number to an exponent that is a power of two. That is, these are the numbers of the form where is a prime number and is a non-negative integer. These numbers form the sequence:
They can be obtained from the prime numbers by repeated squaring, and form the smallest set of numbers that includes all of the prime numbers and is closed under squaring.
Another way of defining this sequence is that each element is the smallest positive integer that does not divide the product of all of the previous elements of the sequence.
Factorization
Analogously to the way that every positive integer has a unique factorization, its representation as a product of prime numbers (with some of these numbers repeated), every positive integer also has a unique factorization as a product of Fermi–Dirac primes, with no repetitions allowed. For example,
The Fermi–Dirac primes are named from an analogy to particle physics. In physics, bosons are particles that obey Bose–Einstein statistics, in which it is allowed for multiple particles to be in the same state at the same time. Fermions are particles that obey Fermi–Dirac statistics, which only allow a single particle in each state. Similarly, for the usual prime numbers, multiple copies of the same prime number can appear in the same prime factorization, but factorizations into a product of Fermi–Dirac primes only a |
https://en.wikipedia.org/wiki/List%20of%20Accipitriformes%20by%20population | This is a list of Accipitriformes species by global population. While numbers are estimates, they have been made by the experts in their fields. For more information on how these estimates were ascertained, see Wikipedia's articles on population biology and population ecology. For the sake of the argument and their small species diversity, the family Cathartidae will be included in this list
This list is incomprehensive, as not all Accipitriformes have had their numbers quantified.
Species by global population
See also
Lists of birds by population
Lists of organisms by population
References
Birds
Accipitriformes |
https://en.wikipedia.org/wiki/Right%20To%20Know | Right To Know is a non profit support project for those who discover via genealogical genetic testing that their lineage is not what they had supposed it to be due to family secrets and misattributed parentage, thus raising existential issues of adoption, race, ethnicity, culture, rape, etc.
See also
Genealogy
Genetic testing
External links
Right To Know - Your Genetic Identity
References
Organizations established in 2022
2022 establishments in the United States
Genetics |
https://en.wikipedia.org/wiki/IJP%20The%20Book%20of%20Surfaces | IJP the book of surfaces is a book by George L. Legendre, with a foreword by Mohsen Mostafavi.
Overview
IJP the Book of Surfaces was released in 2003 by the publishing arm of the London-based Architectural Association School of Architecture.The book features six essays on the notion of surface written from an architectural, philosophical, literary, mathematical, and computational angle, as well as several lighter asides ranging from cookery to poetry. These threads have been given a particular typographic and graphic design treatment meant to weave them together into a continuous narrative.
Background and literary references
The book addresses some significant developments of the decade, such as the explosion of computational tools; the emergence of the 3D surface as an architectural signifier of the Digital Revolution; the profession's fascination with the formal possibilities of surface cladding; and the rise of innovative manufacturing technologies. It can be compared to contemporary titles like Mohsen Mostafavi’s and David Leatherbarrow’s Surface Architecture, an essay on the phenomenology of architectural façades, and Ellen Lupton’s collection Skin: Surface, Substance + Design, which explores the working metaphor of artificial skin in Materials science, fashion and the visual arts. By comparison, Illa Berman notes that IJP the Book of Surfaces withdraws from external cultural currents and their contexts and emerges from within the formal and computational specificity of the surface itself. As a piece of writing, it is indebted to the literary school Oulipo. Its treatment of one theme as a collection of vignettes written in different voices (linguistic, mathematical, computational, mock-literary, and pop-cultural) nods back to Raymond Queneau’s 1947 Exercises in Style, in which the same trivial event is told and re-told in different idioms.
Form and content
In keeping with the literary/mathematical spirit of Oulipo, layout, typography, and pagination form |
https://en.wikipedia.org/wiki/Habitability%20of%20neutron%20star%20systems | The habitability of neutron star systems means assessing and surveying whether life is possible on planets and moons orbiting a neutron star.
A habitable planet orbiting a neutron star must be between one and 10 times the mass of the Earth. If the planet were lighter, its atmosphere would be lost. Its atmosphere must also be thick enough to convert the intense X-ray radiation emanating from the parent star into heat on its surface. Then it could have the temperature suitable for life.
A magnetic field strong enough — the magnetosphere — would protect the planet from the strong solar winds. This could preserve the planet's atmosphere for several billion years. Such a planet could have liquid water on its surface.
A Dutch research team published an article on the subject in the journal Astronomy & Astrophysics in December 2017.
See also
Habitability of red dwarf systems
Habitability of K-type main-sequence star systems
Habitability of natural satellites
Dragon's Egg and its sequel Starquake, novels by Robert L. Forward, about life on a neutron star itself.
References
Planetary habitability
Neutron stars
Astrobiology |
https://en.wikipedia.org/wiki/Oper%20%28mathematics%29 | In mathematics, an Oper is a principal connection, or in more elementary terms a type of differential operator. They were first defined and used by Vladimir Drinfeld and Vladimir Sokolov to study how the KdV equation and related integrable PDEs correspond to algebraic structures known as Kac–Moody algebras. Their modern formulation is due to Drinfeld and Alexander Beilinson.
History
Opers were first defined, although not named, in a 1981 Russian paper by Drinfeld and Sokolov on Equations of Korteweg–de Vries type, and simple Lie algebras. They were later generalized by Drinfeld and Beilinson in 1993, later published as an e-print in 2005.
Formulation
Abstract
Let be a connected reductive group over the complex plane , with a distinguished Borel subgroup . Set , so that is the Cartan group.
Denote by and the corresponding Lie algebras. There is an open -orbit consisting of vectors stabilized by the radical such that all of their negative simple-root components are non-zero.
Let be a smooth curve.
A G-oper on is a triple where is a principal -bundle, is a connection on and is a -reduction of , such that the one-form takes values in .
Example
Fix the Riemann sphere. Working at the level of the algebras, fix , which can be identified with the space of traceless complex matrices. Since has only one (complex) dimension, a one-form has only one component, and so an -valued one form is locally described by a matrix of functions
where are allowed to be meromorphic functions.
Denote by the space of valued meromorphic functions together with an action by , meromorphic functions valued in the associated Lie group . The action is by a formal gauge transformation:
Then opers are defined in terms of a subspace of these connections. Denote by
the space of connections with . Denote by the subgroup of meromorphic functions valued in of the form
with meromorphic.
Then for it holds that . It therefore defines an action. The orbits of this action |
https://en.wikipedia.org/wiki/Snowflake%20%28software%29 | Snowflake is a software package for assisting others in circumventing internet censorship by relaying data requests. Snowflake relay nodes are meant to be created by people in countries where Tor and Snowflake are not blocked. People under censorship then use a Snowflake client, packaged with the Tor Browser or Onion Browser, to access the Tor network, using Snowflake relays as proxy servers. Access to the Tor network can in turn give access to other blocked services (like blocked websites). A Snowflake node can be created by either installing a browser extension, installing a stand-alone program, or browsing a webpage with an embedded Snowflake relay. The node runs whenever the browser or program is connected to the internet.
Tor relays content requests through a chain of Tor nodes, including Snowflake nodes (onion routing). Each node in the chain only knows the addresses of the two adjacent links and cannot decrypt any of the other data it is relaying, which makes tracking or blocking the traffic much more difficult. A common countermeasure is blocking Tor nodes; the number and shifting nature of the Snowflake nodes make identifying and blocking connections to these nodes more difficult.
Tor is itself illegal in some countries. Like the internet, it can relay any sort of content, and some types of content are illegal in some countries.
History
Snowflake was originated by Serene, a hacker and former Google engineer and concert pianist. The name "Snowflake" was coined as her metaphor for a large number of ephemeral proxies in relation to "ICE Negotiation". Three programmers published the first version in January 2016. In 2019, it became available as a browser extension for Firefox and Chrome.It can also be run on derived browsers, such as Brave and Microsoft Edge. In February 2023 a thoroughly upgraded, stand-alone version dubbed Snowstorm was released; written in Rust and funded by the Open Tech Fund, beta testing is by invitation.
Function
Normal internet dat |
https://en.wikipedia.org/wiki/Competence%20factor | The ability of a cell to successfully incorporate exogenous DNA, or competency, is determined by competence factors. These factors consist of certain cell surface proteins and transcription factors that induce the uptake of DNA.
Natural competence is the ability of a cell to bind to and transport extracellular DNA through the membrane and recombine foreign genes into its own DNA through a process called transformation. Horizontal gene transfer is a result of this, where bacterial genes can be transferred amongst same-generation species in a given environment, and competence is the ability of a cell to participate in the transfer. If one cell in a population living in an unfavorable environment has a mutation that results in better survivability, that gene can be passed on to other competent cells to extend the same advantage. Plasmids, commonly used in genetic manipulation, can also be shared through horizontal gene transfer, which is especially relevant in modern medicine concerning the exchange of antibiotic-resistant plasmids.
A cell's competence can be determined by its genetics, which is the case for natural competency, or it can be manipulated in order to achieve artificial competence.
There are two types of competence-inducing pheromones, these are ComX and CSF. ComX is a ten amino acid oligopeptide; it requires two co-components, ComP, an histidine kinase, and ComA, a cytoplasmic response regulator. ComX binds to ComP on the outside of the inner membrane; ComP autophosphorylates and the phosphoric group is transferred to ComA. This activates transcription of genes in the competence pathway. CSF is a five amino acid oligopeptide and is exported via the GSP pathway. CSF enters the cell through oligopeptide permeate and stimulates the competence pathway at low concentrations (1-5 nM); at high concentrations (>20 nM) competence is inhibited and sporulation is stimulated.
Types of competence
Natural
Most bacterium found in nature are said to be naturally |
https://en.wikipedia.org/wiki/Fast%20endophilin-mediated%20endocytosis | Fast endophilin-mediated endocytosis (FEME) is an endocytic pathway found in eukaryotic cells. It requires the activity of endophilins as well as dynamins, but does not require clathrin.
In Clathrin-dependent endocytic pathways, endosomes budding from the cell membrane into the cell will form in clathrin pits, and be coated by clathrin triskelions. In FEME however, endosomes form when coated by actin, and internalise endophilin A2.
Function
Each endocytic pathway focuses on a particular component, and FEME is primarily involved in transporting receptors. These include receptors for acetylcholine and IL-2.
Associated proteins
EGFR
HGFR
VEGFR
PDGFR
NGFR
IGF1R
SHIP1
SHIP2
References
Molecular biology
Cellular processes |
https://en.wikipedia.org/wiki/Glossary%20of%20mycology | This glossary of mycology is a list of definitions of terms and concepts relevant to mycology, the study of fungi. Terms in common with other fields, if repeated here, generally focus on their mycology-specific meaning. Related terms can be found in glossary of biology and glossary of botany, among others. List of Latin and Greek words commonly used in systematic names and Botanical Latin may also be relevant, although some prefixes and suffixes very common in mycology are repeated here for clarity.
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
R
S
T
U
V
W
X
Y
Z
See also
List of mycologists
Outline of fungi
Outline of lichens
Glossary of lichen terms
References
Bibliography
Mycology
mycology
Wikipedia glossaries using description lists |
https://en.wikipedia.org/wiki/Text-to-video%20model | A text-to-video model is a machine learning model which takes as input a natural language description and produces a video matching that description.
Video prediction on making objects realistic in a stable background is performed by using recurrent neural network for a sequence to sequence model with a connector convolutional neural network encoding and decoding each frame pixel by pixel, creating video using deep learning.
Methodology
Data collection and data set preparation using clear video from kinetic human action video.
Training the convolutional neural network for making video.
Keywords extraction from text using natural-language programming .
Testing of Data set in conditional generative model for existing static and dynamic information from text by variational autoencoder and generative adversarial network.
Models
There are different models including open source models. CogVideo presented their code in GitHub. Meta Platforms uses text-to-video with makeavideo.studio.Google used Imagen Video for converting text-to-video.
Antonia Antonova presented another model.
In March 2023, a landmark research paper by Alibaba research was published, applying many of the principles found in latent image diffusion models to video generation. Services like Kaiber or Reemix have since adopted similar approaches to video generation in their respective products.
Matthias Niessner (TUM) and Lourdes Agapito (UCL) at AI company Synthesia work on developing 3D neural rendering techniques that synthesise realistic video. The goal is to improve existing text to video model by 2D and 3D neural representations of shape appearance and motion for controllable video synthesis of avatars that look and sound like real people.
Although alternative approaches exist, full latent diffusion models are currently regarded to be state of the art for video diffusion.
References
Artificial intelligence
Algorithms
Language
Computers |
https://en.wikipedia.org/wiki/Global%20coordination%20level | Global coordination level (GCL) is a computational method that evaluates the system-wide dependency in multivariate data, by calculating the distance correlation between random subsets of the variables. Originally applied to gene expression data, GCL assesses the level of coordination between genes, which are fundamentally linked in performing tasks and biological functions. Unlike traditional methods that require precise knowledge of pairwise interactions between genes, GCL can evaluate coordination without such information. The GCL value of zero signifies independent gene expression, while values above zero indicate gene-to-gene regulatory interactions. For instance, when GCL is applied to known genetic pathways in the Kyoto Encyclopedia of Genes and Genomes (KEGG) database, it yields significantly positive values, while random subsets of genes or mock pathways with similar gene expression levels show very low GCL values. Additionally, GCL can be useful in analyzing high-dimensional ecological and biochemical dynamics.
Introduction
Genes interact with each other in a complex structure known as the gene regulatory network, which plays a crucial role in implementing various biological functions and performing different tasks within cells. However, inferring the precise pairwise interactions of the gene regulatory network remains challenging due to the large number of functional genes and the inherent stochasticity of these systems. Despite these challenges, certain features of the gene regulatory network can still be extracted without fully inferring all the interactions. For instance, the network connectivity, which refers to the density of actual gene-gene interactions compared to all possible interactions, may have important implications for general cellular processes.
Method description
The calculation of the Conditional Likelihood (CL) is based on multivariate dependencies among genes in a given cohort of cells. This involves a repeated procedure of randomly |
https://en.wikipedia.org/wiki/Roberto%20Markarian | Roberto Markarian Abrahamian (12 December 1946) is a Uruguayan mathematician of Armenian descent, expert in dynamical systems and chaos theory.
Biography
He started studying at the University of the Republic in the 1960s. During the civic-military dictatorship he was arrested due to political reasons. Later he went to Brazil, where he graduated from the Federal University of Rio Grande do Sul. Later on, his degree was validated in Uruguay.
Markarian served as rector of the University of the Republic (2014-2018).
He is brother of the football coach Sergio Markarian.
References
1946 births
Living people
Uruguayan people of Armenian descent
Federal University of Rio Grande do Sul alumni
Academic staff of the University of the Republic (Uruguay)
University of the Republic (Uruguay) rectors
Uruguayan mathematicians
Dynamical systems theorists
Chaos theorists |
https://en.wikipedia.org/wiki/Algal%20virus | Algal viruses are the viruses infecting photosynthetic single-celled eukaryotes, algae. As of 2020, there were 61 viruses known to infect algae. Algae are integral components of aquatic food webs and drive nutrient cycling, so the viruses infecting algal populations also impacts the organisms and nutrient cycling systems that depend on them. Thus, these viruses can have significant, worldwide economic and ecological effects. Their genomes varied between 4.4 to 560 kilobase pairs (kbp) long and used double-stranded Deoxyribonucleic Acid (dsDNA), double-stranded Ribonucleic Acid (dsRNA), single-stranded Deoxyribonucleic Acid (ssDNA), and single-stranded Ribonucleic Acid (ssRNA). The viruses ranged between 20 and 210 nm in diameter. Since the discovery of the first algae-infecting virus in 1979, several different techniques have been used to find new viruses infecting algae and it seems that there are many algae-infecting viruses left to be discovered
DNA viruses
The viruses that store their genomic information using DNA, DNA viruses, are the best studied subgrouping of algae-infecting viruses This is especially true for the dsDNA virus family, Phycodnaviridae. However, other groups of dsDNA viruses including giant viruses belonging to the family Mimiviridae also infect algae. A recent survey of 65 algal genomes revealed that some viruses belonging to the Nucleocytoplasmic Large DNA Viruses (NCLDV), the larger viral group containing both the Phycodnaviridae and Mimiviridae viral families, had integrated themselves into 24 of the host’s genomes. Recently, ssDNA viruses, like the group of diatom infecting viruses Bacilladnaviridae, have been discovered
RNA viruses
RNA viruses also attack algal hosts. There are dsRNA viruses like those belonging to the Reoviridae family that infect Micromonas pusilla and ssRNA viruses like those belonging to the genus Bacillarnavirus that infect the diatom Chaetoceros tenuissimus. Incorporation of RNA virus genes into algal genome |
https://en.wikipedia.org/wiki/Vertically%20Generalized%20Production%20Model | The Vertically Generalized Production Model (VGPM) is a model commonly used to estimate primary production within the ocean. The VGPM was designed by Behrenfeld and Falkowski and was originally published in a 1997 article in Limnology and Oceanography. It is one of the most frequently used models for primary production estimation due to its ability to be applied to chlorophyll a data from satellites, and its relatively simple design. Chlorophyll a is a common measure of primary production, as it is a main component of photosynthesis.
Primary production is often measured using three variables: the biomass (or amount in weight) of the phytoplankton, the availability of light, and the rate of carbon fixation. The VGPM is now one of the most popular models to use for satellite chlorophyll data due to it being surface light dependent as well as using an estimated maximum value of primary production compared to the units of chlorophyll throughout the water column, known as PBopt. It also considers environmental factors that often influence primary production as well as allows for variables often collected using remote satellites to derive the primary production without having to physically sample the water. This PBopt was found to be dependent on surface chlorophyll, and data for this can be collected using satellites. Satellites can only collect the parameters used to estimate primary production; they cannot calculate it themselves, which is why the need for a model to do so exists.
Because of this being a generalized model, it is intended to reflect most accurately the open ocean. Other localized areas, especially coastal regions, may need to incorporate additional factors to get the most accurate representation of primary production. The values produced using the VGPM are estimates and there will be some level of uncertainty with using this model.
References
Ecology
Oceanography |
https://en.wikipedia.org/wiki/Beta-model | In model theory, a mathematical discipline, a β-model (from the French "bon ordre", well-ordering) is a model which is correct about statements of the form "X is well-ordered". The term was introduced by Mostowski (1959) as a strengthening of the notion of ω-model. In contrast to the notation for set-theoretic properties named by ordinals, such as -indescribability, the letter β here is only denotational.
In set theory
It is a consequence of Shoenfield's absoluteness theorem that the constructible universe L is a β-model.
In analysis
β-models appear in the study of the reverse mathematics of subsystems of second-order arithmetic. In this context, a β-model of a subsystem of second-order arithmetic is a model M where for any Σ11 formula with parameters from M, iff .p.243 Every β-model of second-order arithmetic is also an ω-model, since working within the model we can prove that < is a well-ordering, so < really is a well-ordering of the natural numbers of the model.
There is am incompleteness theorem for β-models, if T is a recursively axiomatizable theory in the language of second-order arithmetic, analogously to how there is a model of T+"there is no model of T" if there is a model of T, there is a β-model of T+"there are no countable coded β-models of T" if there is a β-model of T. A similar theorem holds for βn-models for any natural number .
Axioms based on β-models provide a natural finer division of the strengths of subsystems of second-order arithmetic, and also provide a way to formulate reflection principles. For example, over , is equivalent to the statement "for all [of second-order sort], there exists a countable β-model M such that .p.253 (Countable ω-models are represented by their sets of integers, and their satisfaction is formalizable in the language of analysis by an inductive definition.) Also, the theory extending KP with a canonical axiom schema for a recursively Mahlo universe (often called ) is logically equivalent to the theory Δ-CA+ |
https://en.wikipedia.org/wiki/Fuzzy%20differential%20inclusion | Fuzzy differential inclusion is tha culmination of Fuzzy concept and Differential inclusion introduced by Lotfi A. Zadeh which became popular.
,
f(t,x(t)] is a fuzzy valued continuous function on euclidian space which is collection of all normal, upper semi-continuous, Convex set
,Compact space , supported fuzzy subsets of .
Second order differential
The second order differential is
where
K is trapezoidal fuzzy number (-1,-1/2,0,1/2)
is a trianglular fuzzy number (-1,0,1) .
Applications
Fuzzy differential inclusion (FDI) has applications in
Cybernetics
Artificial intelligence , Neural network,
Medical imaging
Robotics
Atmospheric dispersion modeling
Weather forecasting
Cyclone
Population biology
Stochastic process , Probability theory
References
Dynamical systems
Variational analysis
Fuzzy logic |
https://en.wikipedia.org/wiki/Transmission%20congestion | Electricity transmission congestion is a condition of the electrical grid that prevents the accepted or forecasted load schedules from being implemented due to the grid configuration and equipment performance limitations. In simple terms, congestion occurs when overloaded transmission lines are unable to carry additional electricity flow due to the risk of overheating and the transmission system operator (TSO) has to direct the providers to adjust their dispatch levels to accommodate the constraint or in an electricity market a power plant can produce electricity at a competitive price but cannot transmit the power to a willing buyer. Congestion increases the electricity prices for some customers.
Definitions
There is no universally accepted definition of the transmission congestion. Congestion is not an event, so it is frequently not possible to pinpoint its place and time (in this respect it is similar to traffic congestion). Regulators define congestion as a condition that prevents market transactions from being completed, while a transmission system operator sees it as inability to maintain the security of the power system operation with the power flow scheduled for the grid.
A congestion is a symptom of a constraint or a combination of constraints in a transmission system, usually the limits on physical electricity flow are used to prevent the overheating, unacceptable voltage levels, and loss of system stability. Congestion can be permanent, an effect of the system configuration, or temporary, due to a fault in the transmission equipment.
Congestion management
Avoiding the congestion is essential for a competitive electricity market and is "one of the toughest problems" of its design. The goal is ensure that a power flow as defined by the wholesale market result does not violate the constraints during the normal operation of the grid and in the case of failure of any one particular component (so called n-1 criterion).
The existing markets use a range o |
https://en.wikipedia.org/wiki/Amphibia%20%28taxon%29 | There are several taxons named amphibia. These include:
Amphibia (class), classis Amphibia, the amphibians
Species
Species with the specific epithet 'amphibia'
Rorippa amphibia (R. amphibia), a plant
Persicaria amphibia (P. amphibia), a plant
Neritina amphibia (N. amphibia), a snail
Aranea amphibia (A. amphibia), a spider
See also
Amphibian (disambiguation)
Amphibia (disambiguation) |
https://en.wikipedia.org/wiki/The%20Unknown%20%28hypertext%20novel%29 | The Unknown (also known as The Unknown: The Original Great American Hypertext Novel ) is a web-based hypertext novel written by William Gillespie, Scott Rettberg and Dirk Stratton with Frank Marquardt. It won the 1999 Trace/Alt-X International Hypertext Contest. The name The Unknown was used to refer to both the work and its authors.
Plot
The Unknown is a sprawling hypertext novel about a fictional book tour the four authors are on to promote the Unknown Anthology.
Kristin Krauth describes it as "a satire on publishing and promotion as well as a tough and funny look at the nature of creating hypertext". Brad Quinn describes the plot as "an adventure novel about a book tour for a book that doesn't exist, and it has all kinds of ridiculous behavior, drug abuse and famous people who would probably be shocked and none too happy to find out that they are in the novel."
Performances
The Unknown was not only a story about a book tour, the authors performed the hypertext at many different events. One of the first performances was at a Brown University event called "Technology Platforms for 21st Century Literature" (TP21CL). A journalist writing about the event for PC Magazine noted that "A group of authors gave a reading of a funny hypertext novel called "The Unknown," which had different tracks you move among." Writing for the MIT Technology Review, Nick Montfort describes the authors performing in suits, and beginning the reading by reading a fictionalised account of travelling to the reading they are actually at. Montfort writes: "The different authors rotate through three roles. One reads, one works the mouse, and one dings a bell to alert the audience to each hypertext link. Audience members interact by calling out when they want to click on a link."
An article in the Los Angeles Times about a reading they did in 2000 quotes Dirk Stratton: "With the traditional reading, you have the silent audience: attentive, rapt, staring up at the genius author waiting for enl |
https://en.wikipedia.org/wiki/Undeadline | is a 1989 vertically scrolling shooter video game developed and originally published by T&E Soft for the MSX2 and MSX2+ home computers. It was later ported to the X68000 computer and Sega Mega Drive, published by Palsoft, followed by digital re-releases for Microsoft Windows. Both the MSX2 and X68000 versions also received physical re-releases by Japanese retailer BEEP. It follows a group of characters in rescue of queen Althea from Zidane, a kingdom surrounded by barriers connected with the demon world, whose monsters have overflowed it. Controlling either a fighter, wizard, or ninja, the player can choose from six stages and play them in any order, fighting against waves of enemies and bosses, while defending or avoiding collision with their projectiles and other obstacles.
Undeadline was directed and designed by Tokihiro Naito, who previously worked on Hydlide and Hydlide 3, with Tetsuya "Futaro" Yamamoto serving as main programmer. The soundtrack was composed by Kazunori Hasegawa. Due to T&E Soft liking to push its playtesters to the limits and as their skills improved naturally, it led to the designers increasing the difficulty to keep up with them, particularly paying attention to both enemy movement and spawn patterns. Because of its rarity, original copies of the MSX2 version commands high prices on the secondary game collecting market. The game received generally favorable reception from critics, most of which reviewed it as an import title, although its difficulty has been criticized.
Gameplay
Undeadline is a vertical-scrolling shoot 'em up game with role-playing elements that plays from a top-down perspective. The plot revolves around Zidane, a country surrounded by barriers connected with the demon world whose monsters have overflowed it after being broken in a previous regime. Queen Althea manages the kingdom, as his father became exhausted from battle and fell ill, but she is kidnapped by a creature from the demon world. Joined by the wizard Dino |
https://en.wikipedia.org/wiki/Observability%20%28software%29 | In distributed systems, observability is the ability to collect data about programs' execution, modules' internal states, and the communication among components. To improve observability, software engineers use a wide range of logging and tracing techniques to gather telemetry information, and tools to analyze and use it. Observability is foundational to site reliability engineering, as it is the first step in triaging a service outage.
One of the goals of observability is to minimize the amount of prior knowledge needed to debug an issue.
Etymology, terminology and definition
The term is borrowed from control theory, where the "observability" of a system measures how well its state can be determined from its outputs. Similarly, software observability measures how well a system's state can be understood from the obtained telemetry (metrics, logs, traces, profiling).
The definition of observability varies by vendor:
The term is frequently referred to as its numeronym O11y (where 11 stands for the number of letters between the first letter and the last letter of the word). This is similar to other computer science abbreviations such as i18n and L10n and k8s.
Observability vs. monitoring
Observability and monitoring are sometimes used interchangeably. As tooling, commercial offerings and practices evolved in complexity, "monitoring" was re-branded as observability in order to differentiate new tools from the old.
The terms are commonly contrasted in that systems are monitored using predefined sets of telemetry, and monitored systems may be observable.
Majors et al. suggest that engineering teams that only have monitoring tools end up relying on expert foreknowledge (seniority), whereas teams that have observability tools rely on exploratory analysis (curiosity).
Telemetry types
Observability relies on three main types of telemetry data: metrics, logs and traces. Those are often referred to as "pillars of observability".
Metrics
A metric is a point in tim |
https://en.wikipedia.org/wiki/Extrinsic%20Geometric%20Flows | Extrinsic Geometric Flows is an advanced mathematics textbook that overviews geometric flows, mathematical problems in which a curve or surface moves continuously according to some rule. It focuses on extrinsic flows, in which the rule depends on the embedding of a surface into space, rather than intrinsic flows such as the Ricci flow that depend on the internal geometry of the surface and can be defined without respect to an embedding.
Extrinsic Geometric Flows was written by Ben Andrews, Bennett Chow, Christine Guenther, and Mat Langford, and published in 2020 as volume 206 of Graduate Studies in Mathematics, a book series of the American Mathematical Society.
Topics
The book consists of four chapters, roughly divided into four sections:
Chapters 1 through 4 concern the heat equation and the curve-shortening flow defined from it, in which a curve moves in the Euclidean plane, perpendicularly to itself, at a speed proportional to its curvature. It includes material on curves that remain self-similar as they flow, such as circles and the grim reaper curve , the Gage–Hamilton–Grayson theorem according to which every simple closed curve converges to a circle until eventually collapsing to a point, without ever self-intersecting, and the classification of ancient solutions of the flow.
Chapters 5 through 14 concern the mean curvature flow, a higher dimensional generalization of the curve-shortening flow that uses the mean curvature of a surface to control the speed of its perpendicular motion. After an introductory chapter on the geometry of hypersurfaces, It includes results of Ecker and Huisken concerning "locally Lipschitz entire graphs", and Huisken's theorem that uniformly convex surfaces remain smooth and convex, converging to a sphere, before they collapse to a point. Huisken's monotonicity formula is covered, as are the regularity theorems of Brakke and White according to which the flow is almost-everywhere smooth. Several chapters in this section concern th |
https://en.wikipedia.org/wiki/Dark%20forest%20hypothesis | The dark forest hypothesis is the conjecture that many alien civilizations exist throughout the universe, but they are both silent and hostile, maintaining their undetectability by humanity for fear of being destroyed by another hostile and undetected civilization. In this framing, it is presumed that any space-faring civilization would view any other intelligent life as an inevitable threat, and thus destroy any nascent life that makes itself known. As a result, the electromagnetic spectrum would be relatively quiet, without evidence of any intelligent alien life, as in a "dark forest" filled with "armed hunter(s) stalking through the trees like ghosts".
Background
There is no reliable or reproducible evidence that aliens have visited or attempted to contact Earth. No transmissions and little evidence of intelligent extraterrestrial life has been detected or observed. This runs counter to the knowledge that the universe is filled with a very large number of planets, some of which likely hold conditions hospitable for life. Life typically expands until it fills all available niches. These contradictory facts form the basis for the Fermi paradox, of which the dark forest hypothesis is one proposed solution.
Relationship to other proposed Fermi paradox solutions
The dark forest hypothesis is distinct from the berserker hypothesis in that many alien civilizations would still exist if they kept silent. It can be viewed as a special example of the Berserker hypothesis, if the 'deadly berserker probes' are (due to resource scarcity) only sent to star systems that show signs of intelligent life.
Game theory
The dark forest hypothesis is a special case of the "sequential and incomplete information game" in game theory.
In game theory, a "sequential and incomplete information game" is one in which all players act in sequence, one after the other, and none are aware of all available information. In the case of this particular game, the only win condition is continued sur |
https://en.wikipedia.org/wiki/Berserker%20hypothesis | The berserker hypothesis, also known as the deadly probes scenario, is the idea that humans have not yet detected intelligent alien life in the universe because it has been systematically destroyed by a series of lethal Von Neumann probes. The hypothesis is named after the Berserker series of novels (1963-2005) written by Fred Saberhagen.
The hypothesis has no single known proposer, and instead is thought to have emerged over time in response to the Hart–Tipler conjecture, or the idea that an absence of detectable Von Neumann probes is contrapositive evidence that no intelligent life exists outside of the Sun's Solar System. According to the berserker hypothesis, an absence of such probes is not evidence of life's absence, since interstellar probes could "go berserk" and destroy other civilizations, before self-destructing.
In his 1983 paper "The Great Silence", astronomer David Brin summarized the frightening implications of the berserker hypothesis: it is entirely compatible with all the facts and logic of the Fermi paradox, but would mean that there exists no intelligent life left to be discovered. In the worst-case scenario, humanity has already alerted others to its existence, and is next in line to be destroyed.
Background
There is no reliable or reproducible evidence that aliens have visited Earth. No transmissions or evidence of intelligent extraterrestrial life have been observed anywhere other than Earth in the Universe. This runs counter to the knowledge that the Universe is filled with a very large number of planets, some of which likely hold the conditions hospitable for life. Life typically expands until it fills all available niches. These contradictory facts form the basis for the Fermi paradox, of which the berserker hypothesis is one proposed solution.
Responses
A key component of the hypothesis is that Earth's solar system has not yet been visited by a berserker probe. In a 2013 analysis by Anders Sandberg and Stuart Armstrong at the Future |
https://en.wikipedia.org/wiki/Hart%E2%80%93Tipler%20conjecture | The Hart–Tipler conjecture is the idea that an absence of detectable Von Neumann probes is contrapositive evidence that no intelligent life exists outside of the Solar System. This idea was first proposed in opposition to the Drake equation in a 1975 paper by Michael H. Hart titled "Explanation for the Absence of Extraterrestrials on Earth". The conjecture is the first of many proposed solutions to the Fermi paradox (the conflict between the lack of obvious evidence for alien life and various high probability estimates for its existence). In this case, the solution is that there is no other intelligent life because such estimates are incorrect. The conjecture is named after astrophysicist Michael H. Hart and mathematical physicist and cosmologist Frank Tipler.
Background
There is no reliable or reproducible evidence that aliens have visited Earth. No transmissions or evidence of intelligent extraterrestrial life have been detected or observed anywhere other than Earth in the Universe. If intelligent life existed it would have produced enough von Neumann probes to cover the universe by now, which runs counter to the knowledge that the Universe is filled with a very large number of planets, some of which likely hold the conditions hospitable for life. Life typically expands until it fills all available niches. These contradictory facts form the basis for the Fermi paradox, of which the Hart–Tipler conjecture is one proposed solution.
Relationship to other proposed Fermi paradox solutions
The firstborn hypothesis is a special case of the Hart–Tipler conjecture which states that no other intelligent life has been discovered because humanity is the first intelligent life in the universe. According to the Berserker hypothesis, the absence of interstellar probes is not evidence of life's absence, since such probes could "go berserk" and destroy other civilizations, before self-destructing.
References
Astrobiology
Extraterrestrial life
Hypotheses
Astronomical hypothes |
https://en.wikipedia.org/wiki/TUM%20School%20of%20Computation%2C%20Information%20and%20Technology | The TUM School of Computation, Information and Technology (CIT) is a school of the Technical University of Munich, established in 2022 by the merger of three former departments. As of 2022, it is structured into the Department of Mathematics, the Department of Computer Engineering, the Department of Computer Science, and the Department of Electrical Engineering.
Department of Mathematics
The Department of Mathematics (MATH) is located at the Garching campus.
History
Mathematics was taught from the beginning at the Polytechnische Schule in München and the later Technische Hochschule München. Otto Hesse was the department's first professor for calculus, analytical geometry and analytical mechanics. Over the years, several institutes for mathematics were formed.
In 1974, the Institute of Geometry was merged with the Institute of Mathematics to form the Department of Mathematics, and informatics, which had been part of the Institute of Mathematics, became a separate department.
Research Groups
As of 2022, the research groups at the department are:
Algebra
Analysis
Analysis and Modelling
Applied Numerical Analysis, Optimization and Data Analysis
Biostatistics
Discrete Optimization
Dynamic Systems
Geometry and Topology
Mathematical Finance
Mathematical Optimization
Mathematical Physics
Mathematical Modeling of Biological Systems
Numerical Mathematics
Numerical Methods for Plasma Physics
Optimal Control
Probability Theory
Scientific Computing
Statistics
Department of Computer Science
The Department of Computer Science (CS) is located at the Garching campus.
History
The first courses in computer science at the Technical University of Munich were offered in 1967 at the Department of Mathematics, when Friedrich L. Bauer introduced a two-semester lecture titled Information Processing. In 1968, Klaus Samelson started offering a second lecture cycle titled Introduction to Informatics. By 1992, the computer science department had separated from the De |
https://en.wikipedia.org/wiki/Dana%20Cordell | Dana Cordell is a Research Director at the Institute for Sustainable Futures at the University of Technology Sydney, where she directs and undertakes international and Australian research on sustainable food and phosphorus futures. Cordell's work in sustainability research has been recognised with the Eureka Prize for Environmental Research (2012), a Banksia Mercedes-Benz Environmental Research Award, the Advance Food and Agriculture Award (2016) and she was named one of Australia's 100 Women of Influence (2013).
Education
Cordell was awarded a PhD in Sustainable Futures from the University of Technology Sydney and a PhD in Water and Environmental Studies from Linköping University, Sweden, as part of a cotutelle agreement between the two universities with her thesis "The Story of Phosphorus: Sustainability implications of global phosphorus scarcity for food security".
References
Living people
Year of birth missing (living people)
University of Technology Sydney alumni
Linköping University alumni
Academic staff of the University of Technology Sydney
Food scientists
Australian women scientists |
https://en.wikipedia.org/wiki/Evolution%20of%20sex-determining%20mechanisms | The evolution of sex-determining mechanisms, characterized by the evolutionary transition to genetic sex determination or temperature-dependent sex determination from the opposite mechanism, has frequently and readily occurred among multiple taxa across a transitionary continuum.
Sex-determining mechanisms include genetic sex determination, where sex is determined by genes on sex chromosomes, and environmental sex determination/temperature-dependent sex determination, where sex is permanently fixed by environmental cues after fertilization. Evolutionary transitions between these mechanisms are frequently driven by sex reversal, a phenomenon where environmental overrides produce organisms with discordant genotypic and phenotypic sex.
Evolutionary transitions
Technological advances in comparative chromosome mapping and molecular cytogenetics have advanced understanding of the many transitions between sex-determining modes. Threshold changes in gene expression for either male- or female-determining factors is enough to change modes of sex determination, as these thresholds are heritable, and more labile sex-determining mechanisms can be advantageous in unpredictable or changing environments.
Evolutionary transitions from genetic sex determination to temperature-dependent sex determination are possible as long as there is temperature sensitivity in the genetic system (on the sex chromosomes) and selection occurs on those sensitivity levels. All of these drivers of transitions between sex-determining mechanisms are due to a suggested novel locus changing fitness, which selection then acts upon. Current sex-determining systems must be destabilized in order to drive the evolution of a new system. When evolving from a genetic XX/XY or ZZ/ZW system to a temperature-dependent system, temperature-dependent sex determination naturally avoids nonviable YY or WW genotypes. The evolution of temperature-dependent sex determination is considered to be adaptive in most hypothes |
https://en.wikipedia.org/wiki/Forum%20of%20Incident%20Response%20and%20Security%20Teams | The Forum of Incident Response and Security Teams (FIRST) is a global forum of incident response and security teams. They aim to improve cooperation between security teams on handling major cybersecurity incidents. FIRST is an association of incident response teams with global coverage.
The 2018 Report of the United Nations Secretary-General's High-Level Panel on Digital Cooperation noted FIRST as a neutral third party which can help build trust and exchange best practices and tools during cybersecurity incidents.
History
FIRST was founded as an informal group by a number of incident response teams after the WANK (computer worm) highlighted the need for better coordination of incident response activities between organizations, during major incidents. It was formally incorporated in California on August 7, 1995, and moved to North Carolina on May 14, 2014.
Activities
In 2020, FIRST launched EthicsFIRST, a code of Ethics for Incident Response teams.
Annually, FIRST offers a Suguru Yamaguchi Fellowship, which helps incident response teams with national responsibility gain further integration with the international incident response community. It also maintains an Incident Response Hall of Fame, highlighting individuals who contributed significantly to the Incident Response community.
FIRST maintains several international standards, including the Common Vulnerability Scoring System, a standard for expressing impact of security vulnerabilities; the Traffic light protocol for classifying sensitive information; and the Exploit Prediction Scoring System, an effort for predicting when software vulnerabilities will be exploited.
FIRST is a partner of the International Telecommunication Union (ITU) and the Department of Foreign Affairs and Trade of Australia on Cybersecurity. The ITU co-organizes with FIRST the Women in Cyber Mentorship Programme, which engages cybersecurity leaders in the field, and connects them with women worldwide.
Together with the National Telec |
https://en.wikipedia.org/wiki/Sum%20of%20two%20cubes | In mathematics, the sum of two cubes is a cubed number added to another cubed number.
Factorization
Every sum of cubes may be factored according to the identity
in elementary algebra.
Binomial numbers are the general of this factorization to higher odd powers.
"SOAP" method
The mnemonic "SOAP", standing for "Same, Opposite, Always Positive", is sometimes used to memorize the correct placement of the addition and subtraction symbols while factorizing cubes. When applying this method to the factorization, "Same" represents the first term with the same sign as the original expression, "Opposite" represents the second term with the opposite sign as the original expression, and "Always Positive" represents the third term and is always positive.
{| cellspacing="4"
|- style="vertical-align:bottom;text-align:center;line-height:0.9;font-size:90%;"
| || originalsign || || Same || || Opposite || || AlwaysPositive
|-
| || style="border:1px solid;border-bottom:none;"|
| || style="border:1px solid;border-bottom:none;"|
| || style="border:1px solid;border-bottom:none;"|
| || style="border:1px solid;border-bottom:none;"|
|-
|
!
|
!
|
!
|
!
|
|-
|
!
|
!
|
!
|
!
|
|}
Proof
Starting with the expression, is multiplied by a and b
By distributing a and b to , one get
and by canceling the alike terms, one get
Similarly for the difference of cubes,
Fermat's last theorem
Fermat's last theorem in the case of exponent 3 states that the sum of two non-zero integer cubes does not result in a non-zero integer cube. The first recorded proof of the exponent 3 case was given by Euler.
Taxicab and Cabtaxi numbers
Taxicab numbers are numbers that can be expressed as a sum of two positive integer cubes in n distinct ways. The smallest taxicab number, after Ta(1), is 1729, expressed as
or
The smallest taxicab number expressed in 3 different ways is 87,539,319, expressed as
, or
Cabtaxi numbers are numbers that can be expressed as a sum of two positive or negative integers or 0 cube |
https://en.wikipedia.org/wiki/List%20of%20systems%20biology%20modeling%20software | Systems biology relies heavily on building mathematical models to help understand and make predictions of biological processes. Specialized software to assist in building models has been developed since the arrival of the first digital computers. The following list gives the currently supported software applications available to researchers.
The vast majority of modern systems biology modeling software support SBML, which is the de facto standard for exchanging models of biological cellular processes. Some tools also support CellML, a standard used for representing physiological processes. The advantage of using standard formats is that even though a particular software application may eventually become unsupported and even unusable, the models developed by that application can be easily transferred to more modern equivalents. This allows scientific research to be reproducible long after the original publication of the work.
To obtain more information about a particular tool, click on the name of the tool. This will direct you either to a peer-reviewed publication or, in some rare cases, to a dedicated Wikipedia page.
Actively supported open-source software applications
General information
When an entry in the SBML column states "Yes, but only for reactions.", it means that the tool only supports the reaction component of SBML. For example, rules, events, etc. are not supported.
Specialist Tools
The following table lists specialist tools that cannot be grouped with the modeling tools.
Feature Tables
Supported modeling paradigms
Differential equation specific features
File format support and interface type
Advanced features (where applicable)
Other features
Particle-based simulators
Particle based simulators treat each molecule of interest as an individual particle in continuous space, simulating molecular diffusion, molecule-membrane interactions and chemical reactions.
Comparison of particle-based simulators
The following list compares the feat |
https://en.wikipedia.org/wiki/Folgar-Tucker%20Model | The Folgar-Tucker-Equation (FTE) is a widespread and commercially applied model to describe the fiber orientation in injection molding simulations of fiber composites.
The equation is based on Jeffrey's equation for fibers suspended in melts, but, in addition, accounts for fiber-fiber interactions.
Tucker and Advani then integrate over an ensemble of fibers and hence obtain an evolution equation for the orientation/alignment tensor
as a Field (physics).
A compact way to express it is
The scalar quantities are the shear rate , the interaction coefficient C (for an isotropic diffusion) and the parameter accounting for the fibers aspect ratio . is a fourth order tensor. Normally, is expressed as a function of A. The detection of the best suited function is known as closure problem.
D and W are respectively the symmetric and antisymmetric part of the velocity gradient, while 1 represents the unit tensor.
represents a contraction over two indices.
Thus the Folgar Tucker is an differential equation for the second order tensor A, namely the orientation tensor.
This evolution equation is in the frame of continuum mechanics and is coupled to the velocity field.
Since different closure forms can be inserted, many possible formulations of the equations are possible. For most of the closure forms the FTE results in a nonlinear differential equation (though a Lemma to linearize it for some popular closure was introduced ).
Analytical solutions to some versions of the FTE consists of both exponential, trigonometrical and hyperbolic functions.
Numerically the FTE is solved also in commercial software for injection molding simulations.
References
Differential equations
Mathematical modeling |
https://en.wikipedia.org/wiki/Secunet%20Security%20Networks | secunet Security Networks AG commonly known as Secunet, is a German information security corporation headquartered in Essen. Secunet develops, manufactures and sells information security hardware and secure telecommunications equipment. It is the producer and vendor of Germany's SINA infrastructure that forms the basis for Germany's secure IT networks. Secunet is Germany's biggest information security company and provides services for the public administration and private entreprises the country. The company is public stock index listed in the SDAX.
Secunet was founded in 1997 as a spin-off of the IT division of the former TÜV Mitte AG. Since 2009, Giesecke+Devrient is majority shareholder. In 2022, Secunet generated a revenue of 347 million Euro and had more than 1,000 employees, thus making it the leading IT security partner of the Federal Republic of Germany. Within Germany Secunet has seats in Berlin, Bonn, Borchen, Dresden, Eschborn, Hamburg, Munich and Siegen.
Products and services
With its products and services, Secunet primarily focusses clients in the public administration and ministries, the healthcare sector, the defence and space sector as well as specifically security and police agencies.
Secure Networking
One of Secunet's premier products is the SINA (Secure Inter-Network Architecture) product line that was co-developed with Germany's Federal Office for Information Security (BSI). The SINA architecture forms the basis for Germany's classified information networks and is used in a broad range of public institutions in Germany and abroad. The architecture is accredited up the NATO and EU SECRET classifications and includes network encryptors and work stations as well as communication devices, such as encrypted phones.
Border Control
Secunet is the producer of EasyPASS, an automated border control system used at airports and border check points in Germany and a number of EU countries.
Consulting
Secunet's services include information security con |
https://en.wikipedia.org/wiki/Fukushima%20Hydrogen%20Energy%20Research%20Field | Fukushima Hydrogen Energy Research Field (FH2R) is the world's largest hydrogen production facility using renewable energy. It is located in Fukushima Prefecture in Japan. The construction was started on 2018 and completed in 2018. It was inaugurated by Shinzo Abe in 2020. The facility uses 10 MW of solar electricity which is installed near the production facility. The facility can produce 1,200 Nm3 of hydrogen per hour. It was jointly established by the New Energy and Industrial Technology Development Organization, Toshiba Energy Systems & Solutions Corporation, Tohoku Electric Power and Iwatani Corporation.
References
External links
Description of FH2R used technology
Hydrogen infrastructure |
https://en.wikipedia.org/wiki/Control%20coefficient%20%28biochemistry%29 | Control coefficients are used to describe how much influence (i.e., control) a given reaction step has on the steady-state flux or species concentration level. In practice, this can be accomplished by changing the expression level of a given enzyme and measuring the resulting changes in flux and metabolite levels. Control coefficients form a central component of metabolic control analysis.
There are two primary control coefficients:
Flux Control Coefficients
Concentration Control Coefficients.
The simplest way to look at control coefficients is as the scaled derivatives of the steady-state change in an observable with respect to a change in enzyme activity. For example, the flux control coefficients can be written as:
while the concentration control coefficients can be written as:
Control coefficients can have any value that includes negative and positive values. A negative value indicates that the observable in question decreases as a result of the change in enzyme activity.
In theory, other observables, such as growth rate, or even combinations of observables, can be defined using a control coefficient. But flux and concentration control coefficients are by far the most commonly used.
The approximation in terms of percentages makes control coefficients easier to measure and more intuitively understandable.
Control coefficients are useful because they tell us how much influence each enzyme or protein has in a biochemical reaction network.
It is important to note that control coefficients are not fixed values but will change depending on the state of the pathway or organism. If an organism shifts to a new nutritional source, then the control coefficients in the pathway will change.
Formal Definition
One criticism of the concept of the control coefficient as defined above is that it is dependent on the being described relative to a change in enzyme activity. Instead, the Berlin school defined control coefficients in terms of changes to local rates br |
https://en.wikipedia.org/wiki/Caroline%20Thomas%20Rumbold | Caroline Thomas Rumbold (July 22, 1877 – November 7, 1949) was an American botanist. She specialized in forest pathology. Her researches focused on “fungus diseases of trees and blue stain of wood.”
Biography
Born on July 22, 1877, in St. Louis, Missouri, United States, Caroline Thomas Rumbold was the daughter of Thomas Frasier Rumbold and Charlotte E. Ledengerber.
In 1901 she graduated from Smith College in Massachusetts. She got both the master's degree and the doctorate from the Washington University in St. Louis.
She started her career as an assistant at the Bureau of Plant Industry, United States Department of Agriculture in 1903. She later moved to University of Missouri to become an assistant in botany. From 1929 to 1942 she had a long career as an associate pathologist at the Department of Plant Pathology in the University of Wisconsin–Madison. She briefly worked as a fellow at the Missouri Botanical Garden.
She was associated with a number of professional institutions including Phytopathological Society, the American Society of Plant Physiologists and the Botanical Society of Washington.
She died on November 7, 1949, in Cleveland, Ohio, United States.
References
1877 births
1949 deaths
Pathology
American physiologists
Plant physiologists
American women botanists
American botanists
Women physiologists
American women academics
20th-century American botanists
20th-century American women scientists
Smith College alumni
Washington University in St. Louis alumni
University of Wisconsin–Madison staff |
https://en.wikipedia.org/wiki/Trustworthy%20AI | Trustworthy AI is a programme of work of the ITU (United Nations Specialized Agency for ICT) under its AI for Good programme. The programme advances the standardization of a number of Privacy-enhancing technologies (PETs), including homomorphic encryption, federated learning, secure multi-party computation, differential privacy, zero-knowledge proof.
Privacy-Enhancing Technologies apply complex and sometimes counterintuitive operations to process signals and information while safeguarding privacy. For instance, homomorphic encryption allows for computing on encrypted data, where the outcomes or result is still encrypted and unknown to those performing the computation, but decryptable by the original encryptor. These technologies are often developed with the goal of enabling use in jurisdictions different from the data creation (under e.g. GDPR). As such, the programme being led by two international organizations develops international standards to operate in this context. The PETs are used from analytics such as Artificial Intelligence.
History
The origin of the programme lies with the ITU-WHO Focus Group on Artificial Intelligence for Health, where strong need for privacy at the same time as the need for analytics, created a demand for a standard in these technologies.
When AI for Good moved online in 2020, the TrustworthyAI seminar series was initiated to start discussions on such work, which eventually led to the standardization activities.
Standardization
Multi-Party Computation
Secure Multi-Party Computation (MPC) is being standardizated under "Question 5" (the incubator) of ITU-T Study Group 17.
Homomorphic Encryption
ITU has been collaborating since the early stage of the HomomorphicEncryption.org standardization meetings, which has developed a standard on Homomorphic encryption. The 5th homomorphic encryption meeting was hosted at ITU HQ in Geneva.
Federated Learning
Zero-sum masks as used by federated learning for privacy preservation are used e |
https://en.wikipedia.org/wiki/Makr%20Shakr | Makr Shakr (pronounced Maker Shaker) is a producer of robotic bartenders and baristas based in Turin, Italy. The robots receive orders from customers via mobile devices, and leverage automation technologies to prepare different beverages.
Research and development
Development for Makr Shakr's robotic bartenders began at MIT Senseable City Lab led by professor Carlo Ratti, with the support of the Coca-Cola and Bacardi. It originated from the concept to leverage digital technologies to "explores the new dynamics of social creation and consumption".
The prototype is equipped with the ability to perform motions essential to bartending like shaking, muddling and slicing. In addition, cocktail-making is "crowdsourced" through a mobile app inviting individual users to contribute their own recipes. Its movement is modelled after the choreography by New York Theater Ballet's Marco Pelle.
Makr Shakr was first introduced to the public through various international technology and design events. After a test run in April 2013 during the Milan Design Week, three mechanical arms were deployed at the Google I/O Conference in San Francisco a month later.
In 2014, Makr Shakr received design awards by Core 77 and D&AD for its concept and digital interface respectively.
Commercial launch
Makr Shakr was launched commercially as a start-up in 2017, with an assembly line set up in Turin, Italy. It partners with the German automative system manufacturer KUKA in producing the robotic bartenders.
Operating at a rate of 60-90 seconds per drink, the brand's various models are implemented in cities such as Amsterdam, Prague, London and Las Vegas. The Bionic Bar, produced by Makr Shakr, is installed in nine cruise ships of cruise line Royal Caribbean. The company is set to debut in Changi Airport, Singapore, Bali and Dallas in 2023.
The company's flagship robot Toni featured in the exhibition "AI: More Than Human" at the Barbican, London in 2019.
Influence on restaurants and bars
|
https://en.wikipedia.org/wiki/Light-induced%20fluorescence%20transient | A light-induced fluorescence transient (LIFT) is a device to remotely measure chlorophyll fluorescence in plants in a fast and non-destructive way. By using a series of excitation light pulses, LIFT combines chlorophyll fluorescence data with spectral and RGB information to provide insights into various photosynthetic traits and vegetation indices. LIFT combines the pump-probe method with the principle of laser-induced fluorescence.
Fluorescence measurement principle
The process of photosynthesis in plants involves the absorption of light by photosynthetic pigments, transfer of energy to the reaction center, charge separation leading to oxygen evolution, and a series of electron transport processes. The efficiency of photosynthesis is determined by factors such as the rate of electron transport, carbon fixation, and energy production. The LIFT measures photosynthesis by exposing the plant to short flashes of blue light and analyzing the changes in fluorescence over time by the help of the FRR technique.
LIFT FRR technique
The LIFT fast repetition rate (FRR) fluorescence technique is a method for measuring plant fluorescence. It uses a series of short bursts of blue light pulses from a LED to excite photosystem II in the plant. When the quinone acceptor A (QA) reaches its capacity for binding electrons, the system becomes saturated and consequently red fluorescence is emitted. This is regulated by a precise excitation protocol, which consists of a saturation sequence (SQA) and a relaxation sequence (RQA) with a set of short excitation flashes (1 μs).
The fluorescence can then be measured with FRR fluorometry. For that purpose, the LIFT instrument has a built-in optical interference filter to separate the red chlorophyll fluorescence from reflected light, with a wavelength of 685 ± 10 nm.
The fluorescence transient resulting from this excitation protocol shows the kinetics of the reduction of QA and its subsequent re-oxidation, and can be used to calculate vari |
https://en.wikipedia.org/wiki/Sparse%20identification%20of%20non-linear%20dynamics | Sparse identification of nonlinear dynamics (SINDy) is a data-driven algorithm for obtaining dynamical systems from data. Given a series of snapshots of a dynamical system and its corresponding time derivatives, SINDy performs a sparsity-promoting regression (such as LASSO) on a library of nonlinear candidate functions of the snapshots against the derivatives to find the governing equations. This procedure relies on the assumption that most physical systems only have a few dominant terms which dictate the dynamics, given an appropriately selected coordinate system and quality training data. It has been applied to identify the dynamics of fluids, based on proper orthogonal decomposition, as well as other complex dynamical systems, such as biological networks.
Mathematical Overview
First, consider a dynamical system of the form
where is a state vector (snapshot) of the system at time and the function defines the equations of motion and constraints of the system. The time derivative may be either prescribed or numerically approximated from the snapshots.
With and sampled at equidistant points in time (), these can be arranged into matrices of the form
and similarly for .
Next, a library of nonlinear candidate functions of the columns of is constructed, which may be constant, polynomial, or more exotic functions (like trigonometric and rational terms, and so on):
The number of possible model structures from this library is combinatorically high. is then substituted by and a vector of coefficients determining the active terms in :
Because only a few terms are expected to be active at each point in time, an assumption is made that admits a sparse representation in . This then becomes an optimization problem in finding a sparse which optimally embeds . In other words, a parsimonious model is obtained by performing least squares regression on the system with sparsity-promoting () regularization
where is a regularization parameter. Finally, the sparse |
https://en.wikipedia.org/wiki/Energy%20Reports | Energy Reports is a peer-reviewed open-access scientific journal covering all aspects of energy research. The journal was established in 2015 and is published by Elsevier. The editor-in-chief is Nelson Fumo (University of Texas at Tyler). Authors pay article processing charges, but do not retain unrestricted copyrights and publishing rights.
Abstracting and indexing
The journal is abstracted and indexed in Ei Compendex, Scopus, and the Science Citation Index Expanded. According to the Journal Citation Reports, the journal has a 2021 impact factor of 4.937.
References
External links
Energy and fuel journals
Academic journals established in 2015
English-language journals
Elsevier academic journals
Creative Commons Attribution-licensed journals
Continuous journals |
https://en.wikipedia.org/wiki/Bow-tie%20diagram | A bow-tie diagram is a graphic tool used to simplify understanding of the process leading to damage or loss. The diagram records the results of an analysis based on a defined Event and shows all of the physically possible Mechanisms to the left of the Event and the physically possible Outcomes of the Event on the right. The shape of the diagram resembles that of a bow tie, after which it is named.[3]
The bow-tie diagram can be used proactively to record the identification and analysis all possible damage processes in a particular situation and act as an index of the various features of design and management systems that control the risk of such processes. They may also be used to contribute to the structure of an analysis of an Event that has occurred (an 'accident' analysis). In this way the bow-tie diagram is useful for a structured overview of all sources of risk.
The diagram follows the same basic principles as those on which fault tree analysis and event tree analysis are based, but, in being far less complex than these, is attractive as a means of rapidly establishing an overall scope of risk concerns for an organisation, only some few of which may justify those more rigorous and logical methods.
Bow-tie diagrams are used to analyze and manage risk in several industries, such as oil and gas production, the process industries, aviation, and finance.
History
It is generally accepted in the literature that the earliest public mention of the bow-tie diagram appeared in Imperial Chemical Industries (ICI) course notes. However, Viner explains how the diagram originally arose in a lecture on damage process modelling given by him in Australia at the then Ballarat College of Advanced Education (now the Federation University), which lecture was attended by students from ICI. This diagrammatic representation of the damage process was developed spontaneously during a lecture on the Generalised Time Sequence Model (GTSM) of the damage process, in order to fa |
https://en.wikipedia.org/wiki/P4-metric | P4 metric
enables performance evaluation of the binary classifier.
It is calculated from precision, recall, specificity and NPV (negative predictive value).
P4 is designed in similar way to F1 metric, however addressing the criticisms leveled against F1. It may be perceived as its extension.
Like the other known metrics, P4 is a function of: TP (true positives), TN (true negatives), FP (false positives), FN (false negatives).
Justification
The key concept of P4 is to leverage the four key conditional probabilities:
- the probability that the sample is positive, provided the classifier result was positive.
- the probability that the classifier result will be positive, provided the sample is positive.
- the probability that the classifier result will be negative, provided the sample is negative.
- the probability the sample is negative, provided the classifier result was negative.
The main assumption behind this metric is, that a properly designed binary classifier should give the results for which all the probabilities mentioned above are close to 1.
P4 is designed the way that requires all the probabilities being equal 1.
It also goes to zero when any of these probabilities go to zero.
Definition
P4 is defined as a harmonic mean of four key conditional probabilities:
In terms of TP,TN,FP,FN it can be calculated as follows:
Evaluation of the binary classifier performance
Evaluating the performance of binary classifier is a multidisciplinary concept. It spans from the evaluation of medical tests, psychiatric tests to machine learning classifiers from a variety of fields. Thus, many metrics in use exist under several names. Some of them being defined independently.
Properties of P4 metric
Symmetry - contrasting to the F1 metric, P4 is symmetrical. It means - it does not change its value when dataset labeling is changed - positives named negatives and negatives named positives.
Range:
Achieving requires all the key four conditional probabilities |
https://en.wikipedia.org/wiki/Hydrodynamic%20delivery | Hydrodynamic Delivery (HD) is a method of DNA insertion in rodent models. Genes are delivered via injection into the bloodstream of the animal, and are expressed in the liver. This protocol is helpful to determine gene function, regulate gene expression, and develop pharmaceuticals in vivo.
Methods
Hydrodynamic Delivery was developed as a way to insert genes without viral infection (transfection). The procedure requires a high-volume DNA solution to be inserted into the veins of the rodent using a high-pressure needle. The volume of the DNA is typically 8-10% equal to 8-10% of the animal's body weight, and is injected within 5-7 seconds. The pressure of the insertion leads to cardiac congestion (increased pressure in the heart), allowing the DNA solution to flow through the bloodstream and accumulate in the liver. The pressure expands the pores in the cell membrane, forcing the DNA molecules into the parenchyma, or the functional cells of the organ. In the liver, these cells are the hepatocytes. In less than two minutes after the injection, the pressure returns to natural levels, and the pores shrink back, trapping the DNA inside of the cell. After injection, the majority of genes are expressed in the liver of the animal over a long period of time.
Originally developed to insert DNA, further developments in HD have enabled the insertion of RNA, proteins, and short oligonucleotides into cells.
Applications
The development of Hydrodynamic Delivery methods allows an alternative way to study in vivo experiments. This method has shown to be effective in small mammals, without the potential risks and complications of viral transfection. Applications of these studies include: testing regulatory elements, generating antibodies, analyzing gene therapy techniques, and developing models for diseases. Typically, genes are expressed in the liver, but the procedure can be altered to express genes in kidneys, lungs, muscles, heart, and pancreas.
Gene therapy
Hydrodynamic De |
https://en.wikipedia.org/wiki/Four%20Epigraphs%20after%20Escher | Four Epigraphs after Escher (German: 4 Epigraphe nach Escher), Op. 35, is a chamber music composition by Graham Waterhouse, written in 1993 for viola, heckelphone and piano. Its four movements refer to graphic artworks by M. C. Escher. It was premiered in Munich in 1995, and the U.S. premiere was given in 1998. It was published by Hofmeister in 1998.
History
Waterhouse was inspired by graphic artworks by M. C. Escher to write in 1993 Four Epigraphs after Escher in four movements, each named for a piece of graphic art. He scored it as a piano trio with viola and heckelphone. It is one of few compositions for a solo heckelphone, a kind of oboe in low register. Paul Hindemith had written a Trio, Op. 47, for the same ensemble in 1928. The work was published by Friedrich Hofmeister Musikverlag in Leipzig in 1998.
The composition is structured in four movements:
Die Gottesanbeterin (Praying mantis)
Möbiusband II (Möbius strip)
Reiter (Rider)
Reptilien (Reptiles)
The first movement was inspired by a graphic showing an oversized mantis in a church, on a stone monument to a bishop on his tomb. Escher dealt with the phenomenon of the Möbius strip several times; the music relates to Möbius II, with ants crawling over the strip. The third movement alludes to a print with riders in two directions and two colour shades, partly complementing each other. The final movement is based on Escher's 1934 print Reptiles.
Performances
Four Epigraphs after Escher was premiered in Munich in 1995, and the U.S. premiere was played at the 1998 conference of the International Double Reed Society in Tempe, Arizona. In a concert on 6 June of music by Waterhouse, which contained also two world premieres, one more U.S. premiere and a reprise of Mouvements d'Harmonie, the piece was played by Gerald Corey, Heckelphone, violist Peter Rosato and the composer as the pianist.
References
External links
of Graham Waterhouse
Graham Waterhouse (1962): Four Epigraphs after Escher, op. 35, fo |
https://en.wikipedia.org/wiki/Roberts%27s%20triangle%20theorem | Roberts's triangle theorem, a result in discrete geometry, states that every simple arrangement of lines has at least triangular faces. Thus, three lines form a triangle, four lines form at least two triangles, five lines form at least three triangles, etc. It is named after Samuel Roberts, a British mathematician who published it in 1889.
Statement and example
The theorem states that every simple arrangement of lines in the Euclidean plane has at least triangular faces. Here, an arrangement is simple when it has no two parallel lines and no three lines through the same point.
One way to form an arrangement of lines with exactly triangular faces is to choose the lines to be tangent to a semicircle. For lines arranged in this way, the only triangles are the ones formed by three lines with consecutive points of tangency. As the lines have consecutive triples, they also have triangles.
Proof
Branko Grünbaum found the proof in Roberts's original paper "unconvincing", and credits the first correct proof of Roberts's theorem to Robert W. Shannon, in 1979. He presents instead the following more elementary argument, first published in Russian by Alexei Belov. It depends implicitly on a simpler version of the same theorem, according to which every simple arrangement of three or more lines has at least one triangular face. This follows easily by induction from the fact that adding a line to an arrangement cannot decrease the number of triangular faces: if the line cuts an existing triangle, one of the resulting two pieces is again a triangle. On the other hand, although the bound of Roberts's theorem increases with each added line, the number of triangles in any particular arrangement may sometimes remain unchanged.
If the given lines are all moved without changing their slopes, their new positions can be described by a system of real numbers, the offsets of each line from its original position. For each triangular face, there is a linear equation on the offse |
https://en.wikipedia.org/wiki/Hintikka%20set | In mathematical logic, a Hintikka set is a set of logical formulas whose elements satisfy the following properties:
An atom or its conjugate can appear in the set but not both,
If a formula in the set has a main operator that is of "conjuctive-type", then its two operands appear in the set,
If a formula in the set has a main operator that is of "disjuntive-type", then at least one of its two operands appears in the set.
The exact meaning of "conjuctive-type" and "disjunctive-type" is defined by the method of semantic tableaux.
Hintikka sets arise when attempting to prove completeness of propositional logic using semantic tableaux. They are named after Jaakko Hintikka.
Propositional Hintikka sets
In a semantic tableau for propositional logic, Hintikka sets can be defined using uniform notation for propositional tableaux. The elements of a propositional Hintikka set S satisfy the following conditions:
No variable and its conjugate are both in S,
For any in S, its components are both in S,
For any in S, at least one of its components are in S.
If a set S is a Hintikka set, then S is satisfiable.
References
Sources
Mathematical logic |
https://en.wikipedia.org/wiki/Food%20self-provisioning | Food self-provisioning (FSP) is the growing of one's own food, especially fruits and vegetables. Also labelled as household food production, is a traditional activity persisting in the countries of the Global North. It is studied in Sustainability science and in ecofeminism on reason of its social, health and environmental outcomes.
References
ecology
natural resources
Human impact on the environment |
https://en.wikipedia.org/wiki/Crime-Free%20Multi-Housing | The Crime-Free Multi-Housing (CFMH) program is a crime prevention program, which partners property owners, residents, and law-enforcement personnel in an effort to eliminate crime, drugs, and gang activity from rental properties.
History
The program began in Mesa, Arizona in the United States in 1992. Since then, it has spread to other US cities and several other countries.
The International Crime Free Association says that the program has brought satisfied tenants, increased demand for rental units, lowered maintenance and repair costs, increased property values, and improved safety.
Program
Three phases must be completed under police supervision:
an eight-hour seminar presented by the local police department
certification that the rental property has met the security requirements for the tenants' safety
a tenant crime-prevention meeting is held
Landlords are allowed to advertise their full certification on their property.
See also
Crime prevention through environmental design
Notes
References
External links
Crime Free Housing Training, West Fargo, ND
Crime-Free-Multi-Housing, City of Ottawa, Ontario, Canada
Crime Free Multi-Housing Program, British Columbia, Canada
Crime Free Multi-Housing, Tucson, AZ
EPD Crime Free Multi-Housing Program, Evansville, IN
Criminology
Security engineering
Crime prevention |
https://en.wikipedia.org/wiki/Coachella%20filter | The Coachella filter was an augmented reality social media camera filter released April 2016 that superimposed a flower crown on the user's head and brightened the complexion of the user's skin. The filter appeared on the Snapchat photo messaging application on the occasion of the Coachella music festival in 2016, using the festival's recognizable "boho-chic aesthetic." The Coachella filter became popular worldwide.
The filter was criticized for whitewashing users' skin complexion and contributing to unrealistic beauty standards and dysmorphia.
References
Social media
Popular culture
2016
Internet culture |
https://en.wikipedia.org/wiki/AES50 | AES50 is an Audio over Ethernet protocol for multichannel digital audio. It is defined by the AES50-2011 standard for High-resolution multi-channel audio interconnection (HRMAI).
Origins
AES50 is based on the SuperMAC protocol created by Sony Pro Audio Lab (now Oxford Digital). The preliminary standard was assigned the AES-X140 project designation in 2003, and was finally approved in 2005 as a royalty-free open standard.
HyperMAC is an improved protocol based on Gigabit Ethernet physical layer, allowing more channels and lower audio latency. It was considered for an alternate physical layer in a future revision of AES50, but standardisation did not move forward.
Sony licensed its proprietary software implementations of SuperMAC and HyperMAC to Midas Consoles for their Midas XL8 digital mixer. Midas parent Klark Teknik took over the SuperMAC and HyperMAC patents in 2007, then in 2009 Midas and Klark Teknik were acquired by Uli Behringer's Music Group.
The AES50 protocol is implemented in digital mixing consoles by Midas and Behringer to transfer digital audio between the remote stage boxes.
Specifications
AES50 is a point-to-point interconnect which carries multiple channels of AES3, PCM or DSD bitstream formats, along with system clock and synchronisation signals, over Cat 5 cable using 100 Mbit/s Fast Ethernet physical layer.
AES50 uses the four pairs of the Cat 5 cable in the 8P8C connector:
Audio data transmit +
Audio data transmit –
Audio data receive +
Sync signal transmit +
Sync signal transmit –
Audio data receive –
Sync signal receive +
Sync signal receive –
Audio data is transmitted in bidirectional full-duplex mode over two differential pairs used by the 100BASE-TX standard, and word clock sync signal is transmitted over the remaining differential pairs not used by the Fast Ethernet layer. Using separate copper pairs for clock signal simplifies connection setup and allows phase-accurate low-jitter clock sync.
AES50 only employs the Ethe |
https://en.wikipedia.org/wiki/List%20of%20electric%20truck%20makers | This is a list of electric truck makers that have produced medium- and heavy-duty commercial battery-powered all-electric trucks.
Multiple-brand corporations
The following truck brands are owned by Big Three automobile manufacturers and other corporations which hold multiple automobile and truck brands.
Hyundai-Kia
In 2020, Hyundai sold over 9,000 units of its Porter Electric truck in South Korea while Kia sold over 5,000 units of the Kia Bongo EV in the same market.
Mercedes-Benz Group
Mercedes-Benz
Mercedes-Benz began delivering eActros units to 10 customers in September 2018 for a two-year real-world test. Customers include Dachser, Edeka, Hermes, Kraftverkehr Nagel, Ludwig Meyer, Pfenning Logistics, TBS Rhein-Neckar and Rigterink of Deutschland, and Camion Transport and Migros of Switzerland.
Daimler AG
Freightliner
Freightliner began delivering e-M2 trucks to Penske in December 2018, and will commercialize its larger e-Cascadia in 2019. A total of 50 electric trucks should be on the road by the end of 2019, including 20 units delivered to Penske and NFI. The Portland factory will be renovated to start electric truck production in 2021.
Mitsubishi Fuso
Mitsubishi Fuso began deliveries of the eCanter in 2017.
Rizon
Daimler launched the all-electric truck Rizon brand in the United States in 2023. Journalists questioned whether the Rizon trucks are rebranded Mitsubishi Fuso eCanter trucks, but Daimler did not address these questions.
Paccar
DAF
DAF delivered its first CF semi-truck to Jumbo for testing in December 2018. It uses a VDL powertrain. The logistics company Tinie Manders Transport received a unit in February 2019, and Contargo in Germany received two units in May.
DAF supplies the CF Electric as a two-axle tractor unit (the FT ; GCW: 37 tonnes) that's ideally suited for emission-free, almost silent supermarket deliveries. The three-axle rigid truck (the FAN; GVW: up to 29 tonnes) has a steered trailing rear axle for maximum manoeuvrabilit |
https://en.wikipedia.org/wiki/Biofabrication | Biofabrication is a branch of biotechnology specialising in the research and development of biologically engineered processes for the automated production of biologically functional products through bioprinting or bioassembly and subsequent tissue maturation processes; as well as techniques such as directed assembly, which employs localised external stimuli guide the fabrification process; enzymatic assembly, which utilises selective biocatalysts to build macromolecular structures; and self-assembly, in which the biological material guides its own assembly according to its internal information. These processes may facilitate fabrication at the micro- and nanoscales. Biofabricated products are constructed and structurally organised with a range of biological materials including bioactive molecules, biomaterials, living cells, cell aggregates such as micro-tissues and micro-organs on chips, and hybrid cell-material constructs.
See also
References
Works cited
Biological engineering
Biotechnology |
https://en.wikipedia.org/wiki/Le%20Vaillant | Le Vaillant (French: The Valiant) (died 4 June 1916) was a pigeon used by the French Army in the First World War. The bird was the last held at Fort Vaux before it was overrun in the Battle of Verdun. Le Vaillant carried a message from the fort's commander Sylvain Reynal to his senior officers requesting reinforcements but was mortally wounded in flight. The bird was posthumously appointed to the Legion of Honour and is commemorated by a plaque at the fort.
Background
Fort Vaux was a fortification guarding the north-east approach to the city of Verdun. The fort was besieged by German forces during the 1916 Battle of Verdun and by early June the remaining French garrison was under the command of Commandant Sylvain Raynal. Telephone connection between the fort and the had been severed by German troops and Raynal's only means of communication was by messenger pigeon, of which he had four.
With German attacks continuing to gain ground Raynal sent the first of his pigeons on 2 June. The message requested that artillery fire be directed upon the fort against German troops that had occupied its upperworks. The pigeon arrived at the citadel, despite injury but had lost the ring containing the message. Raynal's penultimate bird was received and was awarded the Croix de Guerre for its flight.
Flight of 4 June
On 4 June Raynal released his last pigeon, number 787.15, named Le Vaillant. The message he bore included the text "we are holding. But ... relief is imperative ... This is my last pigeon". Le Vaillant had been affected by gas released from German shells and was revived by a number of trips to a loophole in Raynal's command post. He set off at 11:30 a.m.
Le Vaillant delivered the message to the dovecot at the citadel. The bird was grievously wounded and died in the hands of the citadel's pigeon master. Because of the message, five relief parties were sent to reinforce Raynal, arriving on 5 June. The garrison lacked water and ammunition and Raynal was |
https://en.wikipedia.org/wiki/List%20of%20paleotempestology%20records | Paleotempestology is the study of past tropical cyclone activity by means of geological proxies as well as historical documentary records. The term was coined by American meteorologist Kerry Emanuel.
Examples
Non-tropical examples
See also
Tropical cyclone
Tropical cyclone observation
Tropical cyclones and climate change
Notes
References
Citations
General sources
Further reading
External links
Western North Atlantic Basin 8,000 Year palaeotempestology Database 2018
palaeotempestology Resource Center
Shipwrecks, tree rings and hurricanes
Paleontology lists
Tropical cyclone meteorology
Paleotempestology |
https://en.wikipedia.org/wiki/EU%E2%80%93US%20Data%20Privacy%20Framework | The EU–US Data Privacy Framework is a forthcoming European Union–United States data transfer framework that was agreed to in 2022. Previous such regimes—the EU–US Privacy Shield (2016–2020) and the International Safe Harbor Privacy Principles (2000–2015)—were declared invalid by the European Court of Justice due to concerns that personal data leaving EU borders is subject to sweeping US government surveillance. The Trans-Atlantic Data Privacy Framework is intended to address these concerns.
Since the invalidation of the EU–US Privacy Shield in July 2020, companies wishing to transfer data between the EU and the US "have faced confusion, higher compliance costs, and challenges for EU–US business relationships".
In October 2022 U.S. President Joe Biden signed an executive order to implement the framework. A ratification process by the European Commission is expected to take up to six months. The European Data Protection Board has approved the draft. Although not binding on the European Commission, on 11 May 2023 the European Parliament voted in favour of a resolution calling on the Commission to renegotiate the Framework and not to adopt an adequacy finding on the basis that "the EU–US Data Privacy Framework fails to create essential equivalence in the level of protection".
Data Protection Review Court
The Data Protection Review Court (DPRC) is a three-judge panel, established in Executive Order 14086, which will deal with appeals made to the decisions of the Civil Liberties Protection Officer of the Office of the Director of National Intelligence as described by the Trans-Atlantic Data Privacy Framework. The decisions made by the DPRC have binding authority.
See also
Data Protection Directive
Digital privacy
General Data Protection Regulation
Safe harbor (law)
References
External links
Questions & Answers: EU-U.S. Data Privacy Framework by the European Commission
Final Implementing Decision of EU-US Data Privacy Framework by the European Commission
Informatio |
https://en.wikipedia.org/wiki/V%20%28programming%20language%29 | V (/ˈviː/, as in the letter v) also known as vlang, is a statically typed compiled programming language created by Alexander Medvednikov in early 2019. It was mostly inspired by the Go programming language, but was also influenced by C, Rust, and Oberon-2. It is free and open-source software, in active development at GitHub, and released under the MIT license.
The foremost goal of V is to be easy to use, and at the same time, to enforce a safe coding style through elimination of ambiguity. For example, variable shadowing is not allowed; declaring a variable with a name that is already used in a parent scope will cause a compilation error.
History
The new language was created as a result of frustration with existing languages being used for personal projects. Originally the language was intended for personal use, but after it was mentioned publicly and gained interest, it was decided to make it public. Initially, the name was the same as a product known as Volt. The V language was created in order to develop it, along with other software applications such as Ved (also called Vid). As the extension in use was already ".v", to not mess up the git history, it was decided to name it "V". Upon public release, the compiler was written in V, and could compile itself. Along with Alexander Medvednikov, the creator, its community has a large number of contributors from around the world who have helped with continually developing and adding to the compiler, language, and modules. Key design goals behind the creation of V: easier to learn and use, higher readability, fast compilation, increased safety, efficient development, cross-platform usability, improved C interop, better error handling, modern features, and more maintainable software.
Features
Safety
Usage of bounds checking
Usage of Option/Result
Mandatory checking of errors
No usage of values that are undefined
No shadowing of variables
No usage of null (unless in unsafe code)
No usage of global variables |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.