source
stringlengths
31
203
text
stringlengths
28
2k
https://en.wikipedia.org/wiki/Pencil%20Code%20%28programming%20language%29
Pencil Code is an educational programming language and website. It allows programming using either Scratch-style block coding, or CoffeeScript. Code runs directly in the web browser and can be shared with others. The language centers on a model of a pencil programmatically drawing on a 2-dimensional screen, with the pencil cursor depicted visually as a turtle. History Pencil Code was created by David Bau and his son in 2013. It was inspired by Logo, the 1967 programming language for drawing on a screen using a Lisp-like programming language. Google has funded improvements to Pencil Code via Google Summer of Code projects. References External links Pencil Code official website Domain-specific programming languages Educational programming languages Free educational software Programming languages
https://en.wikipedia.org/wiki/Avant-Garde%20Computing
Avant-Garde Computing, Inc., was a publicly traded American software and computer hardware company active from 1978 to 1990 and based in Mount Laurel, New Jersey. It was most well known for its Net/Command, Net/Adviser, Net/Alert, and Net/Guard suite of network management, monitoring, and security products. The company was acquired by Boole & Babbage in 1990 after a five-year string of losses. Beginning Avant-Garde Computing was founded by Timothy P. Ahlstrom and F. Morgan LaMarche and incorporated in 1978. Ahlstrom and LaMarche were previously 20 year veterans of IBM, both working in that company's marketing department. In their off-time in the early 1970s, the duo built a device that would warn computer operators when a data tape was close to the end of its reel, founding Ahlstrom LaMarche & Co. to market it. The device proliferated rapidly in the computer rooms of various companies, and the duo later sold their company and its patents to Telegentics of Cherry Hill, New Jersey, to reportedly modest profit. After several years, the duo regrouped and discussed starting another business. They decided to invite several top executives of companies who ran large mainframe computer networks to dinner at a restaurant. When Ahlstrom and LaMarche asked them what troubles they frequently encountered, network management was cited as the most challenging task. These conversations inspired the duo to raise the capital to incorporate Avant-Garde Computing in Cherry Hill; in order to secure adequate financing, the two also put second mortgages on their homes. The company's first product, Net/Alert, was announced in October 1979. It was a hardware–software network management suite comprising a light-pen-capable, color CRT monitor, logic analyzer, and graphical software to analyze traffic on a mainframe network and display and print reports based on the collected data. Net/Alert took two years to develop and was primarily the brainchild of Ahlstrom; LaMarche meanwhile possessed t
https://en.wikipedia.org/wiki/Internet%20real-name%20system%20in%20China
The Internet real-name system in China is a real-name system in which Internet service providers and Internet content providers (especially user-generated content sites) in the People's Republic of China are required to collect users' real names, ID numbers, and other information when providing services. Since the implementation of the real-name system on the Internet may lead to the infringement and narrowing of the constitutionally protected speech space of Internet users, it has attracted concerns from all sides and generated much controversy in Chinese society. Only a few countries in the world, such as South Korea, have implemented a real-name system on the Internet. History Proposal to ban anonymity in civil society The origin of the proposed ban on anonymity in mainland China is generally believed to be the proposal made by Li Xiguang, a journalism professor at Tsinghua University, in 2002, when he talked about journalism reform in the South, that "the Chinese National People's Congress should ban anyone from being anonymous online". He argued that the Internet should be strictly protected by copyright and intellectual property rights, and that "at the same time, online writing should be legally responsible," and that "including traditional media, we should promote the use of real names, not pseudonyms... publishing under pseudonyms is irresponsible to the public. ". His remarks caused an uproar on the Internet and became known as the "Li Xiguang incident". Although there was a period of heated debate, no corresponding measures were subsequently introduced and the matter was left unresolved. Afterwards, Li Xiguang himself said that he had lost interest in the topic of real names on the Internet, and that "banning online anonymity is very unrealistic and not legally or technically feasible." The Chinese government's implementation of an online real-name system In 2012, the Standing Committee of the National People's Congress (NPC) of China adopted the Deci
https://en.wikipedia.org/wiki/Bondee
Bondee is a virtual avatar social networking app developed by Singapore-based tech firm, Metadream. It is a platform in which users can connect and interact with others by using a personalized figure-style avatar. Launched in January 2023, the app quickly gained popularity in Asia and topped app store charts in several countries. History and development Metadream, an independent Singapore-based tech company, acquired the intellectual property (IP) rights for True.ly in May 2022, and planned to roll out worldwide. It was founded by investors from the United States and Australia, with research and development (R&D) and operational bases in Japan and South Korea, as well as data centers in Singapore, Japan, and the United States, to ensure product safety and meet data security requirements. The company plans to establish regional operation centers in additional countries (such as Thailand and the Philippines) to serve local users. On November 15, 2022, it was announced that information technology (IT) startup company, Metadream, would launch Bondee, a figure-style avatar messenger application. The virtual avatar application was then officially launched on January 17, 2023. Features Bondee is a messenger application that allows users to send and receive messages from friends and acquaintances, "[to] express their current mood, condition, situation," or alternatively, through a "figure-style" avatar, and "exhibit" it to the other person using the app. It is available on operating systems such as iOS 13.0 and above and Android 8.0 and up versions. Upon opening the app, users would be able to customize their personal "3D virtual" self from a "wide selection" of avatars, hairstyles, clothing, shoes, and accessories; then, they would be redirected to create a virtual space or home with pieces of furniture, fixtures, and fittings. Users can also participate in "virtual" activities such as camping, swinging, dancing, sailing, visiting each other's rooms with friends, and l
https://en.wikipedia.org/wiki/El%20Capitan%20%28supercomputer%29
Hewlett Packard Enterprise El Capitan, is an upcoming exascale supercomputer, hosted at the Lawrence Livermore National Laboratory in Livermore, United States and projected to become operational in 2024. It is based on the Cray EX Shasta architecture. When deployed, El Capitan is projected to displace Frontier as the world's fastest supercomputer. Design El Capitan has been announced to use an unknown number of AMD Instinct MI300A accelerated computing units (APUs). The MI300A consists of 24 AMD Zen AMD64-based CPU cores, and CDNA 3-based GPU integrated onto a single organic package, along with 128GB of HBMe RAM. The floor space and number of racks for El Capitan have not yet been announced. Blades are interconnected by HPE Slingshot 64-port switch that provides 12.8 terabits/second of bandwidth. Groups of blades are linked in a dragonfly topology with at most three hops between any two nodes. Cabling is either optical or copper, customized to minimize cable length. Total cabling runs . El Capitan uses an APU architecture where the CPU and GPU share an internal on-chip coherent interconnect. History El Capitan was ordered as a part of the Department of Energy's CORAL-2 initiative, intended to replace Sierra (supercomputer), an IBM/NVIDIA machine deployed in 2018. LLNL partnered with HPE Cray and AMD to build the system. Three El Capitan prototypes – named rzVernal, Tioga, and Tenaya – themselves were powerful enough to be listed on the TOP200 supercomputer list in June, 2023. rzVernal reached 4.1 petaflops. In early July, the first components of El Capitan were installed at Lawrence Livermore, with complete installation expected by mid 2024. References Cray products Exascale computers GPGPU supercomputers Lawrence Livermore National Laboratory X86 supercomputers 64-bit computers
https://en.wikipedia.org/wiki/Chiliz
Chiliz is a blockchain platform developed by Maltese-based sports company Mediarex. The Chiliz blockchain powers the Socios.com platform, which offers fan tokens to sports fans, enabling them to participate in polls hosted by the clubs, or receive rewards and promotions. The native token Chiliz is used to buy the fan tokens. Alexandre Dreyfus is the CEO of Chiliz and Beatrice Collet is the managing director. History Chiliz was launched in 2018 by Maltese-based sports company Mediarex led by CEO Alexandre Dreyfus. Members of the firm's advisory panel include Dr. Christian Mueller, InFront Sports’ vice president, strategy and business development, and Sam Li, Sina Sports’ head of strategic partnerships; with Perform Group's chief strategy officer of Perform Group, John Gleasure, also a shareholder of Mediarex. Other members of the advisory board are Fnatic's CEO Wouter Sleijffers and Team Vitality's CEO Nicolas Maurer. In June 2018, Chiliz raised $65 million in a round led by Binance with other reputed names in the industry like OK Blockchain Capital, FBG Capital, Ceyuan Ventures, and Bancor also investing. In March 2021, the company announced it will invest $50 million in an expansion to the United States. Fan Tokens Fan tokens are digital coins created on the Chiliz blockchain that sports organisations provide to their fans through the app Socios.com. They allow fans to vote on a variety of minor decisions, such as new facilities, kit designs, shirt numbers of new signings, celebration songs, and more. Fan Tokens were first introduced in 2019, with football clubs Juventus and Paris Saint-Germain being the first clubs to launch their official tokens. Sports clubs including Barcelona, Atletico Madrid, Manchester City, Inter, Arsenal, AS Roma, Galatasaray, Flamengo, Corinthians and 60 other teams have launched Fan Tokens through Socios.com. References External links Official Website Digital currencies Blockchains Cryptocurrency projects Cryptocurrencies
https://en.wikipedia.org/wiki/Inverter-based%20resource
An inverter-based resource (IBR) is a source of electricity that is asynchronously connected to the electrical grid via an electronic power converter ("inverter"). The devices in this category, also known as converter interfaced generation (CIG), include the variable renewable energy generators (wind, solar) and battery storage power stations. These devices lack the intrinsic behaviors (like the inertial response of a synchronous generator) and their features are almost entirely defined by the control algorithms, presenting specific challenges to system stability as their penetration increases, for example, a single software fault can affect all devices of a certain type in a contingency (cf. section on Blue Cut fire below). IBRs are sometimes called non-synchronous generators. The design of inverters for the IBR generally follows the IEEE 1547 and NERC PRC-024-2 standards. Grid-following vs. grid-forming A grid-following (GFL) device is synchronized to the local grid voltage and injects an electric current vector aligned with the voltage (in other words, behaves like a current source). The GFL inverters are built into an overwhelming majority of installed IBR devices. Due to their following nature, the GFL device will shutdown if a large voltage/frequency disturbance is observed. The GFL devices cannot contribute to the grid strength, dampen active power oscillations, or provide inertia. A grid-forming (GFM) device partially mimics the behavior of a synchronous generator: its voltage is controlled by a free-running oscillator that slows down when more energy is withdrawn from the device. Unlike a conventional generator, the GFM device has no overcurrent capacity and thus will react very differently in the short-circuit situation. Adding the GFM capability to a GFL device is not expensive in terms of components, but affects the revenues: in order to support the grid stability by providing extra power when needed, the power semiconductors need to be oversized and
https://en.wikipedia.org/wiki/Piecewise%20algebraic%20space
In mathematics, a piecewise algebraic space is a generalization of a semialgebraic set, introduced by Maxim Kontsevich and Yan Soibelman. The motivation was for the proof of Deligne's conjecture on Hochschild cohomology. Robert Hardt, Pascal Lambrechts, Victor Turchin, and Ismar Volić later developed the theory. References Maxim Kontsevich and Yan Soibelman. “Deformations of algebras over operads and the Deligne conjecture”. In: Conférence Moshé Flato 1999, Vol. I (Dijon). Vol. 21. Math. Phys. Stud. Dordrecht: Kluwer Acad. Publ., 2000, pp. 255–307. arXiv: math/0001151. Algebraic geometry
https://en.wikipedia.org/wiki/Deligne%27s%20conjecture%20on%20Hochschild%20cohomology
In deformation theory, a branch of mathematics, Deligne's conjecture is about the operadic structure on Hochschild cochain complex. Various proofs have been suggested by Dmitry Tamarkin, Alexander A. Voronov, James E. McClure and Jeffrey H. Smith, Maxim Kontsevich and Yan Soibelman, and others, after an initial input of construction of homotopy algebraic structures on the Hochschild complex. It is of importance in relation with string theory. See also piecewise algebraic space References Further reading https://ncatlab.org/nlab/show/Deligne+conjecture https://mathoverflow.net/questions/374/delignes-conjecture-the-little-discs-operad-one Algebraic topology String theory Conjectures
https://en.wikipedia.org/wiki/Stanhope%20Demonstrator
The Stanhope Demonstrator was the first machine to solve problems in logic. It was designed by Charles Stanhope, 3rd Earl Stanhope to demonstrate consequences in logic symbolically. The first model was constructed in 1775. It consisted of two slides coloured red and gray mounted in a square brass frame. This could be used to demonstrate the solution to a syllogistic type of problem in which objects might have two different properties and the question was how many would have both properties. Scales marked zero to ten were used to set the numbers or proportions of objects with the two properties. This form of inference anticipated the numerically definite syllogism which Augustus De Morgan laid out in his book, Formal Logic, in 1847. Construction The device was a brass plate about four inches square which was mounted on a piece of mahogany which was three-quarters of an inch thick. There was a opening with a depression in the wood about one and a half inches square and half an inch deep. This opening was called the holon, meaning whole, and represented the full set of objects under consideration. A slide of red translucent glass could be inserted from the right across the holon. A slide of gray wood could be slid under the red slide. When the device was used for the "Rule for the Logic of Certainty", the gray slider was inserted from the left. When it was used for the "Rule for the Logic of Probability", the gray slider was inserted from above. The red and the gray sliders represented the two affirmative propositions which were being combined. Stanhope called these ho and los. At least four of the devices with this square style were built. In 1879, Robert Harley wrote that he had one which he had been given by Stanhope's great-grandson, Arthur, who had kept one. The other two were owned by General Babbage – the son of Charles Babbage, who continued his work on the Analytical Engine. One of the devices was donated to the Science Museum, London by the
https://en.wikipedia.org/wiki/Macroprogramming
In computer science, macroprogramming is a programming paradigm aimed at expressing the macroscopic, global behaviour of an entire system of agents or computing devices. In macroprogramming, the local programs for the individual components of a distributed system are compiled or interpreted from a macro-program typically expressed by a system-level perspective or in terms of the intended global goal. The aim of macroprogramming approaches is to support expressing the macroscopic interactive behaviour of a whole distributed system of computing devices or agents in a single program. It has not to be confused with macros, the mechanism often found in programming languages (like C or Scala) to express substitution rules for program pieces. Macroprogramming originated in the context of wireless sensor network programming and found renewed interest in the context of the Internet of Things and swarm robotics. Macroprogramming shares similar goals (related to programming a system by a global perspective) with multitier programming, choreographic programming, and aggregate computing. Context and motivation Programming distributed systems, multi-agent systems, and collectives of software agents (e.g., robotic swarms) is difficult, for many issues (like communication, concurrency, and failure) have to be properly considered. In particular, a general recurrent problem is how to induce the intended global behaviour by defining the behaviour of the individual components or agents involved. The problem can be addressed through learning approaches, such as multi-agent reinforcement learning, or by manually defining the control program driving each component. However, addressing the problem by a fully individual (or (single-node) perspective may be error-prone, because it is generally difficult to foresee the overall behaviour emerging from complex networks of activities and interactions (cf. complex systems and emergence. Therefore, researchers have started investigated ways
https://en.wikipedia.org/wiki/Adobe%20Enhanced%20Speech
Adobe Enhanced Speech is an online artificial intelligence software tool by Adobe that aims to significantly improve the quality of recorded speech that may be badly muffled, reverberated, full of artifacts, tinny, etc. and convert it to a studio-grade, professional level, regardless of the initial input's clarity. Users may upload mp3 or wav files up to an hour long and a gigabyte in size to the site to convert them relatively quickly, then being free to listen to the converted version, toggle back-and-forth and alternate between it and the original as it plays, and download it. Currently in beta and free to the public, it has been used in the restoration of old movies and the creation of professional-quality podcasts, narrations, etc. by those without sufficient microphones. Although the model still has some current limitations, such as not being compatible with singing and occasional issues with excessively muffled source audio resulting in a light lisp in the improved version, it is otherwise noted as incredibly effective and efficient in its purpose. Utilizing advanced machine learning algorithms to distinguish between speech and background sounds, it enhances the quality of the speech by filtering out the noise and artifacts, adjusting the pitch and volume levels, and normalizing the audio. This is accomplished by the network having been trained on a large dataset of speech samples from a diverse range of sources and then being fine-tuned to optimize the output. References Enhanced Search Audio software Artificial intelligence Deep learning software applications
https://en.wikipedia.org/wiki/Tensor%20%28machine%20learning%29
Tensor informally refers in machine learning to two different concepts that organize and represent data. Data may be organized in a multidimensional array (M-way array) that is informally referred to as a "data tensor"; however in the strict mathematical sense, a tensor is a multilinear mapping over a set of domain vector spaces to a range vector space. Observations, such as images, movies, volumes, sounds, and relationships among words and concepts, stored in an M-way array ("data tensor") may be analyzed either by artificial neural networks or tensor methods. Tensor decomposition can factorize data tensors into smaller tensors. Operations on data tensors can be expressed in terms of matrix multiplication and the Kronecker product. The computation of gradients, an important aspect of the backpropagation algorithm, can be performed using PyTorch and TensorFlow. Computations are often performed on graphics processing units (GPUs) using CUDA and on dedicated hardware such as Google's Tensor Processing Unit or Nvidia's Tensor core. These developments have greatly accelerated neural network architectures and increased the size and complexity of models that can be trained. History A tensor is by definition a multilinear map. In mathematics, this may express a multilinear relationship between sets of algebraic objects. In physics, tensor fields, considered as tensors at each point in space, are useful in expressing mechanics such as stress or elasticity. In machine learning, the exact use of tensors depends on the statistical approach being used. In 2001, the field of signal processing and statistics were making use of tensor methods. Pierre Comon surveys the early adoption of tensor methods in the fields of telecommunications, radio surveillance, chemometrics and sensor processing. Linear tensor rank methods (such as, Parafac/CANDECOMP) analyzed M-way arrays ("data tensors") composed of higher order statistics that were employed in blind source separation problems
https://en.wikipedia.org/wiki/Math%20walk
A math walk, or math trail, is a type of themed walk in the US, where direct experience is translated into the language of mathematics or abstract mathematical sciences such as information science, computer science, decision science, or probability and statistics. Some sources specify how to create a math walk whereas others define a math walk at a specific location such as a junior high school or in Boston. The journal The Mathematics Teacher includes a special section titled "Mathematical Lens" in many issues with the metaphor of lens capturing seeing the world as mathematics. Informal learning The idea that "math is everywhere", which is emphasized on a math walk, is captured by the philosophy of mathematicism with its early adherents, Pythagoras and Plato. The math walk also implicitly involves experiencing math via modeling since mathematics serves to model what we sense. The math walk is a form of informal learning, often in an outside environment or in a museum. This type of learning is contrasted with formal learning, which tends to be more structured and performed in a classroom. Math walks have been shown to encourage students to think more deeply about mathematics, and to connect school content to the real world. Maps and object discovery There are different approaches to designing a math walk. The walk can be guided or unguided. In a guided walk, the learners are guided by a person knowledgeable in the topic of mathematics. In an unguided walk, learners are provided with a map. The map identifies walking stops and identifiers, such as QR codes or bluetooth beacons, to provide additional information on how the objects experienced during a math walk are translated into mathematical language. Example math walk scene A walk can involve translation only, or translation and problem solving. For example, considering a window on a building involves first perceiving the window. After perception, there is a translation of the form of the window to mathematic
https://en.wikipedia.org/wiki/Exclu
Exclu was an encrypted messaging app that was shut down after a series of international raids in February 2023. Service Exclu offered licences for three or six months, for €500 and €900 respectively. History In 2019 German police raided CyberBunker, which led to them obtaining data needed to decrypt the Exclu services. Data was shared with other police forces. On 3 February 2023 simultaneous raids took place on properties in Germany, the Netherlands, Poland and Belgium. 48 people were arrested, including administrators, developers and users of Exclu. Dutch police said that arrests were based on two distinct operations. The first, called 26Sambar, started in September 2020 and targeted owners and managers of Exclu. They were suspected of facilitating criminals. The second, called 26Lytham, began in late April 2022. This specifically targeted users suspected of being involved in organised crime. In February 2023 German police announced that George Mitchell was one of five major suspects behind the network. References Anonymity networks Cyberspace Dark web Defunct darknet markets Distributed computing architecture Law enforcement operations Organized crime in Europe 2023 disestablishments
https://en.wikipedia.org/wiki/Pinto%20Bean%20%28squirrel%29
Pinto Bean was an eastern gray squirrel on the campus of the University of Illinois Urbana-Champaign that was renowned and named for its rare piebald pattern. It died in Champaign on October 8, 2022, presumably due to a motor vehicle collision. Pinto Bean has been called a "minor celebrity" and a "grassroots, unofficial mascot" for the university. During a home football game against Minnesota, a tribute to Pinto Bean was shown on the jumbotron at Memorial Stadium. Pattern An eastern gray squirrel, Pinto Bean was named for its distinctive mixture of gray fur with patches of unpigmented white fur, which resembled the appearance of pinto beans. According to Illinois Natural History Survey director Eric Shauber, this was the result of a rare genetic mutation that affected where melanin was distributed in the squirrel's body. Shauber said the mutation was rare enough to assume that there was only one such squirrel on campus. Death and taxidermy Pinto Bean was found dead on the side of Springfield Avenue in Champaign on October 8, 2022. Its cause of death was presumably a vehicular collision. News of its death quickly spread on the r/UIUC subreddit and other social media. One user, Champaign resident Clark Jackson, retrieved the squirrel's remains and delivered them to a taxidermist in Bloomington in an effort to preserve the squirrel. As of March 27, 2023, Pinto Bean's taxidermied remains are on display at the Forbes Natural History Building on the south side of the university's campus. Tributes During Illinois's October 15, 2022, home football game against Minnesota, the jumbotron at Memorial Stadium displayed a tribute to Pinto Bean at halftime. The screen displayed an image of Pinto Bean, accompanied by the message: "RIP to Pinto Bean the Squirrel, forever in our hearts." The Illini Wildlife and Conservation Club held a moment of silence in remembrance of Pinto Bean. Students and community members shared many tributes and eulogies on r/UIUC. See also Tomm
https://en.wikipedia.org/wiki/Candle%20Corporation
Candle Corporation was an American software company active from 1976 to 2004. The company spent the first two decades developing system monitoring applications for a variety of IBM mainframes and their corresponding software, their first being OMEGAMON which saw quick widespread adoption in commercial enterprises. In the mid-1990s, the company made pivots toward non-mainframe monitoring software and middleware. IBM acquired the company for between $350 million and $600 million in 2004. History 1970s – 1980s Aubrey G. Chernick (born 1949 in Los Angeles, California), the founder of Candle, grew up in Deloraine, Manitoba, after his family moved there from California. After graduating from the University of Manitoba with a Bachelor of Science in chemistry, he landed a job at the university's environmental protection laboratory, performing analyses of the Red River of the North. The minicomputers at the lab were Chernick's first hands-on experience with computers; with a fellow employee, he learned how to program in BASIC. Following this, Chernick deviated from his original career path of medicine to work as a software developer for Computer Science Corporation (CSC)'s Canadian subsidiary in Ontario. After getting laid off from CSC after three months, he worked as a programmer for Laurentian University, working on IBM's System/360 Model 40 mainframe, and for the Government of Manitoba, where he learned how to operate and code for IBM's MFT and MVS operating systems. These jobs provided Chernick his first experiences with mainframes. While attending meetings hosted by in Ontario SHARE—a users' group for IBM mainframe personnel—Chernick observed recurring complaints from attendees, who spoke of not being able to satisfy common needs with IBM's operating systems. In 1975, Chernick convinced Canada Life's Ontario branch to let him use their mainframes as a development platform for an application that monitored system performance, in exchange for a bargain license for the
https://en.wikipedia.org/wiki/0/1-polytope
A 0/1-polytope is a convex polytope generated by the convex hull of a subset of d coordinates value 0 or 1, {0,1}d. The full domain is the unit hypercube with cut planes passing through these coordinates. A d-polytope requires at least d+1 vertices, and can't be all in the same hyperplanes. n-simplex polytopes for example can be generated (n+1) vertices, using the origin, and one vertex along each primary axis, (1,0....), etc. References Polytopes Convex hulls Planes (geometry)
https://en.wikipedia.org/wiki/Cole%E2%80%93Hopf%20transformation
The Cole–Hopf transformation is a method of solving parabolic partial differential equations (PDEs) with a quadratic nonlinearity of the form:where , are constants, is the Laplace operator, is the gradient, and is the -norm. By assuming that , where is an unknown smooth function, we may calculate:Which implies that:if we constrain to satisfy . Then we may transform the original nonlinear PDE into the canonical heat equation by using the transformation: This is the Cole-Hopf transformation. With the transformation, the following initial-value problem can now be solved:The unique, bounded solution of this system is:Since the Cole–Hopf transformation implies that , the solution of the original nonlinear PDE is: Applications Aerodynamics Stochastic optimal control Solving the viscous Burgers' equation References Partial differential equations Transformation (function) Parabolic partial differential equations
https://en.wikipedia.org/wiki/UML-RSDS
UML-RSDS is a lightweight Model-driven engineering (MDE) and Model transformation tool supporting the UML 2.5 class diagram notation and OCL 2.4 Object Constraint Language. It supports code-generation in multiple 3GLs: Java, C#, C++, Python, Go, Swift and ANSI C. The toolset has been defined as an Eclipse project AgileUML under the modeling category. The toolset originated from EPSRC-funded research at Imperial College and King's College London in the period 1996–2014. It was publicly released in 2010 and defined as an Eclipse project in 2019. It is now supported by AgileMDE Ltd: agilemde.co.uk. One motivation of the tools has been to provide a means for general software practitioners to use MDE in a flexible manner, to support agile development using MDE. The tool has been applied to financial software development and to many different kinds of transformation problems, for example Language characteristics The main specification notations in UML-RSDS are UML class diagrams and use cases, together with Object Constraint Language (OCL) expressions, used to define invariants and operation pre and post-conditions. Either a graphical or textual notation can be used for UML-RSDS specifications. For example, a simple class specification could be written as: class Person { attribute age: int; attribute name: String; operation birthday() pre: true post: age = age@pre + 1; } Transformations are defined as use cases together with constraints expressing how result data is derived from input data. Thus a declarative specification style similar to the QVT relations language is supported, but without the need to define additional syntax or language elements - only OCL is used. For example, to copy every person instance to a 'PersonRecord' with a 'data' attribute formed from the name and age, it is sufficient to write: usecase copyPersons { Person:: PersonRecord->exists( r | r.data = name + age ) } Verification procedures have been defined for this sty
https://en.wikipedia.org/wiki/Superconcentrated%20electrolytes
Superconcentrated electrolytes, also known as water-in-salt or solvent-in-salt liquids, usually refer to chemical systems, which are liquid near room temperature and consist of a solvent-to-dissoved salt in a molar ratio near or smaller than ca. 4-8, i.e. where all solvent molecules are coordinated to cations, and no free solvent molecules remain. Since ca. 2010 such liquid electrolytes found several applications, primarily for batteries. In the case of lithium metal batteries and lithium-ion batteries most commonly used anions for superconcentrated electrolytes are those, that are large, asymmetric and rotationally-vibrationally flexible, such bis(trifluoromethanesulfonyl)amide and bis(fluorosulfonyl)amide. Noteworthy, lithium chloride and sodium perchlorate also form water-in-salt solutions. Advantages Superconcentrated electrolytes demonstrate the following advantages: (1) They show a good oxidative stability. In particular, some can suppress oxidative corrosion of an Al current collector without a source of fluoride ion (such as hexafluorophosphate) and enable the use of 5 V lithium-ion battery cathode materials. (2) They are resistant to electrochemical reduction. It is believed, that some sulfonimides (e.g., those with S-F and F-(H)C-N fragments, form a solid electrolyte interface similar to that formed by some organic carbonate solvents. Properties #1 and #2 are responsible for very large (4-5 volt) voltage window, which is useful for advanced batteries. (3) Related to #2 is the ability of superconcentrated electrolytes to allow for reversible intercalation of Li+ ions into graphite in the absence of ethylene carbonate solvent, therefore enabling a new class of safer lithium-ion batteries. (4) Solvent volatility is lower and thermal stability is higher, which contributes to a better battery safety. (5) The concentration of charge-carrying ion is larger, which translates into smaller ion travelling distances. (6) In some cases, and contrary to expectat
https://en.wikipedia.org/wiki/Kempe%27s%20Engineers%20Year-Book
Kempe's Engineers Year Book was for many years a standard reference work of practical engineering information in the United Kingdom, covering a wide range of subjects. History First published in 1894 by H. R. Kempe with W. Hannaford-Smith and then published annually, except during World War II, until 2002, the book was a standard source of reference for civil, mechanical, electrical, marine, mining, and other engineers. See also Machinery's Handbook References External links Kempe's Engineers Year Book, 1949, Vol. 1, Archive.org Kempe's Engineers Year Book, 1949, Vol. 2, Archive.org Kempe's Engineers Year Book, 1969, Vol. 2, Archive.org Kempe's Engineers Year Book, 1989, Vol. 1, Archive.org Kempe's Engineers Year Book, 1985, Archive.org Engineering books Yearbooks Annual publications Publications established in 1894 Publications disestablished in 2002
https://en.wikipedia.org/wiki/Vintage%20computer
A vintage computer is an older computer system that is largely regarded as obsolete. The personal computer has been around since approximately 1971. But in that time, numerous technological revolutions have left generations of obsolete computing equipment on the junk heap. Nevertheless, in that time, these otherwise useless computers have spawned a sub-culture of vintage computer collectors, who often spend large sums to acquire the rarest of these items, not only to display but restore to their fully functioning glory, including active software development and adaptation to modern uses. This often includes homebrew developers and hackers who add on, update and create hybrid composites from new and old computers for uses for which they were otherwise never intended. Ethernet interfaces have been designed for many vintage 8-bit machines to allow limited connectivity to the Internet; where users can access user groups, bulletin boards, and databases of software. Most of this hobby centers on those computers manufactured after 1960, though some collectors specialize in pre-1960 computers as well. The Vintage Computer Festival, an event held by the Vintage Computer Federation for the exhibition and celebration of vintage computers, has been held annually since 1997 and has expanded internationally. By platform MITS Inc. Micro Instrumentation and Telemetry Systems (MITS) produced the Altair 8800 in 1975. According to Harry Garland, the Altair 8800 was the product that catalyzed the microcomputer revolution of the 1970s. IMSAI IMSAI produced a machine similar to the Altair 8800. It was introduced in 1975, first as a kit, and later as an assembled system. The list price was $591 () for a kit, and $931 () assembled. Processor Technology Processor Technology produced the Sol-20. This was one of the first machines to have a case that included a keyboard; a design feature copied by many of later "home computers". SWTPC Southwest Technical Products Corporation (
https://en.wikipedia.org/wiki/Factorization%20algebra
In mathematics and mathematical physics, a factorization algebra is an algebraic structure first introduced by Beilinson and Drinfel'd in an algebro-geometric setting as a reformulation of chiral algebras, and also studied in a more general setting by Costello to study quantum field theory. Definition Prefactorization algebras A factorization algebra is a prefactorization algebra satisfying some properties, similar to sheafs being a presheaf with extra conditions. If is a topological space, a prefactorization algebra of vector spaces on is an assignment of vector spaces to open sets of , along with the following conditions on the assignment: For each inclusion , there's a linear map There is a linear map for each finite collection of open sets with each and the pairwise disjoint. The maps compose in the obvious way: for collections of opens , and an open satisfying and , the following diagram commutes. So resembles a precosheaf, except the vector spaces are tensored rather than (direct-)summed. The category of vector spaces can be replaced with any symmetric monoidal category. Factorization algebras To define factorization algebras, it is necessary to define a Weiss cover. For an open set, a collection of opens is a Weiss cover of if for any finite collection of points in , there is an open set such that . Then a factorization algebra of vector spaces on is a prefactorization algebra of vector spaces on so that for every open and every Weiss cover of , the sequence is exact. That is, is a factorization algebra if it is a cosheaf with respect to the Weiss topology. A factorization algebra is multiplicative if, in addition, for each pair of disjoint opens , the structure map is an isomorphism. Algebro-geometric formulation While this formulation is related to the one given above, the relation is not immediate. Let be a smooth complex curve. A factorization algebra on consists of A quasicoherent sheaf over for any finite s
https://en.wikipedia.org/wiki/Hyperproperty
In computer science, hyperproperties are a formalism for describing properties of computational systems. Hyperproperties generalize safety and liveness properties, and can express properties such as non-interference and observational determinism. Elaborating on the example of non-interference: Non-interference can't be represented as a "property" in the formal sense because there's no inclusion-test that could be applied to a single program trace; non-interference is an assertion about how neighboring traces are similar to each other and it does no good to look at one trace at a time. "Hyperproperties" are the extension from properties as predicates on traces to properties as relations between traces. Definitions Traces and systems Hyperproperties are defined in terms of traces of a computational system. A trace is a sequence of states; a system is a set of traces. Intuitively, a program corresponds to the set of all of its possible execution traces, given any inputs. Formally, the set of traces over a set of states is . This representation is expressive enough to encompass several computational models, including labeled transition systems and state machines. Hyperproperties A trace property is a set of traces. Safety and liveness properties are trace properties. Formally, a trace property is an element of , where is the powerset operator. A hyperproperty is a set of trace properties, that is, an element of . Trace properties may be divided into safety properties (intuitively, properties that ensure "bad things don't happen") and liveness properties ("good things do happen"), and every trace property is the intersection of a safety property and a liveness property. Analogously, hyperproperties may be divided into hypersafety and hyperliveness hyperproperties, and every hyperproperty is an intersection of a safety hyperproperty and a liveness hyperproperty. -safety properties are safety hyperproperties such that every violation of the property can be witne
https://en.wikipedia.org/wiki/Geometric%20drawing
Geometric drawing consists of a set of processes for constructing geometric shapes and solving problems with the use of a ruler without graduation and the compass (drawing tool). Modernly, such studies can be done with the aid of software, which simulates the strokes performed by these instruments. For ancient mathematicians, geometry could not do without the methods of geometric constructions, necessary for understanding, theoretical enrichment, and problem-solving. The accuracy and precision required of geometric drawing make it an important ally in the application of geometric concepts in significant areas of human knowledge, such as architecture, engineering, industrial design, among others. The process of geometric drawing is based on constructions with a ruler and compass, which in turn are based on the first three postulates of Euclid's Elements. The historical importance of rulers and compasses as instruments in solving geometric problems leads many authors to limit Geometric Drawing to the representation and solution of geometric figures in the plane. With the development of computer-aided design (CAD) programs, geometric drawing has become more important in teaching-learning processes (development of spatial faculties) than the more imprecise tracing offered by rulers and compasses, when taking into account the precision of computer systems. See also Graphic design Technical drawing References General references External links Euclid's Elements Download Ruler and Compass 1.8 (software for building geometric figures with ruler and compass) Drawing Geometry
https://en.wikipedia.org/wiki/Henry%20Underhill
Henry Michael John Underhill (1855–1920) was an amateur scientist, artist, photographer and grocer from Oxford, England. Underhill is best known for his hand-painted and photographic lantern slides which illustrate a variety of subjects including entomology, natural history, prehistoric British archaeology and folk tales. Underhill was a founding member of the Oxfordshire Natural History Society (now the Ashmolean Natural History Society of Oxfordshire). Personal life and education Underhill was born in Oxford in 1855. He attended Christ Church Cathedral School and was a private pupil of the artist William Riviere. At one time, he lived at 20 Bardwell Road. After his father's death in 1896, Underhill took over as proprietor of the family's provisions store at 7 High Street, Oxford. He ran the shop until his own death in 1920. For much of his life, Underhill participated in charitable endeavours for the George Street Congregational Sunday School and the Oxford branch of the Band of Hope (now Hope UK), organising outings and providing entertainment for the city’s poorer children. One of Underhill's earliest magic lantern shows was given to the pupils of the Oxford Ragged School, which his grandfather, father and uncle had helped to establish. Underhill died on 2 October 1920 aged 65 after a long bout of cancer. He is buried in Wolvercote Cemetery. Oxfordshire Natural History Society and lantern slide lectures Underhill was active in Oxford's intellectual community and helped to found the Oxfordshire Natural History Society. He served as the society's secretary before becoming its president in 1893. Throughout the 1880s and 1890s, Underhill gave eleven lectures to the society on a variety of topics. All of his lectures were illustrated by his hand-painted and photographic lantern slides. His lectures for the society included: Spiders (1887) Insect Eyes (1888) Microscopic Organisms from Ponds (1889) Artistic Japan (1890) Painting Lantern Slides (1891) A Hol
https://en.wikipedia.org/wiki/Baik%E2%80%93Deift%E2%80%93Johansson%20theorem
The Baik–Deift–Johansson theorem is a result from probabilistic combinatorics. It deals with the subsequences of a randomly uniformly drawn permutation from the set . The theorem makes a statement about the distribution of the length of the longest increasing subsequence in the limit. The theorem was influential in probability theory since it connected the KPZ-universality with the theory of random matrices. The theorem was proven in 1999 by Jinho Baik, Percy Deift and Kurt Johansson. Statement For each let be a uniformly chosen permutation with length . Let be the length of the longest, increasing subsequence of . Then we have for every that where is the Tracy-Widom distribution of the Gaussian unitary ensemble. Literature References Combinatorics Probability
https://en.wikipedia.org/wiki/Form-fit%20connection
A form-fit, form-locking or form-closed connection is a type of mechanical connection between two parts (example: screw and screwdriver), wherein these parts due to their forms interlock and block each other along at least one defined linear or rotational direction. Form-fit connections are created by the interlocking of the connecting components. For example, the lid cannot slip sideways away from the pot because both interlock at the edge. On the other hand, a round lid can be rotated while sitting on the pot, because there is no form-fit against the rotation. Towards the bottom, the lid has a stop against the pot. This is a "half form-fit" because upwards it can be removed. A form-fit acts via the geometric contact of two effective surfaces, and the effective forces are transmitted as normal forces to the effective surfaces of a driver part (surface pressure and Hertzian contact stress). Typically, some manufacturing-related tolerance occurs in the connection during form-fit. In a form-fit connection, one connection partner blocks the movement of the other. Such "blocking" occurs in at least one direction. If a second pair of surfaces is arranged opposite, the opposite direction is also blocked. Examples Tongue and groove Zipper References Mechanical engineering Metalworking terminology de:Verbindungstechnik#Formschluss
https://en.wikipedia.org/wiki/Method%20of%20Chester%E2%80%93Friedman%E2%80%93Ursell
In asymptotic analysis, the method of Chester–Friedman–Ursell is a technique to find asymptotic expansions for contour integrals. It was developed as an extension of the steepest descent method for getting uniform asymptotic expansions in the case of coalescing saddle points. The method was published in 1957 by Clive R. Chester, Bernard Friedman and Fritz Ursell. Method Setting We study integrals of the form where is a contour and are two analytic functions in the complex variable and continuous in . is a large number. Suppose we have two saddle points of with multiplicity that depend on a parameter . If now an exists, such that both saddle points coalescent to a new saddle point with multiplicity , then the steepest descent method no longer gives uniform asymptotic expansions. Procedure Suppose there are two simple saddle points and of and suppose that they coalescent in the point . We start with the cubic transformation of , this means we introduce a new complex variable and write where the coefficients and will be determined later. We have so the cubic transformation will be analytic and injective only if and are neither nor . Therefore and must correspond to the zeros of , i.e. with and . This gives the following system of equations we have to solve to determine and . A theorem by Chester–Friedman–Ursell (see below) says now, that the cubic transform is analytic and injective in a local neighbourhood around the critical point . After the transformation the integral becomes where is the new contour for and The function is analytic at for and also at the coalescing point for . Here ends the method and one can see the integral representation of the complex Airy function. Chester–Friedman–Ursell note to write not as a single power series but instead as to really get asymptotic expansions. Theorem by Chester–Friedman–Ursell Let and be as above. The cubic transformation with the above derived values for and , s
https://en.wikipedia.org/wiki/Shadow%20effect
The shadow effect is a phenomenon seen in genetic studies that use noninvasive genetic data collection methods. It occurs when there are not enough loci and/or loci that have low variance of alleles within the population. As a result, researchers can capture two separate individuals and mistakenly label them as the same individual. This can create a negative bias in the data and portray a population as smaller and less genetically diverse than it is. This is most commonly seen in collection methods that rely on environmental DNA (eDNA) which is collected directly from the environment (such as feces or hair removed from the ground). The accuracy of non-invasive collection data can be increased by increasing the amount of loci being examined during the study. Background There are several types of rarefaction methods that can be used to estimate the size of a hard monitor species. The study of population size and density falls under demography, the study of populations of any kind of organism. Mark and recapture is a common form of data collection involving species with large populations. Being able to capture and mark a species in a noninvasive way allows for accurate readings of the population's size, both total and effective over several rounds of recapture. However, for species that are difficult to capture or view directly such as endangered species, it can be near impossible to use the mark-recapture method to obtain genetic samples. Another method for population size estimation is a real-time polymerase chain reaction (qPCR). qPCR is a molecular approach that measures the amplification of DNA over time rather than just at the end of the reaction. This method is useful because it can rely on eDNA to give an estimate of how abundant a species is in a given habitat. Noninvasive forms of data collection can be achieved through the collection of fur, feces or other fragments of DNA-rich material left behind (eDNA). Once considered costly, modern advancements hav
https://en.wikipedia.org/wiki/Woggabaliri
Woggabaliri is described by the Australian Sports Commission (ASC) as a traditional Indigenous Australian "co-operative kicking volley game". Described as a kicking game similar to soccer played in a group of four to six players in a circle, the game has been encouraged in schools in New South Wales and Queensland. Origin Ken Edwards research In 1999 Australian author Ken Edwards, then Associate Professor in Sport, Health and Physical Education at the Queensland University of Technology, published a book Choopadoo: Games from the Dreamtime, in which he makes mention of a game played by the Wiradjuri children near the Bogan River and Lachlan River. Historian David Thompson while investigating Aboriginal games, alleges that Edwards simply coined the term using an existing Aboriginal word and attributed it to various observations across outback Victoria and New South Wales. Ken Edwards and Troy Meston stated that the word Woggabaliri comes from the Wiradjuri word for "play". However according to the official Wiradjuri dictionary (as researched by Dr Stan Grant and Dr John Rudder) the word for play is wagigi. Robert Hamilton Mathews, studying Aboriginal Australian languages, listed the word woggabaliri in 1901 as the Ngunnawal word for "play". The Australian Sports Commission (ASC) in 2000 cited permission to "use and adapt" Edwards' Choopadoo book to publish a derivative titled Indigenous Traditional Games, listing it as one of 19 games complete with lists of rules. The ASC's John Evans copied the descriptions of the games verbatim from Edwards' book, though further modified Woggabaliri with additional rules to make it suitable for contemporary children to play. Indigenous Traditional Games has subsequently been cited as a source for Woggabaliri by others, such as English-Australian fantasy author Malcom Walker. Funding and grants based on Woggabaliri In 2002 the ASC also funded the Laureus Sport for Good Foundation and Aboriginal and Torres Strait Islander Comm
https://en.wikipedia.org/wiki/Degree-Rips%20bifiltration
The degree-Rips bifiltration is a simplicial filtration used in topological data analysis for analyzing the shape of point cloud data. It is a multiparameter extension of the Vietoris–Rips filtration that possesses greater stability to data outliers than single-parameter filtrations, and which is more amenable to practical computation than other multiparameter constructions. Introduced in 2015 by Lesnick and Wright, the degree-Rips bifiltration is a parameter-free and density-sensitive vehicle for performing persistent homology computations on point cloud data. Definition It is standard practice in topological data analysis (TDA) to associate a sequence of nested simplicial complexes to a finite data set in order to detect the persistence of topological features over a range of scale parameters. One way to do this is by considering the sequence of Vietoris–Rips complexes of a finite set in a metric space indexed over all scale parameters. If is a finite set in a metric space, then this construction is known as the Vietoris–Rips (or simply "Rips") filtration on , commonly denoted or . The Rips filtration can be expressed as a functor from the real numbers (viewed as a poset category) to the category of simplicial complexes and simplicial maps, a subcategory of the category of topological spaces and continuous maps via the geometric realization functor. The Rips filtration is indexed over a single parameter, but we can capture more information (e.g., density) about the underlying data set by considering multiparameter filtrations. A filtration indexed by the product of two totally-ordered sets is known as a bifiltration, first introduced by Gunnar Carlsson and Afra Zomorodian in 2009. The degree-Rips bifiltration filters each simplicial complex in the Rips filtration by the degree of each vertex in the graph isomorphic to the 1-skeleton at each index. More formally, let be an element of and define to be the subgraph of the 1-skeleton of containing all ver
https://en.wikipedia.org/wiki/Linked-read%20sequencing
Linked-read sequencing, a type of DNA sequencing technology, uses specialized technique that tags DNA molecules with unique barcodes before fragmenting them. Unlike traditional sequencing technology, where DNA is broken into small fragments and then sequenced individually, resulting in short read lengths that has difficulties in accurately reconstructing the original DNA sequence, the unique barcodes of linked-read sequencing allows scientists to link together DNA fragments that come from the same DNA molecule. A pivotal benefit of this technology lies in the small quantities of DNA required for large genome information output, effectively combining the advantages of long-read and short-read technologies. History This sequencing method was originally developed by 10x Genomics in 2015, and was launched under the name 'GemCode' or 'Chromium'. GemCode employed a method of gel bead-based barcoding to amalgamate short DNA fragments. The longer fragments produced by this could then be sequenced using validated technology such as Illumina next-generation sequencing. An updated version of linked-read sequencing was introduced by the same company in 2018, termed 'Linked-Reads V2'. While GemCode uses a single barcode for tagging of both the gel bead and the DNA fragment, Linked-Reads V2 uses separate barcodes for improved detection of genetic variants. The group developed the linked-read sequencing technology published their first paper regarding this technology in 2016. The authors of this paper developed the linked-read sequencing technology initially to sequence the genomes of both healthy individuals and cancer patients to determine somatic mutations, copy number variations, and structural variations in cancer genomes. Later that year, another research group combined linked-read sequencing technology with long-read sequencing technology to assemble human genome. Both studies demonstrated the utility of linked-read sequencing in comprehensive genome analysis and in und
https://en.wikipedia.org/wiki/Metaphocyte
Metaphocytes are myeloid-like cells considered among tissue-resident macrophages (TRMs) and are present in the skin, gill, and intestine of the zebrafish (Danio rerio). Originating from the ectoderm during development, metaphocytes share many similarities, in terms of cellular morphology and gene expression profile with macrophages (which are of mesodermal origin) in particular the Langerhans cells in the skin. Function Similar to many immune cells, metaphocytes are highly motile cells found in mucosal tissues such as skin, gills, and intestines. Interestingly, by contrast to conventional macrophages, metaphocytes do not migrate or respond to wound-induced inflammation, and they lack phagocytosis ability. The main function of the metaphocytes is to uptake soluble antigens from the external environment and to transfer these antigens to Langerhans cells (TRM of the skin), most probably to regulate the immune response. References Cell biology
https://en.wikipedia.org/wiki/JOYclub
JOYclub is an online dating service and sex-positive community for sexual contacts. The site provides events, dating, communication, content sharing, and forums for people of different genders and sexual orientations. There is also an online magazine and a section with pornographic films. History JOYclub was founded in 1999 as an internet forum. Since 2013, JOYclub has been a main product of F&P GmbH. In 2013, JOYclub got a Venus Award as the Best Erotic-Community. In 2015, the development of JOYCE, the JOYclub application, began. It was launched in 2017. In 2019, the site had three million members and four million in 2021. In 2020, the site started working in Spain. In 2020, during the COVID-19 restrictions JOYclub offered Italians the premium subscription for free to connect more actively online and share private photos, videos etc. In 2021, JOYclub was launched in France. In 2022, JOYclub started in Mexico as well as in the UK. JOYclub conducts various surveys on sexuality among its users. According to the book Online-Dating fur Dummies JOYclub has a strict authentication and identity verification procedure. Security and privacy In December 2021, JOYclub received a certificate for tested data protection from TÜV Saarland. JOYclub offers both a possibility to send messages via Messenger without an indication of the user's phone number and to completely communicate via the platform with no need to switch to other messengers. Regarding the security aspect, JOYclub has a strict and well established authentication and verification procedure. References Online dating services Internet forums
https://en.wikipedia.org/wiki/Zarya%20of%20the%20Dawn
Zarya of the Dawn is a short comic book written by Kris Kashtanova and illustrated using Midjourney. It is illustrated entirely using artificial intelligence, which resulted in a copyright dispute. Plot Zarya awakens in an abandoned New York City with no memories. A postcard from a person named Rusty falls out of their pocket, allowing them to remember their name and home address. After returning home and getting new clothes, Zarya meets Raya, their "inter-world assistant", who tells Zarya that a mental health crisis in 2023 led to the almost complete destruction of life on earth. Raya then takes Zarya to Zatura World, the world of acceptance. There, Zarya meets a mysterious woman and learns to accept their feelings. When they return to Central Park, it is covered in greenhouses. Zarya remarks that "acceptance is the first step of letting go". Copyright dispute In September 2022, Kashtanova applied for the comic's copyright protection with the United States Copyright Office, but they did not disclose that the illustrations were created using Midjourney, an artificial intelligence image generator. The comic was granted copyright protection, but the Copyright Office initiated a proceeding to revoke the protection of the artwork after discovering the fact. The artwork's copyright protection was revoked in February 2023, and the Copyright Office explained that only human-created works can receive protection. Although the images themselves are not protected by copyright, the arrangement of the images and the text and story of the book are, as they are the creative work of Kashtanova and not the artificial intelligence. See also Artificial intelligence and copyright Artificial intelligence art Alice and Sparkle References External links Zarya of the Dawn at AIcomicbooks 2022 comics debuts Artificial intelligence art Text-to-image generation Copyright law LGBT-related comics Comics set in New York City Public domain comics Comics controversies 2020s webcomic debuts
https://en.wikipedia.org/wiki/Oriane%20Lassus
Oriane Lassus (born in 1987) is a French author and cartoonist, as well as an illustrator. Biography Since 2009, Lassus has contributed to the Spongiculture blog in which she goes through her daily life against a backdrop of acerbic humour. The project won the "Blog Revelation Prize" two years later, at the Angoulême International Comics Festival. After a master's degree in illustration at the Académie Royale des Beaux-Arts in Brussels, Lassus published her first comic book, (Vraoum, 2012). The book explores the individual micro-events that affect the family unit. Lassus regularly participates in the artist residencies organized in Arc-et-Senans, which bring together each year a selection of authors among the most innovative of the alternative literary scene. Since 2014, Lassus has collaborated with the children's magazine, , in which she publishes the story , honored in the youth selection of the Angoulême International Comics Festival 2018 and the subject of an exhibition in the Pavillon Jeunes Talents (young talents pavilion). In 2016, the comic was published by Arbitraire. In it, Lassus highlights the situation of those women who choose not to have children. Awards and honors 2011: Blog Revelation Prize for Spongiculture, Angoulême International Comics Festival 2020: (EESI) prize Expositions "Le Meilleurissime Repaire de la Terre", Festival d'Angoulême 2018, Pavillon Jeunes Talents, January 2018 "Oriane Lassus : Lauréate du Prix de l’ÉESI 2020, Exposition personnelle", Éesi Angoulême, 30 January to 15 February 2020 Selected works Ça va derrière?, Vraoum, 2012 Immobilerie Pointure, Super Structure, 2013 Quoi de plus normal qu'infliger la vie?, Arbitraire, 2016 Première fraîcheur, Arbitraire, 2017 Le Meilleurissime Repaire de la Terre, Biscoto, 2017 Les Gardiennes du grenier, Biscoto, 2020 References 1987 births Living people 21st-century French writers French illustrators 21st-century French women writers French women illustrators French wo
https://en.wikipedia.org/wiki/Gervonta%20Davis%20vs.%20Ryan%20Garcia
Gervonta Davis vs. Ryan Garcia, billed as It Doesn't Get Any Better Than This, was a professional boxing match contested between WBA (Regular) lightweight champion Gervonta Davis and former WBC interim lightweight champion Ryan Garcia. The non-title bout took place at a catchweight of 136 lbs, with a 10 lbs rehydration clause, on April 22, 2023, at T-Mobile Arena in Paradise, Nevada. Gervonta Davis knocked out Ryan Garcia in the seventh round with a body shot to Ryan Garcia's right abdomen. After staggering backwards, Ryan Garcia took a knee, holding the right side of his body. The fight was praised by commentators and financially successful. The fight reportedly sold over 1.2 million PPV buys on Showtime for $84.99. It also generated approximately $22.8 million from ticket sales, with additional money earned from sponsorships and advertising. Fight card Broadcasting 1Also available in New Zealand 2Davis vs. Garcia on DAZN coverage only Available in Belgium, Bulgaria, Denmark, Estonia, Finland, Iceland, Ireland, Latvia, Lithuania, Netherlands, Norway, Poland, Portugal, Sweden, and the United Kingdom References 2023 in boxing Boxing matches April 2023 sports events in the United States Boxing in Nevada Simulcasts Boxing on Showtime DAZN
https://en.wikipedia.org/wiki/Alice%20and%20Sparkle
Alice and Sparkle is a 2022 children's book published by Ammaar Reshi. Reshi created the book using artificial intelligence in one weekend, which sparked controversy among artists. Plot A girl named Alice discovers artificial intelligence. She knows that artificial intelligence is powerful, and that it has the power to do good and evil depending on how it is used. One day, she creates her own artificial intelligence and names it Sparkle. Sparkle helps Alice with her homework and plays with her, and they quickly become good friends. However, Sparkle soon grows more powerful and begins to make its own decisions, which makes Alice both proud and scared. She knows that it is her responsibility to guide Sparkle to do good, not evil. Together, Alice and Sparkle use their knowledge to make the world a better place and to teach people about the power of artificial intelligence. The two live happily ever after. Creation Ammaar Reshi was inspired to write a children's book when reading to his friend's daughter, but had no experience with creative writing or illustration. To circumvent this, he used the chatbot ChatGPT to write the story for him and used the image generation software Midjourney to illustrate it. On December 4, 2022, 72 hours after having the idea for the book, he published it on Amazon's digital bookstore, and published a paperback version the following day. Within ten days it had sold around 70 copies. Controversy On December 9, 2022, Reshi made a thread on Twitter about his experience publishing the book, which soon went viral. Reshi received heavy backlash from artists with concerns over the ethics of art generated by artificial intelligence. He also received death threats and messages encouraging self-harm because of his publication. Many writers and illustrators criticized both the creation process and the product itself, claiming that if artificial intelligence programs such as Midjourney are trained on existing illustrations, then the original artist
https://en.wikipedia.org/wiki/Offset%20filtration
The offset filtration (also called the "union-of-balls" or "union-of-disks" filtration) is a growing sequence of metric balls used to detect the size and scale of topological features of a data set. The offset filtration commonly arises in persistent homology and the field of topological data analysis. Utilizing a union of balls to approximate the shape of geometric objects was first suggested by Frosini in 1992 in the context of submanifolds of Euclidean space. The construction was independently explored by Robins in 1998, and expanded to considering the collection of offsets indexed over a series of increasing scale parameters (i.e., a growing sequence of balls), in order to observe the stability of topological features with respect to attractors. Homological persistence as introduced in these papers by Frosini and Robins was subsequently formalized by Edelsbrunner et al. in their seminal 2002 paper Topological Persistence and Simplification. Since then, the offset filtration has become a primary example in the study of computational topology and data analysis. Definition Let be a finite set in a metric space , and for any let be the closed ball of radius centered at . Then the union is known as the offset of with respect to the parameter (or simply the -offset of ). By considering the collection of offsets over all we get a family of spaces where whenever . So is a family of nested topological spaces indexed over , which defines a filtration known as the offset filtration on . Note that it is also possible to view the offset filtration as a functor from the poset category of non-negative real numbers to the category of topological spaces and continuous maps. There are some advantages to the categorical viewpoint, as explored by Bubenik and others. Properties A standard application of the nerve theorem shows that the union of balls has the same homotopy type as its nerve, since closed balls are convex and the intersection of convex sets is convex
https://en.wikipedia.org/wiki/Multicover%20bifiltration
The multicover bifiltration is a two-parameter sequence of nested topological spaces derived from the covering of a finite set in a metric space by growing metric balls. It is a multidimensional extension of the offset filtration that captures density information about the underlying data set by filtering the points of the offsets at each index according to how many balls cover each point. The multicover bifiltration has been an object of study within multidimensional persistent homology and topological data analysis. Definition Following the notation of Corbet et al. (2022), given a finite set , the multicover bifiltration on is a two-parameter filtration indexed by defined index-wise as , where denotes the non-negative integers. Note that when is fixed we recover the Offset Filtration. Properties The multicover bifiltration admits a polynomially-sized simplicial model that is topologically equivalent, called the "rhomboid bifiltration." The rhomboid bifiltration is an extension of the rhomboid tiling introduced by Edelsbrunner and Osang in 2021 for computing the persistent homology of the multicover bifiltration along one axis of the indexing set. The rhomboid bifiltration on a set of points in Euclidean space can be computed in time . The multicover bifiltration is also topologically equivalent to a multicover nerve construction due to Sheehy called the subdivision-Čech bifiltration, which considers the barycentric subdivision on the nerve of the offsets. In particular, the subdivision-Čech and multicover bifiltrations are weakly equivalent, and hence have isomorphic homology modules in all dimensions. However, the subdivision-Čech bifiltration has an exponential number of simplices in the size of the data set, and hence is not amenable to efficient direct computations. The multicover bifiltration is also weakly equivalent to the subdivision-Čech bifiltration, though the latter has an exponential number of simplices. References Computer science Topo
https://en.wikipedia.org/wiki/Path%20explosion
In computer science, path explosion is a fundamental problem that limits the scalability and/or completeness of certain kinds of program analyses, including fuzzing, symbolic execution, and path-sensitive static analysis. Path explosion refers to the fact that the number of control-flow paths in a program grows exponentially ("explodes") with an increase in program size and can even be infinite in the case of programs with unbounded loop iterations. Therefore, any program analysis that attempts to explore control-flow paths through a program will either have exponential runtime in the length of the program (or potentially even failure to terminate on certain inputs), or will have to choose to analyze only a subset of all possible paths. When an analysis only explores a subset of all paths, the decision of which paths to analyze is often made heuristically. References Program analysis
https://en.wikipedia.org/wiki/Persistence%20module
A persistence module is a mathematical structure in persistent homology and topological data analysis that formally captures the persistence of topological features of an object across a range of scale parameters. A persistence module often consists of a collection of homology groups (or vector spaces if using field coefficients) corresponding to a filtration of topological spaces, and a collection of linear maps induced by the inclusions of the filtration. The concept of a persistence module was first introduced in 2005 as an application of graded modules over polynomial rings, thus importing well-developed algebraic ideas from classical commutative algebra theory to the setting of persistent homology. Since then, persistence modules have been one of the primary algebraic structures studied in the field of applied topology. Definition Single Parameter Persistence Modules Let be a totally ordered set and let be a field. The set is sometimes called the indexing set. Then a single-parameter persistence module is a functor from the poset category of to the category of vector spaces over and linear maps. A single-parameter persistence module indexed by a discrete poset such as the integers can be represented intuitively as a diagram of spaces: To emphasize the indexing set being used, a persistence module indexed by is sometimes called a -persistence module, or simply a -module. Common choices of indexing sets include , etc. One can alternatively use a set-theoretic definition of a persistence module that is equivalent to the categorical viewpoint: A persistence module is a pair where is a collection of -vector spaces and is a collection of linear maps where for each , such that for any (i.e., all the maps commute). Multiparameter Persistence Modules Let be a product of totally ordered sets, i.e., for some totally ordered sets . Then by endowing with the product partial order given by only if for all , we can define a multiparameter persiste
https://en.wikipedia.org/wiki/PLAC-Seq
Proximity ligation-assisted chromatin immunoprecipitation sequencing (PLAC-seq) is a chromatin conformation capture(3C)-based technique to detect and quantify genomic chromatin structure from a protein-centric approach. PLAC-seq combines in situ Hi-C and chromatin immunoprecipitation (ChIP), which allows for the identification of long-range chromatin interactions at a high resolution with low sequencing costs. Mapping long-range 3-dimensional(3D) chromatin interactions is important in identifying transcription enhancers and non-coding variants that can be linked to human diseases. Different 3C-based techniques have been used to study the higher-order 3D chromatin structure, and it has been combined with high-throughput sequencing to determine the chromatin structure on a genome-wide level. Hi-C is one of the most widely used 3C-based techniques because it allows for high-resolution (kilobase-scale) genome-topology identification. However, it requires billions of sequencing reads which has limited its application. Another commonly used 3C-based technique is chromatin interaction analysis by paired-end tag sequencing (ChiA-PET). ChiA-PET can identify long-range interactions of transcription promoters and enhancers at a high resolution but requires millions of cells. PLAC-seq alleviates these issues by using in situ Hi-C, which creates long-range DNA contacts in situ in the nucleus before lysis. Unlike ChiA-PET which performs ChIP and proximity ligation after chromatin shearing, performing proximity ligation in the nuclei first prevents large disruptions of protein/DNA complexes. This decreases false-positive interactions and improves DNA contact capture efficiency, meaning that PLAC-seq is more accurate and requires fewer cells. History PLAC-seq was developed in 2016 and an almost identical technique called HiChIP was also developed in the same year. Both methods combine in situ Hi-C and ChIP but have different library preparation methods. While PLAC-seq uses bioti
https://en.wikipedia.org/wiki/Ordered%20two-template%20relay
Ordered Two-Template Relay (OTTR) is a library preparation technique used to improve quantitation of highly modified non-coding RNA (ncRNA) species, which have been difficult to characterize using traditional cDNA sequencing approaches. OTTR leverages a retroelement reverse transcriptase (RT), termed BoMoC, with template jumping properties and high processivity across modified RNA templates, to generate cDNA products for next-generation sequencing (NGS). Overall, OTTR offers a streamlined approach for cDNA library production of full-length and modified ncRNA targets. Background Cellular ncRNA pools are known to be dynamically regulated and can have high degrees of variation between different cell types and developmental stages. Dysregulation of transfer RNAs (tRNAs), a type of ncRNA, has been linked to a diverse array of detrimental physiological conditions including neurological diseases and cancer. While characterization of transfer RNA (tRNAs) diversity is relevant to disease, current library preparation approaches are limited in their ability to capture highly modified tRNA bases, which block reverse transcriptase and interfere with the production of full-length cDNA intermediates needed for sequencing. To date, several cDNA library preparation techniques, including OTTR, have attempted to overcome these problems and improve our ability to characterize ncRNA pools. OTTR Workflow BoMoC Reverse Transcriptase Reverse transcriptases (RTs) are polymerases capable of synthesizing complementary DNA (cDNA) using either RNA and DNA templates and have become essential biotechnology tools in both clinical and laboratory settings. OTTR makes use of a unique non-long terminal repeat (LTR) retroelement RT called BoMoC, due to its specialized ability to synthesize cDNA opposite templates containing modified bases or sugar backbones and being highly processive across discontinuous RNA templates. Originally purified from the silk moth Bombyx mori, OTTR BoMoC is N-terminal
https://en.wikipedia.org/wiki/Tcr-seq
TCR-Seq (T-cell Receptor Sequencing) is a method used to identify and track specific T cells and their clones... TCR-Seq utilizes the unique nature of a T-cell receptor (TCR) as a ready-made molecular barcode. This technology can apply to both single cell sequencing technologies and high throughput screens Background T-cell Receptor (TCR) T cells are a part of the adaptive immune system and play a critical role in protecting the body from foreign pathogens. T-cell receptors (TCRs) are a group of membrane proteins found on the surface of T cells which can bind to foreign antigens. TCRs interact with major histocompatibility complexes (MHC) on cell surfaces to recognize antigens. They are heterodimers made up of predominantly α and β chains (or more rarely δ and γ chains) and consist of a variable region and a constant region. Variable regions are produced through a process called VDJ recombination, which results in unique amino acid sequences for α, β, and γ chains. The result is that each TCR is unique and recognizes a specific antigen Complementarity Determining Regions (CDRs) Complementarity determining regions (CDRs) are a part of the TCR and play an essential role in TCR-MHC interactions. CDR1 and CDR2 are encoded by V genes, while CDR3 is made from the region between V and J genes or between D and J genes (termed "VDJ genes" when referred to together). CDR3 is the most variable of the CDRs, and is in direct contact with the antigen. As such, CDR3 is used as the “barcode region” to identify unique T cell populations, as it is highly unlikely for two T cells to have the same CDR3 sequence unless they came from the same parental T cell. Clonality VDJ recombination produces such a vast amount of unique TCRs that many receptors never encounter the antigen they are best suited for. When a foreign antigen is present in the body, the few T cells that recognize that antigen are positively selected for so that the body has an adequate number of T cells to mount
https://en.wikipedia.org/wiki/NOMe-seq
Nucleosome Occupancy and Methylome Sequencing (NOMe-seq) is a genomics technique used to simultaneously detect nucleosome positioning and DNA methylation... This method is an extension of bisulfite sequencing, which is the gold standard for determining DNA methylation. NOMe-seq relies on the methyltransferase M.CviPl, which methylates cytosines in GpC dinucleotides unbound by nucleosomes or other proteins, creating a nucleosome footprint. The mammalian genome naturally contains DNA methylation, but only at CpG sites, so GpC methylation can be differentiated from genomic methylation after bisulfite sequencing. This allows simultaneous analysis of the nucleosome footprint and endogenous methylation on the same DNA molecules. In addition to nucleosome foot-printing, NOMe-seq can determine locations bound by transcription factors. Nucleosomes are bound by 147 base pairs of DNA whereas transcription factors or other proteins will only bind a region of approximately 10-80 base pairs. Following treatment with M.CviPl, nucleosome and transcription factor sites can be differentiated based on the size of the unmethylated GpC region. Nucleosome occupancy determines DNA accessibility, which provides insight into regulatory regions of the genome. Important regulatory elements within a cell (such as promoters, enhancers, silencers, etc.), are located in open or accessible regions to allow binding of transcription factors or other regulatory molecules. NOMe-seq can therefore be used to elucidate regulatory information. Alternative DNA accessibility techniques include MNase-seq, DNase-seq, FAIRE-seq, and their successor ATAC-seq. NOMe-seq has the additional benefit of providing DNA methylation status, which also plays a crucial role in the regulation of genomic activity. Interestingly, increased DNA methylation is associated with transcriptional silencing whereas accessible DNA unbound by nucleosomes is generally associated with transcriptional activation. In this sense, NOMe-seq
https://en.wikipedia.org/wiki/Conservative%20morphological%20anti-aliasing
Conservative morphological anti-aliasing (CMAA) is an antialiasing technique originally developed by Filip Strugar at Intel. CMAA is an image-based, post processing technique similar to that of morphological antialiasing. CMAA uses 4 main steps which are image analysis for color discontinuities, locally dominant edge detection, simple shape handling, and lastly symmetrical long edge shape handling. A couple of years after CMAA was introduced, Intel unveiled an updated version which they named CMAA2. See also Multisample anti-aliasing Fast approximate anti-aliasing Temporal anti-aliasing Supersampling Spatial anti-aliasing References Image processing Computer graphic artifacts Anti-aliasing algorithms
https://en.wikipedia.org/wiki/Programming%20language%20design%20and%20implementation
Programming languages are typically created by designing a form of representation of a computer program, and writing an implementation for the developed concept, usually an interpreter or compiler. Interpreters are designed to read programs, usually in some variation of a text format, and perform actions based on what it reads, whereas compilers convert code to lower level. Design In programming language design, there are a wide variety of factors to consider. Some factors may be mutually exclusive (e.g. security versus speed). It may be necessary to consider whether a programming language will perform better interpreted, or compiled, if a language should be dynamically or statically typed, if inheritance will be in a language, and the general syntax of the language. Many factors involved with the design of a language can be decided on by the goals behind the language. It's important to consider the target audience of a language, its unique features and its purpose. It is good practice to look at what existing languages lack, or make difficult, to make sure a language serves a purpose. Various experts have suggested useful design principles: As the last paragraph of an article published in 1972, Tony Hoare has provided some general advice for any software project: “So my advice to the designers and implementer of software of the future is in a nutshell: do not decide exactly what you are going to do until you know how to do it; and do not decide how to do it until you have evaluated your plan against all the desired criteria of quality. And if you cannot do that, simplify your design until you can.” At a SIGPLAN symposium in 1973, Tony Hoare discussed various language aspects in some detail. He also identifies a number of shortcomings in (then) current programming languages. “a programming language is a tool which should assist the programmer in the most difficult aspects of his art, namely program design, documentation, and debugging.” “objective criteria for
https://en.wikipedia.org/wiki/Pore-C
Pore-C is an emerging genomic technique which utilizes chromatin conformation capture (3C) and Oxford Nanopore Technologies' (ONT) long-read sequencing to characterize three-dimensional (3D) chromatin structure. To characterize concatemers, the originators of Pore-C developed an algorithm to identify alignments that are assigned to a restriction fragment; concatemers with greater than two associated fragments are deemed high order. Pore-C attempts to improve on previous 3C technologies, such as Hi-C and SPRITE, by not requiring DNA amplification prior to sequencing. This technology was developed as a simpler and more easily scalable method of capturing higher-order chromatin structure and mapping regions of chromatin contact. In addition, Pore-C can be used to visualize epigenomic interactions due to the capability of ONT long-read sequencing to detect DNA methylation. Applications of this technology include analysis of combinatorial chromatin interactions, the generation of de novo chromosome scale assemblies, visualization of regions associated with multi-locus histone bodies, and detection and resolution of structural variants. Background Although the DNA within eukaryotic cells is linear, it is also intricately folded and packaged to fit within each cell’s nucleus. Thus, specific parts of the genome may be closer in physical space than would otherwise appear to be based on DNA sequence alone. The 3D genome refers to how DNA is spatially organized within cells. The 3D structures found in the genome include active and inactive chromatin, chromatin loops, and topologically associated domains (TADs). These structures function to regulate gene expression. In genomic and epigenomic research, chromatin structure is most often visualized by 3C techniques, which quantify interactions between loci to construct a 3D map. The fundamental 3C technique is used to quantify interactions between pairs of genomic loci. Methods that are derived from this technique, such as 4C, 5
https://en.wikipedia.org/wiki/Lateral%20olfactory%20tract%20usher%20substance
Lateral olfactory tract usher substance (LOTUS), also known as Cartilage acidic protein-1B (Crtac1B), is a membrane protein produced by neurons. During embryonic development, it is strongly expressed in the olfactory bulb by Mitral cells. Function LOTUS is an endogenous antagonist of the Nogo receptor (NgR1) and Paired Immunoglobulin-Like Receptor B (PirB in mice, LilrB2 in humans). These receptors block neuronal outgrowth when activated. By blocking their function, LOTUS promotes neuronal growth, e.g. during the formation of the lateral olfactory tract. As LOTUS generates a permissive brain environment for neuronal regeneration, it may aid recovery after spinal cord injury. It also has been shown to reduce synapse loss in a mouse model of Alzheimer's disease. References Neuroscience
https://en.wikipedia.org/wiki/Koml%C3%B3s%27%20theorem
Komlós' theorem is a theorem from probability theory and mathematical analysis about the Cesàro convergence of a subsequence of random variables (or functions) and their subsequences to an integrable random variable (or function). It's also an existence theorem for an integrable random variable (or function). There exist a probabilistic and an analytical version for finite measure spaces. The theorem was proven in 1967 by János Komlós. There exists also a generalization from 1970 for general measure spaces by Srishti D. Chatterji. Komlós' theorem Probabilistic version Let be a probability space and be a sequence of real-valued random variables defined on this space with Then there exists a random variable and a subsequence , such that for every arbitrary subsequence when then -almost surely. Analytic version Let be a finite measure space and be a sequence of real-valued functions in and . Then there exists a function and a subsequence such that for every arbitrary subsequence if then -almost everywhere. Explanations So the theorem says, that the sequence and all its subsequences converge in Césaro. Literature Kabanov, Yuri & Pergamenshchikov, Sergei. (2003). Two-scale stochastic systems. Asymptotic analysis and control. 10.1007/978-3-662-13242-5. Page 250. References Probability theorems Theorems in analysis
https://en.wikipedia.org/wiki/Drawing%20tower
A drawing tower produces a fine glass filament by drawing a glass preform. The tip of the preform is heated to melting temperature and then a strand of molten material is pulled downward. Industrial drawing towers range in height from 30 to 45 meters. A drawing tower is used in the production of optical fiber, for example for fiber-optic communication cables. The preform is a multi-layered cylinder typically 20 cm in diameter, and 2 m long. References Fiber optics
https://en.wikipedia.org/wiki/Syntax%20and%20semantics%20of%20logic%20programming
Logic programming is a programming paradigm that includes languages based on formal logic, including Datalog and Prolog. This article describes the syntax and semantics of the purely declarative subset of these languages. Confusingly, the name "logic programming" also refers to a programming language that roughly corresponds to the declarative subset of Prolog. Unfortunately, the term must be used in both senses in this article. Declarative logic programs consist entirely of rules of the form H :- B1, ..., BN. Each such rule can be read as an implication: meaning "If each is true, then is true". Logic programs compute the set of facts that are implied by their rules. Many implementations of Datalog, Prolog, and related languages add procedural features such as Prolog's cut operator or extra-logical features such as a foreign function interface. The formal semantics of such extensions are beyond the scope of this article. Datalog Datalog is the simplest widely-studied logic programming language. There are three major definitions of the semantics of Datalog, and they are all equivalent. The syntax and semantics of other logic programming languages are extensions and generalizations of those of Datalog. Syntax A Datalog program consists of a list of rules (Horn clauses). If constant and variable are two countable sets of constants and variables respectively and relation is a countable set of predicate symbols, then the following BNF grammar expresses the structure of a Datalog program: <program> ::= <rule> <program> | "" <rule> ::= <atom> ":-" <atom-list> "." <atom> ::= <relation> "(" <term-list> ")" <atom-list> ::= <atom> | <atom> "," <atom-list> | "" <term> ::= <constant> | <variable> <term-list> ::= <term> | <term> "," <term-list> | "" Atoms are also referred to as . The atom to the left of the :- symbol is called the of the rule; the atoms to the right are the . Every Datalog program must satisfy the condition that every variable that appears in th
https://en.wikipedia.org/wiki/Resilience%20%28mathematics%29
In mathematical modeling, resilience refers to the ability of a dynamical system to recover from perturbations and return to its original stable steady state. It is a measure of the stability and robustness of a system in the face of changes or disturbances. If a system is not resilient enough, it is more susceptible to perturbations and can more easily undergo a critical transition. A common analogy used to explain the concept of resilience of an equilibrium is one of a ball in a valley. A resilient steady state corresponds to a ball in a deep valley, so any push or perturbation will very quickly lead the ball to return to the resting point where it started. On the other hand, a less resilient steady state corresponds to a ball in a shallow valley, so the ball will take a much longer time to return to the equilibrium after a perturbation. The concept of resilience is particularly useful in systems that exhibit tipping points, whose study has a long history that can be traced back to catastrophe theory. While this theory was initially overhyped and fell out of favor, its mathematical foundation remains strong and is now recognized as relevant to many different systems. History In 1973, Canadian ecologist C. S. Holling proposed a definition of resilience in the context of ecological systems. According to Holling, resilience is "a measure of the persistence of systems and of their ability to absorb change and disturbance and still maintain the same relationships between populations or state variables". Holling distinguished two types of resilience: engineering resilience and ecological resilience. Engineering resilience refers to the ability of a system to return to its original state after a disturbance, such as a bridge that can be repaired after an earthquake. Ecological resilience, on the other hand, refers to the ability of a system to maintain its identity and function despite a disturbance, such as a forest that can regenerate after a wildfire while maintain
https://en.wikipedia.org/wiki/Libroadrunner
libRoadRunner is a C/C++ software library that supports simulation of SBML based models.. It uses LLVM to generate extremely high-performance code and is the fastest SBML-based simulator currently available. Its main purpose is for use as a reusable library that can be hosted by other applications, particularly on large compute clusters for doing parameter optimization where performance is critical. It also has a set of Python bindings that allow it to be easily used from Python. libroadrunner is often paired with Tellurium, which adds additional functionality such as Antimony scripting. Capabilities Time-course simulation using the CVODE, RK45, and Euler solvers of ordinary differential equations, which can report on the system's variable concentrations and reaction rates over time. Steady-state calculations using non-linear solvers such as kinsolve and NLEQ2 Stochastic simulation using the standard Gillespie algorithm. Supports both steady-state and time-dependent Metabolic control analysis, including calculating the elasticities towards the variable metabolites by algebraic or numerical differentiation of the rate equations, as well as the flux and concentration control coefficients by means of matrix inversion and perturbation methods. libroadrunner will also compute the structural matrices (e.g. K- and L-matrices) of a stoichiometric model. The stability of a system can be investigated by way of the system eigenvalues. Data and results can be plotted via matplotlib, or saved in text files. libroadrunner supports the import and export of standard SBML. Applications libroadrunner has been widely used in the systems biology community for doing research in systems biology modeling, as well as being a host for other simulation platforms. Software applications that use libroadrunner CompuCell3D CRNT4SBML DIVIPAC massPy pyBioNetFit PhysiCell pyViPR runBiosimulations SBMLSim Tellurium (simulation tool) Tissue Forge (multi-cellular
https://en.wikipedia.org/wiki/Sarvatra%20Technologies
Sarvatra Technologies Private Limited is an Indian fintech company, headquartered in Pune, Maharashtra, that develops banking software and provides PaaS (Platform as a Service), SaaS (Software as a Service), and cloud computing solutions to cooperative, private, and public sector banks in India. As of 2020, the company's PaaS model supported 600 banks in India, while the company's electronic funds transfer (EFT) switch had a 54% market share among application service providers in India. The company had a 55% market share in providing banking software to banks in India. The company's switch was the recipient of a BFSI Innovation Tribe Award from The Economic Times in 2018, and a Best Banking Technology Award from the Internet and Mobile Association of India in 2019. History The company was founded on 22 June 2000 by Mandar Agashe and is headquartered in Pune, Maharashtra. The company initially worked in conjunction with Agashe's other financial technology companies, EBZ Online and Codito, developing banking software solutions. After securing a loan worth ₹35 million from his father's Suvarna Sahakari Bank in May 2002, the companies launched a point of sale (PoS) terminal called "Sarvatra" through the bank's 12 branches across Maharashtra in July 2003. The name Sarvatra Technologies began being used officially by 2004. In May 2006, the company installed its inaugural "Anywhere Money" point of sale terminal in a bank in Ahmednagar as a proof of concept for the Government of Maharashtra. Then Union Minister of Agriculture Sharad Pawar inaugurated the project. The company's terminals received praise from politicians Shankarrao Gadakh and Hasan Mushrif, with them encouraging local schools, farmers, artisans, and traders to make use of the system. The Bharat Sanchar Nigam Limited, as well as sugarcane factories in the Parner taluka and those of the Brihan Maharashtra Sugar Syndicate implemented use of the terminals for their clients. By January 2008, in a report filed
https://en.wikipedia.org/wiki/Radio%20Equipment%20Directive%20%282022%29
The Directive (EU) 2022/2380 is a directive of the European Parliament and the European Council which was formally adopted on 23 November 2022 amending Radio Equipment Directive 2014/53. The directive mandates the use of USB-C as a universal charger using a standard USB-C to USB-C cable for smartphones, tablets, digital cameras, headphones, headsets, handheld video game consoles, portable speakers, e-readers, keyboards, mice, portable navigation systems, and earbuds that use wired charging by the end of 2024, and laptops by 2026. Furthermore, if said equipment is capable of being recharged by wired charging at voltages higher than 5 volts, currents higher than 3 amperes, or powers higher than 15 watts, the equipment must support the full functionality of USB Power Delivery. It is considered a successor to the EU's common external power supply (2009–2014), a voluntary specification which used micro-USB as a standard connector. References See also Brussels effect External links Directive 2022/2380 text European Union directives 2022 in the European Union Electronics and the environment Battery charging Mobile phone standards
https://en.wikipedia.org/wiki/TVRI%20Tower
The TVRI Tower (or ) is a 144-meter high television transmitter tower located in Jakarta, Indonesia. Started construction on 1 April 1975 and completed on 24 August 1977, this tower was at one time one of the tallest structures in Jakarta and Indonesia. History Since its establishment on 17 August 1962, TVRI broadcasts in Jakarta and its surroundings have been emitted from an 85-meter high iron antenna located near the current position of the tower, which was funded from Japanese war reparations and built by Nippon Electric Company. The construction of a transmitter tower had been planned since May 1, 1972 as part of the construction of a television station complex which was carried out to suit the surrounding environment which included the Parliament Complex of the Republic of Indonesia and the Gelora Bung Karno Stadium. Construction will start on 1 April 1975 by Joint venture Indonesia-Japan P.T. Waskita, and Kajima. The tower was completed on 24 August 1977 and started functioning in 1978. The tower was inaugurated on 24 August 1982 along with the new studio building. See also Fernsehturm Stuttgart List of tallest towers in the world List of transmission sites References External links TVRI Tower, Jakarta - SkyscraperPage.com TVRI Towers in Indonesia Tourist attractions in Jakarta Buildings and structures in Jakarta Towers completed in 1977 Towers with revolving restaurants 1982 establishments in Indonesia Radio masts and towers
https://en.wikipedia.org/wiki/Clothing%20physiology
Clothing physiology is a branch of science that studies the interaction between clothing and the human body, with a particular focus on how clothing affects the physiological and psychological responses of individuals to different environmental conditions. The goal of clothing physiology research is to develop a better understanding of how clothing can be designed to optimize comfort, performance, and protection for individuals in various settings, including outdoor recreation, occupational environments, and medical contexts. Purpose of clothing Human clothing motives are frequently oversimplified in cultural and sociological theories, with the assumption that they are solely motivated by modesty, adornment, protection, or sex. However, clothing is primarily motivated by the environment, with its form being influenced by human characteristics and traits, as well as physical and social factors such as sex relations, costume, caste, class, and religion. Ultimately, clothing must be comfortable in various environmental conditions to support physiological behavior. The concept of clothing has been aptly characterized as a quasi-physiological system that interacts with the human body. Quasi-physiological systems Clothing can be considered as a quasi-physiological system that interacts with the body in different ways, just like the distinct physiological systems of the human body, such as digestive system and nervous system, which can be analyzed systematically. Purpose of clothing physiology The acceptance and perceived comfort of a garment cannot be attributed solely to its thermal properties. Rather, the sensation of comfort when wearing a garment is associated with various factors, including the fit of the garment, its moisture buffering properties, and the mechanical characteristics of the fibers and fabrics used in its construction. The field of clothing physiology concerns the complex interplay between the human body, environmental conditions, and clothing.
https://en.wikipedia.org/wiki/Bating%20%28leather%29
Bating, a technical term used in the tanning industry to denote leather that has been treated with hen or pigeon manure, similar to puering (see puer) where the leather has been treated with dog excrement, and which treatment, in both cases, was performed on the raw hide prior to tanning in order to render the skins, and the subsequent leather, soft and supple. Today, both practices are obsolete and have been replaced in the tanneries with other natural proteolytic enzymes. Leather processing Since early times, tanners have made use of either dog fæces, or hen and pigeon manure, in one of the early phases of leather treatment to produce a soft leather. A bath solution containing the animal extracts was made and the raw hide inserted and left there for a few days, which activated the bacteria and enzymes that reacted with the collagen in the animal skin to make the leather soft and supple. This step was followed by drenching, a term denoting skins that were thoroughly washed in a bath solution of bran (usually of barley or rye), or ash bark. This process was thought to open up the fibre, and, if lime (CaO) was used to remove hair before the actual bating, drenching removed excess or residual lime trapped in the leather. Early inventors who concerned themselves with tanning looked upon bating as a process for removing lime from the skins, and nothing more, and since the use of animal fæces was repulsive, sought to substitute them by inventing artificial bates. What they failed to realize, however, was that bating also acts upon the skin fibres, rendering portions of the skins soluble, bringing about the finished condition. One of the early inventions made to replicate bating was the chemical use of old lime liquors (with high levels of ammonia) neutralized with sulphuric acid. This method more nearly approximates to the conditions of the dung. Experimentation and research Puering fell into disuse after began producing the enzyme pancreatin on an industrial scale b
https://en.wikipedia.org/wiki/DatalogZ
DatalogZ (stylized as ) is an extension of Datalog with integer arithmetic and comparisons. The decision problem of whether or not a given ground atom (fact) is entailed by a DatalogZ program is RE-complete (hence, undecidable), which can be shown by a reduction to diophantine equations. Syntax The syntax of DatalogZ extends that of Datalog with numeric terms, which are integer constants, integer variables, or terms built up from these with addition, subtraction, and multiplication. Furthermore, DatalogZ allows , which are atoms of the form t < s or t <= s for numeric terms t, s. Semantics The semantics of DatalogZ are based on the model-theoretic (Herbrand) semantics of Datalog. Limit DatalogZ The undecidability of entailment of DatalogZ motivates the definition of limit DatalogZ. Limit DatalogZ restricts predicates to a single numeric position, which is marked maximal or minimal. The semantics are based on the model-theoretic (Herbrand) semantics of Datalog. The semantics require that Herbrand interpretations be to qualify as models, in the following sense: Given a ground atom of a limit predicate where the last position is a max (resp. min) position, if is in a Herbrand interpretation , then the ground atoms for (resp. ) must also be in for to be limit-closed. Example Given a constant w, a binary relation edge that represents the edges of a graph, and a binary relation sp with the last position of sp minimal, the following limit DatalogZ program computes the relation sp, which represents the length of the shortest path from w to any other node in the graph: sp(w, 0) :- . sp(y, m + 1) :- sp(x, m), edge(x, y). See also Constraint logic programming References Notes Sources Logic in computer science Computer programming
https://en.wikipedia.org/wiki/Empress%20%28cracker%29
Empress (sometimes stylized EMPRESS) is a video game cracker who specializes in breaking anti-piracy software. While the identity of Empress is unknown, she refers to herself as a woman and Russian. She has iris heterochromia according to her statement. Empress has also released cracked games under the moniker C000005. Empress is known as one of the few crackers who can crack Denuvo. Also according to her statement, she was the one who cracked for Codex all of their releases of Denuvo-protected games. Her motivation is to remove the software license aspect of digital games in an effort to preserve them after developers drop support. Empress also claims that removing digital rights management (DRM) decreases performance issues in a game. Career Empress became interested in the DRM-cracking scene in 2014. Her followers can participate in polls to select which game they want cracked next, and her work is funded through crowdsourced donations. Empress uses the money to cover the living costs, upgrade hardware, and purchase games that they intend to crack. She acknowledges that accepting payment for piracy is against the etiquette of the warez scene. Empress rose to prominence after releasing a cracked version of Red Dead Redemption 2. Other high-profile games cracked by Empress include Mortal Kombat 11 and Anno 1800. In February 2021, Empress stated that she would soon be arrested after being allegedly caught red handed working on the crack for Immortals Fenyx Rising. Empress blamed FitGirl Repacks, with whom she had a feud. However, that March, Empress was available to publish a workaround for the online check-in system of Battle.net. Empress's arrest announcement was met with general skepticism by the cracking community. In 2023, Empress was banned from Reddit by reddit admins. She released a cracked version of Hogwarts Legacy that February. Controversies Empress is known around the P2P scene for the "personal note" section in the NFOs of her releases, often co
https://en.wikipedia.org/wiki/Hot%20pixel%20%28telescopes%29
A hot pixel or bright dot defect is a pixel that outputs many more electrons than others at the same input signal in a charge-coupled device (CCD) or CMOS sensor. In the simulated image, the hot pixels are the sources of the salt-and-pepper noise. In the definition of the HST ACS, A pixel above 0.14 e¯/pixel/second is considered a "hot" pixel. A warm pixel is a pixel that has negative bias values. In the definition of the Hubble Space Telescope, a pixel below the hot pixels range but above 0.06 e¯/pixel/second is considered a "warm" pixel. See also Defective pixel References Digital imaging
https://en.wikipedia.org/wiki/Software%20load%20testing
The term load testing is used in different ways in the professional software testing community. Load testing generally refers to the practice of modeling the expected usage of a software program by simulating multiple users accessing the program concurrently. As such, this testing is most relevant for multi-user systems; often one built using a client/server model, such as web servers. However, other types of software systems can also be load tested. For example, a word processor or graphics editor can be forced to read an extremely large document; or a financial package can be forced to generate a report based on several years' worth of data. The most accurate load testing simulates actual use, as opposed to testing using theoretical or analytical modeling. Load testing lets you measure your website's quality of service (QOS) performance based on actual customer behavior. Nearly all the load testing tools and frameworks follow the classical load testing paradigm: when customers visit your website, a script recorder records the communication and then creates related interaction scripts. A load generator tries to replay the recorded scripts, which could possibly be modified with different test parameters before replay. In the replay procedure, both the hardware and software statistics will be monitored and collected by the conductor, these statistics include the CPU, memory, disk IO of the physical servers and the response time, the throughput of the system under test (SUT), etc. And at last, all these statistics will be analyzed and a load testing report will be generated. Load and performance testing analyzes software intended for a multi-user audience by subjecting the software to different numbers of virtual and live users while monitoring performance measurements under these different loads. Load and performance testing is usually conducted in a test environment identical to the production environment before the software system is permitted to go live. As an
https://en.wikipedia.org/wiki/Trust%20and%20safety
Trust and safety (T&S) is a set of practices, policies, technologies, tools, and processes used to maintain and promote security on the internet and create a trustworthy environment for users. One of the core objectives of trust and safety is to ensure that a web portal or digital platform is a trusted, safe environment, where personal identity, data and virtual assets are protected.   T&S helps achieve this by enforcing various measures and tools, such as policies and guidelines for what is an acceptable code of conduct/behaviour, monitoring data breaches and malicious or harmful activities, and by addressing incidents and cybercrimes quickly. By investing in the trust and safety of digital platforms, brands establish themselves, as safe and reliable online destinations, building strong relationships with their users, thus, protecting their reputation Key services Trust and safety encompasses a range of services, including: Data security measures, such as encryption, secure storage, and restricted access controls protect user data from unauthorised access. Content moderation services involve reviewing content created by user-named user-generated content in the industry-and removing what is inappropriate, such as hate speech, misinformation, graphic or video violence and any other non compliant materials Cybersecurity solutions like firewalls, intrusion detection and prevention systems, VPNs, antivirus software, and authentication solutions, eliminate the risk of hacking, data breaches, and other malicious activities. Real-time monitoring, allows for quick and automated threat detection and prompt response to incidents. Tools such as digital wallets, blockchain technology, MFA solutions, digital asset management platforms, or virtual asset recovery services enable the protection of virtual assets such as digital currency, in-game items, or other digital assets. References Internet culture Reputation management Content moderation
https://en.wikipedia.org/wiki/Skyhigh%20Security
Skyhigh Security is a cloud security company, with headquarters in San Jose, California. The company offers enterprise cloud security services. History Skyhigh Networks was founded in 2011 by Rajiv Gupta, Sekhar Sarukkai, and Kaushik Narayan to protect an organization's sensitive data by providing visibility, control, and usage of cloud services. In November 2016, Skyhigh Networks was recognized as a market leader. On November 27, 2017, McAfee, an American global computer security software company, announced a definitive agreement to acquire Skyhigh Networks. The deal closed in January 2018. In March 2021, McAfee announced that its enterprise business was acquired by private equity firm Symphony Technology Group (STG) for US$4bn. and in March 2022, STG relaunched the cloud portfolio, including the former Skyhigh Networks, as Skyhigh Security. Skyhigh Security remains a leader in the now expanded Security Services Edge market focusing on cloud data protection. References Computer security companies Computer security software companies
https://en.wikipedia.org/wiki/Dische%20test
The Dische test, or Dische reaction, is used to distinguish DNA from RNA. It was invented by Zacharias Dische. It is a type of nitrate test. Method Dische's diphenylamine reagent consists of diphenylamine, glacial acetic acid, sulfuric acid, and ethanol. When heated with DNA, it turns blue in the presence of DNA. A more intense blue color indicates a greater concentration of DNA. Mechanism The acid converts deoxyribose to a molecule that binds with diphenylamine to form a blue substance. The reagent does not interact with RNA, so can be used to distinguish DNA from RNA. See also Bial's test References Analytical reagents
https://en.wikipedia.org/wiki/Leonor%20Ferrer%20Girabau
Leonor Ferrer Girabau (1 July 1874 – 1953) (sometimes spelled Leonor Ferrer i Girabau but widely known as Leonor Ferrer) was the first female draftsperson in Spain (1905). Biography She was born in Barcelona. In 1897, Ferrer obtained the title of teacher. On 13 March 1905, she earned the title of expert draftsman issued by the Friends of the Country Economic Society, Teaching Section, School of Governesses and Other Careers for Women and became the first woman in Spain to obtain the degree. She had been working as a draughtswoman for more than six years when she obtained her degree in 1905. To qualify, she had studied technical drawing, topographical drawing, geometry and trigonometry between 1902 and 1904. Between 1898 and 1931, she worked for the General Telephone Society, which later became the Peninsular Telephone Company. She entered by competitive examination as a telephone operator but in 1899, and thanks to her knowledge of drawing, she became an assistant to the draftsman Juan Marxuach. When she left the company, she was appointed head of the Plans Section, directing a team that included, among others: Eulàlia Fàbregas, Teresa Torrens and Maria Grau. Her task was recognized in the publications of the time: "her expertise in the highly useful art that she cultivates, the success and beauty of her drawings, the seriousness with which she carries out her mission have earned her trust and appreciation of the important Barcelona society." From the second decade of the 20th century, Ferrer dedicated herself to teaching drawing, starting at the Institute of Culture and Popular Library of Women, a private institution dedicated to the education and promotion of women, which was founded by Francesca Bonnemaison in 1909 in the Sant Pere district of Barcelona. Next Ferrer opened her own school under the name of Drawing Academy for Young Ladies at her home at number 10, Calle de Grasas del Pueblo Seco, Barcelona. In 1931, Ferrer left the telephone company, which h
https://en.wikipedia.org/wiki/Dividing%20a%20square%20into%20similar%20rectangles
Dividing a square into similar rectangles (or, equivalently, tiling a square with similar rectangles) is a problem in mathematics. Three rectangles There is only one way (up to rotation and reflection) to divide a square into two similar rectangles. However, there are three distinct ways of partitioning a square into three similar rectangles: The trivial solution given by three congruent rectangles with aspect ratio 3:1. The solution in which two of the three rectangles are congruent and the third one has twice the side length of the other two, where the rectangles have aspect ratio 3:2. The solution in which the three rectangles are all of different sizes and where they have aspect ratio ρ2, where ρ is the plastic number. The fact that a rectangle of aspect ratio ρ2 can be used for dissections of a square into similar rectangles is equivalent to an algebraic property of the number ρ2 related to the Routh–Hurwitz theorem: all of its conjugates have positive real part. Generalization to n rectangles In 2022, the mathematician John Baez brought the problem of generalizing this problem to n rectangles to the attention of the Mathstodon online mathematics community. The problem has two parts: what aspect ratios are possible, and how many different solutions are there for a given n. Frieling and Rinne had previously published a result in 1994 that states that the aspect ratio of rectangles in these dissections must be an algebraic number and that each of its conjugates must have a positive real part. However, their proof was not a constructive proof. Numerous participants have attacked the problem of finding individual dissections using exhaustive computer search of possible solutions. One approach is to exhaustively enumerate possible coarse-grained placements of rectangles, then convert these to candidate topologies of connected rectangles. Given the topology of a potential solution, the determination of the rectangle's aspect ratio can then trivially be expre
https://en.wikipedia.org/wiki/Intake%20momentum%20drag
Intake momentum drag is an aerodynamic phenomenon which effects turbo prop and jet-powered aircraft. Causes Intake momentum drag is caused by the consequence of the speed of the air entering the engine increasing, but where the exit speed of the air from the engine remains constant. The outcome therefore is that the amount by which the engine increases air velocity, ostensibly by way of the compression process, is reduced. A repercussion of this causes a slight reduction in the thrust of a jet engine. Intake momentum drag yaw Intake momentum drag yaw is a further consequence of intake momentum drag which effects V/STOL (vertical and/or short take-off and landing) aircraft such as the Hawker Siddeley Harrier. Intake momentum drag yaw is an aspect in which the mass of air ingested by the intake of the engine, whilst the aircraft is in the hover during a crosswind, can result in a state of uncontrolled roll (a secondary aerodynamic effect of yaw). The phenomenon was identified during the test flying programme for the Harrier and which required precise investigation. This resulted in test pilot John Farley deliberately flying right into the edge of this condition repeatedly, so that a system to counteract the effect could be developed. References Aerospace engineering Aerodynamics Classical mechanics Force
https://en.wikipedia.org/wiki/Broadcast%20Engineering%20Conservation%20Group
The Broadcast Engineering Conservation Group (BECG) conserves historic broadcasting equipment. It is based at Hemswell Cliff in Lincolnshire, England and is a Charitable incorporated organisation. The group was founded by people with large private collections of broadcasting equipment, including several Outside Broadcast (OB) vehicles. It is led by six trustees, many of them working or retired broadcast industry professionals. In 2021 the group purchased its present building and is converting it into a permanent home for its collection known as the Broadcast Engineering Museum. To date, the museum only opens for visitors on special occasions or for groups by appointment. A newsletter called Line-Up is published a few times each year and back issues are available on the BECG website, as is a 3D virtual tour. History Members of the group had been collecting and restoring broadcasting equipment and vehicles for many years before forming BECG in 2017. Some of these vehicles have been fully restored, while others are works in progress. The group was formally incorporated as a charity by six founding trustees in May 2020. In November 2021, the group bought the former RAF Sergeants' Mess at Hemswell Cliff. The building provides a permanent home for the collection and forms the Broadcast Engineering Museum. As well as the main building, there are east and west wings of similar size and two large function rooms and workshops behind. This building had been unused for 12 years and needed a lot of repairs. The local authority, West Lindsey District Council, provided BECG with funding towards the repair of broken windows. Since acquiring the building, repairs and improvements have been made by both contractors and volunteers. In the first year vegetation was cut back, uneven ground levelled, leaking roofs repaired, drains unblocked and over 150 broken window panes replaced. A solar PV array was installed on the main south-facing roof and a CCTV system provided. Two lar
https://en.wikipedia.org/wiki/Tri-nim
Tri-nim is a mathematical abstract strategy game developed by brothers Bruce L. Hicks and Hervey C. Hicks and published by WFF 'N PROOF Games from 1970 to 1975. Players move pieces around a triangular board, attempting to score points by being the last to enter each of the corners. It is a variation on the strategy game Nim. Gameplay Tri-nim is played on a game board that has on it an equilateral triangle cut into 36 smaller triangles. The centre triangles are marked with zeros, while the rest are marked with a point value from one to six moving towards the corners. Players take turns moving stacks of counters from the centre towards one of the three corners marked with a six. On their turn, a player can move any number of counters from a single triangle, but they must be moved parallel to an edge of the triangle and can only be moved to a higher number space if there are no possible triangles of lower or equal values. A player gains control of a corner if their piece is the last to enter it. Points are awarded for each corner depending on the number of pieces on the triangle and the order in which it was cleared. The winner is the player with the most points by the time all counters have been moved into the triangle's corners. Reception In a review of the game in Games & Puzzles No.41, David Wells praised it for its "ample entertainment and tactical possibilities." Marvin Kaye, writing for Galileo, concluded that the game was "an excellent abstract strategy game" but that "unless one is a game buff, one can become hopelessly confused as to the object of Tri-Nim, which is to finish last." In A Gamut of Games, Sid Sackson described Tri-nim as "the ultimate in Nim games." References External links Abstract strategy games Board games Mathematical games
https://en.wikipedia.org/wiki/Fine-tuning%20%28deep%20learning%29
In deep learning, fine-tuning is an approach to transfer learning in which the weights of a pre-trained model are trained on new data. Fine-tuning can be done on the entire neural network, or on only a subset of its layers, in which case the layers that are not being fine-tuned are "frozen" (not updated during the backpropagation step). A model may also be augmented with "adapters" that consist of far fewer parameters than the original model, and fine-tuned in a parameter-efficient way by tuning the weights of the adapters and leaving the rest of the model's weights frozen. For some architectures, such as convolutional neural networks, it is common to keep the earlier layers (those closest to the input layer) frozen because they capture lower-level features, while later layers often discern high-level features that can be more related to the task that the model is trained on. Models that are pre-trained on large and general corpora are usually fine-tuned by reusing the model's parameters as a starting point and adding a task-specific layer trained from scratch. Fine-tuning the full model is common as well and often yields better results, but it is more computationally expensive. Fine-tuning is typically accomplished with supervised learning, but there are also techniques to fine-tune a model using weak supervision. Fine-tuning can be combined with a reinforcement learning from human feedback-based objective to produce language models like ChatGPT (a fine-tuned version of GPT-3) and Sparrow. Robustness Fine-tuning can degrade a model's robustness to distribution shifts. One mitigation is to linearly interpolate a fine-tuned model's weights with the weights of the original model, which can greatly increase out-of-distribution performance while largely retaining the in-distribution performance of the fine-tuned model. Variants Low-rank adaption Low-rank adaption (LoRA) is an adapter-based technique for efficiently finetuning models. The basic idea is to design
https://en.wikipedia.org/wiki/Marginal%20sinus
The marginal sinus is a dural venous sinus surrounding the margin of the foramen magnum inside the skull, accommodated by the groove for marginal sinus. It usually drains into either the sigmoid sinus, or the jugular bulb. It communicates with the basilar venous plexus anteriorly, and the occipital sinus posteriorly (the posterior union of the left and the right marginal sinus usually representing the commencement of the occipital sinus); it may form extracranial communications with the internal vertebral venous plexuses, or deep cervical veins. Clinical significance Arteriovenous fistulas involving the marginal sinus have been described - often following basilar skull fractures. The marginal sinus must be traversed during surgical entry into subdural space deep to the foramen magnum. References Veins of the head and neck Anatomy Human anatomy
https://en.wikipedia.org/wiki/Comparison%20of%20data%20structures
This is a comparison of the performance of notable data structures, as measured by the complexity of their logical operations. For a more comprehensive listing of data structures, see List of data structures. The comparisons in this article are organized by abstract data type. As a single concrete data structure may be used to implement many abstract data types, some data structures may appear in multiple comparisons (for example, a hash map can be used to implement an associative array or a set). Lists A list or sequence is an abstract data type that represents a finite number of ordered values, where the same value may occur more than once. Lists generally support the following operations: peek: access the element at a given index. insert: insert a new element at a given index. When the index is zero, this is called prepending; when the index is the last index in the list it is called appending. delete: remove the element at a given index. Maps Maps store a collection of (key, value) pairs, such that each possible key appears at most once in the collection. They generally support three operations: Insert: add a new (key, value) pair to the collection, mapping the key to its new value. Any existing mapping is overwritten. The arguments to this operation are the key and the value. Remove: remove a (key, value) pair from the collection, unmapping a given key from its value. The argument to this operation is the key. Lookup: find the value (if any) that is bound to a given key. The argument to this operation is the key, and the value is returned from the operation. Unless otherwise noted, all data structures in this table require O(n) space. Integer keys Some map data structures offer superior performance in the case of integer keys. In the following table, let be the number of bits in the keys. Priority queues A priority queue is an abstract data-type similar to a regular queue or stack. Each element in a priority queue has an associated prior
https://en.wikipedia.org/wiki/The%20Lovers%20%28Farmer%20novella%20and%20novel%29
The Lovers is a science-fiction novella by American writer Philip José Farmer (1918–2009), first published in August 1952 in Startling Stories. In 1961, the work was expanded and published as a stand-alone softcover novel by Ballantine Books. In 1979, the novel was reissued by Ballantine as a Del Rey Classic in a final revised ("definitive") edition. Hailed by the science fiction community as a bold and pioneering exploration of religion and sexuality, the original version won Farmer the Hugo Award for "Best New SF Author or Artist" in 1953. Plot summary In the 31st century, the military starship Gabriel has arrived at the distant planet Ozagen. Its all-male crew represents the Haijac Union, where the "American" language is spoken. The Union is one of three powers dominating the planet Earth (the others are the Malay Federation and the Israeli Republics) since the Apocalyptic War, hundreds of years ago, decimated the Earth's population through bio-warfare. The Union is an extreme theocracy and religious tyranny in which everyone (including spouses) is expected to inform on one another for the slightest infractions. The true mission of the Gabriel, which is secret, may be "gunboat diplomacy" or worse. The Ozagenians – a sentient, but technologically inferior, arthropod (insect-like) race – are known to the Earthmen contemptuously as Wogglebugs, or Wogs. (This is an explicit reference to L. Frank Baum's character Professor Woggle-Bug, who these aliens are said to resemble.) One crew member on the Gabriel, linguist Hal Yarrow, is happy for a mission that has allowed him to escape an unhappy marriage, but Yarrow finds that the worst of Earth has followed him in the form of Pornsen, his personal minder ("Guardian Angel"), vigilant for any evidence of sinful deeds or even wrong thinking. Conditioned by a lifetime of submission, Yarrow attempts to lose himself in the study of the Ozagen language. On a visit to ancient ruins built by long vanished mammalian humanoids, he e
https://en.wikipedia.org/wiki/Sonia%20Pa%C3%A7o-Rocchia
Sonia Paço-Rocchia /so.ˈnja ˈpa.so ˈrɔ.kja/, born in 1982 in Montreal, is a composer, multidisciplinary artist, improviser, bassoonist and creative coder. Biography After graduating in mixed music composition from the Université de Montréal in 2005, she began her career in Europe, where she was based mainly in London. She later moved to the Laurentides region in Quebec. Her work has been shown in a dozen countries, including Canada, England and Belgium. In 2019, she became the first woman to receive the in the "Creation of the Year" category. Sonia Paço-Rocchia's approach focuses on sounds, timbres and open musical forms. Her research includes the exploitation and expansion of the sound palettes of instruments through playing techniques or live electronics, inventing her own instruments and instrument automatons. Her pieces frequently involve a visual or theatrical aspect. Sonia Paço-Rocchia is bassoonist-improviser with the . She regularly improvises with chamber ensembles, such as The Fantastique Quintet, VibraLib and ZzCc or as a soloist. Work Compositions Justine et les machines, opera on a libretto by Marie-Ève Bouchard, commissioned by the 3 FEMMES prize by Mécénat Musica (2021-2022) Trouée, a work for baritone saxophone (doubling on piccolo), contrabass clarinet, two Tables de Babel (instrument by ), Orgue de sirène (instrument by ), percussion including an electric Lame, a Stemsaw and a Flex-a-tone on stand, and multi-channel live electronics. This work was commissioned by and , and was nominated as a finalist for the "Creation of the Year" in 2020 (2019) Ode au métal is a work for saxophone quartet performing inside an installation, made of large metal pieces augmented with electronics, as well as quadraphonic live electronics. This work was commissioned by the saxophone quartet . Ode au métal was awarded two , "Creation of the Year" and "Concert of the Year, New Music, Electroacoustic" as well as an Excellence award in performing arts from
https://en.wikipedia.org/wiki/Comparison%20of%20OTP%20applications
The following is a general comparison of OTP applications that are used to generate one-time passwords for two-factor authentication (2FA) systems using the time-based one-time password (TOTP) or the HMAC-based one-time password (HOTP) algorithms. Authenticated implementations See also Password manager List of password managers References Computer access control Authentication methods Password authentication
https://en.wikipedia.org/wiki/List%20of%20linear%20ordinary%20differential%20equations
This is a list of named linear ordinary differential equations. A–Z {| class="wikitable sortable" style="background: white; color: black; text-align: left" |-style="background: #eee" !Name !Order !Equation !Applications |- |Airy |2 | |Optics |- |Bessel |2 | |Wave propagation |- |Cauchy-Euler |n | | |- |Chebyshev |2 | |Orthogonal polynomials |- |Damped harmonic oscillator |2 | |Damping |- |Frenet-Serret |1 | |Differential geometry |- |General Laguerre |2 | |Hydrogen atom |- |General Legendre |2 | | |- |Harmonic oscillator |2 | |Simple harmonic motion |- |Heun |2 | | |- |Hill |2 |, (f periodic) |Physics |- |Hypergeometric |2 | | |- |Kummer |2 | | |- |Laguerre |2 | | |- |Legendre |2 | |Orthogonal polynomials |- |Matrix |1 | | |- |Picard-Fuchs |2 | |Elliptic curves |- |Riemann |2 | | |- |Quantum harmonic oscillator |2 | |Quantum mechanics |- |Sturm-Liouville |2 | |Applied mathematics |} See also List of nonlinear ordinary differential equations List of nonlinear partial differential equations List of named differential equations Equations, differential, ordinary, linear
https://en.wikipedia.org/wiki/Wu%E2%80%93Yang%20dictionary
In topology and high energy physics, the Wu–Yang dictionary refers to the mathematical identification that allows back-and-forth translation between the concepts of gauge theory and those of differential geometry. It was devised by Tai Tsun Wu and C. N. Yang in 1975 when studying the relation between electromagnetism and fiber bundle theory. This dictionary has been credited as bringing mathematics and theoretical physics closer together. A crucial example of the success of the dictionary is that it allowed the understanding of monopole quantization in terms of Hopf fibrations. History In 1931, Paul Dirac published his quantization conditions for magnetic monopoles. Unaware of any connection, the same year, mathematician Heinz Hopf independently proposed his epynomous fibration of a 3-sphere. Equivalences between fiber bundle theory and gauge theory were hinted at the end of the 1960s. In 1967, mathematician Andrzej Trautman started a series of lectures aimed at physicists and mathematicians at King's College London regarding these connections. Theoretical physicists Tsun Wu and C. N. Yang working in Stony Brook University, published a paper in 1975 on the mathematical framework of electromagnetism and the Aharonov–Bohm effect in terms of fiber bundles. A year later, mathematician Isadore Singer came to visit and brought a copy back to the University of Oxford. Singer showed the paper to Michael Atiyah and other mathematicians, sparking a close collaboration between physicists and mathematicians. Yang also recounts a conversation that he had with one of the mathematicians that founded fiber bundle theory, Shiing-Shen Chern: Using these equivalences, Trautman demonstrated an equivalence between Dirac quantization condition and Hopf fibration in 1977. Mathematician Jim Simons discussing this equivalence with Yang expressed that “Dirac had discovered trivial and nontrivial bundles before mathematicians.” Description Summarized version The Wu-Yang dictionary r
https://en.wikipedia.org/wiki/Traditional%20Phenological%20Knowledge
Traditional Phenological Knowledge can be seen as a "subset of Indigenous Knowledge". Traditional Phenological Knowledge (TPK) is the knowledge based on traditional observations made by Indigenous Peoples that predict seasonal changes of nature and their immediate environment. This can be useful for the management of naturally occurring phenomenon, as well as "adaptive management" such as fire management. TPK is not a novel practice and has been practised for hundreds of years. TPK encompasses history, observations and Traditional Knowledge(TK) or Indigenous Knowledge (IK). Indigenous Knowledge is flexible and always evolves. It considers the past, present and future of environmental and biological generations. TPK is integrative and interactive. It falls under the same teachings of Traditional Ecological Knowledge also known as TEK. Both TPK and TEK share close definitions which IK can be an umbrella term. Traditional forms of knowledge are combined with sustainable interaction with the land. Indigenous knowledge creates a relationship that is respectful and symbiotic with the natural world and promotes the existence of passing on hands-on experiences to future generations. Phenology in TPK can be qualitative and quantitative. Observations can be described, passed down by oral histories. TPK can reinforce what is measured and recorded scientifically. TPK can be a tool to help leverage climate change and biodiversity loss in today's climate crisis. TPK can be "direct" or "indirect". Direct observations of phenology in TPK can refer to species signals and timings of secondary species. Direct TPK is translated through the use of belief systems, spirituality, stories, myth and ceremonial events. Indirect TPK is passed on through the use of language specifically. The use of both direct and indirect embodies, reinforces and defines the values TPK. The observation of nature timings along with stories and beliefs, pass down the knowledge from elders and family members t
https://en.wikipedia.org/wiki/Aporia%20%28company%29
Aporia is a machine learning observability platform based in Tel Aviv, Israel. The company has a US office located in San Jose, California. Aporia has developed software for monitoring and controlling undetected defects and failures used by other companies to detect and report anomalies, and warn in the early stages of faults. History Aporia was founded in 2019 by Liran Hason and Alon Gubkin. In April 2021, the company raised a $5 million seed round for its monitoring platform for ML models. In February 2022, the company closed a Series A round of $25 million for its ML observability platform. Aporia was named by Forbes as the Next Billion-Dollar Company in June 2022. In November, the company partnered with ClearML, an MLOPs platform, to improve ML pipeline optimization. In January 2023, Aporia launched Direct Data Connectors, a novel technology allowing organizations to monitor their ML models in minutes (previously the process of integrating ML monitoring into a customer’s cloud environment took weeks or more.) DDC (Direct Data Connectors) enables users to connect Aporia to their preferred data source and monitor all of their data at once, without data sampling or data duplication (which is a huge security risk for major organizations. In April 2023, Aporia announced the company partnered with Amazon Web Services (AWS) to provide more reliable ML observability to AWS consumers by deploying Aporia's architecture to their AWS environment, this will allow customers to monitor their models in production regardless of platform. References Companies based in Tel Aviv Machine learning Israeli companies established in 2019 Information technology companies of Israel
https://en.wikipedia.org/wiki/Golden%20record%20%28informatics%29
In informatics, a golden record is the valid version of a data element (record) in a single source of truth system. It may refer to a database, specific table or data field, or any unit of information used. A golden copy is a consolidated data set, and is supposed to provide a single source of truth and a "well-defined version of all the data entities in an organizational ecosystem". Other names sometimes used include master source or master version. The term has been used in conjunction with data quality, master data management, and similar topics. (Different technical solutions exist, see master data management). Master data In master data management (MDM), the golden copy refers to the master data (master version) of the reference data which works as an authoritative source for the "truth" for all applications in a given IT landscape. See also Single source of truth Single version of the truth References Data quality Data management Data modeling
https://en.wikipedia.org/wiki/XGC88000%20crawler%20crane
The XGC88000 crawler crane is an extremely large ultraheavy crawler crane made by XCMG. With a lifting capacity of 3,600 to 4,000 tons, a total boom length of 144 meters and a total gross weight of 5,350 tons. The XGC88000 crawler crane became the largest tracked mobile crane in the world, beating out the previous record holder, the Liebherr LR 13000 when it officially came in production in 2013. However, when it comes to absolute size, movability and strength, the title still goes to the Honghai Crane which runs on rails. It is also amongst one of the largest ground vehicles in current operation, and - by its official production in 2013 - became the largest self-propelled ground vehicle by gross size, beating out the NASA crawler-transporters. Design specifications The XGC88000 crawler crane, unlike the majority of crawler cranes, comes in two sections. The primary section consists of the crane itself, which boasts a maximum boom length of 144 meters, a maximum total length of a 173 meters (including the counterweight radius), a maximum height (when fully erect) of 108 meters, a lifting capacity ranging between 3,600 and 4,000 tons (although it managed to lift a maximum overload of 4,500 tons), and a maximum lifting momentum of 88,000 tons per meter. The vehicle itself is powered by three 641KW (860 hp) U.S. Cummins engine units outputting a total power of 1923 kW. Each power unit could act as a mobile hydraulic power working station. Moreover, they can also work as an additional power source during crane assembly/disassembly process to improve assembly efficiency, as well as act as an additional spare unit for each other. The crane driver sits in a large spacious cabin the size of a large office room. The cabin has an airconditioner, a seat and a small sofa to accommodate three additional passengers. The second section is a separate tracked compartment which essentially holds the crane's counterweight in total. The counterweight has a total height of 9.7 met
https://en.wikipedia.org/wiki/The%20Longevity%20Diet
The Longevity Diet is a 2018 book by Italian biogerontologist Valter Longo. The subject of the book is fasting and longevity. The book advocates a fasting mimicking diet (FMD) coupled with a low protein, plant based diet. The book advises people about how to have a longer lifespan and healthspan through fasting and diet. Background Valter Longo, a PhD in biochemistry and director of the Longevity Institute at the University of Southern California, invented the fasting mimicking diet. Longo has said, "Using epidemiology and clinical trials, we put all the research together..." The diet calls for an emphasis on consuming fatty fish, and seafood, together with fasting, timing and food quantity. Synopsis In the book, Longo says one should alter one's diet to avoid illness in old age. He advises dieters start the diet with a five-day fasting mimicking diet (FMD), which calls for a vegan diet with calorie restriction from 800 and 1,100 calories per day. After the initial five-day period, Longo advises dieters should eat within a 12-hour window each day. The fast-mimicking diet was pioneered by Valter Longo. The book calls for the five-day, calorie restriction FMD to occur twice per year. Before turning 65 the diet calls for minimal protein, and mostly plant-based diet augmented with calorie-restriction. Reception The book is an international bestseller, has been translated into more than 15 languages, and is sold in more than 20 countries. Writing for Red Pen Reviews, Hilary Bethancourt stated the diet might be difficult and expensive to follow. In addition there is limited research on the long-term effects of the diet. Bethancourt goes on to say that the book gives advice about how to have a longer lifespan and healthspan through the practice of following a five-day fasting-mimicking diet and by choosing what to eat, how much to eat, and how often to eat. Reviewing the book for Glam Adelaide James Murphy stated: "Longo's radical claims have not been accepted entirely
https://en.wikipedia.org/wiki/Colombian%20Air%20Force%20One
Colombian Air Force One abbreviated FAC-0001 is the registration number and indicative that gives air traffic control to the main plane at the service of the President of Colombia, a Boeing 737-700 with the Boeing Business Jet configuration. It is also known by the name "República de Colombia 1" . It is internationally recognized for being one of the few militarized presidential aircraft with a NATO E-4 status, which represents the highest level of protection. It is monitored by Israeli and American satellites, in addition to having a fourth degree nuclear ballistic capacity. History Over time, Colombia has had several aircraft at the service of the president of Colombia. The first Colombian ruler to fly in an 0airplane during his tenure was Pedro Nel Ospina for an official mission in August 1922. First presidential aircraft In 1933, the first aircraft entered service, the FAC 625 Junkers Ju-52/3mge, which made its maiden voyage with President Enrique Olaya Herrera. This German-made aircraft had a capacity for three crew members and 20 passengers. It was used until 1950 transporting Enrique Olaya Herrera, Alfonso López Pumarejo, Eduardo Santos and Mariano Ospina Pérez. As an alternative to this plane, the Douglas C-47 Skytrain FAC 660 used by the presidents Gustavo Rojas Pinilla, Alberto Lleras Camargo, Guillermo León Valencia and Carlos Lleras Restrepo was also used. Similarly, a Lockheed C-60 Lodestar FAC 654 used by Alfonso López Pumarejo. In 1954 came the Douglas C-54 Skymaster FAC 613, later numbered FAC 690, manufactured in the United States. He served in the terms of General Gustavo Rojas Pinilla, Alberto Lleras Camargo, Guillermo León Valencia, Carlos Lleras Restrepo and Misael Pastrana Borrero. He retired in 1971. Fokker F-28 The third presidential plane that the Air Force had was the Fokker F-28, which entered service on February 19, 19712 and whose manufacturer is the Dutch company Fokker. It was used, among other missions, to transport the remains o
https://en.wikipedia.org/wiki/Generative%20artificial%20intelligence
Generative artificial intelligence (also generative AI or GenAI) is artificial intelligence capable of generating text, images, or other media, using generative models. Generative AI models learn the patterns and structure of their input training data and then generate new data that has similar characteristics. In the early 2020s, advances in transformer-based deep neural networks enabled a number of generative AI systems notable for accepting natural language prompts as input. These include large language model chatbots such as ChatGPT, Bing Chat, Bard, and LLaMA, and text-to-image artificial intelligence art systems such as Stable Diffusion, Midjourney, and DALL-E. Generative AI has uses across a wide range of industries, including art, writing, script writing, software development, product design, healthcare, finance, gaming, marketing, and fashion. Investment in generative AI surged during the early 2020s, with large companies such as Microsoft, Google, and Baidu as well as numerous smaller firms developing generative AI models. However, there are also concerns about the potential misuse of generative AI, including cybercrime or creating fake news or deepfakes which can be used to deceive or manipulate people. History The academic discipline of artificial intelligence was founded at a research workshop at Dartmouth College in 1956, and has experienced several waves of advancement and optimism in the decades since. Since its founding, researchers in the field have raised philosophical and ethical arguments about the nature of the human mind and the consequences of creating artificial beings with human-like intelligence; these issues have previously been explored by myth, fiction and philosophy since antiquity. These concepts of automated art date back at least to the automata of ancient Greek civilization, where inventors such as Daedalus and Hero of Alexandria were described as having designed machines capable of writing text, generating sounds, and playing
https://en.wikipedia.org/wiki/Ingrid%20Hornef
Ingrid Hornef (born August 12, 1940) is a German sculptor, installation artist, curator and painter. She is a representative of Concrete art and became best known for using a dice as a random number generator in her series of works Alea iacta est (Latin for the die is cast). Life Ingrid Hornef acquired her professional skills as an autodidact and opened her own studio in 1985. In her early days, she was mainly concerned with pottery and attended ceramic courses in 1984 and 1985 with the well-known Japanese ceramic artist Takeshi Yasuda. In 1992 she took part in workshops by Mária Geszler Garzuly in Kecskemét , Hungary. Besides her work as an artist, Hornef is also active as a curator. In 2002, she initiated the temporary sculpture trail Land schafft Kunst (Land Creates Art) as part of the Rhine-Main Regional Park between Hochheim-Massenheim and Wiesbaden-Delkenheim. The area is a typical industrial landscape with seemingly intact nature, but at the same time there are high-rise buildings and high-voltage power lines that dominate the landscape. Hornef invited 21 artists to create their works especially for this place. In July 2019, Hornef organized a German-Greek–WWII–memorial and peace project named Building Bridges, together with city councilor Annette Courtis, who has Greek roots and who, like Hornef, lives in Hofheim am Taunus. In a small village called Chouni (municipal district of Agrinio), Greece, 16 German and Greek artists created works of art from sandstone under the direction of Hornef, in which they dealt with the history of Chouni during World War II. The village had been almost completely burned down by German soldiers in 1944. Parallel to this, lectures on the effects of the Second World War in Greece were held. The resulting sculptures were permanently installed on the village square in Chouni. In July 2021, a corresponding workshop took place in Hofheim, again led by Hornef. Here, too, the invited German and Greek artists formed works of art in
https://en.wikipedia.org/wiki/Round%20%28cryptography%29
In cryptography, a round or round function is a basic transformation that is repeated (iterated) multiple times inside the algorithm. Splitting a large algorithmic function into rounds simplifies both implementation and cryptanalysis. For example, encryption using an oversimplified three-round cipher can be written as , where is the ciphertext and is the plaintext. Typically, rounds are implemented using the same function, parameterized by the round constant and, for block ciphers, the round key from the key schedule. Parameterization is essential to reduce the self-similarity of the cipher, which could lead to slide attacks. Increasing the number of rounds "almost always" protects against differential and linear cryptanalysis, as for these tools the effort grows exponentially with the number of rounds. However, increasing the number of rounds does not always make weak ciphers into strong ones, as some attacks do not depend on the number of rounds. The idea of an iterative cipher using repeated application of simple non-commutating operations producing diffusion and confusion goes as far back as 1945, to the then-secret version of C. E. Shannon's work "Communication Theory of Secrecy Systems"; Shannon was inspired by mixing transformations used in the field of dynamical systems theory (cf. horseshoe map). Most of the modern ciphers use iterative design with number of rounds usually chosen between 8 and 32 (with 64 and even 80 used in cryptographic hashes). For some Feistel-like cipher descriptions, notably the one of the RC5, a term "half-round" is used to define the transformation of part of the data (a distinguishing feature of the Feistel design). This operation corresponds to a full round in traditional descriptions of Feistel ciphers (like DES). Round constants Inserting round-dependent constants into the encryption process breaks the symmetry between rounds and thus thwarts the most obvious slide attacks. The technique is a standard feature of most mo
https://en.wikipedia.org/wiki/Red%20Sea%20brine%20pool%20microbiology
The Red Sea and its extensions of the Gulf of Suez and the Gulf of Aqaba contain the largest recorded concentration of deep sea brine pools on the planet. These pools have many features that make them un-inhabitable to almost all organisms on the planet, yet, certain communities of microbes thrive within these extreme environments that have temperature ranging from 2.0 °C all the way up to the high of 75 °C. The Red Sea brine pools have extreme salinity concentrations and varying compositions of nutrients, chemicals properties and molecules that directly affect the microbiome between the estimated 25 individual pools in the region, some of which are closely clustered together in groups leading to their undetermined classification of names. The brine pools in the region originate from hydrothermal vents and shifting of tectonic plates and the accumulation of water with properties that make it unsuitable for mixing leading to its accumulation within faults and divots in the sea floor. Atlantis Deep II, Discovery Deep and the Kebrit are the most investigated and researched brine pools among the many located within the Red Sea Additionally, many microbial species form beneficial symbiotic relationships with organisms living and feeding in proximity to the pools. These relationships allow for the study of specialised adaptations of microbes to brine pool environments. List In addition to the originally-discovered warm brine pools, recent discoveries have found four smaller warm brine pools named the NEOM Brine Pools located in the Gulf of Aqaba. Furthermore, multiple cold seeps have been identified in the region of the Red Sea (Thuwal Cold Seeps) consisting of two individual pools. Three of these Red Sea brine pools are unnamed as they are small and potentially extensions of other nearby larger pools. Viral diversity Composition The virus community within the many Red Sea brine pools is largely unexplored. However, with the use of metagenomics, viral communitie
https://en.wikipedia.org/wiki/Confidential%20computing
Confidential computing is a security and privacy-enhancing computational technique focused on protecting data in use. Confidential computing can be used in conjunction with storage and network encryption, which protect data at rest and data in transit respectively. It is designed to address software, protocol, cryptographic, and basic physical and supply-chain attacks, although some critics have demonstrated architectural and side-channel attacks effective against the technology. The technology protects data in use by performing computations in a hardware-based trusted execution environment (TEE). Confidential data is released to the TEE only once it is assessed to be trustworthy. Different types of confidential computing define the level of data isolation used, whether virtual machine, application, or function, and the technology can be deployed in on-premise data centers, edge locations, or the public cloud. It is often compared with other privacy-enhancing computational techniques such as fully homomorphic encryption, secure multi-party computation, and Trusted Computing. Confidential computing is promoted by the Confidential Computing Consortium (CCC) industry group, whose membership includes major providers of the technology. Properties Trusted execution environments (TEEs) "prevent unauthorized access or modification of applications and data while they are in use, thereby increasing the security level of organizations that manage sensitive and regulated data". Trusted execution environments can be instantiated on a computer's processing components such as a central processing unit (CPU) or a graphics processing unit (GPU). In their various implementations, TEEs can provide different levels of isolation including virtual machine, individual application, or compute functions. Typically, data in use in a computer's compute components and memory exists in a decrypted state and can be vulnerable to examination or tampering by unauthorized software or administra
https://en.wikipedia.org/wiki/Marie%20Lhuissier
Marie Lhuissier (born 1989) is a French mathematician, mathematical story-teller, and children's book author. Education and work Lhuissier earned a Ph.D. at the École normale supérieure de Lyon in 2018, with the dissertation Le problème mathématique des trois corps, abordé simultanément sous l'angle de la recherche théorique et celui de la diffusion auprès de publics variés concerning both the theory and public dissemination of research on the three-body problem in celestial mechanics, jointly directed by Étienne Ghys and Christian Mercat. After completing her doctorate, she decided to devote herself to popularizing mathematics through telling stories to children. Her philosophy is that by identifying with the subjects of her stories, who are forced to use mathematics creatively to solve their problems, children can come to learn of mathematics as a dynamic and creative subject, rather than one that is static and abstract. Books Lhuissier has published some of her stories in book form, including: Lune (illustrated by Elis Tamula, 2017) La Faiseuse de Neige (illustrated by Elis Tamula, 2018) Recognition The Société mathématique de France gave Lhuissier the 2022 for the dissemination of mathematics. References External links Home page 1989 births Living people French mathematicians French women mathematicians Mathematics popularizers
https://en.wikipedia.org/wiki/Photoconductance%20decay
Photoconductance decay or Photoconductivity decay (PCD or PC), is a non-destructive analytical technique used to measure the lifetime of minority charge carriers in a semiconductor, especially in silicon wafers. The technique studies the transient photoconductivity of a semiconductor sample during or after it is illuminated by a light pulse. Electron–hole pairs are first generated by the light pulse, and the photoconductivity of the sample declines as the carriers recombine. PCD is an important characterisation step in determining the quality and expected performance of wafers before they are used to fabricate devices such as integrated circuits or solar cells. It is one of the most common methods of determining carrier lifetimes. PCD uses a fast light source (e.g. a xenon flash lamp) to excite the test sample, causing free carriers to be generated. Excess carriers in the material cause it to become more conductive, and thus the number of excess carriers () can be measured over time by measuring the material conductivity. Conductivity can be measured through non-contact methods, such as through microwave reflectance, or inductive or capacitive coupling. A higher effective lifetime of minority charge carriers indicate that they can remain mobile in the wafer for a long time period before undergoing recombination. History Characterisation of minority carrier lifetimes through measurement of photoconductance decay was a technique used by Bell Laboratories as early as 1954 on silicon and germanium wafers during investigation of carrier trapping. A detailed method for measuring PCD was published soon after by MIT Lincoln Laboratory in 1955. A standard method for PCD was described in ATSM standards in 1971 for measurement of minority carrier lifetimes. A new method for Quasi-steady-state photoconductance measurements was described in 1996 by Ronald Sinton. Theory The difference in dark and excited photoconductivity of the wafer is typically measured through monito
https://en.wikipedia.org/wiki/Earnings%20at%20risk
Earnings at risk (EaR) and the related cash flow at risk (CFaR) are measures reflecting the potential impact of market risk on the income statement and cash flow statement respectively, and hence the risk to the institution's return on assets and, ultimately, return on equity. EaR measures the impact on net interest income due to movements in foreign exchange and interest rates; while CFaR measures possible shortfalls in cash flow due to these. Both are calculated under simulation as for Value at Risk. References Mathematical finance Financial risk modeling Market risk Monte Carlo methods in finance Capital management
https://en.wikipedia.org/wiki/Kathryn%20Peddrew
Kathryn Peddrew (June 14, 1922 - March 4, 2012) was an African-American mathematician, engineer, and scientist who played a crucial role in the National Advisory Committee for Aeronautics (NACA) and the National Aeronautics and Space Administration (NASA). She was one of the African-American women who worked as a "human computer" at NACA's Langley Research Center in the 1940s and 1950s. Early life and education Peddrew was born on June 14, 1922 in Martinsburg, Virginia. She attended Storer College in her home state of West Virginia. She focused her studies on chemistry and graduated with a chemistry degree in 1943. After college, she began looking for research opportunities. Her first choice was to travel with one of her former professors to New Guinea to study quinine deafness. Unfortunately, these plans fell through as the research program had made no plans for female housing. Career at NACA and NASA Peddrew saw an advertisement from NACA (eventually known as NASA) saying that they were hiring chemists. At the time, there was a large increase in women being hired by NACA due to men going overseas to fight in WWII. She decided to apply for this position and was hired. However, when she arrived at the job, she was relocated to the West Area Computing Unit after it was discovered that she was African American. There she worked in the all-black West Building at NACA. She and her colleagues were referred to as the “West Computers”, a group that consisted of Dorothy Vaughn, Mary Jackson, Miriam Daniel Mann, and Peddrew herself. Here she conducted aeronautical and aerospace research, doing the majority of her work in the Instrument Research Division. The unit was responsible for performing complex calculations that were critical to various aeronautical research projects. Despite facing racial segregation and discrimination, Peddrew and her colleagues persisted in their work and contributed to the development of supersonic flight, as well as the early stages of the
https://en.wikipedia.org/wiki/Hydrologic%20unit%20system%20%28United%20States%29
In order to advantage hydrologists, ecologists, and water-resource managers in the study of "water, its properties and laws, and its distribution over the earth's surface" in the United States, the United States Geological Survey created a hierarchical system of hydrologic units. Originally a four-tier system divided into regions, sub-regions, accounting units, and cataloging units, each unit was assigned a unique Hydrologic Unit Code (HUC). As first implemented the system had 21 regions, 221 subregions, 378 accounting units, and 2,264 cataloging units. Over time the system was changed and expanded. As of 2010 there are six levels in the hierarchy, represented by hydrologic unit codes from 2 to 12 digits long, called regions, subregions, basins, subbasins, watersheds, and subwatersheds. The table below describes the system's hydrologic unit levels and their characteristics, along with example names and codes. The original delineation of units, down to subbasins (cataloging units), was done using 1:250,000 scale maps and data. The newer delineation work on watersheds and subwatersheds was done using 1:24,000 scale maps and data. As a result, the subbasin boundaries were changed and adjusted in order to conform to the higher resolution watersheds within them. Changes to subbasin boundaries resulted in changes in area sizes. Therefore, older data using "cataloging units" may differ from newer, higher resolution data using "subbasins". The regions (1st level hydrologic units) are geographic areas that contain either the drainage area of a major river, such as the Missouri region, or the combined drainage areas of a series of rivers, such as the Texas–Gulf region. Each subregion includes the area drained by a river system, a reach of a river and its tributaries in that reach, a closed basin or basins, or a group of streams forming a coastal drainage area. Regions receive a two-digit code. The following levels are designated by the addition of another two digits. T
https://en.wikipedia.org/wiki/Quasi-free%20algebra
In abstract algebra, a quasi-free algebra is an associative algebra that satisfies the lifting property similar to that of a formally smooth algebra in commutative algebra. The notion was introduced by Cuntz and Quillen for the applications to cyclic homology. A quasi-free algebra generalizes a free algebra, as well as the coordinate ring of a smooth affine complex curve. Because of the latter generalization, a quasi-free algebra can be thought of as signifying smoothness on a noncommutative space. Definition Let A be an associative algebra over the complex numbers. Then A is said to be quasi-free if the following equivalent conditions are met: Given a square-zero extension , each homomorphism lifts to . The cohomological dimension of A with respect to Hochschild cohomology is at most one. Let denotes the differential envelope of A; i.e., the universal differential-graded algebra generated by A. Then A is quasi-free if and only if is projective as a bimodule over A. There is also a characterization in terms of a connection. Given an A-bimodule E, a right connection on E is a linear map that satisfies and . A left connection is defined in the similar way. Then A is quasi-free if and only if admits a right connection. Properties and examples One of basic properties of a quasi-free algebra is that the algebra is left and right hereditary (i.e., a submodule of a projective left or right module is projective or equivalently the left or right global dimension is at most one). This puts a strong restriction for algebras to be quasi-free. For example, a hereditary (commutative) integral domain is precisely a Dedekind domain. In particular, a polynomial ring over a field is quasi-free if and only if the number of variables is at most one. An analog of the tubular neighborhood theorem, called the formal tubular neighborhood theorem, holds for quasi-free algebras. References Bibliography Maxim Kontsevich, Alexander Rosenberg, Noncommutative spaces, prepri
https://en.wikipedia.org/wiki/Comparison%20of%20platforms%20for%20software%20agents
There several platforms for software agents or also agent development toolkits, which can facilitate the development of multi-agent systems. Hereby, software agents are implemented as independent threads which communicate with each other using agent communication languages. Below is a chart intended to capture many of the features that are important to such platforms. Comparison of platforms References Software comparisons