source
stringlengths
31
203
text
stringlengths
28
2k
https://en.wikipedia.org/wiki/Q-category
In mathematics, a Q-category or almost quotient category is a category that is a "milder version of a Grothendieck site." A Q-category is a coreflective subcategory. The Q stands for a quotient. The concept of Q-categories was introduced by Alexander Rosenberg in 1988. The motivation for the notion was its use in noncommutative algebraic geometry; in this formalism, noncommutative spaces are defined as sheaves on Q-categories. Definition A Q-category is defined by the formula where is the left adjoint in a pair of adjoint functors and is a full and faithful functor. Examples The category of presheaves over any Q-category is itself a Q-category. For any category, one can define the Q-category of cones. There is a Q-category of sieves. References Alexander Rosenberg, Q-categories, sheaves and localization, (in Russian) Seminar on supermanifolds 25, Leites ed. Stockholms Universitet 1988. Further reading Category theory Noncommutative geometry
https://en.wikipedia.org/wiki/Certificate%20revocation
In public key cryptography, a certificate may be revoked before it expires, which signals that it is no longer valid. Without revocation, an attacker could exploit such a compromised or misissued certificate until expiry. Hence, revocation is an important part of a public key infrastructure. Revocation is performed by the issuing certificate authority, which produces a cryptographically authenticated statement of revocation. For distributing revocation information to clients, the timeliness of the discovery of revocation (and hence the window for an attacker to exploit a compromised certificate) trades off against resource usage in querying revocation statuses and privacy concerns. If revocation information is unavailable (either due to an accident or an attack), clients must decide whether to fail-hard and treat a certificate as if it is revoked (and so degrade availability) or to fail-soft and treat it as unrevoked (and allow attackers to sidestep revocation). Due to the cost of revocation checks and the availability impact from potentially-unreliable remote services, Web browsers limit the revocation checks they will perform, and will fail soft where they do. Certificate revocation lists are too bandwidth-costly for routine use, and the Online Certificate Status Protocol presents connection latency and privacy issues. Other schemes have been proposed but have not yet been successfully deployed to enable fail-hard checking. Glossary of acronyms History The Heartbleed vulnerability, which was disclosed in 2014, triggered a mass revocation of certificates, as their private keys may have been leaked. GlobalSign revoked over 50% of their issued certificates. StartCom was criticised for issuing free certificates but then charging for revocation. A 2015 study found an overall revocation rate of 8% for certificates used on the Web, though this may have been elevated due to Heartbleed. Despite Web security being a priority for most browsers, due to the latency and
https://en.wikipedia.org/wiki/Kaisen%20Linux
Kaisen Linux (stylized as ka:sen linux) is a system rescue Linux distribution based on Debian and composed of free and open-source software. It is originated from France. Kaisen is designated for the IT professionals. The operating system is developed by Kaisen Team which is led by Kevin Chevreuil. It is also supported by the volunteers. It has KDE Plasma, LXQt, MATE, Xfce interfaces. See also Comparison of Linux distributions Debian Linux References External links Official website Debian-based distributions Free software operating systems X86-64 Linux distributions Linux distributions
https://en.wikipedia.org/wiki/Ilman%20Shazhaev
Ilman Shazhaev (born 13 May 1993, Grozny, the Chechen Republic) is a scientist, IT engineer, and the author of 30 scientific papers and 10 patents. Early life and education Ilman Shazhaev was born on the 13th of May in 1993 in Grozny, the Chechen Republic. In 2012, he received a gold medal at the Russian Federal Young Scientists Contest for his innovative project of a new-generation electricity counter. In 2015, Ilman graduated from Moscow State Road & Automobile University with a BS degree in Informatics and Computing. From 2015 to 2017, he studied Mechanical Engineering in Harbin Institute of Technology, China. From 2017 to present, Ilman is a PHD candidate in Management of Science and Engineering, Shanghai Jiao Tong University, China. From 2011 to 2012, Ilman worked as an engineer and researcher at Grozny State Oil Technological University. From 2013 to 2014, he served as an assistant to the Head of Grozny State Oil Technological University. Due to his academic records, in 2014, Ilman was awarded the Russian President Scholarship. In 2015, he got the Chinese Government Scholarship, and in 2017 — Shanghai Government Scholarship. At the same time, Ilman initiated a business incubator program in partnership with the Heilongjiang province government of China and Wanda Group. Under Ilman's coordination, the program helped 10 startups and businesses accelerate and set up their business in the Chinese market. In 2019, he cofounded Acoustery, a health tech company that developed AI technology for the early recognition of respiratory diseases. In 2022, Ilman Shazhaev founded a decentralized science metaverse platform connecting scientists, gamers, and investors — MetaDeSci. Also, he created Farcana Gaming Metaverse and Farcana Labs, the scientific and engineering platform specialized in the cryptocurrency industry. References Software engineers 1993 births Living people
https://en.wikipedia.org/wiki/Digital%20Archive%20of%20Literacy%20Narratives
The Digital Archive of Literacy Narratives (DALN) is an online public archive of personal literacy narratives. The DALN collects narratives ranging in formats and composition styles to include traditional and unconventional self-exploratory mediums such as video essays, drawings and written narratives. In 2005, Cynthia Selfe, H. Lewis Ulman, and Scott DeWitt at Ohio State University began development of the DALN with the purpose of creating and preserving a diverse and accessible collection of personal narratives. While most visitors to the site are from the United States, the DALN has developed a worldwide audience, and , there were over 8,000 submissions from countries on six of seven continents. Origin The original purpose of DALN, as stated by its creators, was to create an accessible collection of literacy narratives for the purpose of literacy research. With origins in writing studies research, its creators sought to capture the development of narratives, to challenge notable definitions of literacy, and create a dynamic way for collaborators, readers and researchers to interact. In recent years DALN has been implemented in writing studies courses, with the intent of promoting literacy through personal inquiry. References External links Digital Archive of Literacy Narratives Literacy Online archives
https://en.wikipedia.org/wiki/Ambient%20IoT
Ambient IoT, from ambient and Internet of things, is a concept originally coined by 3GPP that is used in the technology industry referring to an ecosystem of a large number of objects in which every item is connected into a wireless sensor network using low-cost self-powered sensor nodes. Bluetooth SIG has assessed the total addressable market of Ambient IoT to be more than 10 trillion devices across different verticals. The applications of Ambient IoT include making supply chains for food and medicine more efficient and sustainable, protecting from counterfeiting and delivering the data required for advanced transportation and smart city initiatives. Ambient IoT has been called "the original vision for the IoT" by U.S. Department of Commerce IoT Advisory Board chair Benson Chan. Standards for Ambient IoT are being considered by 3GPP, IEEE and Bluetooth SIG. Ambient IoT technology is being developed and produced at scale by Wiliot, Identiv and others. References Internet of things Ambient intelligence
https://en.wikipedia.org/wiki/Bailey%27s%20FFT%20algorithm
The Bailey's FFT (also known as a 4-step FFT) is a high-performance algorithm for computing the fast Fourier transform (FFT). This variation of the Cooley–Tukey FFT algorithm was originally designed for systems with hierarchical memory common in modern computers (and was the first FFT algorithm in this so called "out of core" class). The algorithm treats the samples as a two dimensional matrix (thus yet another name, a matrix FFT algorithm) and executes short FFT operations on the columns and rows of the matrix, with a correction multiplication by "twiddle factors" in between. The algorithm got its name after an article by David H. Bailey, FFTs in external or hierarchical memory, published in 1989. In this article Bailey credits the algorithm to W. M. Gentleman and G. Sande who published their paper, Fast Fourier Transforms: for fun and profit, some twenty years earlier in 1966. The algorithm can be considered a radix- FFT decomposition. Here is a brief overview of how the "4-step" version of the Bailey FFT algorithm works: The data (in natural order) is first arranged into a matrix. Each column of a matrix is then independently processed using a standard FFT algorithm. Each element of a matrix is multiplied by a correction coefficient. Each row of a matrix is then independently processed using a standard FFT algorithm. The result (in natural order) is read column-by-column. Since the operations are performed column-wise and row-wise, steps 2 and 4 (and reading of the result) might include a matrix transpose to rearrange the elements in a way convenient for processing. The algorithm resembles a 2-dimensional FFT, a 3-dimensional (and beyond) extensions are known as 5-step FFT, 6-step FFT, etc. The Bailey FFT is typically used for computing DFTs of large datasets, such as those used in scientific and engineering applications. The Bailey FFT is a very efficient algorithm, and it has been used to compute FFTs of datasets with billions of elements (when appli
https://en.wikipedia.org/wiki/Decipherment%20of%20cuneiform
The decipherment of cuneiform began with the decipherment of Old Persian cuneiform between 1802 and 1836. The first cuneiform inscriptions published in modern times were copied from the Achaemenid royal inscriptions in the ruins of Persepolis, with the first complete and accurate copy being published in 1778 by Carsten Niebuhr. Niebuhr's publication was used by Grotefend in 1802 to make the first breakthrough – the realization that Niebuhr had published three different languages side by side and the recognition of the word "king". The rediscovery and publication of cuneiform took place in the early 17th century, and early conclusions were drawn such as the writing direction and that the Achaemenid royal inscriptions are three different languages (with two different scripts). In 1620, García de Silva Figueroa dated the inscriptions of Persepolis to the Achaemenid period, identified them as Old Persian, and concluded that the ruins were the ancient residence of Persepolis. In 1621, Pietro della Valle specified the direction of writing from left to right. In 1762, Jean-Jacques Barthélemy found that an inscription in Persepolis resembled that found on a brick in Babylon. Carsten Niebuhr made the first copies of the inscriptions of Persepolis in 1778 and settled on three different types of writing, which subsequently became known as Niebuhr I, II and III. He was the first to discover the sign for a word division in one of the scriptures. Oluf Gerhard Tychsen was the first to list 24 phonetic or alphabetic values for the characters in 1798. Actual decipherment did not take place until the beginning of the 19th century, initiated by Georg Friedrich Grotefend in his study of Old Persian cuneiform. He was followed by Antoine-Jean Saint-Martin in 1822 and Rasmus Christian Rask in 1823, who was the first to decipher the name Achaemenides and the consonants m and n. Eugène Burnouf identified the names of various satrapies and the consonants k and z in 1833–1835. Christian La
https://en.wikipedia.org/wiki/Internet%20of%20Musical%20Things
The Internet of Musical Things (also known as IoMusT) is a research area that aims to bring Internet of Things connectivityto musical and artistic practices. Moreover, it encompasses concepts coming from music computing, ubiquitous music, human-computer interaction, artificial intelligence, augmented reality, virtual reality, gaming, participative art, and new interfaces for musical expression. From a computational perspective, IoMusT refers to local or remote networks embedded with devices capable of generating and/or playing musical content. Introduction The term "Internet of Things" (IoT) is extensible to any everyday object connected to the internet, having its capabilities increased by exchanging information with other elements present in the network to achieve a common goal. Thanks to the technological advances that have occurred in the last decades, its use has spread to several areas of performance, assisting in medical analysis, traffic control and home security. When its concepts meet music, the Internet of Music Things (IoMusT) emerges. The term "Internet of Musical Things" also receives numerous classifications, according to the use of certain authors. Hazzard et al., for example, uses it in the context of musical instruments that have QR code that directs the user to a page with information about this instrument, such as manufacturing date and history. Keller and Lazzarini, use this term in ubiquitous music (ubimus) research, while Turchet et al. define IoMusT as a subfield of the Internet of Things, where interoperable devices can connect to each other, aiding the interaction between musicians and the audience. Like the IoT, the Internet of Music Things can encompass a variety of ecosystems. But generally, it is marked by being employed in musical activities (rehearsals, concerts, recordings, and music teaching) and relying on service and information providers. In addition to the technological and artistic advantages that this field offers, new op
https://en.wikipedia.org/wiki/Amazon%20Kinesis
Amazon Kinesis is a family of services provided by Amazon Web Services (AWS) for processing and analyzing real-time streaming data at a large scale. Launched in November 2013, it offers developers the ability to build applications that can consume and process data from multiple sources simultaneously. Kinesis supports multiple use cases, including real-time analytics, log and event data collection, and real-time processing of data generated by IoT devices. Components Amazon Kinesis is composed of four main services: Kinesis Data Streams, Kinesis Data Firehose, Kinesis Data Analytics, and Kinesis Video Streams. Kinesis Data Streams Kinesis Data Streams is a scalable and durable real-time data streaming service that captures and processes gigabytes of data per second from multiple sources. It enables the storage and processing of data in real time, making it useful for applications that require immediate insights, such as monitoring and alerting. Kinesis Data Firehose Kinesis Data Firehose is a fully managed service for delivering real-time streaming data to destinations such as Amazon S3, Amazon Redshift, Amazon Elasticsearch, and AWS-partner data stores. With Data Firehose, users can configure and scale data delivery without manual intervention. Kinesis Data Analytics Kinesis Data Analytics enables the analysis of streaming data in real time using standard SQL or Apache Flink. Kinesis Video Streams Kinesis Video Streams is a fully managed service for securely capturing, processing, and storing video streams for analytics and machine learning. It supports multiple video codecs and streaming protocols, making it suitable for various use cases, such as security and surveillance, video-enabled IoT devices, and live event broadcasting. Integration Amazon Kinesis can be easily integrated with other AWS services, such as AWS Lambda, Amazon S3, Amazon Redshift, and Amazon OpenSearch. This integration enables developers to build end-to-end streaming data processi
https://en.wikipedia.org/wiki/Tsakane%20Clay%20Grassland
The Tsakane Clay Grassland is a rare South African vegetation type supporting a unique grassland ecosystem. It is named after the township of Tsakane in Ekurhuleni, Gauteng, in which it is the dominant natural vegetation type. This ecosystem is characterized by its clay-rich soil, which supports a diverse array of flora and fauna, including several endemic and threatened species. The Tsakane Clay Grassland is an important conservation area, as it plays a crucial role in maintaining biodiversity and providing ecosystem services to the surrounding human populations. Geography The Tsakane Clay Grassland is the main vegetation type within the Suikerbosrand Nature Reserve, with a smaller occurrence of the Andesite Mountain Bushveld (SVcb11) vegetation unit. The altitude varies between 1,545 and 1,917 meters above sea level. The grassland extends from Soweto to the town of Springs in Gauteng and is distributed in patches southwards to Nigel and Vereeniging. The vegetation unit also occurs in parts of Mpumalanga between Balfour and Standerton and also in the northern side of the Vaal Dam. The landscape is flat to slightly undulating, with low hills also present in some areas of the grassland. Biodiversity The Tsakane Clay Grassland is home to a diverse range of plant species, including important taxa such as Andropogon schirensis, Eragrostis racemosa, Senecio inornatus, and Anthospermum rigidum subsp. pumilum. These species are adapted to the clay-rich soil conditions found in the grassland. The ecosystem also supports a variety of animal species, including mammals, birds, reptiles, and insects, many of which rely on the unique plant species for food and habitat. Conservation The Tsakane Clay Grassland is an important conservation area due to its high levels of biodiversity and the presence of several threatened and endemic species. Efforts to conserve the ecosystem include the establishment of protected areas, as well as ongoing research and monitoring programs to b
https://en.wikipedia.org/wiki/Hachette%20v.%20Internet%20Archive
Hachette Book Group, Inc. v. Internet Archive, No. 20-cv-4160 (JGK), 2023 WL 2623787, 2023 U.S. Dist. LEXIS 50749 (S.D.N.Y. 2023), is a case in which the United States District Court for the Southern District of New York determined that the Internet Archive committed copyright infringement by scanning and distributing copies of books online. Stemming from the creation of the National Emergency Library (NEL) during the onset of the COVID-19 pandemic, publishing company Hachette Book Group alleged that the Open Library and the National Emergency Library facilitated copyright infringement. The case involves the fair use of controlled digital lending (CDL) systems. On March 25, 2023, the court ruled against the Internet Archive, which plans on appealing. Background The Internet Archive is a non-profit organization dedicated to preserving knowledge and based in San Francisco, California; the Archive maintains Open Library, a digital library index and lending system. As many of the works in the Internet Archive are under copyright, the Archive uses a controlled digital lending (CDL) system, a practice that relies upon digital rights management (DRM) to prevent unauthorized downloading or copying of copyrighted works. Open Library can generate digitized material (ebooks) from print copy. The Open Library CDL system ensures that only one digital copy is in use for each print copy or otherwise authorized ebook copy available. On March 24, 2020, as a result of shutdowns caused by the COVID-19 pandemic, the Internet Archive opened the National Emergency Library, removing the waitlists used in Open Library and expanding access to these books for all readers. Two months later on June 1, the National Emergency Library (NEL) was met with a lawsuit from four book publishers. Two weeks after that, June 16, the Internet Archive closed the NEL, and the prior Open Library CDL system resumed after the 12 weeks of NEL usage. Lawsuit On June 1, 2020, Hachette Book Group and othe
https://en.wikipedia.org/wiki/AWS%20App%20Runner
AWS App Runner is a fully managed container application service offered by Amazon Web Services (AWS). Launched in May 2021, it is designed to simplify the process of building, deploying, and scaling containerized applications for developers. The service enables users to focus on writing code and developing features, without needing to manage the underlying infrastructure. It provides automatic scaling, load balancing, and security features, making it a suitable choice for deploying web applications and APIs. The service also simplifies MLOps. Features AWS App Runner offers several features that are designed to simplify the deployment and management of containerized applications, including: Fully managed: AWS App Runner takes care of the underlying infrastructure and operational tasks, allowing developers to focus on their applications. Automatic scaling: AWS App Runner automatically scales applications based on incoming traffic and resource utilization, ensuring optimal performance and cost-efficiency. CI/CD integration: AWS App Runner integrates with popular CI/CD services, streamlining the build, deployment, and release processes. Custom domains and TLS support: AWS App Runner supports custom domains and TLS certificates, providing secure access to applications. Monitoring and logging: AWS App Runner integrates with Amazon CloudWatch, enabling developers to monitor application performance and access logs. Health checks and automatic recovery: AWS App Runner periodically checks the health of running instances and automatically replaces any unhealthy instances. Flexible pricing: AWS App Runner offers pay-as-you-go pricing, with charges based on compute and memory usage. Continuous deployment from code repositories or container registries Customers AWS App Runner has been used by various companies to streamline the deployment of their web applications and APIs. Some notable customers include Classmethod, Hubble, and Velo by Wix. These companies have leve
https://en.wikipedia.org/wiki/Charlson%20Comorbidity%20Index
In medicine, the Charlson Comorbidity Index (CCI) predicts the mortality for a patient who may have a range of concurrent conditions (comorbidities), such as heart disease, AIDS, or cancer (considering a total of 17 categories). A score of zero means that no comorbidities were found; the higher the score, the higher the predicted mortality rate is. For a physician, this score is helpful in deciding how aggressively to treat a condition. It is one of the most widely used scoring system for comorbidities. The index was developed by Mary Charlson and colleagues in 1987, but the methodology has been adapted several times since then based on the findings of additional studies. Many variations of the Charlson comorbidity index have been presented, including the Charlson/Deyo, Charlson/Romano, Charlson/Manitoba, and Charlson/D'Hoores comorbidity indices. Calculation Each condition is assigned a score of 1, 2, 3, or 6, depending on the risk of dying associated with each one. Clinical conditions and associated scores are as follows: 1 each: Myocardial infarction, congestive heart failure, peripheral vascular disease, cerebrovascular disease, dementia, chronic pulmonary disease, rheumatologic disease, peptic ulcer disease, liver disease (if mild, or 3 if moderate/severe), diabetes (if controlled, or 2 if uncontrolled) 2 each: Hemiplegia or paraplegia, renal disease, malignancy (if localized, or 6 if metastatic tumor), leukemia, lymphoma 6 each: AIDS . Patients who are 50 years old or more get additional points: 50-59 years old: +1 point 60-69 years old: +2 points 70-79 years old: +3 points 80 years old or more: +4 points Scores are summed to provide a total score to predict mortality. Currently 17 categories are considered in the popular Charlson/Deyo variant, instead of 19 in the original score. The weights were also adapted in 2003. Conditions can be identified using the International Classification of Diseases (ICD) diagnosis codes commonly used in patient
https://en.wikipedia.org/wiki/Glossary%20of%20developmental%20biology
This glossary of developmental biology is a list of definitions of terms and concepts commonly used in the study of developmental biology and related disciplines in biology, including embryology and reproductive biology, primarily as they pertain to vertebrate animals and particularly to humans and other mammals. The developmental biology of invertebrates, plants, fungi, and other organisms is treated in other articles; e.g. terms relating to the reproduction and development of insects are listed in Glossary of entomology, and those relating to plants are listed in Glossary of botany. This glossary is intended as introductory material for novices; for more specific and technical detail, see the article corresponding to each term. Additional terms relevant to vertebrate reproduction and development may also be found in Glossary of biology, Glossary of cell biology, Glossary of genetics, and Glossary of evolutionary biology. A B C D E F G H I J K L M N O P Q R S T U V W X Y Z See also Introduction to developmental biology Outline of developmental biology Outline of cell biology Glossary of biology Glossary of cell biology Glossary of genetics Glossary of evolutionary biology References External links Developmental biology Glossaries of biology Wikipedia glossaries using description lists
https://en.wikipedia.org/wiki/List%20of%20plant%20pathologists
A Ruth F. Allen B Kenneth F. Baker Heinrich Anton de Bary Helen Purdy Beale Miles Joseph Berkeley Norman E. Borlaug Henry Luke Bolley Myron Brakke Julius Oscar Brefeld Edwin John Butler C Vera Charles Jesse Roy Christie John Colhoun (plant pathologist) D James G. Dickson E Jakob Eriksson F Harold Henry Flor G Ernst Albert Gäumann H Robert Hartig James G. Horsfall I J K Julius Kühn L Frank Lamson-Scribner Ernest Charles Large M Pierre-Marie-Alexis Millardet B.B. Mundkur N Margaret Newton O P Flora Wambaugh Patterson Isaac-Bénédict Prévost Q R S Effie A. Southworth Elvin C. Stakman T U V Mikhail Stepanovich Voronin W Harry Marshall Ward Herbert Hice Whetzel X Y Z Milton Zaitlin See also List of botanists List of plant scientists (merge to List of botanists) Lists of biologists
https://en.wikipedia.org/wiki/Th%C3%A9%C3%A2tre%20D%27op%C3%A9ra%20Spatial
Théâtre D'opéra Spatial is an image created by the generative artificial intelligence platform Midjourney, using a prompt by Jason Matthew Allen. The painting became a news story when it won the 2022 Colorado State Fair's annual fine art competition on 5 September, becoming one of the first AI generated images to win such a prize. Winning the fair's contest for "digital arts/digitally-manipulated photography" led to a backlash from artists who accused Allen of cheating. Allen responded: "I'm not going to apologize for it. I won, and I didn’t break any rules." The two category judges were unaware that Midjourney used AI to generate images, although they later said that had they known this, they would have awarded Allen the top prize anyway. The image, created using a text prompt, was upscaled using Gigapixel AI. In September 2023 the US Copyright Office review board found that Théâtre D'Opéra Spatial was not eligible for copyright protection as the rules "exclude works produced by non-humans". This decision reaffirms previous guidance given in respect of AI by the Office and a recent court case (Thaler v Comptroller-General of Patents, Designs and Trademarks) that found against Thaler on the basis of a principle of human authorship that—though not enshrined in copyright law—is a working principle used by the US Copyright Office. Allen insists he will fight on:"Allen was dogged in his attempt to register his work. He sent a written explanation to the Copyright Office detailing how much he’d done to manipulate what Midjourney conjured, as well as how much he fiddled with the raw image, using Adobe Photoshop to fix flaws and Gigapixel AI to increase the size and resolution. He specified that creating the painting had required at least 624 text prompts and input revisions." Some legal writers support his claim and consider it to be form of technological discrimination comparing it with the treatment of photographs and the modern use of electronic cameras. Any leg
https://en.wikipedia.org/wiki/Subdivision%20bifiltration
In topological data analysis, a subdivision bifiltration is a collection of filtered simplicial complexes, typically built upon a set of data points in a metric space, that captures shape and density information about the underlying data set. The subdivision bifiltration relies on a natural filtration of the barycentric subdivision of a simplicial complex by flags of minimum dimension, which encodes density information about the metric space upon which the complex is built. The subdivision bifiltration was first introduced by Donald Sheehy in 2011 as part of his doctoral thesis (later subsumed by a conference paper in 2012) as a discrete model of the multicover bifiltration, a continuous construction whose underlying framework dates back to the 1970s. In particular, Sheehy applied the construction to both the Vietoris-Rips and Čech filtrations, two common objects in the field of topological data analysis. Whereas single parameter filtrations are not robust with respect to outliers in the data, the subdivision-Rips and -Cech bifiltrations satisfy several desirable stability properties. Definition Let be a simplicial complex. Then a nested sequence of simplices of is called a flag or chain of . The set of all flags of comprises an abstract simplicial complex, known as the barycentric subdivision of , denoted by . The barycentric subdivision is naturally identified with a geometric subdivision of , created by starring the geometric realization of at the barycenter of each simplex. There is a natural filtration on by considering for each natural number the maximal subcomplex of spanned by vertices of corresponding to simplices of of dimension at least , which is denoted . In particular, by this convention, then . Considering the sequence of nested subcomplexes given by varying the parameter , we obtain a filtration on known as the subdivision filtration. Since the complexes in the subdivision filtration grow as increases, we can regard it as a functor fr
https://en.wikipedia.org/wiki/Resilience%20%28power%20system%29
Power resilience refers to a company's ability to adapt to power outages. Frequent outages have forced businesses to take into account the "cost of not having access to power" in addition to the traditional "cost of power". Climate-related issues have intensified the attention on energy sustainability and resilience. In the United States, electric utility firms have registered over 2500 significant power outages since 2002, with almost half of them (specifically 1172) attributed to weather events, including storms, hurricanes, and other unspecified severe weather occurrences. These incidents often lead to significant economic losses. The Committee on Enhancing the Resilience of the Nation's Electric Power Transmission and Distribution System has developed strategies that seek to reduce the impact of large-scale, long-duration outages. Resilience is not just about preventing these outages from happening, but also limiting their scope and impact, restoring power quickly, and preparing for future events. Some parts of the United States still rely on regulated, vertically integrated utilities, while others have adopted competitive markets. Efforts to improve resilience must take into account this institutional and policy heterogeneity. The use of automation at the high-voltage level can improve grid reliability, but also introduces cybersecurity vulnerabilities. These "smart grids" use improved sensing, communication, automation technologies, and advanced metering infrastructure. Distributed energy resources are rapidly growing in some states, but most U.S. customers will continue to depend on the large-scale, interconnected, and hierarchically structured electric grid. Therefore, strategies to enhance electric power resilience must consider a diverse set of technical and institutional arrangements and a wide variety of hazards. There is no single solution that fits all situations when it comes to avoiding, planning for, coping with, and recovering from major outage
https://en.wikipedia.org/wiki/Computing%20in%20Science%20%26%20Engineering
Computing in Science & Engineering (CiSE) is a bimonthly technical magazine published by the IEEE Computer Society. It was founded in 1999 from the merger of two publications: Computational Science & Engineering (CS&E) and Computers in Physics (CIP), the first published by IEEE and the second by the American Institute of Physics (AIP). The founding editor-in-chief was George Cybenko, known for proving one of the first versions of the universal approximation theorem of neural networks. The magazine is interdisciplinary and covers topics such as numerical simulation, modeling, and data analysis and visualization. CiSE aims to provide its readers with practical information on the latest developments in computational methods and their applications in science and engineering. Computing in Science & Engineering publishes peer-reviewed technical articles, special issues, editorials, and departments (regular columns). Notable articles One of the most notable articles published in CiSE is "Matplotlib: A 2D Graphics Environment," by the late John D. Hunter. It shows more than 22 thousand full-text views and more than 17 thousand citations in IEEE Xplore, and more than 27 thousand citations in Google Scholar (checked August 14, 2023). A very popular department article is "What is the Blockchain?" by member of the editorial board Massimo DiPierro. Other notable articles include "Python for Scientific Computing" by Travis Oliphant, which has more than 15 thousand views in Xplore, and "The NumPy Array: A Structure for Efficient Numerical Computation," by Stefan van der Walt et al., with nearly 7 thousand citations and 12 thousand views in Xplore. The winner of the CiSE 2021 Best Paper Award was "Jupyter: Thinking and Storytelling With Code and Data," by Brian E. Granger and Fernando Pérez. Notable editors Among the editors emeritus, who served close to twenty years in the editorial board, is Jack Dongarra, Distinguished Professor of Computer Science at the University of T
https://en.wikipedia.org/wiki/Breastmilk%20medicine
Breastmilk medicine refers to the non-nutritional usage of human breast milk (HBM) as a medicine or therapy to cure diseases. Breastmilk is perceived as an important food that provides essential nutrition to infants. It also provides protection in terms of immunity by direct transfer of antibodies from mothers to infants. The immunity developed via this mean protects infants from diseases such as respiratory diseases, middle ear infections, and gastrointestinal diseases. HBM can also produce lifelong positive therapeutic effects on a number of chronic diseases, including diabetes mellitus, obesity, hyperlipidemia, hypertension, cardiovascular diseases, autoimmunity, and asthma. Therapeutic use of breastmilk has long been a part of natural pharmacopeia, and ethnomedicine. The effectiveness of HBM and fresh colostrum as a treatment for inflammatory disorders such as rhinitis, skin infection, soring nipples, and conjunctivitis has been reported by public health nurses. Currently, many breastmilk components have shown therapeutic benefits in preclinical studies and are being evaluated by clinical studies. Anti-inflammatory effect of breastfeeding HBM can be used to treat inflammations. Breastfeeding has an anti-inflammatory effect that is conveyed by its chemical components’ interaction with body cells. The major chemical component that produces the anti-inflammatory effect in both colostrum and transitional milk are glycoprotein and lactoferrin. Lactoferrin has multiple actions including lymph-stimulatory, anti-inflammatory, anti-bacterial, anti-viral, and anti-fungal effects.  The anti-inflammatory effects of lactoferrin are attributed to its iron-binding properties, inhibition of inflammation-causing molecules including interleukin-1β (IL-1β) and tumor necrosis factor-alpha (TNF-α), stimulation of the activity and maturation of lymphocytes as well as preservation of an antioxidant environment.  Besides, lactoferrin protects infants against bacterial and fungal inf
https://en.wikipedia.org/wiki/Robert%20G.%20Wilhelm
Robert Gerard Wilhelm (born June 27, 1960) is an American mechanical engineer. On May 15, 2018, Wilhelm became the Vice Chancellor for Research and Economic Development at the University of Nebraska — Lincoln, where he also holds the Kate Foster professorship in Mechanical and Materials Engineering. Before joining the University of Nebraska — Lincoln, he served as Vice Chancellor for Research and Economic Development at the University of North Carolina at Charlotte. There, he also held a faculty appointment as a professor. His expertise is in precision engineering and advanced manufacturing. Biography Bob Wilhelm was born June 27, 1960 in Mobile, Alabama. As a child, his family moved to Raleigh, North Carolina, where his father, William J. Wilhelm, earned a PhD in Civil Engineering at North Carolina State University. Their family relocated to Morgantown, West Virginia when William J. Wilhelm joined the West Virginia University civil and environmental engineering faculty. While there, Wilhem's mother, Patricia Zietz, earned a Bachelor of Arts in elementary education and Master of Arts in special education. As a boy scout in West Virginia, Wilhelm earned the Eagle Scout rank. Later, his father joined Wichita State University as the Dean of the College of Engineering, and their family relocated to Wichita, Kansas. Bob Wilhelm earned a Bachelor of Science in Industrial Engineering from Wichita State University in 1981, after beginning coursework at West Virginia University from 1977 to 1979. From 1981 to 1982, he studied the history of science and technology at the University of Leicester and the Ironbridge Gorge Museum as a Rotary Foundation Fellow. In 1984, he earned a Master of Science in Industrial Engineering from Purdue University. In 1992, he received a Ph.D. in Mechanical Engineering from the University of Illinois. Wilhelm has also worked in engineering at Cincinnati Milacron and the Palo Alto Laboratory of Rockwell Science Center. He joined Universi
https://en.wikipedia.org/wiki/Cumyl%20alcohol
Cumyl alcohol, also called 4-isopropylbenzyl alcohol, is a liquid, hydroxy functional, aromatic organic chemical with formula C10H14O. It has the CAS Registry Number of 536-60-7 and the IUPAC name of (4-propan-2-ylphenyl)methanol. It is REACH registered with the EC number of 208-640-4. Uses The most common use is as a food additive to add flavor. The material also has insect repellent properties. Manufacture Hydrogenation of cuminal can produce cumyl alcohol. Other Cumyl alcohol is an undesired side reaction product when LDPE is crosslinked. LDPE is used as a plastic electric insulator for electrical power cables. The cumyl alcohol reduces the insulating properties. Alternative names Main sources of information. p-Cymen-7-ol 4-isopropylbenzyl alcohol Cumic alcohol Cuminol Cuminyl alcohol (4-Isopropylphenyl)methanol Cuminic alcohol Toxicology The toxicity of the material has been studied and is reasonably well understood. References Primary alcohols Food additives
https://en.wikipedia.org/wiki/Mar%C3%ADa%20J.%20Carro
María Jesús Carro Rossell (born 1961) is a Spanish mathematician specializing in mathematical analysis, including Fourier analysis, functional analysis, harmonic analysis, operator theory and the analysis of Lorentz spaces. She is a professor at the Complutense University of Madrid, in the Department of Mathematical Analysis and Applied Mathematics. Education and career Carro was born in 1961, and motivated to work in mathematics by her father, who was prevented from studying science by the Spanish Civil War. She earned a degree in mathematical sciences in 1984 from the University of Extremadura. Next, she went to the University of Barcelona for doctoral study in mathematics, completing her Ph.D. in 1988 under the supervision of Joan Cerdà, with the dissertation Interpolación compleja de operadores lineales. After postdoctoral study at Washington University in St. Louis with Guido Weiss, she obtained a faculty position at the Autonomous University of Barcelona in 1991, and then returned to the University of Barcelona in 1992. There, she held a professorial chair from 1993 to 2019, when she moved to the Complutense University of Madrid. Recognition Carro received the medal of the Royal Spanish Mathematical Society in 2020. She was elected as a corresponding member of the Spanish Royal Academy of Sciences in 2021. References External links Home page 1961 births Living people Spanish mathematicians Spanish women mathematicians Mathematical analysts University of Extremadura alumni University of Barcelona alumni Academic staff of the Autonomous University of Barcelona Academic staff of the University of Barcelona Academic staff of the Complutense University of Madrid
https://en.wikipedia.org/wiki/Dark-field%20X-ray%20microscopy
Dark-field X-ray microscopy (DFXM or DFXRM) is an imaging technique used for multiscale structural characterisation. It is capable of mapping deeply embedded structural elements with nm-resolution using synchrotron X-ray diffraction-based imaging. The technique works by using scattered X-rays to create a high degree of contrast, and by measuring the intensity and spatial distribution of the diffracted beams, it is possible to obtain a three-dimensional map of the sample's structure, orientation, and local strain. History The first experimental demonstration of dark-field X-ray microscopy was reported in 2006 by a group at the European Synchrotron Radiation Facility in Grenoble, France. Since then, the technique has been rapidly evolving and has shown great promise in multiscale structural characterization. Its development is largely due to advances in synchrotron X-ray sources, which provide highly collimated and intense beams of X-rays. The development of dark-field X-ray microscopy has been driven by the need for non-destructive imaging of bulk crystalline samples at high resolution, and it continues to be an active area of research today. However, dark-field microscopy, dark-field scanning transmission X-ray microscopy, and soft dark-field X-ray microscopy has long been used to map deeply embedded structural elements. Principles and instrumentation In this technique, a synchrotron light source is used to generate an intense and coherent X-ray beam, which is then focused onto the sample using a specialized objective lens. The objective lens acts as a collimator to select and focus the scattered light, which is then detected by the 2D detector to create a diffraction pattern. The specialized objective lens in DFXM, referred to as an X-ray objective lens, is a crucial component of the instrumentation required for the technique. It can be made from different materials such as beryllium, silicon, and diamond, depending on the specific requirements of the experime
https://en.wikipedia.org/wiki/Mariam%20Nabatanzi
Mariam Nabatanzi Babirye (born ) also known as Maama Uganda or Mother Uganda, is a Ugandan woman known for birthing 44 children. As of April 2023, her eldest children were twenty-eight years old, and the youngest were six years old. She is a single mother, who was abandoned by her husband in 2015. He reportedly feared the responsibility of supporting so many children. Born around 1980, Babirye first gave birth when she was 13 years old, having been forced into marriage the year prior. By the age of 36, she had given birth to a total of 44 children, including three sets of quadruplets, four sets of triplets, and six sets of twins, for a total of fifteen births. The number of multiple births was caused by a rare genetic condition causing hyperovulation as a result of enlarged ovaries. In 2019, when Babirye was aged 40, she underwent a medical procedure to prevent any further pregnancies. She lives in the village of Kasawo, located in the Mukono district of Central Uganda. Life and background In 1993, Babirye was sold into child marriage at the age of twelve to a violent 40-year-old man. A year later, she first became a mother in 1994 with a set of twins, followed by triplets in 1996. She then gave birth to a set of quadruplets 19 months later. She never found the rate at which she was procreating unusual due to her family history; she had been quoted as saying: "My father gave birth to forty-five children with different women, and these all came in quintuplets, quadruples, twins and triplets." In Uganda, there are some communities that practice early child marriages, where a young girl is given off to an older man in exchange for a dowry that most frequently consists of cows. Babirye's marriage was an example of this. At the age of twenty-three, she had given birth to twenty-five children, but was advised to continue giving birth, as it would help reduce further fertility. Those affected with Babirye's condition are often advised that abstinence from pregnancy ca
https://en.wikipedia.org/wiki/Biochar%20carbon%20removal
Biochar carbon removal (BCR) (also called Pyrogenic carbon capture and storage) is a negative emissions technology. It involves the production of biochar through pyrolysis of residual biomass and the subsequent application of the biochar in soils or durable materials (e.g. cement, tar). The carbon dioxide sequestered by the plants used for the biochar production is therewith stored for several hundreds of years, which creates carbon sinks. Definition The term refers to the practice of producing biochar from sustainably sourced biomass and ensuring that it is stored for a long period of time. The concept makes use of the photosynthesis process, through which plants remove CO2 from the atmosphere during their growth. This carbon dioxide is stabilised within the biochar during the production process and can therewith be stored for several hundreds of years. Scientifically, this process is often referred to as Pyrogenic Carbon Capture and Storage (PyCCS). The term Biochar Carbon Removal (BCR) was first introduced by the European Biochar Industry Consortium (EBI) in 2023 and has since been adopted by various institutions and experts. Beyond carbon sequestration, biochar application has various other potential benefits, such as increased yield and root biomass, water use efficiency and microbial activity. Biochar production Biochar is produced through the pyrolysis process. Biomass (e.g. residual plant material from landscaping or agricultural processes) is reduced to smaller pieces is heated to between 350°C and 900°C under oxygen-deficient conditions. This results in solid biochar and by-products (bio-oil, pyrogas). In order to maximise the carbon storage potential, typically those biochar technologies are used that minimise combustion and avoid the loss of pyrogas into the atmosphere. In low-oxygen conditions, the thermal-chemical conversion of organic materials (including biomass) produces both volatiles, termed pyrolytic gases (pyrogases), as well as sol
https://en.wikipedia.org/wiki/Hanuba%20Hanubi%20Paan%20Thaaba
The , also known as the , is a Meitei folktale of Ancient Kangleipak (early Manipur). It is about the story of an old man, an old woman and some monkeys. Story Once there was a childless old couple, who used to treat a group of monkeys, from the nearby forest, kindly like their own children. One day, the old couple was planting taro plants () in their kitchen garden. Seeing that, the monkeys told the two that it was actually not the right way to plant taros. They told the two that the best peeled off tubers of the taros should initially be boiled in a pot until softened and after getting cooled, these should be planted by wrapping in the banana leaves tightly. The old couple believed the monkeys and they did so as suggested. In the midnight, the monkeys relished all the cooked taros from the garden. And in place of all those delicious taros, they uprooted some inedible giant wild taros from somewhere and planted them in the garden. In the next morning, the old couple were surprised at the sudden growth of the taros they had planted the previous day. The two immediately prepared a dish of the recently full-grown taros and ate them. But as soon as they gulped some, both felt a tingling sensation in their throat. Unable to bear the allergy, both asked to give to each other. It was only after they had the hentak that their allergy was cured. Realising that the monkeys had tricked them, the two devised a plan for revenge. And according to the plan, the old man () pretended to be dead, and the old woman () cried out loudly to make the monkeys hear her cry. Then, the monkeys came there and asked the old woman what had happened. She told them that the old man died after eating the taros. She asked them to help her taking the old man's body out in the lawn. All the monkeys, unaware of the plan, came inside the house. As soon as they came near the old man, he took up his stick and started beating them. Frightened, they all ran away. The old couple knew that the monkeys wo
https://en.wikipedia.org/wiki/Multiverse%20Computing
Multiverse Computing is a Spanish quantum computing software company headquartered in San Sebastian, Spain, with offices in Paris, Munich, London, Toronto and Sherbrooke, Canada. The Spanish startup applies quantum and quantum-inspired algorithms to problems in energy, logistics, manufacturing, mobility, life sciences, finance, cybersecurity, chemistry and aerospace. Its flagship product, Singularity, is an industrial quantum and quantum-inspired software platform focused on solving real-world challenges for large enterprises. Among other features, Singularity’s user interface incorporates tools such as Microsoft Excel plug-ins that allow use of the platform’s core algorithms without prior knowledge of quantum computing. History Multiverse was co-founded in 2019 by Enrique Lizaso, Román Orús, Alfonso Rubio and Sam Mugel. Lizaso, treasurer and member of the governing board of the European Quantum Industry Consortium, and Orús, Ikerbasque Research Professor at DIPC and former Marie Curie Fellow at the Max Planck Institute of Quantum Optics, were chatting on WhatsApp when the idea for a quantum computing company for finance was born. In 2021, the company announced €12.5 million in funding from the European Innovation Council Accelerator program. In April 2022, the company partnered with the Bank of Canada to explore how quantum computing can provide new insights into economic problems via simulation of cryptocurrency adoption. This research made Canada the first G7 country to explore the model of complex networks and cryptocurrencies through quantum computing. That July, Multiverse partnered with Bosch to integrate quantum algorithms into digital twin simulation workflow to scale simulations more efficiently and improve the accuracy of defect detection. Later that year, BASF used Multiverse’s Singularity to develop models for foreign exchange trading optimization between U.S. and EU currency to improve profits. Additional organizations exploring Multiverse’s Sing
https://en.wikipedia.org/wiki/Microcrystallization
Microcrystallization (or microcrystal test) is a method for identifying lichen metabolites that was predominantly used before the advent of more advanced techniques such as thin-layer chromatography and high-performance liquid chromatography. Developed primarily by Yasuhiko Asahina, this approach relies on the formation of distinctive crystals from lichen extracts. Although now superseded by modern analytical methods, microcrystallization still holds importance for compound purification and analysis using X-ray crystallography. History Between 1936 and 1940, Japanese chemist and lichenologist Yasuhiko Asahina published a series of papers in the Journal of Japanese Botany detailing the microcrystallization technique. This simple and rapid method allowed for the identification of major metabolites in hundreds of lichen species, contributing significantly to taxonomic research. The technique was introduced to western lichenologists in a 1943 publication by Alexander Evans, and was used regularly until more advanced techniques such as thin-layer chromatography and high-performance liquid chromatography were introduced and integrated into laboratories. Decades of research on the secondary metabolite of lichens culminated in the publication of Identification of Lichen Substances, a 1996 work by Siegfried Huneck and Isao Yoshimura, that summarized analytical data for hundreds of lichen molecules, including images of microcrystals. Ultimately, the microcrystallization method had limitations, as it was unable to detect minor components or analyze complex mixtures of lichen substances. Despite these drawbacks, microcrystallization played a crucial role in the study of correlations between lichen chemistry, morphology, and geographic distribution. Procedure To perform microcrystallization, a small piece of lichen is extracted using acetone or other solvents, filtered, and evaporated to yield a residue. The residue is transferred to a microscope slide, and a drop of microcr
https://en.wikipedia.org/wiki/RomArchive
The RomArchive is a digital repository of Romani culture, established in 2015. Fourteen curators organised 5,000 objects, available in English, German and Romani. The archive has won a European Union Prize for Cultural Heritage and a Grimme-Preis. Project The RomArchive was established in 2015 as a digital repository of Romani culture. The German Federal Cultural Foundation was the largest initial sponsor, providing €3.75 million. The founders were Isabel Raabe and Franziska Sauerbrey. The project covers areas such as dance, film, literature and flamenco. Filmmaker Katalin Bársony curated a selection of 35 films which authentically present Romani culture, one being Taikon by Lawen Mohtadi. The visual arts collection is curated by Tímea Junghaus and photography by André Raatzsch. In total there are fourteen curators. The archive contains 5,000 objects and is available in English, German and Romani. The Documentation and Cultural Centre of German Sinti and Roma took over the sponsorship of the RomArchive in 2019, on International Romani Day. The same year, the archive won a European Union Prize for Cultural Heritage. In 2020, it also won a Grimme-Preis. References External links Website Publications established in 2015 Romani culture Online archives Romani language
https://en.wikipedia.org/wiki/Immunosequencing
Immunosequencing, sometimes referred to as repertoire sequencing or Rep-Seq, is a method for analyzing the genetic makeup of an individual's immune system. Background In most areas of biology a single gene codes for one or a few possible proteins. Through V(D)J recombination a number of organisms take a relatively small number of genes coding for antibodies and T-cell receptors (TCRs) and produce a huge diversity of slightly different antibodies and TCRs. The diversity allows for the recognition of a wide array of antigens. As an immune system reacts to infections and other events, the number of different antibodies and TCRs it contains changes. The makeup and quantity of these proteins is sometimes referred to as an immune repertoire. Immunosequencing is a technique utilizing multiplex polymerase chain reaction that allows for the sequencing and quantification of the large diversity of antibody and TCR genes composing an individual's immune repertoire. History Immunosequencing in its modern context started being discussed in scientific literature in the early 2010s with the advent of more powerful gene sequencing techniques. References Molecular biology Laboratory techniques DNA profiling techniques
https://en.wikipedia.org/wiki/Photoentrainment%20%28chronobiology%29
In chronobiology, photoentrainment refers to the process by which an organism's biological clock, or circadian rhythm, synchronizes to daily cycles of light and dark in the environment. The mechanisms of photoentrainment differ from organism to organism. Photoentrainment plays a major role in maintaining proper timing of physiological processes and coordinating behavior within the natural environment. Studying organisms’ different photoentrainment mechanisms sheds light on how organisms may adapt to anthropogenic changes to the environment. Background 24-hour physiological rhythms, known now as circadian rhythms, were first documented in 1729 by Jean Jacques d'Ortous de Mairan, a French astronomer who observed that mimosa plants (Mimosa pudica) would orient themselves to be toward the position of the sun despite being in a dark room. That observation spawned the field of chronobiology, which seeks to understand the mechanisms that underly endogenously expressed daily rhythms in organisms from cyanobacteria to mammals, which includes understanding and modeling the process of photoentrainment. Two prominent 20th century chronobiologists, Jürgen Aschoff and Colin Pittendrigh, both worked throughout the 1960s to model of the process of photoentrainment, and despite examining the same subject, they arrived at different conclusions. Aschoff proposed a parametric model of entrainment, which assumed that organisms entrained to environmental timing cues (often referred to as zeitgebers, or "time givers" in German) gradually, changing their internal "circadian" period to be greater or less than 24 hours until it became aligned with the zeitgeber time. Conversely, Pittendreigh proposed a non-parametric model of entrainment, which assumed that organisms adjusted their internal clocks instantaneously when confronted with a light signal, or zeitgeber, that was out of sync with when their internal circadian time expected to see light. Pittendrigh developed his model based on th
https://en.wikipedia.org/wiki/History%20of%20phagocytosis
The history of phagocytosis is an account of the discoveries of cells, known as phagocytes, that are capable of eating other cells or particles, and how that eventually established the science of immunology. Phagocytosis is broadly used in two ways in different organisms, for feeding in unicellular organisms (protists) and for immune response to protect the body against infections in metazoans. Although it is found in a variety of organisms with different functions, its fundamental process is cellular ingestion of foreign (external) materials, and thus, is considered as an evolutionary conserved process. The biological theory and concept, experimental observations and the name, phagocyte () were introduced by a Ukrainian zoologist Élie Metchnikoff in 1883, the moment regarded as the foundation or birth of immunology. The discovery of phagocytes and the process of innate immunity earned Metchnikoff the 1908 Nobel Prize in Physiology or Medicine, and the epithet "father of natural immunity". However, the cellular process was known before Metchnikoff's works, but with inconclusive descriptions. The first scientific description was from Albert von Kölliker who in 1849 reported an alga eating a microbe. In 1862, Ernst Haeckel experimentally showed that some blood cells in a slug could ingest external particles. By then evidences were mounting that leucocytes can perform cell eating just like protists, but it was not until Metchnikoff showed that specific leukocytes (in his case macrophages) eat cell that the role of phagocytosis in immunity was realised. Discovery of cell feeding Phagocytosis was first observed as a process by which unicellular organisms eat their food, usually smaller organisms like protists and bacteria. The earliest definitive account was given by Swiss scientist Albert von Kölliker in 1849. As he reported in the journal , Kölliker described the feeding process of an amoeba-like alga, Actinophyrys sol (a heliozoan). Under microscope, he noticed th
https://en.wikipedia.org/wiki/Persistent%20Betti%20number
In persistent homology, a persistent Betti number is a multiscale analog of a Betti number that tracks the number of topological features that persist over multiple scale parameters in a filtration. Whereas the classical Betti number equals the rank of the homology group, the persistent Betti number is the rank of the persistent homology group. The concept of a persistent Betti number was introduced by Herbert Edelsbrunner, David Letscher, and Afra Zomorodian in the 2002 paper Topological Persistence and Simplification, one of the seminal papers in the field of persistent homology and topological data analysis. Applications of the persistent Betti number appear in a variety of fields including data analysis, machine learning, and physics. Definition Let be a simplicial complex, and let be a monotonic, i.e., non-decreasing function. Requiring monotonicity guarantees that the sublevel set is a subcomplex of for all . Letting the parameter vary, we can arrange these subcomplexes into a nested sequence for some natural number . This sequences defines a filtration on the complex . Persistent homology concerns itself with the evolution of topological features across a filtration. To that end, by taking the homology group of every complex in the filtration we obtain a sequence of homology groups that are connected by homomorphisms induced by the inclusion maps in the filtration. When applying homology over a field, we get a sequence of vector spaces and linear maps commonly known as a persistence module. In order to track the evolution of homological features as opposed to the static topological information at each individual index, one needs to count only the number of nontrivial homology classes that persist in the filtration, i.e., that remain nontrivial across multiple scale parameters. For each , let denote the induced homomorphism . Then the persistent homology groups are defined to be the images of each induced map. Namely, for all . In paralle
https://en.wikipedia.org/wiki/OS/4
OS/4 is a discontinued operating system, introduced in 1972, from UNIVAC for their 9400, 9480, and 9700 computer systems. It is an enhanced version of UNIVAC's 9400 Disc Operating System. OS/4 is a disc-resident system requiring 64 KB of main memory, two disc drives, a punched-card reader and a printer. The resident memory footprint is approximately 24 KB. UNIVAC intended to replace OS/4 with a new system. OS/7, but OS/7 development was discontinued in 1975 when the 9700 was made part of the new UNIVAC Series 90 line as the 90/70. REferences Discontinued operating systems UNIVAC mainframe computers
https://en.wikipedia.org/wiki/Persistent%20homology%20group
In persistent homology, a persistent homology group is a multiscale analog of a homology group that captures information about the evolution of topological features across a filtration of spaces. While the ordinary homology group represents nontrivial homology classes of an individual topological space, the persistent homology group tracks only those classes that remain nontrivial across multiple parameters in the underlying filtration. Analogous to the ordinary Betti number, the ranks of the persistent homology groups are known as the persistent Betti numbers. Persistent homology groups were first introduced by Herbert Edelsbrunner, David Letscher, and Afra Zomorodian in a 2002 paper Topological Persistence and Simplification, one of the foundational papers in the fields of persistent homology and topological data analysis, based largely on the persistence barcodes and the persistence algorithm, that were first described by Serguei Barannikov in the 1994 paper. Since then, the study of persistent homology groups has led to applications in data science, machine learning, materials science, biology, and economics. Definition Let be a simplicial complex, and let be a real-valued monotonic function. Then for some values the sublevel-sets yield a sequence of nested subcomplexes known as a filtration of . Applying homology to each complex yields a sequence of homology groups connected by homomorphisms induced by the inclusion maps of the underlying filtration. When homology is taken over a field, we get a sequence of vector spaces and linear maps known as a persistence module. Let be the homomorphism induced by the inclusion . Then the persistent homology groups are defined as the images for all . In particular, the persistent homology group . More precisely, the persistent homology group can be defined as , where and are the standard p-cycle and p-boundary groups, respectively. Sometimes the elements of are described as the homology classes that are
https://en.wikipedia.org/wiki/RelayOne
RelayOne was an email to postal system service run by the United Kingdom's Royal Mail from 1998 until 2000. The company described it as a modern-day telegram. History The service launched in the United Kingdom in March 1998, and by April of that year a United States launch was planned. The service cost £1.50 per page and up to 50 pages could be sent for £5, which was a significantly higher cost than the price of a postage stamp. The email would be printed by Royal Mail in London and then posted in the regular mail to its recipient. A selection of greeting cards that could be printed was later added. Royal Mail's technology partner for the system was the American company, Microsoft. The service was discontinued in 2000 as it was not commercially viable. At the time of its demise RelayOne was handling 400 items a month. Royal Mail did however continue to support a similar free of charge system used to send mail to the British Armed Forces serving overseas. References External links Microsoft email software Email Royal Mail Telecommunications-related introductions in 1998 Products and services discontinued in 2000
https://en.wikipedia.org/wiki/Dichromatic%20symmetry
Dichromatic symmetry, also referred to as antisymmetry, black-and-white symmetry, magnetic symmetry, counterchange symmetry or dichroic symmetry, is a symmetry operation which reverses an object to its opposite. A more precise definition is "operations of antisymmetry transform objects possessing two possible values of a given property from one value to the other." Dichromatic symmetry refers specifically to two-coloured symmetry; this can be extended to three or more colours in which case it is termed polychromatic symmetry. A general term for dichromatic and polychromatic symmetry is simply colour symmetry. Dichromatic symmetry is used to describe magnetic crystals and in other areas of physics, such as time reversal, which require two-valued symmetry operations. Examples A simple example is to take a white object, such as a triangle, and apply a colour change resulting in a black triangle. Applying the colour change once more yields the original white triangle. The colour change, here termed an anti-identity operation (1'), yields the identity operation (1) if performed twice. Another example is to construct an anti-mirror reflection (m') from a mirror reflection (m) and an anti-identity operation (1') executed in either order. The m' operation can then be used to construct the antisymmetry point group 3m' of a dichromatic triangle. There are no mirror reflection (m) operations for the dichromatic triangle, as there would be if all the smaller component triangles were coloured white. However, by introducing the anti-mirror reflection (m') operation the full dihedral D3 symmetry is restored. The six operations making up the dichromatic D3 (3m') point group are: identity () rotation by () rotation by () anti-mirror reflection () combination of with () combination of with (). Note that the vertex numbers do not form part of the triangle being operated on - they are shown to keep track of where the vertices end up after each operation. History
https://en.wikipedia.org/wiki/Harvest%20now%2C%20decrypt%20later
Harvest now, decrypt later, also known as store now, decrypt later or retrospective decryption, is a surveillance strategy that relies on the acquisition and long-term storage of currently unreadable encrypted data awaiting possible breakthroughs in decryption technology that would render it readable in the future. The most common concern is the prospect of developments in quantum cryptography which would allow current strong encryption algorithms to be broken at some time in the future, making it possible to decrypt any stored material that had been encrypted using those algorithms. However, the improvement in decryption technology need not be due to a quantum-cryptographic advance; any other form of attack capable of enabling decryption would be sufficient. The existence of this strategy has led to concerns about the need to urgently deploy post-quantum cryptography, even though no practical quantum attacks yet exist, as some data stored now may still remain sensitive even decades into the future. , the U.S. federal government has proposed a roadmap for organizations to start migrating toward quantum-cryptography-resistant algorithms to mitigate these threats. References See also Communications interception (disambiguation) Indiscriminate monitoring Mass surveillance Perfect forward secrecy Cryptography Espionage techniques Mass surveillance Computer data storage Privacy
https://en.wikipedia.org/wiki/Sphericity%20%28graph%20theory%29
In graph theory, the sphericity of a graph is a graph invariant defined to be the smallest dimension of Euclidean space required to realize the graph as an intersection graph of unit spheres. The sphericity of a graph is a generalization of the boxicity and cubicity invariants defined by F.S. Roberts in the late 1960s. The concept of sphericity was first introduced by Hiroshi Maehara in the early 1980s. Definition Let be a graph. Then the sphericity of , denoted by , is the smallest integer such that can be realized as an intersection graph of unit spheres in -dimensional Euclidean space . Sphericity can also be defined using the language of space graphs as follows. For a finite set of points in some -dimensional Euclidean space, a space graph is built by connecting pairs of points with a line segment when their Euclidean distance is less than some specified constant. Then the sphericity of a graph is the minimum such that is isomorphic to a space graph in . Graphs of sphericity 1 are known as interval graphs or indifference graphs. Graphs of sphericity 2 are known as unit disk graphs. Bounds The sphericity of certain graph classes can be computed exactly. The following sphericities were given by Maehara on page 56 of his original paper on the topic. The most general known upper bound on sphericity is as follows. Assuming the graph is not complete, then where is the clique number of and denotes the number of vertices of References Graph theory Discrete mathematics Geometric graph theory
https://en.wikipedia.org/wiki/Cubicity
In graph theory, cubicity is a graph invariant defined to be the smallest dimension such that a graph can be realized as an intersection graph of unit cubes in Euclidean space. Cubicity was introduced by Fred S. Roberts in 1969 along with a related invariant called boxicity that considers the smallest dimension needed to represent a graph as an intersection graph of axis-parallel rectangles in Euclidean space. Definition Let be a graph. Then the cubicity of , denoted by , is the smallest integer such that can be realized as an intersection graph of axis-parallel unit cubes in -dimensional Euclidean space. The cubicity of a graph is closely related to the boxicity of a graph, denoted . The definition of boxicity is essentially the same as cubicity, except in terms of using axis-parallel rectangles instead of cubes. Since a cube is a special case of a rectangle, the cubicity of a graph is always an upper bound for the boxicity of a graph. In the other direction, it can be shown that for any graph on vertices, the inequality , where is the ceiling function, i.e., the smallest integer greater than or equal to . References Graph theory Discrete mathematics Geometric graph theory
https://en.wikipedia.org/wiki/Colloidal%20gold%20protein%20assay
The colloidal gold protein assay is a highly sensitive biochemical assay for determining the total concentration of protein in a solution (~0.1 ng/µL to 200 ng/µL). It was first described in 1987 by two groups who used commercially available "Aurodye" colloidal gold solutions. Notably, the formulation of Aurodye changed between 1987 and 1990 such that it became incompatible with protein assays, however vendors such as Bio-Rad & Diversified Biotech starting offering colloidal gold formulations that were suitable for protein assays. These products have since been discontinued and there are no vendors that currently explicitly sell colloidal gold for the assay, however detailed synthetic procedures were published to produce the ~17-40 nm gold nanoparticles that are suitable for the assay, along with modifications to increase the shelf stability of the colloidal gold & adapt the assay to microplate format & increase it's sensitivity. Gold nanoparticles in the ~17-40 nm size range that are presumably compatible with the assay are currently commercially available. Mechanism The total protein concentration is readout by an increase in absorbance at 565 nm, which can then be measured using colorimetric techniques, including using microplate readers. Most common reagents, except thiols and SDS, are compatible with the assay. An optimized formulation for the assay to maximize sensitivity in microplate format was described. Comparison to other assays While the colloidal gold assay is the most sensitive in-solution colorimetric protein assay, it may be equally sensitive or surpassed in sensitivity by fluorescent protein assays such as the CBQCA, FQ, NanoOrange, Quant-iT, and EZQ assays. See also Colloidal gold Bradford assay BCA assay References Biochemistry methods Chemical tests
https://en.wikipedia.org/wiki/Hyperpredation
Hyperpredation, also known as hypopredation, is when a generalist predator increases its predation pressure as a result of the introduction of a substitute prey. Hyperpredation has been proven, for instance, in lab settings using two hosts and a parasitoid wasp. Prey that requires more handling time than they are worth in terms of nutritional value leads to hyperpredation. In severe circumstances, predators that fed on such prey went extinct. Introduced Eastern cottontails cause an apparent competition with the European hare, as a result this along with the red fox being their main predator causes hyperpredation. Examples After the invasion of feral pigs, golden eagles (which had inhabited the islands due to DDT wiping out the more territorial Bald eagle population) began preying heavily on the alien species. Another prey on the islands, the Island fox, nearly went locally extinct due to the predation pressure from the golden eagles. These incidents happened in the California Channel Islands. Causes Theoretical research indicates that this increased predation may be sufficient to have a demographic impact on prey populations. The empirical data on hyperpredation that are now available are only applicable to situations where the introduction of a feral prey led to an overexploitation of the local prey. The most common cause of hyperpredation is apparent competition between the native and alien prey. See also Predator Carnivore Mesopredator Mutualism (biology) Interspecific competition Surplus killing References Ecology Ethology Predation
https://en.wikipedia.org/wiki/Alfred%20Louis%20Bacharach
Alfred Louis Bacharach (11 August 189116 July 1966), was a British food scientist, scientific author, socialist and editor of music history and criticism. He wrote as A.L. Bacharach. Education and politics Bacharach was born in Hampstead, London and educated at St Paul's School, London and Clare College, Cambridge until 1914. At Cambridge he was a member of the Fabian Society, where he made a lifelong friendship with the journalist William Norman Ewer. He was a member of the 1917 Club for socialists in London's Soho, and later became involved with the left-wing Guild Socialist Movement and (for forty years) with the Labour Research Department. From 1914 and for the rest of his life he was closely associated with the Working Men's College in North West London, where friends and colleagues included Ivor Brown and C. E. M. Joad, as well as Ewer. Scientific career He worked as a chemist at the Wellcome Research Laboratories in Kent during the war. From 1920 he was an analytical chemist at Joseph Nathan and Co Ltd in Greenford, Middlesex, which later changed its name to Glaxo Laboratories Ltd and eventually became GlaxoSmithKline. Bacharach was promoted to chief chemist and subsequently became head of the nutrition research unit. He spent most of his working life at Glaxo, from the first beginnings of the commercialization of vitamins, a subject on which he worked with Harry Jephcott. Bacharach advocated the fortification of baby milk with vitamin D in Britain, helping to eliminate rickets which was previously rife in northern cities. In later years he was responsible for editing Glaxo's scientific papers. While at Glaxo he was the author of Science and Nutrition (1st edition, 1938; 2nd edition, 1945), and edited, with Theodore Rendle, The Nation's Food: A Survey of Scientific Data (1946). He was the editor (with Desmond Laurence), of the two volume Evaluation of Drug Activities: Pharmacometrics (1964), and (with Otto Edholm) Exploration Medicine (1965) and The Physi
https://en.wikipedia.org/wiki/SEE-FIM%20Protocol
The SEE-FIM protocol is a pathology dissection protocol for Sectioning and Extensively Examining the Fimbria (SEE-FIM). This protocol is intended to provide for the optimal microscopic examination of the distal fallopian tube (fimbria) to identify either cancerous or precancerous conditions in this organ. Background Women with either a strong family history of breast and ovarian cancer or a documented inherited (germline) mutation in a BRCA gene are encouraged to consider risk reduction salpingo-oophorectomy (RRSO). The surgery is ideally conducted prior to the time that the risk of developing HGSC became too great to defer the procedure, which was age 35 for women with BRCA1 and 45 for BRCA2 mutations. Removal of both tubes and ovaries has reduced the risk of subsequent HGSC by 85% [see BRCA mutation]. Beginning in 2000, pathologists began to encounter early, often non-invasive HGSCs (serous tubal intraepithelial carcinomas or STICs) in the fallopian tubes of women with germ line BRCA mutation who underwent RRSO. Introduction of the SEE-FIM protocol Conception The SEE-FIM protocol was introduced in 2005 and required examining all of the fallopian tube, specifically the sectioning and examination of the distal one-third (infundibulum and fimbria). Early HGSCs of the fallopian tube, once considered rare, were encountered frequently in this portion of the tube once the SEE-FIM protocol was adopted. Based on this information, the distal fallopian tube was cited as an origin for many HGSCs formerly classified as ovarian cancers. The SEE-FIM protocol was adopted by many to identify or exclude these tumors during pathologic examination of the fallopian tubes in risk reduction salpingo-oophorectomies. Method The SEE-FIM protocol consists of five steps (See Figure): The tube is fixed for at least 2 hours in laboratory fixative. The distal one third is amputated. The distal one third is sectioned in the longitudinal (sagittal) plane as thinly as possible and su
https://en.wikipedia.org/wiki/Raleigh%20plot
Raleigh plots, or Rayleigh plots (also called circlegrams and closely related to circular histograms, phasor diagrams, and wind roses), are statistical graphics that serve as graphical representations for a Raleigh test that map a mean vector to a circular plot. Raleigh plots have many applications in the field of chronobiology, such as in studying butterfly migration patterns or protein and gene expression, and in other fields such as geology, cognitive psychology, and physics. History/Origin Raleigh plots was first introduced by Lord Rayleigh. The concept of Raleigh plots evolved from Raleigh tests, also introduced by Lord Rayleigh in 1880. The Rayleigh test is a popular statistical test used to measure the concentration of data points around a circle, identifying any unimodal bias in the distribution. Rayleigh plots emerged from this analysis as a means to illustrate the nature of the distribution. General interpretation In a Raleigh plot, each individual is assigned a unit vector with a corresponding angle. These unit vectors are averaged together in a Raleigh plot into the mean vector. The length of the mean vector is determined by r (or R), the mean resultant length. r is a measure of concentration, ranging in value between 0 and 1. If the individual angles of the unit vectors are tightly clustered, then the r value will be closer to 1 while if they are widely scattered, then r will be closer to zero. The mean vector is positioned in the center of a circle. Dashes along the circumference of this circle denote desired values. Examples include angles from magnetic north (zero degrees) going clockwise (e.g., 90 degrees from magnetic north, or eastward); times of day, which can also be described in zeitgeber time and circadian time; and phase. Dots on the circumference are usually used to indicate individual unit vectors and their respective angle in regard to the values being measured. Raleigh plots can also use more than one mean vector, particularly if one
https://en.wikipedia.org/wiki/Eclosion%20assay
Eclosion assays are experimental procedures used to study the process of eclosion in insects, particularly in the model organism drosophila (fruit flies). Eclosion is the process in which an adult insect emerges from its pupal case, or a larval insect hatches from its egg. In holometabolous insects, the circadian clock regulates the timing of adult emergence. The daily rhythm of adult emergence in these insects was among the first circadian rhythms to be investigated. The circadian clock in these insects enforces a daily pattern of emergence by permitting or triggering eclosion during specific time frames and preventing emergence during other periods. The purpose of an eclosion assay is to count the number of flies that emerge over time from a developing population, which provides information on the circadian clock in the experimentally manipulated drosophila. For example, with an eclosion monitor, scientists can study how knocking out a certain gene changes the behavioral expression of a drosophila's biological clock. Additionally, the circadian rhythm of adult insect emergence was among the earliest chronobiological phenomena to be examined, significantly impacting the field of chronobiology through its contributions to understanding temperature compensation, phase response curves, and reactions to skeleton photoperiods. The eclosion assay serves as a vital tool for researchers delving into chronobiology studies. Bang box The bang box is the first experimental assay developed to measure eclosion in fruit flies. The first model of the bang box was developed at a Princeton University laboratory, mainly accredited to Colin Pittendrigh, to measure the time that adult drosophilids emerged from pupae populations in a controlled light and temperature environment. This original model works by securing pupae on plastic boxes that can be temperature controlled. The pupae are harvested and attached to a brass holding plate. The holding plate is then secured to face a bras
https://en.wikipedia.org/wiki/Wave%20overtopping
is the time-averaged amount of water that is discharged per linear metre by waves over a structure such as a breakwater, revetment or dike which has a crest height above still water level. When waves break over a dike, it causes water to flow onto the land behind it. Excessive overtopping is undesirable because it can compromise the integrity of the structure or result in a safety hazard, particularly when the structure is in an area where people, infrastructure or vehicles are present, such as in the case of a dike fronting an esplanade or densely populated area. Wave overtopping typically transpires during extreme weather events, such as intense storms, which often elevate water levels beyond average due to wind setup. These effects may be further intensified when the storm coincides with a high spring tide. Excessive overtopping may cause damage to the inner slope of the dike, potentially leading to failure and inundation of the land behind the dike, or create water-related issues on the inside of the dike due to excess water pressure and inadequate drainage. The amount of overtopping depends on factors including the freeboard, wave height, wave period, and slope of the dike. Overtopping factors and influences Overtopping can transpire through various combinations of water levels and wave heights, wherein a low water level accompanied by high waves may yield an equivalent overtopping outcome to that of a higher water level with lower waves. This phenomenon is inconsequential when water levels and wave heights exhibit correlation; however, it poses difficulties in river systems where these factors are uncorrelated. In such instances, a probabilistic calculation is necessary. The freeboard is the height of the dike's crest above the still water level, which usually corresponds to the determining storm surge level or river water level. Overtopping is typically expressed in litres per second per metre of dike length (L/s/m), as an average value. Overtopping fo
https://en.wikipedia.org/wiki/Plants%20in%20Meitei%20culture
Many play significant roles in the different elements of Meitei culture, including but not limited to Meitei cuisine, Meitei festivals, Meitei folklore, Meitei folktales, Meitei literature, Meitei mythology and Meitei religion (Sanamahism) of . Perspective of Mother nature The "Hijan Hirao" (), an ancient Meitei language narrative poem, mentions that King Hongnem Luwang Ningthou Punshiba of Luwang dynasty once ordered his men for the cutting down a tree in the forest for crafting out a beautiful royal Hiyang Hiren. In accordance to the story, his servants found out a big tree growing on the slope of a mountain and by the side of a river. They performed traditional customary rites and rituals before chopping off the tree on the following day. In the death of the night, Mother nature started weeping in the fear of losing her dear child, the tree. The painful lamentations of mother nature is described in the poem as follows: Plants used in rites and rituals is used by the Meitei people for decorations during the Sajibu Cheiraoba (Meitei Lunar Near Year Day) celebrations. is also used by the Meitei people for decorations during the Sajibu Cheiraoba (Meitei Lunar Near Year Day) celebrations. In Meitei culture, Kombirei flower represents love, life and death. It is frequently mentioned in the Meitei folktales and folk songs. In honor of this flowering plant species, the Government of Manipur, organises the "Kombirei Festival" every year, in the aim to preserve and conserve the natural habitats of ethnic flowers like Kombirei. Real plants mentioned in old texts Cape jasmine description Giving reference to Meitei King Khagemba and the Manipur Kingdom, the beauty and grace of Lei Kabok flower, also called (, cape jasmine), is described by Meitei King Charairongba, in his book, the "Leiron", as follows: Real plants mentioned in folklore Colocasia/Taro plantation folktale In Meitei mythology and Meitei folklore of , plants are mentioned. In the Meitei folk
https://en.wikipedia.org/wiki/Robyn%20Lutz
Robyn R. Lutz is an American computer scientist whose research involves software engineering, including modeling and checking software requirements and software system safety. She is a professor of computer science at Iowa State University. Education and career Lutz majored in English at the University of Kansas, graduating with the highest distinction in 1974, earned a master's degree in Spanish there in 1976, and completed a Ph.D. in Spanish in 1980, under the supervision of Raymond Souza. Despite this non-technical background, she became a member of the technical staff at the Jet Propulsion Laboratory, associated with the California Institute of Technology, in 1983, and continued to hold an affiliation there until 2012. Returning to graduate study, she earned a master's degree in computer science in 1990 from Iowa State University. She held an affiliate assistant professor title there from 1994 to 2000. In 2000 she became a regular-rank associate professor, and in 2005 she was promoted to full professor. Recognition Lutz was named a Distinguished Scientist in the Association for Computing Machinery in 2014. In 2021, she received the lifetime service award from the IEEE International Requirements Engineering Conference. She was elected as an IEEE Fellow in 2022, "for contributions to software requirements for safety-critical systems". Personal life Lutz is married to Jack Lutz, a professor of mathematics and computer science at Iowa State University; their son Neil Lutz is also a computer scientist and a visiting assistant professor of computer science at Swarthmore College. They have published together on algorithmic game theory in DNA computing. References External links Home page Year of birth missing (living people) Living people American computer scientists American software engineers American women computer scientists Software engineering researchers University of Kansas alumni Iowa State University alumni Iowa State University faculty Fellow Members
https://en.wikipedia.org/wiki/Bricklayer%20function
In cryptography, the bricklayer function is a part of a round function that can be decomposed into identical independent Boolean operations on the partitioned pieces of its input data, so called bundles. The term was introduced by Daemen and Rijmen in 2001. If the underlying function transforming the bundle is nonlinear, it is traditionally called an S-box. If the function is linear, Daemen and Rijmen use for it the term D-box (after diffusion). References Sources Cryptographic primitives
https://en.wikipedia.org/wiki/Persistence%20barcode
In topological data analysis, a persistence barcode, sometimes shortened to barcode, is an algebraic invariant of a persistence module that characterizes the stability of topological features throughout a growing family of spaces. Formally, a persistence barcode consists of a multiset of intervals in the extended real line, where the length of each interval corresponds to the lifetime of a topological feature in a filtration, usually built on a point cloud, a graph, a function, or, more generally, a simplicial complex or a chain complex. Generally, longer intervals in a barcode correspond to more robust features, whereas shorter intervals are more likely to be noise in the data. A persistence barcode is a complete invariant that captures all the topological information in a filtration. In algebraic topology, the persistence barcodes were first introduced by Sergey Barannikov in 1994 as the "canonical forms" invariants consisting of a multiset of line segments with ends on two parallel lines, and later, in geometry processing, by Gunnar Carlsson et al. in 2004. Definition Let be a fixed field. Then a persistence module indexed over consists of a family of -vector spaces and linear maps for each such that for all . This construction is not specific to ; indeed, it works identically with any totally-ordered set. A persistence module is said to be of finite type if it contains a finite number of unique finite-dimensional vector spaces. The latter condition is sometimes referred to as pointwise finite-dimensional. Let be an interval in . Define a persistence module via , where the linear maps are the identity map inside the interval. The module is sometimes referred to as an interval module. Then for any -indexed persistence module of finite type, there exists a multiset of intervals such that , where the direct sum of persistence modules is carried out index-wise. The multiset is called the barcode of , and it is unique up to a reordering of the int
https://en.wikipedia.org/wiki/Symbols%20of%20grouping
In mathematics and related subjects, understanding a mathematical expression depends on an understanding of symbols of grouping, such as parentheses (), brackets [], and braces {}. These same symbols are also used in ways where they are not symbols of grouping. For example, in the expression 3(x+y) the parentheses are symbols of grouping, but in the expression (3, 5) the parentheses may indicate an open interval. The most common symbols of grouping are the parentheses and the brackets, and the brackets are usually used to avoid too many repeated parentheses. For example, to indicate the product of binomials, parentheses are usually used, thus: . But if one of the binomials itself contains parentheses, as in one or more pairs of parentheses may be replaced by brackets, thus: . Beyond elementary mathematics, brackets are mostly used for other purposes, e.g. to denote a closed interval, or an equivalence class, so they appear rarely for grouping. The usage of the word "parentheses" varies from country to country. In the United States, the word parentheses (singular "parenthesis") is used for the curved symbol of grouping, but in many other countries the curved symbol of grouping is called a "bracket" and the symbol of grouping with two right angles joined is called a "square bracket". The symbol of grouping knows as "braces" has two major uses. If two of these symbols are used, one on the left and the mirror image of it on the right, it almost always indicates a set, as in , the set containing three members, , , and . But if it is used only on the left, it groups two or more simultaneous equations. There are other symbols of grouping. One is the bar above an expression, as in the square root sign in which the bar is a symbol of grouping. For example is the square root of the sum. The bar is also a symbol of grouping in repeated decimal digits. A decimal point followed by one or more digits with a bar over them, for example 0., represents the repeating decimal 0.1
https://en.wikipedia.org/wiki/Wave%20run-up
Wave run-up is the height to which waves run up the slope of a revetment, bank or dike, regardless of whether the waves are breaking or not. Conversely, wave run-down is the height to which waves recede. These heights are always measured vertically (and not along the slope). The wave run-up height, denoted by , , or , is a very important parameter in coastal engineering as, together with the design highest still water level, it determines the required crest height of a dike or revetment. History The first scientific measurements of wave run-up were carried out by the Lorentz Committee in preparation for the works to close off the Zuiderzee. The Committee measured the wave height and wave run-up at various locations in 1920, but established that state of the art methods for measuring waves in the field during storms were inadequate. As a result, scale tests were also undertaken, but these also proved to be of very limited efficacy due to the fact that only regular waves (idealised, periodic waves with constant amplitude and a fixed time period between successive wave crests, following a sinusoidal pattern) could be modelled at the time. The methods and technology available to the committee at the time did not permit model testing of the more realistic and complex irregular waves (consisting of varying heights, periods and directions), which provide a more accurate representation of the actual conditions faced by coastal structures and shorelines. It was found, however, that the depth in front of the dike is very important for wave run-up and that, at least for the range of observations in the committee's measurements, the slope ratio does not play a major role. Nearly all dikes in the Netherlands at that time had a slope of 1:3. Current knowledge indicates that during storms and on gentle coastal slopes, the significant wave height is approximately half the water depth. This relationship appears to be accurate, and the observation is more pronounced for slopes
https://en.wikipedia.org/wiki/Anita%20Carleton
Anita D. Carleton is an American computer scientist and software engineer whose research concerns software measurement, the Capability Maturity Model, statistical process control, and their applications in managing and improving the software development process. She works in the Software Engineering Institute, associated with Carnegie Mellon University, as director of the Software Solutions Division. Education and career Carleton majored in applied mathematics at Carnegie Mellon University; she has an MBA from the MIT Sloan School of Management. She worked on missile weapon systems software at GTE and on tire modeling and simulation at the Goodyear Tire and Rubber Company, before her move to the Software Engineering Institute in the late 1980s. Book Carleton is the coauthor, with William A. Floranc, of the book Measuring the Software Process: Statistical Process Control for Software Process Improvement (Addison-Wesley, 1999). Recognition Carleton was elected as an IEEE Fellow in 2022, "for leadership in the advancement of software measurement and practices". References Year of birth missing (living people) Living people American computer scientists American software engineers American women computer scientists Software engineering researchers Carnegie Mellon University alumni MIT Sloan School of Management alumni Fellow Members of the IEEE
https://en.wikipedia.org/wiki/Animals%20in%20Meitei%20culture
Animals () have significant roles in different elements of Meitei culture, including but not limited to Meitei cuisine, Meitei dances, Meitei festivals, Meitei folklore, Meitei folktales, Meitei literature, Meitei mythology, Meitei religion, etc. Deer in Meitei culture In one of the epic cycles of incarnations in Moirang, Kadeng Thangjahanba hunted and brought a lovely Sangai deer alive from a hunting ground called "Torbung Lamjao" as a gift of love for his girlfriend, Lady Tonu Laijinglembi. However, when he heard the news that his sweetheart lady married King Laijing Ningthou Punsiba of ancient Moirang, during his absence, he got extremely disappointed and sad. And so, with the painful and sad feelings, he realised and sensed the feelings of the deer for getting separated from its mate (partner). So, he released the deer in the wild of the Keibul Lamjao (modern day Keibul Lamjao National Park regions). Since then, the Sangai species started living in the Keibul Lamjao region as their natural habitat. Dogs in Meitei culture Dogs are mentioned as friends or companions of human beings, in many ancient tales and texts. In many cases, when dogs died, they were given respect by performing elaborate death ceremonies, equal to that of human beings. When goddess Konthoujam Tampha Lairembi saw smokes in her native place, she was restless. She came down to earth from heaven to find out who was dead. On reaching the place, her mother told her as follows: Elephants in Meitei culture In the Meitei epic of the Khamba and Thoibi, the crown prince Chingkhu Akhuba of ancient Moirang and Kongyamba, planned to kill to Khuman Khamba. Kongyamba and his accomplices together threatened Khamba to give up Moirang Thoibi, which Khamba rejected. Then they fought, and Khamba bet all of them, and was about to kill Kongyamba, but the men that stood by, the friends of Kongyamba, dragged Khamba off, and bound him to the elephant of the crown prince, with ropes. Then they goaded the elep
https://en.wikipedia.org/wiki/International%20Collection%20of%20%28Vesicular%29%20Arbuscular%20Mycorrhizal%20Fungi
The International Collection of (Vesicular) Arbuscular Mycorrhizal Fungi (INVAM) is the largest collection of living arbuscular mycorrhizal fungi (AMF) and includes Glomeromycotan species from 6 continents. Curators of INVAM acquire, grow, identify, and elucidate the biology, taxonomy, and ecology of a diversity AMF with the mission to expand availability and knowledge of these symbiotic fungi. Culturing AMF presents difficulty as these fungi are obligate biotrophs that must complete their life cycle while in association with their plant hosts, while resting spores outside of the host are vulnerable to predation and degradation. Curators of INVAM have thus developed methods to overcome these challenges to increase the availability of AMF spores. The inception of this living collection of germplasm occurred in the 1980s and it takes the form of fungi growing in association with plant symbionts in the greenhouse, with spores preserved in cold storage within their associated rhizosphere. AMF spores acquired from INVAM have been used extensively in both basic and applied research projects in the fields of ecology, evolutionary biology, agroecology, and in restoration. INVAM is umbrellaed under the Kansas Biological Survey at The University of Kansas, an R1 Research Institution. The Kansas Biological Survey is also home to the well-known organization Monarch Watch. INVAM is currently located within the tallgrass prairie ecoregion, and many collaborators and researchers associated with INVAM study the role of AMF in the mediation of prairie biodiversity. James Bever and Peggy Schultz are the Curator and Director of Operation team, with Elizabeth Koziol and Terra Lubin as Associate Curators. History INVAM was conceptualized and actualized by Dr. Norman Schenk, a mycologist and professor of plant pathology. In 1985, Schenk’s vision was funded by the National Science Foundation to begin the International Culture Collection Vesicular Arbuscular Mycorrhizal Fungi (INVAM). Sc
https://en.wikipedia.org/wiki/Branch%20number
In cryptography, the branch number is a numerical value that characterizes the amount of diffusion introduced by a vectorial Boolean function that maps an input vector to output vector . For the (usual) case of a linear the value of the differential branch number is produced by: applying nonzero values of (i.e., values that have at least one non-zero component of the vector) to the input of ; calculating for each input value the Hamming weight (number of nonzero components), and adding weights and together; selecting the smallest combined weight across for all nonzero input values: . If both and have components, the result is obviously limited on the high side by the value (this "perfect" result is achieved when any single nonzero component in makes all components of to be non-zero). A high branch number suggests higher resistance to the differential cryptanalysis: the small variations of input will produce large changes on the output and in order to obtain small variations of the output, large changes of the input value will be required. The term was introduced by Daemen and Rijmen in early 2000s and quickly became a typical tool to assess the diffusion properties of the transformations. Mathematics The branch number concept is not limited to the linear transformations, Daemen and Rijmen provided two general metrics: differential branch number, where the minimum is obtained over inputs of that are constructed by independently sweeping all the values of two nonzero and unequal vectors , ( is a component-by-component exclusive-or): ; for linear branch number, the independent candidates and are independently swept; they should be nonzero and correlated with respect to (the coefficient of the linear approximation table of should be nonzero): . References Sources Cryptography
https://en.wikipedia.org/wiki/Wind%20setup
Wind setup, also known as wind effect or storm effect, refers to the rise in water level in seas or lakes caused by winds pushing the water in a specific direction. As the wind moves across the water's surface, it applies a shear stress to the water, prompting the formation of a wind-driven current. When this current encounters a shoreline, the water level along the shore increases, generating a hydrostatic counterforce in equilibrium with the shear force. During a storm, wind setup is a component of the overall storm surge. For instance, in The Netherlands, the wind setup during a storm surge can elevate water levels by approximately 3 metres above the normal tide. In the case of cyclones, the wind setup can reach up to 5 metres. This can result in a significant rise in water levels, particularly when the water is forced into a shallow, funnel-shaped area. Observation In lakes, water level fluctuations are typically attributed to wind setup. This effect is particularly noticeable in lakes with well-regulated water levels, where the wind setup can be clearly observed. By comparing this with the wind over the lake, the relationship between wind speed, water depth, and fetch length can be accurately determined. This is especially feasible in lakes where water depth remains fairly consistent, such as the IJsselmeer. At sea, wind setup is usually not directly observable, as the observed water level is a combination of both the tide and the wind setup. To isolate the wind setup, the (calculated) astronomical tide must be subtracted from the observed water level. For example, during the North Sea flood of 1953 at the Vlissingen tidal station (see image), the highest water level along the Dutch coast was recorded at 2.79 metres, but this was not the location of the highest wind setup, which was observed at Scheveningen with a measurement of 3.52 metres. Notably, the highest wind setup ever recorded in the Netherlands (3.63 metres) was in Dintelsas, Steenbergen in 195
https://en.wikipedia.org/wiki/Veratric%20acid
Veratric acid, also known as 3,4-dimethoxybenzoic acid, is a benzoic acid. It is a plant metabolite found in species such as Hypericum laricifolium, Artemisia sacrorum, and Zeyheria montana. Uses Medical research A 2023 study at SRM Institute of Science and Technology suggests that veratric acid has apoptotic and antiproliferative effects against triple negative breast cancer cells. These effects were substantially increased when polydopamine nanoparticles were used as a sustained release drug carrier. References Benzoic acids Plant metabolism Botany Phytochemicals
https://en.wikipedia.org/wiki/Disney%20Speedstorm
Disney Speedstorm is a free-to-play kart racing game developed by Gameloft Barcelona and published by Gameloft. It features various Disney and Pixar characters racing vehicles on tracks themed after the worlds of their films and franchises. The game was released in a pay-for early access on 18 April 2023 for Nintendo Switch, PlayStation 4, PlayStation 5, Windows, Xbox One and Xbox Series X/S, with a soft launch on iOS and Android on 1 August 2023. It left early access on 28 September 2023. Gameplay Disney Speedstorm is a free-to-play kart racing game with a roster composed of characters from various Disney properties, such as the Mickey and Friends, Pirates of the Caribbean, Monsters, Inc., Toy Story and Beauty and the Beast franchises, among others. Various minor Disney characters appear as "crew members" that provide stat boosts and other enhancements for characters from their affiliated collections (franchises), such as the Orange Bird serving as a Crew Member for Figment. The gameplay is similar to Mario Kart. Racers, as the game refers to its playable characters, can drift to improve their cornering and charge their nitro boost, which can only be used when the boost meter full. Racers also get a small drift boost from long drifts (indicated by their tires glowing blue), which also gives them an extra boost charge. Racers can sideswipe other Racers to knock them aside, jump to reach higher grounds or dodge obstacles, offensive power-ups or other Racers, perform aerial stunts off designated jumps for a small speed boost upon landing, and grind on designated blue rails to also gain nitro boost. Racers can also pick up various power-ups, called "skills", by driving through sprites with the image of Arbee, the game's emotive AI mascot, on them. These skills include offensive weapons that stun Racers, shields that protect Racers from stuns, cloaks that make Racers intangible and invisible, nitro boost charges, rushes that provide sudden bursts of speed, and hacks t
https://en.wikipedia.org/wiki/Bioreceptivity
Bioreceptivity is defined as "the ability of a material to be colonized by living organisms." First defined by Guillitte in 1995 as a new term in ecology to discuss the beneficial applications of building materials for ecological uses. Previous understandings termed the colonization of organisms as "degradation," implying a negative connotation, leading to the creation of "bioreceptivity" for positive benefits of colonization on materials. It is an interdisciplinary field of study between materials science and ecology. Bioreceptive design is commonly mistaken for biomimicry, or nature inspired design. Marco Cruz and Richard Beckett provide an alternative explanation known as architectural bark, in which it is both nature-inspired and nature-integrated where colonization by the microbiome and organisms plays a role in the architectural design. Bioreceptivity is different from green infrastructure, such as green roofs, green walls, and storm water management, but has been observed to be related to these research areas in architecture. Bioreceptive design has led to further research studies in concrete materials for use in urban environments through walls and non-green spaces. However, bioreceptive designs have implications outside creating new green spaces, and can be used for conservation biology and ecological restoration. Urban ecologies A more recent trend in architectural design has been an effort to include green spaces in public areas to improve the connection between people and nature. However the creation of green spaces includes pressures such as space, natural resource demand, and development limitations that reduce the amount of green spaces available in urban environments. Land space is limited due to increased urbanization and human dominated landscapes reduce regional biodiversity. To adapt to these challenges, designers are utilizing the vertical spaces provided by urban architecture to promote biodiversity. To address the issue of space availabil
https://en.wikipedia.org/wiki/Blockscale
Intel Blockscale was a brand of crypto-mining accelerator ASIC sold by the U.S. chip manufacturer Intel. The Blockscale product debuted in June 2022, and was cancelled by Intel in April 2023. Intel has stated that it will continue to supply chips to existing customers until April 2024. The Blockscale chips were SHA-256 hardware accelerators designed for proof-of-work calculations. According to Intel, they were capable of up to 580 GH/s with a power consumption of up to 22.7 W, and a claimed efficiency of up to 26 J/TH. The product came in three variants: the Blockscale 1120, 1140, and 1160. References Intel products Hardware acceleration Cryptographic hardware Bitcoin 2022 establishments 2023 disestablishments
https://en.wikipedia.org/wiki/Nostr
Nostr is a decentralized network protocol for a distributed social networking system. The name is an acronym for "Notes and Other Stuff Transmitted by Relay". Posts are resistant to censorship and are cryptographically validated. Jack Dorsey, the co-founder of Twitter, endorses and financially supported the development of Nostr. See also ActivityPub Mastodon (social network) OStatus References External links Social media Social networking services Public-domain software
https://en.wikipedia.org/wiki/Sister%20Beiter%20conjecture
In mathematics, the Sister Beiter conjecture is a conjecture about the size of coefficients of ternary cyclotomic polynomials (i.e. where the index is the product of three prime numbers). It is named after Marion Beiter, a Catholic nun who first proposed it in 1968. Background For the maximal coefficient (in absolute value) of the cyclotomic polynomial is denoted by . Let be three prime numbers. In this case the cyclotomic polynomial is called ternary. In 1895, A. S. Bang proved that . This implies the existence of such that . Statement Sister Beiter conjectured in 1968 that . This was later disproved, but a corrected Sister Beiter conjecture was put forward as . Status A preprint from 2023 explains the history in detail and claims to prove this corrected conjecture. Explicitly it claims to prove References Conjectures about prime numbers Polynomials
https://en.wikipedia.org/wiki/Oleaginous%20microorganism
An oleaginous microorganism is a type of microbe that accumulates lipid as a normal part of its metabolism. Oleaginous microbes may accumulate an array of different lipid compounds, including polyhydroxyalkanoates, triacylglycerols, and wax esters. Various microorganisms, including bacteria, fungi, and yeast, are known to accumulate lipids. These organisms are often researched for their potential use in producing fuels from waste products. Function For a typical bacteria, polar lipids such as phospholipids are synthesized to maintain the cell membrane. However, in oleaginous organisms, lipids can be synthesized and accumulated within the cell to act as energy storage in nutrient deprived conditions. Lipid accumulation can also serve secondary purposes such as acting as a water source in water stressed conditions, and to prevent oxidative stress from the formation of reactive oxygen species as a result of ultraviolet radiation. Lipid accumulation occurs as a storage of energy and nutrients, which appears to be triggered by inadequate environmental conditions. Bacteria such as Methylobacterium rhodesianum strain MB126 have been observed to accumulate poly-β-hydroxybutyrate when grown under phosphorus-, nitrogen-, and carbon-deficient conditions. Similarly, other organisms such as oleaginous Rhodococcus species like R. opacus are known to accumulate triacylglycerols instead, with the fatty acid content of these compounds varying by organism and environmental conditions. Lipid accumulation is proposed to be advantageous to oleaginous microbes as it provides a source of energy and nutrients when they are absent from the environment. It allows the organisms to survive through 'feast and famine' conditions, to prevent die offs before a new source of energy and nutrients may be provided to the population. The specific conditions causing triacylglycerol synthesis and accumulation have been studied in order to develop processes where its intracellular content is maximized
https://en.wikipedia.org/wiki/Zearn
Zearn is an American nonprofit educational software organization. Its online program, Zearn Math, was founded in 2012 and helps elementary and middle school students explore and make sense of mathematical concepts. The organization develops digital lessons and curriculum for teachers, school districts and state education agencies and provides data on math learning. Zearn’s curriculum is used in both targeted small group instruction and in personalized digital lessons. History In 2012, Evan Rudall, former CEO of Uncommon Schools, founded Zearn as a nonprofit organization to develop interactive digital math content for elementary school children. The organization received $4.4 million in grants from the Bill & Melinda Gates Foundation, part of the foundation's larger $1 billion investment into math education. Co-founder and CEO since 2016, Shalinee Sharma, previously worked at Bain & Company for 12 years. The organization developed the digital learning platform Zearn Math, with online interactive curriculum that could be used by educators within classes, as well as by students for extra learning. The lessons are free for students and teachers, with schools and school districts able to pay for extra services including training and printed lessons. First used in New York, the curriculum was used as part of statewide tutoring programs in Tenessee and Texas. By 2022, it was being used by 25% of US elementary school students and more than one million middle school students according to its own tracking of sign-ons. An evaluation report by Johns Hopkins School of Education in 2019 found that overall perceptions of Zearn Math were very positive, with the smaller group model of the curriculum a particular strength. Differences in achievement gains were small but "statistically significant" and the report recommended further support be provided for independent learning and individual needs. In March 2023, the New York City Department of Education approved a seven year co
https://en.wikipedia.org/wiki/Motzkin%E2%80%93Taussky%20theorem
The Motzkin–Taussky theorem is a result from operator and matrix theory about the representation of a sum of two bounded, linear operators (resp. matrices). The theorem was proven by Theodore Motzkin and Olga Taussky-Todd. The theorem is used in perturbation theory, where e.g. operators of the form are examined. Statement Let be a finite-dimensional complex vector space. Furthermore, let be such that all linear combinations are diagonalizable for all . Then all eigenvalues of are of the form (i.e. they are linear in und ) and are independent of the choice of . Here stands for an eigenvalue of . Comments Motzkin and Taussky call the above property of the linearity of the eigenvalues in property L. Bibliography Kato, Tosio (1995). Perturbation Theory for Linear Operators. Berlin, Heidelberg: Springer. p. 86. ISBN 978-3-540-58661-6, doi:10.1007/978-3-642-66282-9.  Friedland, Shmuel (1981). A generalization of the Motzkin-Taussky theorem. Linear Algebra and its Applications. Vol. 36. pp. 103–109. doi:10.1016/0024-3795(81)90223-8. Notes Mathematical theorems Linear algebra Perturbation theory Linear operators
https://en.wikipedia.org/wiki/Locked%20Shields
Locked Shields is an annual cyber defence exercise organised by NATO's Cooperative Cyber Defence Centre of Excellence in Tallinn since 2010. The format is usually that a red team simulates a hostile attack while blue teams from the participating nations simulate their coordination and defence against this. The performance of teams is assessed using a mix of automated and manual scoring. In 2022, there were 24 teams with an average of 50 experts in each team. The team from Finland was declared as the 2022 winner for the excellence of their situation reporting and solid defence. References Security engineering Cyberwarfare
https://en.wikipedia.org/wiki/Ross%20Baldick
Ross Baldick is an American professor emeritus of electrical and computer engineering at the University of Texas at Austin. He is an Institute of Electrical and Electronics Engineers (IEEE) fellow of power and energy society. He is the chairman of the System Economics Sub-Committee of the IEEE Power Engineering and an associate editor of IEEE Transactions on Power Systems. His research interests are optimization and economic theory application to electric power system operations, public policy, and technical issues related to electric transmission under deregulation. Education and career He received his bachelor of science in mathematics and physics and bachelor of engineering in electrical engineering from the University of Sydney, Australia in 1983 and 1985, respectively. He received his Master of Science and Doctor of Philosophy in electrical engineering and computer sciences from University of California, Berkeley in 1988 and 1990, respectively. From 1991-1992, after completing his doctoral studies, he worked as a post-doctoral fellow at the Lawrence Berkeley National Laboratory. From 1992 to 1993, he was an assistant professor at Worcester Polytechnic Institute, Worcester, MA. In 1993, Baldick joined the University of Texas at Austin faculty, where he remained until his retirement in 2021. Research Baldick's research interests in electric power span across multiple areas, and he has contributed to over one hundred peer-reviewed journal articles. Baldick's research focuses on optimization and economic theory applied to electric power system operations and the public policy and technical issues associated with electric transmission under deregulation. He has published numerous articles on these topics and is the author of the textbook "Applied Optimization: Formulation and Algorithms for Engineering Systems." Honors and awards In 2008, Baldick was named an IEEE Fellow for his contributions to analyzing and optimizing electric power systems. In addition to b
https://en.wikipedia.org/wiki/Polychromatic%20symmetry
Polychromatic symmetry is a colour symmetry which interchanges three or more colours in a symmetrical pattern. It is a natural extension of dichromatic symmetry. The coloured symmetry groups are derived by adding to the position coordinates (x and y in two dimensions, x, y and z in three dimensions) an extra coordinate, k, which takes three or more possible values (colours). An example of an application of polychromatic symmetry is crystals of substances containing molecules or ions in triplet states, that is with an electronic spin of magnitude 1, should sometimes have structures in which the spins of these groups have projections of + 1, 0 and -1 onto local magnetic fields. If these three cases are present with equal frequency in an orderly array, then the magnetic space group of such a crystal should be three-coloured. Example The group has three different rotation centres of order three (120°), but no reflections or glide reflections. There are two distinct ways of colouring the p3 pattern with three colours: p3[3]1 and p3[3]2 where the figure in square brackets indicates the number of colours, and the subscript distinguishes between multiple cases of coloured patterns. Taking a single motif in the pattern p3[3]1 it has a symmetry operation 3', consisting of a rotation by 120° and a cyclical permutation of the three colours white, green and red as shown in the animation. This pattern p3[3]1 has the same colour symmetry as M. C. Escher's Hexagonal tessellation with animals: study of regular division of the plane with reptiles (1939). Escher reused the design in his 1943 lithograph Reptiles. Group theory Initial research by Wittke and Garrido (1959) and by Niggli and Wondratschek (1960) identified the relation between the colour groups of an object and the subgroups of the object's geometric symmetry group. In 1961 van der Waerden and Burckhardt built on the earlier work by showing that colour groups can be defined as follows: in a colour group of a pat
https://en.wikipedia.org/wiki/Asian%20Association%20on%20Remote%20Sensing
Asian Association on Remote Sensing or AARS is a non-governmental organization established in 1981 to promote remote sensing in the Asia-Pacific region; it currently has members from 29 countries. Members Indian Society of Remote Sensing Surveying & Spatial Sciences Institute Malaysian Remote Sensing Agency Japan Society of Photogrammetry and Remote Sensing SPARRSO Institute of Remote Sensing and Digital Earth Korean Society of Remote Sensing References External links Official site Remote sensing organizations Remote sensing Geodesy
https://en.wikipedia.org/wiki/Inbox%20and%20outbox%20pattern
The inbox pattern and outbox pattern are two related patterns used by applications to persist data (usually in a database) to be used for operations with guaranteed delivery. The inbox and outbox concepts are used in the ActivityPub protocol and in email. The inbox pattern The application receives data which it persists to an inbox table in a database. Once the data has been persisted another application, process or service can read from the inbox table and use the data to perform an operation which it can retry upon failure until completion, the operation may take a long time to complete. The inbox pattern ensures that a message was received (e.g. to a queue) successfully at least once. The outbox pattern The application persists data to an outbox table in a database. Once the data has been persisted another application or process can read from the outbox table and use that data to perform an operation which it can retry upon failure until completion. The outbox pattern ensures that a message was sent (e.g. to a queue) successfully at least once. See also Enterprise service bus Message broker External links Push-based Outbox Pattern with Postgres Logical Replication Software design patterns
https://en.wikipedia.org/wiki/Hyperchaos
A hyperchaotic system is a dynamical system with a bounded attractor set, on which there are at least two positive Lyapunov exponents. Since on an attractor, the sum of Lyapunov exponents is non-positive, there must be at least one negative Lyapunov exponent. If the system has continuous time, then along the trajectory, the Lyapunov exponent is zero, and so the minimal number of dimensions in which continuous-time hyperchaos can occur is 4. Similarly, a discrete-time hyperchaos requires at least 3 dimensions. Mathematical examples The first two hyperchaotic systems were proposed in 1979. One is a discrete-time system ("folded-towel map"): Another is a continuous-time system:More examples are found in. Experimental examples Only few experimental hyperchaotic behaviors have been identified. Examples include in an electronic circuit, in a NMR laser, in a semiconductor system, and in a chemical system. References Chaotic maps Nonlinear systems Articles containing video clips
https://en.wikipedia.org/wiki/Open%20Connect
Open Connect is a content distribution network specifically developed by Netflix to deliver its TV shows and movies to avoid the traffic and fees. Netflix provides physical appliances to internet service providers that allow them to avoid traffic during peak hours of streaming or sustain the anticipated ones. By shipping copies of content to these appliances ahead of time, the devices can store duplicates of titles, thereby reducing the network burden. Also, Netflix places its servers in locations with the highest number of subscribers and forms partnerships with ISP networks or IXPs.  Furthermore, Netflix adapts its content to the quality of the network. This is achieved by sending three copies of each title, each at a different quality level, to its servers. For example, if a user's ISP is overwhelmed or the Internet connection is poor, the system can select a lower-bitrate version of the title. History Netflix launched Open Connect in 2012. Since then, Netflix has spent over $1 billion to develop and distribute more than 8,000 Open Connect Appliances (OCA). The service started working on the free-of-charge distribution of OCAs in cooperation with ISPs. So far, more than 1,000 ISPs have acquired and installed OCAs, which has allowed them to save $1.25 billion by 2021. In the case of an OCA that is hosted at an IXP, Netflix maintains ownership of the OCA and is responsible for covering its own expenses such as power consumption, colocation fees, cross-connect fees, and other related costs. Netflix has installed OCAs in over 52 IXPs around the world, enabling a connection with any ISP. Deployment IX deployment Netflix deploys OCAs within IXPs located in major Netflix markets around the world. These OCAs are interconnected with ISPs present at the same location through free public or private peering. Embedded deployment OCAs are directly installed within ISP networks. While Netflix supplies the server hardware at no cost, ISPs are responsible for providi
https://en.wikipedia.org/wiki/Spitfire%2040
Spitfire 40 is a combat flight simulation video game developed by Novotrade and published by Mirrorsoft for the Commodore 64 in 1985. Gameplay The game is set during the Battle of Britain where the player flies a Supermarine Spitfire. The game features two separate screens: The view from the cockpit and the instrument panel. Both have to be used in conjunction to fly the plane. Two tutorial modes are included: flying practice and combat practice. Reception Zzap!64 summarized: "There are much better flight simulators than this – even Glider Pilot has faster graphics". Commodore User compared the game to Spitfire Ace and said Spitfire 40 is the better of the two. Crash called the game "An excellent simulation which should appeal to arcade players too." Your Sinclair said that "Spitfire 40 is a friendly program, not nearly so difficult to get into as some earlier simulators, and it's very engaging with its role playing element." Sinclair User summarized: "Not quite a Classic, then, but definitely Mirrorsoft's finest hour." Amtix called the game "An excellent program, and definitely the best flight simulator on the Amstrad." The Games Machine reviewed the Atari ST port: "Despite the age of Spitfire 40, the thrill of combat is present..." Computer Gaming World wrote in 1991: "poor graphics and poorer execution. It flies like a bus with the maneuverability of a tractor-trailer." The game was a best seller in England. References External links Spitfire 40 at Lemon64 Spitfire 40 at Spectrum Computing 1985 video games Amstrad CPC games Atari ST games Avalon Hill video games Battle of Britain video games Commodore 64 games Mirrorsoft games Single-player video games Video games developed in Hungary World War II flight simulation video games ZX Spectrum games
https://en.wikipedia.org/wiki/Validation%20and%20verification%20%28medical%20devices%29
Validation and verification are procedures that ensure that medical devices fulfil their intended purpose. Validation or verification is generally needed when a health facility acquires a new device to perform medical tests. Validation or verification The main difference between the two is that validation is focused on ensuring that the device meets the needs and requirements of its intended users and the intended use environment, whereas verification is focused on ensuring that the device meets its specified design requirements. For instance, a regulatory agency (such as CE or FDA) may ensure that a product has been validated for general use before approval. An individual laboratory that introduces such an approved medical device may then not need to perform their own validation, but generally still need to perform verification to ensure that the device works correctly. Workflow Standards Standards for validation and verification of medical laboratories are outlined in the international standard ISO 15189, in addition to national and regional regulations. As per United States federal regulations, the following analytical tests need to be done by a medical laboratory that introduces a new testing device: To establish a reference range, the Clinical and Laboratory Standards Institute (CLSI) recommends testing at least 120 patient samples. In contrast, for the verification of a reference range, it is recommended to use a total of 40 samples, 20 from healthy men and 20 from healthy women, and the results should be compared to the published reference range. The results should be evenly spread throughout the published reference range rather than clustered at one end. The published reference range can be accepted for use if 95% of the results fall within it. Otherwise, the laboratory needs to establish its own reference range. See also Validation (drug manufacture) References Quality management Product testing Systems engineering
https://en.wikipedia.org/wiki/Hwang%20affair
The Hwang affair, or Hwang scandal, or Hwanggate, is a case of scientific misconduct and ethical issues surrounding a South Korean biologist, Hwang Woo-suk, who claimed to have created the first human embryonic stem cells by cloning in 2004. Hwang and his research team at the Seoul National University reported in the journal Science that they successfully developed a somatic cell nuclear transfer method with which they made the stem cells. In 2005, they published again in Science the successful cloning of 11 person-specific stem cells using 185 human eggs. The research was hailed as "a ground-breaking paper" in science. Hwang was elevated as "the pride of Korea", "national hero" [of Korea], and a "supreme scientist", to international praise and fame. Recognitions and honours immediately followed, including South Korea's Presidential Award in Science and Technology, and Time magazine listing him among the "People Who Mattered 2004" and the most influential people "The 2004 Time 100". Suspicion and controversy arose in late 2005, when Hwang's collaborator, Gerald Schatten at the University of Pittsburgh, came to know of the real source of oocytes (egg cells) used in the 2004 study. The eggs, reportedly from several voluntary donors, were from Hwang's two researchers, the fact which Hwang denied. The ethical issues made Schatten immediately break his ties with Hwang. In December 2005, a whistleblower informed Science of reuse of the same data. As the journal probed in, it was revealed that there was a lot more data fabrication. The SNU immediately investigated the research work and found that both the 2004 and 2005 papers contained fabricated results. Hwang was compelled to resign from the university, and publicly confessed in January 2006 that the research papers were based on fabricated data. Science immediately retracted the two papers. In 2009, the Seoul Central District Court convicted Hwang for embezzlement and bioethical violations, sentencing him to a two-yea
https://en.wikipedia.org/wiki/Random%20flip-flop
Random flip-flop (RFF) is a theoretical concept of a non-sequential logic circuit capable of generating true randomness. By definition, it operates as an "ordinary" edge-triggered clocked flip-flop, except that its clock input acts randomly and with probability p = 1/2. Unlike Boolean circuits, which behave deterministically, random flip-flop behaves non-deterministically. By definition, random flip-flop is electrically compatible with Boolean logic circuits. Together with them, RFF makes up a full set of logic circuits capable of performing arbitrary algorithms, namely to realize Probabilistic Turing machine. Symbol Random flip-flop comes in all varieties in which ordinary, edge triggered clocked flip-flop does, for example: D-type random flip-flop (DRFF). T-type random flip-flop (TRFF), JK-type random flip-flop (JKRFF), etc. Symbol for DRFF, TRFF and JKRFF are shown in the Fig. 1. While varieties are possible, not all of them are needed: a single RFF type can be used to emulate all other types. Emulation of one type of RFF by the other type of RFF can be done using the same additional gates circuitry as for ordinary flip-flops. Examples are shown in the Fig. 2. Practical realization of random flip-flip By definition, action of a theoretical RFF is truly random. This is difficult to achieve in practice and is probably best realized through use of physical randomness. A RFF, based on quantum-random effect of photon emission in semiconductor and subsequent detection, has been demonstrated to work well up to a clock frequency of 25 MHz. At a higher clock frequency, subsequent actions of the RFF become correlated. This RFF has been built using bulk components and the effort resulted only in a handful of units. Recently, a monolithic chip containing 2800 integrated RFFs based on quantum randomness has been demonstrated in Bipolar-CMOS-DMOS (BCD) process. Applications and prospects One straightforward application of a RFF is generation of random bits, as shown
https://en.wikipedia.org/wiki/Sinogene%20Biotechnology
Sinogene Biotechnology is a biotechnology company focusing on animal cloning technology. Their services include dog, cat, and horse cloning. Sinogene Biotechnology began by offering cloning dogs in 2017 and introduced cat cloning in 2019. In 2022, they cloned an Arctic wolf, and started horse cloning in 2023. References External links Official website Cloning Biotechnology companies
https://en.wikipedia.org/wiki/How%20Data%20Happened
How Data Happened: A History from the Age of Reason to the Age of Algorithms is a 2023 non-fiction book written by Columbia University professors Chris Wiggins and Matthew L. Jones. The book explores the history of data and statistics from the end of the 18th century to the present day. Publication It has 336 pages and was published in 2023 by W. W. Norton & Company. Synopsis The book starts at the end of the 18th century, when European states began tabulating physical resources, and ends at the present day, when algorithms manipulate our personal information as a commodity. It looks at the rise of data and statistics, and how early statistical methods were used to justify eugenics, quantify supposed racial differences, and develop military and industrial applications. The authors also discuss the impact of the internet and e-commerce on data collection, the rise of data science, and the consequences of government-run surveillance systems collecting vast amounts of personal data for customized, targeted advertising. They emphasize the importance of privacy and democracy, and propose remedies to the problems caused by mass data collection, including stronger regulation of the tech industry and collective action by its employees. The book is a historical analysis that provides context for understanding the debates surrounding data and its control. References 2023 non-fiction books Data science History books about the 19th century History books about the 20th century History books about the 21st century Books about mathematics Computer science books W. W. Norton & Company books External Links The wild evolution of data science and how to unpack it, book excerpt on Big Think From Eugenics to Targeted Advertising: The Dark Role of Data in Sorting Humanity, book excerpt on Literary Hub
https://en.wikipedia.org/wiki/Annual%20grasslands
Annual grasslands are a type of grassland ecosystem characterized by the dominance of annual grasses and forbs. They are most commonly found in regions with Mediterranean climates, such as California, and provide important habitats for a variety of wildlife species. Annual grasslands have a history of disturbance factors, including grazing, crop production, fire, and drought, which have contributed to the conversion of native perennial grasslands to non-native annual-dominated grasslands. Management issues in annual grasslands include carbon sequestration, native grass restoration, invasive species control, and land use change. Characteristics Annual grasslands are dominated by non-native annual grasses and forbs, with a few native perennial grass species present. These grasslands are subject to seasonal and yearly variations in species composition and productivity, which are largely controlled by the timing and amount of precipitation and temperature. Vegetation dynamics Long-term changes in annual grassland productivity, species composition, and ecosystem processes are influenced by continuing waves of invasion, changes in soil moisture depletion patterns, and fire frequency. Species composition in annual grasslands can change throughout a growing season, depending on germination, seedling establishment, and plant growth progress. Disturbance factors Grazing, crop production, fire, and drought have all contributed to the conversion of native grassland to non-native annual-dominated grassland. Severe droughts, such as those in 1828, 1862, and 1864, have also played a role in this conversion. Some researchers suggest that high-frequency burning by native peoples and Europeans may have made the native grasslands susceptible to invasion by non-native species. Management issues Management issues in annual grasslands include carbon sequestration, native grass restoration, invasive species control, and land use change. Carbon sequestration In the absence of r
https://en.wikipedia.org/wiki/National%20Quantum%20Mission%20India
National Quantum Mission India is an initiative by the Department of Science and Technology, Government of India, to foster quantum technologies related scientific and industrial research and development to accelerate economic growth to establish India as a global leader in quantum technology and applications and support national Digital India, Make India, Skill India and Sustainable development goals. Background The union cabinet of Government of India approved the National Quantum Mission with a cost of INR 6003.65 cr ($730,297,000) from 2023–24 to 2030–31. References Science and technology in India Quantum computing Quantum mechanics
https://en.wikipedia.org/wiki/Security%20orchestration
Security orchestration, automation and response (SOAR) is a group of cybersecurity technologies that allow organizations to respond to some incidents automatically. It collects inputs monitored by the security operations team such as alerts from the SIEM system, TIP, and other security technologies and helps define, prioritize, and drive standardized incident response activities. Organizations uses SOAR platforms to improve the efficiency of physical and digital security operations. SOAR enables administrators to handle security alerts without the need for manual intervention. When the network tool detects a security event, depending on its nature, SOAR can raise an alert to the administrator or take some other action. Components "Orchestration" connects the different security tools and systems of the Information system. It integrates custom-built applications with built-in security tools, so they all work with each other. It also connects divers endpoints, firewalls and behavior analysis tools. "Automation" takes the huge amount of information generated through orchestration and analyzes it through machine learning processes. SOAR handle a lot of manual tasks of log analysis and can also handle ticket requests, vulnerability checks and auditing processes. "Incident response" allows security teams to react when a potential threat is indicated. This component also handles post-incident activities such as threat intelligence sharing in an automated way. Playbooks and runbooks SOAR allows security administrators to define the potential incidents and the response, thanks to playbooks and runbooks. A playbook is a document that describes how to verify a cybersecurity incident and how the incident should be responded. The purpose of the playbook is to document what the runbook should do. Playbook can be used as a manual backup in case the SOAR fails. A runbook implements the playbook data into an automated tool so that it performs predefined actions to mitigate
https://en.wikipedia.org/wiki/Stephen%20Webb%20%28scientist%29
Stephen Webb is a physicist and author of numerous popular science and math books, as well as academic publications. Webb was educated at Bristol University (BSc (Hons) Physics – First Class) and, as a graduate student, attended Manchester University (PhD – Theoretical Particle Physics). Webb is currently on the academic staff at the University of Portsmouth, and is a presenter of numerous science-related non-academic talks and academic lectures. In 2018, Webb was a featured science speaker at the annual TED conference. Webb has worked at the University of Cardiff (Physics Department; 1993–95), University of Sheffield (Math & Statistics; 1995–98), University of Loughborough (Math & Sciences; 1998–99), Northumbria University (Information Sciences; 1999–2000), The Open University (2000–2006) and the University of Portsmouth (2006–current). In addition, Webb has been a Member of the Institute of Physics (M. Inst. P.), Chartered Physicist (C. Phys.), Senior Fellow of the Higher Education Academy (SFHEA), a Member of international editorial board (Springer S&F Series), Member of the UK SETI Research Network, and a Project lead for the UK Advance HE Collaborative Award in Teaching Excellence (CATE 2022). Publications Note: Listings of many more books and publications by Stephen Webb are on GoodReads, Webb's WebSite and elsewhere. Authored books Edited books See also References External links Stephen Webb on Mastodon and Stephen Webb (video; 13:09): Where Are All The Aliens? (TED talk – 2018) (transcript) (TED Talk – 2018) Astrobiology Enrico Fermi Extraterrestrial life Interstellar messages Living people Search for extraterrestrial intelligence Unsolved problems in astronomy 1963 births Academics of the University of Sheffield
https://en.wikipedia.org/wiki/Netprov
Netprov is "networked, improvised literature" or collaborative literary improvisations performed on the internet. The word netprov is a portmanteau of "networked" and "improv" as in improvisational theatre. Netprov is considered a genre of electronic literature. Background Netprov is explicitly related to improvisational theatre, and also has a lot in common with live action role-playing games. Rob Wittig, one of netprov's originators, was also involved in Invisible Seattle, a novel created in the early 1980s by a group of "literary workers" who gathered stories from Seattle residents, in part using an early online bulletin board system. An early example of netprov was Rob Wittig's Grace, Wit, and Charm (2011), which centred around a fictional company that offered services to people who wanted help making their online avatars more successful. Participants took the roles of workers in the company and clients writing in to request services, and the netprov was performed in online writing, in weekly theatre performances and streaming. While many netprovs are mostly playful, like #1WkNoTech, some offer powerful political critique, such as Occupy MLA, a netprov held during the Modern Language Association conference in 2011. I Work for the Web is another example that critiques the exploitation of online gig workers. Scholarship Netprov is included in many discussions of Electronic literature. Lyle Skains describes netprov as "online, collaborative, real-time, carnivalesque performances". Scott Rettberg notes that netprov is told in real-time, using social media, and are collaborative and interactive in the sense that readers can join in as participants. Wittig and Marino have also contributed chapters about netprov to a number of scholarly anthologies on electronic literature. Netprovs have also been taught at universities, both as a literary genre and as a classroom activity. References Genres of electronic literature Improvisation Internet culture
https://en.wikipedia.org/wiki/Data-driven%20model
Data-driven models are a class of computational models that primarily rely on historical data collected throughout a system's or process' lifetime to establish relationships between input, internal, and output variables. Commonly found in numerous articles and publications, data-driven models have evolved from earlier statistical models, overcoming limitations posed by strict assumptions about probability distributions. These models have gained prominence across various fields, particularly in the era of big data, artificial intelligence, and machine learning, where they offer valuable insights and predictions based on the available data. Background These models have evolved from earlier statistical models, which were based on certain assumptions about probability distributions that often proved to be overly restrictive. The emergence of data-driven models in the 1950s and 1960s coincided with the development of digital computers, advancements in artificial intelligence research, and the introduction of new approaches in non-behavioural modelling, such as pattern recognition and automatic classification. Key Concepts Data-driven models encompass a wide range of techniques and methodologies that aim to intelligently process and analyse large datasets. Examples include fuzzy logic, fuzzy and rough sets for handling uncertainty, neural networks for approximating functions, global optimization and evolutionary computing, statistical learning theory, and Bayesian methods. These models have found applications in various fields, including economics, customer relations management, financial services, medicine, and the military, among others. Machine learning, a subfield of artificial intelligence, is closely related to data-driven modelling as it also focuses on using historical data to create models that can make predictions and identify patterns. In fact, many data-driven models incorporate machine learning techniques, such as regression, classification, and clusteri
https://en.wikipedia.org/wiki/Pocket%20prairie
A pocket prairie is a small, artificially created, self-sustaining area of land where forbs and plants predominate. Oftentimes these plants are native. Pocket prairies are typically found in urban and suburban areas where there exists a lack of vegetation and wildlife (e.g. vacant lots, backyards, green spaces). These parcels of land serve as a habitat for nearby bird, insect, and mammal species. Benefits Pocket Prairies provide the benefits of: Improves bee populations Increases the number of native plants Improve soil health Filter runoff water Habitat for nearby species Aesthetic and spiritual values The Cleveland Pocket Prairie Project As a result of economic and population decline, thousands of vacant lots are dispersed in Cleveland, Ohio. From the abundance of these lots came The Cleveland Pocket Prairie Project, an Ohio State University-led project which aims to repurpose 64 vacant lots with 8 distinct plant communities referred to as "pocket prairies". The project employs 8 different treatment methods to create plant communities, two of which use existing vegetation while the rest introduce plant mixes. This project took place in eight Cleveland neighborhoods: Glenville, Slavic Village, Buckeye, Central, Tremont/Clark Fulton, Buckeye, Fairfax, and Hough. References Prairies Ecology
https://en.wikipedia.org/wiki/Quantum%20random%20circuits
Quantum random circuits (QRC) is a concept of incorporating an element of randomness into the local unitary operations and measurements of a quantum circuit. The idea is similar to that of random matrix theory which is to use the QRC to obtain almost exact results of non-integrable, hard-to-solve problems by averaging over an ensemble of outcomes. This incorporation of randomness into the circuits has many possible advantages, some of which are (i) the validation of quantum computers, which is the method that Google used when they claimed quantum supremacy in 2019., and (ii) understanding the universal structure of non-equilibrium and thermalization processes in quantum many-body dynamics. Quantum Random Circuits The constituents of some general quantum circuits would be qubits, unitary gates, and measurements. The time evolution of the quantum circuits is discrete in time , and the states are evolved step by step in time by the application of unitary operators under which a pure state evolves according to(note that unitary operators can entangle states). Thus, the time evolution from a starting time, say , to some time would be given bywhere for each step, the unitary operator is represented by a tensor product of local unitary gates where the index specifies the lattice integer which connects a pair of qubits, and is the time step. Figure 1, shows a time-space diagram of a quantum circuit which shows the local interactions at each time step. In the language of quantum information theory, the number of qubits is the circuit's width, and we define its depth as the number of layers of unitary gates. Hence, for the configuration in Figure 1, and . Another way to interpret the circuit is to look at it as a tensor network in which each purple box is a local gate operating on two qubits and the total contraction of qubits indices at the start and the end at time on the lattice integers would give the full unitary time evolution . Thus, the propagation ampli
https://en.wikipedia.org/wiki/Mobile%20barrage%20squad
Mobile barrages squad is an element of a combat or operational order in the form of a temporary military formation, which is created from units of engineering troops and army aviation. The abbreviation for the temporary formation of troops or forces used in service documents is MBS. The main purpose of MBS is to set up mine blast barrages during combat and to destroy transport infrastructure on behalf of friendly forces. Until July 1943 they were referred to simply as a barrage squad. History The theoretical foundation for the practical application of MBS was laid in the work "Разрушения и заграждения" (1931) by the Soviet military engineer Dmitry Karbyshev. During the Second World War (1939 - 1945), and especially the Eastern Front (1941 - 1945), wide use in all types of combat found mines and explosive barrages. For their arrangement in the Battle of Moscow, the Soviet troops for the first time in 1941 were used barrage squads, later called mobile barrage squads, which subsequently were successfully used in other operations of the Red Army of the Soviet Union. After the Battle of Kursk (1943) on the basis of the experience gained, it was concluded that the army command needed a permanent specialized reserve of engineering units, which would have the means of mechanization of mines, large quantities of mines and explosives of various types. As a consequence, the MBS became a mandatory element of operational structure of the Soviet troops, and in 1942 - 1943 the tactics of MBS in the offensive and defensive were practiced. During the Eastern Front, the Red Army expended more than 70,000,000 different mines, including about 30,000,000 anti-tank mines. Basic provisions The composition and equipment of a squad is determined by its objectives in combat or an operation, the availability of available forces and equipment, the composition of the enemy's troops, and the conditions on the ground. When setting a mission, the MBS receives data on the area of location, m
https://en.wikipedia.org/wiki/Illusion%20City
is a role-playing video game originally developed and published by Microcabin for the MSX Turbo R home computer. It was later ported to PC-88 and PC-98 computers, FM Towns, X68000, and Sega Mega-CD. The story takes place in the 21st century after Hong Kong was devastated by a demonic attack, before the crisis was isolated and the region was reformed under new order by SIVA corporation. The game follows demon hunter Tianren, gathering information in order to unravel the mystery surrounding the demonic beings and SIVA corporation. Gameplay features a growing party led by Tianren navigating the city, talking with non-playable characters, exploring complex areas, and taking part in turn-based battles against enemies. Illusion City was developed by "Project I", a group within Microcabin which previously worked on Fray in Magical Adventure (1990) under the name "Team Piku Piku". Yasuhiko Nakatsu acted as director, planner, and co-programmer. His motivation for creating the game was because he wanted to bring more world variety into the role-playing genre. The character design concepts were created by under the pseudonym "Hyakkimaru", with Yukio Kitta acting as art illustrator. Masashi Katō, who worked on Xak II: Rising of the Redmoon, served as scenario writer. The music was composed by Tadahiro Nitta, Yasufumi Fukuda, and Yukiharu Urita. Illusion City proved popular among Japanese players and garnered favorable reception from critics, but the Mega-CD version received mixed response and sold over 2,164 copies in its first week on the market. Retrospective commentary has been more positive. Gameplay Illusion City is a Japanese role-playing game. The player controls the main character Tianren, a demon hunter gathering information in order to unravel the mystery surrounding the demonic beings and SIVA corporation. It features a growing party led by Tianren navigating the city, talking with non-playable characters, exploring complex areas, and taking part in turn-base
https://en.wikipedia.org/wiki/Criminal%20menopause
Criminal menopause is an informal term describing a decrease in anti-social behavior that correlates with human aging. In the United States, for example, people over 60 years are responsible for less than one percent of crime. Another study found that only two percent of convicts paroled after age 55 are ever imprisoned again. The term criminal menopause alludes the human female biological process of menopause, in which ovulation and menstruation slow and then cease, eventually resulting in natural infecundity. There is no generally accepted method for assessing whether or not a convicted criminal has entered a state of criminal menopause. Marie Gottschalk writes in Caught: The Prison State and the Lockdown of American Politics: According to the author of a Los Angeles Review of Books article on prison reform in California, "Ed Bunker, the celebrated novelist who spent 18 years behind bars, including a stint in San Quentin as the youngest prisoner ever to enter the institution, would always tell me: 'crime is a young man's game.'" There is a complicated moral, financial and social calculus to be made by states that hold large populations of aging criminals. The country of the United States is expected to have 400,000 elderly incarcerated people by 2030. One study found that the recidivism rate of ex-convicts who had served more than 25 years of prison time was "essentially zero." In 1992 a manager of Louisiana's Department of Public Safety and Corrections recommending releasing prisoners over 45 years of age who had already served 20 or more years. In 2010, a 90-year-old man who bludgeoned to death his 89-year-old wife was said the defy "the theories about criminal menopause." See also Frontal lobe References External links Criminal justice Prison reform Gerontology
https://en.wikipedia.org/wiki/Mojo%20%28programming%20language%29
Mojo is a programming language developed for the MLIR compiler framework that provides a unified programming framework for software development, especially in the field of artificial intelligence (AI). Designed to be a superset of the Python programming language, the Mojo programming language is called by some as "Python++". Mojo was made available in browsers via Jupyter notebooks in May 2023, locally on Linux in September, 2023, and on macOS on October 19th, 2023. An official Visual Studio Code extension is also available. Origin design and development In 2022, the Modular company was founded by Chris Lattner, the original architect of the Swift programming language, and Tim Davis, an ML thought leader at Google. In September 2022, an initial build of Mojo was released internally by Modular Inc. with advanced compilation features powered by the MLIR, the Multi-Level Intermediate Representation compiler framework. Its type system is hybrid (something between static and dynamic), given that the developer can opt-in for high performance static typing by choosing the keyword (between fn and def) to define their function. The companion Modular inference engine includes a compiler and runtime system. Mojo was created for easy transition from Python and other programming languages. The language is largely compatible with Python and allows you to import any Python module into a Mojo module. Mojo is not open source, but it is planned to become open source in the future. Programming language advance The Mojo programming language aims to be fully compatible with the Project Jupyter ecosystem. It plans to add a borrow checker, an influence from Rust, and to add integration to transparently import Clang C/C++ modules and transparently generate a foreign function interface between C/C++ and Mojo. It can call existing Python 3.x code by reusing the CPython runtime. Mojo is not yet fully source-compatible with Python 3, only providing a subset of its syntax so far
https://en.wikipedia.org/wiki/Snowy%202.0%20Pumped%20Storage%20Power%20Station
Snowy 2.0 Pumped Storage Power Station or Snowy Hydro 2.0 is a pumped-hydro battery megaproject in New South Wales, Australia. The dispatchable generation project connects two existing dams through a underground tunnel and a new, underground pumped-hydro power station. Construction began in 2019. It is expected to supply 2.2 gigawatts of capacity and about 350,000 megawatt hours of large-scale storage to the national electricity market. It is the largest renewable energy project under construction in Australia. It is designed for grid stabilization; to be a backup at times of peak demand and for when solar and wind energy are not providing power. Snowy Hydro acts like a giant battery by absorbing, storing, and dispatching energy. The battery is designed to operate for up to 175 hours of temporary supply. It is Australia's largest energy project, estimated to cost 12 billion Australian dollars. By 2023, AU$4.3 billion had been spent. The project is led by public company Snowy Hydro Limited. When complete it is expected to have a large impact on the price and reliability of electric power. History Initial plans for a power station at the location were discussed in 1966. Further studies were undertaken in 1980 and 1990. The current project originated as the centrepiece of Malcolm Turnbull's climate change policy in 2017. A feasibility study carried out in 2017 finding the project was both technically and financially feasible. The study was released on 21 December 2017 and found the project cost would be between $3.8 and 4.5 billion. The first tunnel that was completed by October 2022, was a 2.85 kilometre section that provided main access at Lobs Hole. It was 10 metres in diameter and provides pedestrian and vehicle access into the power station. By May 2023 the emergency, cable and ventilation tunnel was excavated. It is 2.93 kilometre long, 10 metres in diameter and will be used for power station ventilation and high-voltage cables. It was originally exp
https://en.wikipedia.org/wiki/Gcore
Gcore is a public cloud and content delivery network (CDN) company founded in 2014 in Vienna, Austria. In 2015, Gcore established its headquarters in Luxembourg, where its domain was registered. As of March 2023, its global network consists of over 140 Points of Presence (PoPs) on six continents. Gcore partnered with Graphcore to launch the European AI Cloud, which uses IPU technology to speed up machine learning tasks with ready-made AI infrastructure. Server infrastructure According to the company's website, Gcore has network locations in 6 continents: North America, Europe, the Middle East, Asia, Latin America, and Africa. Gcore's uses the web application firewall WAF. Products and services Gcore provides content delivery network, cloud computing, bare-metal server, AI intelligence, kubernetes, dedicated hosting service, streaming media platform, DDoS mitigation, colocation center, custom software development, game testing, function as a service (FaaS) and logging as a Service (LaaS). A January 18, 2022, review by TechRadar commend Gcore's extensive network and CDN analytics. However, the review noted that the website lacked comprehensive guidance and assistance, which could be improved. In March 2023, Gcore offers a free speed test that helps check internet speed and the quality of a broadband and mobile connection. Correctiv and Tageszeitung reported that Gcore supported the distribution of the TV network RT until April 2023, which has been under sanctions by the EU since March 2022. However, Gcore denies the accusations. History Gcore was established in 2014 in Vienna, Austria. In 2015, Gcore moved to new headquarters in Luxembourg. In 2020, the company entered a partnership agreement with Intel. References External links Cloud computing Cloud computing providers Cloud platforms DDoS mitigation companies Content delivery networks Internet security Companies established in 2003 Software companies of Luxembourg AI companies
https://en.wikipedia.org/wiki/Google%20Silicon%20Initiative
The Google Open Silicon Initiative is an initiative launched by the Google Hardware Toolchains team to democratize access to custom silicon design. Google has partnered with SkyWater Technology and GlobalFoundries to open-source their Process Design Kits for 180nm, 130nm and 90nm process. This initiative provides free software tools for chip designers to create, verify and test virtual chip circuit designs before they are physically produced in factories. The aim of the initiative is to reduce the cost of chip designs and production, which will benefit DIY enthusiasts, researchers, universities, and chip startups. The program has gained more partners, including the US Department of Defense, which injected $15 million in funding to SkyWater, one of the manufacturers supporting the program. References External links Google Open Silicon (official site) Google Git repositories of FOSS EDA Tools SkyWater Technology Foundry FOSS 130nm Production PDK- Github GlobalFoundries GF180MCU FOSS 180nm Production PDK - GitHub Google hardware Integrated circuits
https://en.wikipedia.org/wiki/John%20David%20Hunt
John David Hunt FRS (12 December 1936 – 8 December 2012) was a British metallurgist. His research career was mainly based at the University of Oxford, from 1966 to 2002. His legacy includes the Institute of Materials, Minerals and Mining's John Hunt Medal, awarded for 'outstanding contribution to the science and/or technology of casting and solidification of metals'. He was elected Fellow of the Royal Society in 2001. References 1936 births 2012 deaths British metallurgists Academics of the University of Oxford Fellows of the Royal Society
https://en.wikipedia.org/wiki/CII%2010070
The CII 10070 is a discontinued computer system from the French company CII. It was part of the first series of computers manufactured in the late 1960s under Plan Calcul. The 10070 is a rebadged Scientific Data Systems (SDS) Sigma 7. In addition to the Sigma software, a new operating system was developed by teams from INRIA. The 10070 is optimized for scientific calculation. It has 32-bit words, byte addressing, and 16 index registers. It can handle both batch processing, and time-sharing. It also has as a standard feature, similar to virtual memory except that it is only intended for instant memory-to-memory remapping for performance reasons, with no support for managing swapping to disk. This is managed by the time-sharing monitor. The 10070 served as the basis for the design of the Iris 50 and Iris 80 series, which were entirely manufactured by CII. Software Operating systems The CII 10070 runs several SDS and locally developed operating systems: BPM (Batch Processing Monitor), single-stream batch processing system with independent tasks, called symbionts, to process card and printer inputs and outputs. This system was supplied by SDS. BTM time sharing system from SDS. Siris 7 from CII, a version of Siris 8 for the Iris 80. An experimental system, Ésope, was developed at IRIA. Languages and utilities Most of the software for the 10070 also came from SDS: Fortran IV H compiler Symbol (assembly language) Metasymbol, a more powerful assembler COBOL compiler PL/I compiler Sort CII Document retrieval system: See also CII Iris 50 CII Iris 80 SDS Sigma series Notes References External links System description from the Bull Teams Federation (machine-translated to English). Picture of a CII 10070 at CERN Scientific Data Systems The Sigma Family: Introducing Sigma from Scientific Data Systems. 1967 SDS Sigma 7 technical information Sigma 7 technical information Mainframe computers History of computing in France Computers designed in France
https://en.wikipedia.org/wiki/Precipitate-free%20zone
In materials science, a precipitate-free zone (PFZ) refers to microscopic localized regions around grain boundaries that are free of precipitates (solid impurities forced outwards from the grain during crystallization). It is a common phenomenon that arises in polycrystalline materials (crystalline materials with stochastically-oriented grains) where heterogeneous nucleation of precipitates is the dominant nucleation mechanism. This is because grain boundaries are high-energy surfaces that act as sinks for vacancies, causing regions adjacent to a grain boundary to be devoid of vacancies. As it is energetically favorable for heterogeneous nucleation to occur preferentially around defect-rich sites such as vacancies, nucleation of precipitates is impeded in the vacancy-free regions immediately adjacent to grain boundaries History Pioneering studies on the theory and experimental observation of PFZs were made in the 1960s. Effect on material properties PFZs are detrimental to the mechanical properties of materials. In particular, PFZs degrade the material's hardness, because the lack of precipitates in PFZs lead to these regions having fewer pinning sites. Dislocation motion – a condition necessary to cause a material to yield – will require an appreciably lower applied shear stress in PFZs, and consequently these locally weak zones will lead to plastic deformation. The width of PFZs have also been found to be negatively correlated with intergranular fracture PFZs also accelerate pitting corrosion and stress corrosion cracking, significantly reducing the usable life of these materials in chemically aggressive environments. Techniques to minimize It has been shown that PFZs can be minimized by quenching. First, quenching increases undercooling, favoring homogeneous nucleation in PFZs as it lowers the nucleation energy barrier even in the absence of potent nucleation sites. Additionally, low temperatures also lead to a reduction in diffusion rates, minimizing the