source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/AusCERT | AusCERT is a non-profit organisation founded in 1993, that provides advice and solutions to cybersecurity threats and vulnerabilities. The organisation covers their costs through member subscriptions, attendees to the annual AusCERT conference and service contracts.
History
In the early 1990s, Australian university student Nahshon Even-Chai hacked into the NASA computer system during his spare time. This triggered a chain reaction, causing businesses and government bodies to develop awareness for the need of improved information security. As a result, three Australian universities (Queensland University of Technology, Griffith University, and the University of Queensland) came together to form AusCERT. They aimed to create a central source for information security and protection.
AusCERT is one of Australia’s only Computer Emergency Response Team (CERT), and is one of the oldest CERT in the world. As a Member of FIRST, AusCERT is part of a worldwide network of computer security incident response and security teams. These teams work together to voluntarily deal with computer security problems and formulating prevention methods.
Their office is located on The University of Queensland campus.
Services
AusCERT covers their costs by selling member subscriptions and service contracts to individuals and businesses. The organisation boasts its 24/7 support and incident management against cyber threats. Other services include phishing take-down, security bulletins, incident notifications, sensitive information alerts, early warning SMS, and malicious URL feeds.
Engaged and active within the incident response teams at a global level; AusCERT is a charter member of APCERT as well as a member of the Forum of Incident Response and Security Teams (FIRST).
AusCERT Annual Conference
AusCERT has been hosting cybersecurity conferences in Australia since 2002. The conference takes place every year with presentations and hands-on tutorials for industry professionals. The AusC |
https://en.wikipedia.org/wiki/Spoon%20Radio | Spoon is a social digital audio live streaming service. It is developed by Spoon Radio Inc., a South Korea-based company with a US office located in San Francisco, California. It allows users to listen to streamers and even start their own live streams using their smartphones and web, without any other equipment. It provides services in four languages, including South Korea, the United States, Japan, and Middle Eastern and North African regions.
History
CEO Neil Choi, together with founders Hyuk Jun Choi and Hee-jae Lee, started their company Mykoon in 2013 with Plugger, a smartphone battery sharing service. They created Spoon Radio as their flagship on March 23, 2016. Two years since its inception, it got its backing in Series B funding from investors – namely Softbank Ventures Asia, KB Investment (KBIC), and Goodwater Capital – wherein they invested a total of $17 million. As part of its global expansion effort, Spoon has also hired Fernando Pizarro, media and internet executive from Disney, Discovery Communications and Yahoo!, as vice president.
Reception
By August 2018, the application has generated net revenue that surpassed more than $20 million, becoming the most famous audio-only streaming service in South Korea. Popular among the youth between ages 18 to 24 years old, Spoon has been downloaded 2.5 million times daily and there have been millions of broadcast uploads. It has expanded to Japan, US, and has plans for further expansion. In 2019, it has generated net revenue of $41 million.
Features
Spoon is an internet-radio broadcasting mobile application which allows its users to stream through audio alone, hence the catchphrase "live stream without a camera". Listeners can give their favorite streamers digital gifts called "Spoons" which then can be redeemed for money. There is a Live Call feature which lets streamers invite other streamers or even their listeners to co-host the stream with them.
Live
Live-streaming is Spoon's most definitive feature. U |
https://en.wikipedia.org/wiki/Zero-shot%20learning | Zero-shot learning (ZSL) is a problem setup in deep learning where, at test time, a learner observes samples from classes which were not observed during training, and needs to predict the class that they belong to. Zero-shot methods generally work by associating observed and non-observed classes through some form of auxiliary information, which encodes observable distinguishing properties of objects. For example, given a set of images of animals to be classified, along with auxiliary textual descriptions of what animals look like, an artificial intelligence model which has been trained to recognize horses, but has never been given a zebra, can still recognize a zebra when it also knows that zebras look like striped horses. This problem is widely studied in computer vision, natural language processing, and machine perception.
Background and history
The first paper on zero-shot learning in natural language processing appeared in 2008 at the AAAI’08, but the name given to the learning paradigm there was dataless classification. The first paper on zero-shot learning in computer vision appeared at the same conference, under the name zero-data learning. The term zero-shot learning itself first appeared in the literature in a 2009 paper from Palatucci, Hinton, Pomerleau, and Mitchell at NIPS’09. This terminology was repeated later in another computer vision paper and the term zero-shot learning caught on, as a take-off on one-shot learning that was introduced in computer vision years earlier.
In computer vision, zero-shot learning models learned parameters for seen classes along with their class representations and rely on representational similarity among class labels so that, during inference, instances can be classified into new classes.
In natural language processing, the key technical direction developed builds on the ability to "understand the labels"—represent the labels in the same semantic space as that of the documents to be classified. This supports the cla |
https://en.wikipedia.org/wiki/GNU%20Taler | GNU Taler is a free software-based microtransaction and electronic payment system. Unlike most other decentralized payment systems, GNU Taler does not use a blockchain. A blind signature is used to protect the privacy of users as it prevents the exchange from knowing which coin it signed for which customer.
The project is led by Florian Dold and Christian Grothoff of Taler Systems SA. Taler is short for the "Taxable Anonymous Libre Economic Reserves" and alludes to the Taler coins in Germany during the Early Modern period. It has vocal support from GNU Project founder Richard Stallman. Stallman has described the program as "designed to be anonymous for the payer, but payees are always identified." In a paper published in Security, Privacy, and Applied Cryptography Engineering, GNU Taler is described as meeting ethical considerations – the paying customer is anonymous while the merchant is identified and taxable. An implementation is provided by Taler Systems SA.
See also
DigiCash
Cryptocurrency
Open source
References
External links
Taler
Software using the GPL license
Payment systems
Online payments |
https://en.wikipedia.org/wiki/Albert%20Bijaoui | Albert Bijaoui (born in Monastir in 1943) is a French astronomer, former student of the Ecole Polytechnique (X 1962), renowned in image processing in astrophysics and its application in cosmology, he then prepared his PhD thesis at the Paris Observatory, under the supervision of André Lallemand. He defended his thesis at the Université Denis Diderot (Paris VII) in March 1971.
He has been a corresponding member of the French Academy of sciences since 1997.
Biography
Trainee then Research Associate at the CNRS at the Paris Observatory, then at the Nice Observatory, he became an Astronomer at the Nice Observatory in 1972. He is Honorary Astronomer at the Observatoire de la Côte d'Azur since 2015. He was Director of the Centre de Dépouillement des Clichés Astronomiques at the Institut National d'Astronomie et de Géophysique between 1973 and 1981. He was also director of the Cassiopeia laboratory (UMR CNRS/OCA) between 2004 and 2007.
He experimented with André Lallemand's electronic camera in 1970 at the 1.93 m telescope at the Haute-Provence Observatory.
The asteroid Bijaoui was named in his honour.
Scientific work
His research work has focused on various themes related to astronomical imaging. During the preparation of his thesis, he contributed to the development of electronography with the study and interpretation of the properties of André Lallemand's electronic camera. In parallel, he was involved in its exploitation for astrophysical purposes. With the creation in 1973 of the Centre de Dépouillement des Clichés Astronomiques at Nice Observatory, he was involved in the development of new methods for the analysis of astrophysical data and in the creation of software to exploit them for the French astronomical community. A system for the analysis of astronomical images has resulted. It will be widely disseminated in the astronomical community. With the commissioning of INAG's Schmidt telescope on the Calern plateau near Caussols in the Alpes-Maritimes, INAG ha |
https://en.wikipedia.org/wiki/Hao%20Huang%20%28mathematician%29 | Hao Huang is a mathematician known for solving the sensitivity conjecture. Huang is currently an associate professor in the mathematics department at National University of Singapore.
Huang was an assistant professor from 2015 to 2021 in the Department of Mathematics at Emory University. He obtained his Ph.D in mathematics from UCLA in 2012 advised by Benny Sudakov. His postdoctoral research was done at the Institute for Advanced Study in Princeton and DIMACS at Rutgers University in 2012-2014, followed by a year at the Institute for Mathematics and its Applications at University of Minnesota.
In July 2019, Huang announced a breakthrough, which gave a proof of the sensitivity conjecture. At that point the conjecture had been open for nearly 30 years, having been posed by Noam Nisan and Mario Szegedy in 1992.
Theoretical computer scientist Scott Aaronson said of Huang's ingenious two-page proof, "I find it hard to imagine that even God knows how to prove the Sensitivity Conjecture in any simpler way than this."
Huang received an NSF Career Award in 2019 and a Sloan Research Fellowship in 2020.
References
External links
Year of birth missing (living people)
Living people
Academic staff of the National University of Singapore
Emory University faculty
Sloan Research Fellows
Peking University alumni
University of California, Los Angeles alumni
University of Minnesota alumni
21st-century Chinese mathematicians
Combinatorialists |
https://en.wikipedia.org/wiki/Spatial%20computing | Spatial computing was defined in 2003 by Simon Greenwold, as "human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces". [Yale, June 1995.]
With the advent of consumer virtual reality, augmented reality, and mixed reality, companies such as Microsoft and Magic Leap use "spatial computing" in reference to the practice of using physical actions (head and body movements, gestures, speech) as inputs for interactive digital media systems, with perceived 3D physical space as the canvas for video, audio, and haptic outputs. It is also tied to the concept of 'digital twins'. The Infinite Retina, a book written by Irena Cronin and Robert Scoble and published by Packt Publishing in 2020, has spatial computing as its main topic and was widely well received, exemplifying the current importance and relevance of spatial computing.
Apple announced a spatial computing platform with the Vision Pro on 5 June 2023. It features several features such as Spatial Audio, two micro-OLED displays, the Apple R1 chip and eye tracking. It is planned to be released in 2024 in the United States.
References
Subfields of computer science |
https://en.wikipedia.org/wiki/Ultrasonic%20algae%20control | Ultrasonic algae control is a commercial technology that has been claimed to control the blooming of cyanobacteria, algae, and biofouling in lakes, and reservoirs, by using pulsed ultrasound. The duration of such treatment is supposed to take up to several months, depending on the water volume and algae species. Despite the experimental demonstration of certain bioeffects in small samples under controlled laboratory and sonication conditions, there is no scientific foundation for outdoors ultrasonic algae control.
Academic studies
It has been speculated that ultrasound produced at the resonance frequencies of cells or their membranes may cause them to rupture.
The center frequencies of the ultrasound pulses used in academic studies lie between 20 kHz and 2.5 MHz. The acoustic powers, pressures, and intensities applied vary from low, not affecting humans,
to high, unsafe for swimmers.
According to research at the University of Hull, ultrasound-assisted gas release from blue-green algae cells may take place from nitrogen-containing cells, but only under very specific short-distance conditions, which are not representative for intended outdoors applications. In addition, a study by Wageningen University on several algae species concluded that most claims on outdoors ultrasonic algae control are unsubstantiated.
See also
Ultrasonic antifouling
References
Green algae
Ultrasound
Acoustics
Algae classes |
https://en.wikipedia.org/wiki/Compensator%20%28control%20theory%29 | A compensator is a component in the control system that is used to regulate another system. Usually, it is done by conditioning the input or the output to that system. There are three types of compensators: lag, lead and lag-lead compensators.
Adjusting a control system in order to improve its performance might lead to unexpected behaviour (e.g., poor stability or even instability by increasing the gain value). In order to make the system behave as desired, it is necessary to redesign the system and add a compensator, a device which compensates for the deficient performance of the original system.
See also
Control theory
Lead–lag compensator
References
Control theory |
https://en.wikipedia.org/wiki/ChIL-sequencing | ChIL sequencing (ChIL-seq), also known as Chromatin Integration Labeling sequencing, is a method used to analyze protein interactions with DNA. ChIL-sequencing combines antibody-targeted controlled cleavage by Tn5 transposase with massively parallel DNA sequencing to identify the binding sites of DNA-associated proteins. It can be used to map global DNA binding sites precisely for any protein of interest. Currently, ChIP-Seq is the most common technique utilized to study protein–DNA relations, however, it suffers from a number of practical and economical limitations that ChIL-Sequencing does not. ChIL-Seq is a precise technique that reduces sample loss could be applied to single-cells.
Uses
ChIL-sequencing can be used to examine gene regulation or to analyze transcription factor and other chromatin-associated protein binding. Protein-DNA interactions regulate gene expression and are responsible for many biological processes and disease states. This epigenetic information is complementary to genotype and expression analysis. ChIL-Seq is an alternative to the current standard of ChIP-seq. ChIP-Seq suffers from limitations due to the cross linking step in ChIP-Seq protocols that can promote epitope masking and generate false-positive binding sites. As well, ChIP-seq suffers from suboptimal signal-to-noise ratios and poor resolution. ChIL-sequencing has the advantage of being a simpler technique suitable for low sample input due to the high signal-to-noise ratio, requiring less depth in sequencing for higher sensitivity.
Specific DNA sites in direct physical interaction with proteins such as transcription factors can be isolated by Protein-A (pA) conjugated Tn5 bound to a protein of interest. Tn5 mediated cleavage produces a library of target DNA sites bound to a protein of interest in situ. Sequencing of prepared DNA libraries and comparison to whole-genome sequence databases allows researchers to analyze the interactions between target proteins and DNA, as well as d |
https://en.wikipedia.org/wiki/Incomplete%20Bessel%20K%20function/generalized%20incomplete%20gamma%20function | Some mathematicians defined this type incomplete-version of Bessel function or this type generalized-version of incomplete gamma function:
Properties
One of the advantage of defining this type incomplete-version of Bessel function is that even for example the associated Anger–Weber function defined in Digital Library of Mathematical Functions can related:
Recurrence relations
satisfy this recurrence relation:
References
Special functions
Special hypergeometric functions
Functions and mappings |
https://en.wikipedia.org/wiki/Source%20%28programming%20language%29 | Source is a family of sublanguages of JavaScript, developed for the textbook Structure and Interpretation of Computer Programs, JavaScript Edition (SICP JS). The JavaScript sublanguages Source §1, Source §2, Source §3 and Source §4 are designed to be just expressive enough to support all examples of the respective chapter of the textbook.
Purpose and design principle
During the development of SICP JS, starting in 2008, it became clear that purpose-designed sublanguages of JavaScript would contribute to the learning experience. Initially called "JediScript" and inspired by the book "JavaScript: The Good Parts" by Douglas Crockford, the Source sublanguages follow the chapters of SICP JS; each language Source §x is a sublanguage of the next language Source §(x+1). Following the minimalistic approach of SICP JS, implementations of Source are expected to remove any JavaScript language features that are not included in the language specification.
Features
Source §1 is a very small purely functional sublanguage of JavaScript, designed for Chapter 1 of SICP JS. Source §2 adds pairs and a list library, following the data structures theme of Chapter 2. Source §3 adds stateful constructs, and Source §4 adds support for meta-circular evaluation. Chapter 5 of SICP JS does not require language support beyond Source §4. All Source languages are properly tail recursive, as required by Chapter 1 of SICP and as specified by ECMAScript 2015.
Source Academy
Since the Safari browser is ECMAScript-2015-compliant, including proper tail calls, it can serve as an implementation of all Source languages, provided that the SICP package is loaded. The Source Academy is a web-based programming environment that implements all Source languages, regardless of browser support for proper tail calls, and features various tools for the readers of SICP JS. The language implementation in the Source Academy, js-slang, is also available as a stand-alone environment based on Node.js.
References
Exte |
https://en.wikipedia.org/wiki/Jewish%20Museum%20of%20the%20American%20West | The Jewish Museum of the American West is an online museum sponsored by the Western States Jewish History Association dedicated to telling the stories of the participation of Jews in the development of the American West and why they were so successful. It was established in 2013 by Gladys Sturman and David W. Epstein of the Western States Jewish History Association as a continuation of its journal published from 1968 to 2018.
References
Jewish-American history
Jewish history organizations
Jewish museums in the United States
Museums established in 2013
2013 establishments in the United States
Virtual museums |
https://en.wikipedia.org/wiki/Prusa%20Mini | The Prusa Mini, sometimes stylized as the Original Prusa MINI, is an open-source fused deposition modeling 3D printer that is manufactured by the Czech company Prusa Research. The printer is the lowest cost machine produced by Prusa Research and is designed as a first printer or as part of a 'print farm'.
Specifications
Mini
The Prusa Mini was officially launched in October 2019. The printer is available either assembled or as a kit. The build volume is 180 x 180 x 180 mm, and the print is performed on a spring steel sheet which meant to be easy to remove. Minimum layer resolution is 50 micrometers, and the maximum print speed is 200 millimeters per second. The printer has an LCD color display (non-touch), is able to print via USB drives. It has a custom 32-bit mainboard and a built-in online firmware updater. The printer has sensorless homing using Trinamic 2209 drivers and has a custom hot end which supports E3D nozzles.
It has several safety features including three thermistors to detect thermal runaway.
The printer is the first open source hardware product to require a user wishing to use unsigned firmware to physically break off a piece of the PCB, voiding the printer's warranty, before it can be flashed onto the board. This made sure Prusa wasn't liable for damage caused by printers instructed to behave in an unendorsed manner by custom firmware (such as disabling thermal runaway protections or other safety features).
Mini+
In November 2020, the Prusa Mini was replaced by the Mini+, which had a few small updates meant to ease assembly and maintenance. One of the changes was a new mesh bed levelling sensor called "SuperPINDA" which replaced the previous "MINDA" sensor, and it is claimed by the manufacturer that this should result in a more consistent calibration of the first print layer in particular. The Mini+ filament sensor is an optional extra.
See also
Prusa i3
References
External links
Prusa Research
Open hardware electronic devices
3D pr |
https://en.wikipedia.org/wiki/Adela%20Ruiz%20de%20Royo | Adela María Ruiz González , customary married name Ruiz de Royo (December 15, 1943 – June 19, 2019) was a Spanish-born Panamanian mathematics academic and educator. She served as the First Lady of Panama from 1978 until 1982 during the presidency of her husband, Aristides Royo. She also served President of the Panamanian Academy of Language (Academia Panameña de la Lengua).
Biography
Ruiz was born Adela María Ruiz González in a home in the municipality of Grado, Asturias, Spain to parents, José María and Rosalina. She was raised in the nearby city of Oviedo alongside her three sisters, Marta, Mabel, and María José. Ruiz was nicknamed Deli.
By 1960, Ruiz had moved to Salamanca to study medicine. That same year, she met her future husband, a Panamanian national and fellow student at the University of Salamanca named Aristides Royo. The couple married in the early 1960s and eventually had three children - Marta Elena, Irma Natalia, and Aristides José. Ruiz, Royo and their oldest daughter, Marta, moved to Panama permanently on September 17, 1965.
In addition to her own career, Ruiz held the role of the wife of a government minister and politician. She became First Lady of Panama from 1978 to 1982. During her tenure as first lady, Ruiz created the Asociación Pro Obras de Beneficencia.
Ruiz was diagnosed with colon and liver cancer in 2017. She died from the disease on June 19, 2019, at the age of 75. Adela Ruiz was survived by her husband, Aristides Royo, and their three children, Marta Elena, Natalia, and Arístides José. Her funeral was held at the National Sanctuary in Bella Vista, Panama City on June 24, 2019. Ruiz's ashes were returned to her native Spain, where they were partially buried at the Praviano cemetery in Riberas, Asturias. A second funeral mass was held at the Carmelite Catholic Church of Oviedo on October 4, 2019. Shortly before the funeral, her remaining ashes were sprinkled into the Cantabrian Sea by her husband and children.
In December 2019, R |
https://en.wikipedia.org/wiki/Eakin%E2%80%93Nagata%20theorem | In abstract algebra, the Eakin–Nagata theorem states: given commutative rings such that is finitely generated as a module over , if is a Noetherian ring, then is a Noetherian ring. (Note the converse is also true and is easier.)
The theorem is similar to the Artin–Tate lemma, which says that the same statement holds with "Noetherian" replaced by "finitely generated algebra" (assuming the base ring is a Noetherian ring).
The theorem was first proved in Paul M. Eakin's thesis and later independently by . The theorem can also be deduced from the characterization of a Noetherian ring in terms of injective modules, as done for example by David Eisenbud in ; this approach is useful for a generalization to non-commutative rings.
Proof
The following more general result is due to Edward W. Formanek and is proved by an argument rooted to the original proofs by Eakin and Nagata. According to , this formulation is likely the most transparent one.
Proof: It is enough to show that is a Noetherian module since, in general, a ring admitting a faithful Noetherian module over it is a Noetherian ring. Suppose otherwise. By assumption, the set of all , where is an ideal of such that is not Noetherian has a maximal element, . Replacing and by and , we can assume
for each nonzero ideal , the module is Noetherian.
Next, consider the set of submodules such that is faithful. Choose a set of generators of and then note that is faithful if and only if for each , the inclusion implies . Thus, it is clear that Zorn's lemma applies to the set , and so the set has a maximal element, . Now, if is Noetherian, then it is a faithful Noetherian module over A and, consequently, A is a Noetherian ring, a contradiction. Hence, is not Noetherian and replacing by , we can also assume
each nonzero submodule is such that is not faithful.
Let a submodule be given. Since is not faithful, there is a nonzero element such that . By assumption, is Noetherian and so is finitely ge |
https://en.wikipedia.org/wiki/Fairness%20%28machine%20learning%29 | Fairness in machine learning refers to the various attempts at correcting algorithmic bias in automated decision processes based on machine learning models. Decisions made by computers after a machine-learning process may be considered unfair if they were based on variables considered sensitive. Examples of these kinds of variable include gender, ethnicity, sexual orientation, disability and more. As it is the case with many ethical concepts, definitions of fairness and bias are always controversial. In general, fairness and bias are considered relevant when the decision process impacts people's lives. In machine learning, the problem of algorithmic bias is well known and well studied. Outcomes may be skewed by a range of factors and thus might be considered unfair with respect to certain groups or individuals. An example would be the way social media sites deliver personalized news to consumers.
Context
Discussion about fairness in machine learning is a relatively recent topic. Since 2016 there has been a sharp increase in research into the topic. This increase could be partly accounted to an influential report by ProPublica that claimed that the COMPAS software, widely used in US courts to predict recidivism, was racially biased. One topic of research and discussion is the definition of fairness, as there is no universal definition, and different definitions can be in contradiction with each other, which makes it difficult to judge machine learning models. Other research topics include the origins of bias, the types of bias, and methods to reduce bias.
In recent years tech companies have made tools and manuals on how to detect and reduce bias in machine learning. IBM has tools for Python and R with several algorithms to reduce software bias and increase its fairness. Google has published guidlines and tools to study and combat bias in machine learning. Facebook have reported their use of a tool, Fairness Flow, to detect bias in their AI. However, critics have |
https://en.wikipedia.org/wiki/Meeza | Meeza (ميزة) is an Egyptian electronic payment systems provider for domestic transactions within Egypt. It is supported by the Egyptian government and is regulated by the Egyptian Central Bank and the national Egyptian Banks Company (EBC). Meeza provides similar electronic payment services as MasterCard and Visa but can only be used locally inside Egypt. The name Meeza means "special" in Egyptian Arabic.
History
Meeza was established in early 2019, to provide a national payment scheme supporting a cashless society in Egypt. By the end of 2019, Meeza has issued about 4 million payment cards for use within the Egyptian network.
In October 2020, Meeza started offering free prepaid cards to Egyptians, which can be used locally and can not be used outside Egypt.
Products and services
Meeza produces bank cards and mobile wallet applications for local transactions within Egypt. Meeza payment cards are accepted in merchandise stores and government organizations across Egypt, in addition to online Egyptian e-commerce websites. Meeza issues both prepaid payment cards and bank account debit cards. The cards are issued by most of the major banks in Egypt like National Bank of Egypt (NBE), Banque Misr, Alex Bank, Banque du Caire, Commercial International Bank (CIB), and other Egyptian banks. The Meeza card products also include contactless cards for point-of-sale outlets supporting them.
See also
Payment service provider
E-commerce payment system
References
External links
Official website
Telecommunications companies of Egypt
Financial services companies established in 2019
Egyptian_brands
Credit cards
Credit card issuer associations
Contactless smart cards
Online payments
2019 establishments in Egypt
Companies based in Cairo |
https://en.wikipedia.org/wiki/Dustnet | Dustnet (stylized as DUSTNET) is a 2019 asymmetrical, action, sandbox video game developed by Canadian studio SCRNPRNT. The game explores the theme of "dying" or disappearing multiplayer video games and their player bases, with the gameplay being set around a copy of the Dust II multiplayer map, originally created for Counter-Strike in 2001.
Gameplay
Many of Dustnets features are in direct reference to Counter-Strike, such as an unlockable "Bunny hopping" feature, selection between one "Counter-Terrorist" and one "Terrorist" team upon joining a server, and the presence of a bomb to be defused. A deathmatch mechanic is present, with available weapons being references to the Counter-Strike and Quake video game series.
However, many of these features are inserted merely as references to the game's theme. For example, the bomb to be defused, which is a central point of importance in the Counter-Strike series, is of no particular importance in Dustnet. Instead, players may edit and build upon the original "Dust II" layout, copying parts of it or laying down new map pieces such as teleporters, jump pads, and ammo pickups. Players with access to virtual reality headsets will appear to normal PC players as giant floating pairs of hands, which may smite or haul other players around the game world, as well as edit and build upon the level on a greater scale.
As a tie-in to the game's themes, a server's player-made creations are permanent for as long as there are players connected to it, but will be erased if the server becomes empty. Besides this, a server may go into a state named "low-energy mode", where the game's visuals and audio become more dull and muted. This can be prevented by executing game-related actions such as joining a server, searching through a defeated player's inventory, or planting the bomb.
The game features a free augmented reality companion app, available for Android and iOS. It allows players to observe and interact with one of the game's server |
https://en.wikipedia.org/wiki/SELF-SCAN | SELF-SCAN is a family of plasma displays introduced by Burroughs Corporation during the 1970s. The most common format was a single-row dot matrix display in sizes from 16 to 40 ASCII characters wide. Other formats were also produced, including the SELF-SCAN II 40 wide by 12 or 6 line high displays, and a variety of custom displays showing gauges or pointers.
The SELF-SCAN displays were an important stepping-stone technology between printer-based teletype-like terminals of the 1960s and the widespread use of cathode ray tube (CRT) displays from the mid-1970s on. They were often used for operator terminals on mainframe and minicomputers, and after that continued to see some use in demanding environments where their thinness, on the order of 1 inch, and resistance to interference from magnetic and electric fields that cause problems for CRTs.
The introduction of low-cost liquid crystal displays (LCD)s replaced plasma displays in these uses by the mid-1990s.
References
Display devices |
https://en.wikipedia.org/wiki/GeneMatcher | GeneMatcher is an online service and database that aims to match clinicians studying patients with a rare disease presentation based on genes of interest. When two or more clinicians submit the same gene to the database, the service matches them together to allow them to compare cases. It also allows matching genes from animal models to human cases. The service aims to establish novel relationships between genes and genetic diseases of unknown cause.
The website was launched in September 2013 by a team from a government-funded collaborative project between Johns Hopkins Hospital and Baylor College of Medicine in the United States.
, the site contained 11,855 genes from 7,724 submitters from 88 countries, and 6,609 matches had been made. The service has aided geneticists in making several discoveries, including establishing the genetic causes of a form of autism spectrum disorder, syndromes of microcephaly with hearing loss, a mitochondrial disease, SPONASTRIME dysplasia and Au–Kline syndrome.
History
The website was launched in September 2013 by Nara Sobreira, François Schiettecatte, Ada Hamosh and others. The team are part of a collaborative effort between Johns Hopkins Hospital in Baltimore, Maryland and Baylor College of Medicine in Houston, Texas, United States called the Baylor–Hopkins Center for Mendelian Genomics (BHCMG), one of three such Centers for Mendelian Genomics (CMGs) established and funded by the American National Institutes of Health (NIH) and National Human Genome Research Institute (NHGRI) in 2011.
Features
The service allows researchers to submit candidate genes to a database and match based on a shared gene of interest. Researchers, healthcare providers or patients can create an account using their email, name and address. Upon doing this, they can post a gene by gene symbol, Entrez ID or Ensembl gene ID. They can also specify genes by OMIM number or genomic location. If an identical gene has already been posted by another user, the match |
https://en.wikipedia.org/wiki/List%20of%20MOSFET%20applications | The MOSFET (metal–oxide–semiconductor field-effect transistor) is a type of insulated-gate field-effect transistor (IGFET) that is fabricated by the controlled oxidation of a semiconductor, typically silicon. The voltage of the covered gate determines the electrical conductivity of the device; this ability to change conductivity with the amount of applied voltage can be used for amplifying or switching electronic signals.
The MOSFET is the basic building block of most modern electronics, and the most frequently manufactured device in history, with an estimated total of 13sextillion (1.3 × 1022) MOSFETs manufactured between 1960 and 2018. It is the most common semiconductor device in digital and analog circuits, and the most common power device. It was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of uses. MOSFET scaling and miniaturization has been driving the rapid exponential growth of electronic semiconductor technology since the 1960s, and enable high-density integrated circuits (ICs) such as memory chips and microprocessors.
MOSFETs in integrated circuits are the primary elements of computer processors, semiconductor memory, image sensors, and most other types of integrated circuits. Discrete MOSFET devices are widely used in applications such as switch mode power supplies, variable-frequency drives, and other power electronics applications where each device may be switching thousands of watts. Radio-frequency amplifiers up to the UHF spectrum use MOSFET transistors as analog signal and power amplifiers. Radio systems also use MOSFETs as oscillators, or mixers to convert frequencies. MOSFET devices are also applied in audio-frequency power amplifiers for public address systems, sound reinforcement, and home and automobile sound systems.
Integrated circuits
The MOSFET is the most widely used type of transistor and the most critical device component in integrated circuit (IC) chips. Planar process, develop |
https://en.wikipedia.org/wiki/Swiss%20Media%20Database | Swiss Media Database (SMD) is a Swiss newspaper-article and television-program database accessible at no charge to media professionals. The public can access its contents for a fee.
Organization and contents
The Swiss Media Database, founded in Zurich in May 1966 is a joint venture of publishing houses Ringier of Zofingen, Tamedia of Zurich, and Swiss Radio and Television. Each holds one-third of the shares.
The offerings of the participating publishers are in full text, and reproductions of the original newspaper pages are available. Full texts of most Swiss daily and weekly newspapers, print and online, are archived.
Since June 2019, broadcasts of the German-speaking Swiss television (SRF) and the Swiss television in French (RTS) have been indexed. Swiss Media Database (SMD) contains more than 33 million documents (as of 2019). About two million are added each year.
Since 2002, SMD has been providing paid access to its archives through the website swissdox.ch. The SMD, in cooperation with the Association of Swiss Professional Journalists, has also offered access for free lance journalists.
Deletion of articles
The deletion of articles in the database is sometimes ordered by courts or by publishers on their own. These deletions have led to controversial discussions — for example on the coverage of Jolanda Spiess-Hegglin, a politician in the canton of Zug who quit the Alternative Green Party a year after a scandal in which she claimed to have been sexually abused by a fellow member of the cantonal parliament.
References
External links
Schweizer Mediendatenbank
swissdox.ch
Online archives
Mass media companies established in 1966
Internet properties established in 2002
Archives in Switzerland
Companies based in Zürich |
https://en.wikipedia.org/wiki/Inverse%20Pythagorean%20theorem | In geometry, the inverse Pythagorean theorem (also known as the reciprocal Pythagorean theorem or the upside down Pythagorean theorem) is as follows:
Let , be the endpoints of the hypotenuse of a right triangle . Let be the foot of a perpendicular dropped from , the vertex of the right angle, to the hypotenuse. Then
This theorem should not be confused with proposition 48 in book 1 of Euclid's Elements, the converse of the Pythagorean theorem, which states that if the square on one side of a triangle is equal to the sum of the squares on the other two sides then the other two sides contain a right angle.
Proof
The area of triangle can be expressed in terms of either and , or and :
given , and .
Using the Pythagorean theorem,
as above.
Special case of the cruciform curve
The cruciform curve or cross curve is a quartic plane curve given by the equation
where the two parameters determining the shape of the curve, and are each .
Substituting with and with gives
Inverse-Pythagorean triples can be generated using integer parameters and as follows.
Application
If two identical lamps are placed at and , the theorem and the inverse-square law imply that the light intensity at is the same as when a single lamp is placed at .
See also
References
Geometry |
https://en.wikipedia.org/wiki/Math-Tinik | Math-Tinik (; stylized as MATH-Tinik) is a Philippine educational children's television series produced by the E-Media program of ABS-CBN Foundation (now the ABS-CBN Lingkod Kapamilya Foundation) and the Department of Education, Culture and Sports (DECS; now the Department of Education). The series was conceived by producer Gina Lopez to impart mathematics lessons to students through visual means. The series aired from January 7, 1997, to 2004.
Cast
Angela Garcia as Ms. Math-Tinik
Lorena Garcia as Sheila Mae
Huey Remulla as Charlie
Claudine Alejandro as Patricia
Mark Guayco as Joey
Introduced in 2000
Herbie Go as Artmetic
Production
The series' headwriters are Catherina Calzo-Fournier and Andrea Delos Reyes, with the former later becoming editor-in-chief of the Pinas newspaper. One of the episode directors is Rene Guidote, who has also directed Sine'skwela, Bayani, Pahina and other educational series made by ABS-CBN Foundation.
As with other educational television programs from the ABS-CBN Foundation and DECS, a single episode of Math-Tinik took between three and nine months to make from conception to approval.
Music
The "Math-Tinik Theme" was sung by Cris Villonco, with lyrics by Ting-ting Calzo-Fournier and music composed by Jungee Marcelo. The series composers were Noel Argosino, Froilan Malimban, and Noel Manalo, who also provided sound effects for the series.
In the 1999 episode "Numeration", the song "Numerals" was composed by Liezel Ann Tiamzon. In the 2000 episode "Time and Calendar", the song "Time and Calendar" was also composed by Tiamzon, sung by Caloy Santos Jr.
See also
Sine'skwela
Hiraya Manawari
Bayani
Epol/Apple
References
1997 Philippine television series debuts
2004 Philippine television series endings
1990s Philippine television series
ABS-CBN original programming
Children's education television series
Filipino-language television shows
Mathematics education television series
Philippine children's television series
Philippine educational |
https://en.wikipedia.org/wiki/Canadian%20Energy%20Centre | The Canadian Energy Centre Limited (CEC), also commonly called the "Energy War Room", is an Alberta provincial corporation mandated to promote Alberta's energy industry and rebut "domestic and foreign-funded campaigns against Canada's oil and gas industry". The creation of an organization to promote Alberta's oil and gas industries was a campaign promise by United Conservative Party leader Jason Kenney during the 2019 Alberta general election. After winning a majority of seats in the election, Kenney's government inaugurated the CEC with a $2.84 million budget in December 2019. The CEC originally had an annual budget of CA$30 million which was decreased to $CA12 million. The CEC has been the subject of several controversies since its establishment, including accusations of plagiarizing logo designs. The CEC attracted widespread media attention when it launched a campaign against the Netflix animated children's movie Bigfoot Family because it cast Alberta's oil and gas industry in a negative light.
Background
The creation of a 'war room' capable of challenging "energy industry critics' inaccuracies" was an election promise made by then candidate Jason Kenney as part of his campaign leading up to the 16 April 2019 Alberta general election. In the founding speech of the UCP on 9 May 2018, Kenney announced that he would engage in "national and international advocacy" including a "fully staffed rapid response war room in government to quickly and effectively rebut every lie told by the green left about our world-class energy industry. If companies like HSBC decide to boycott our oil sands, our government will boycott them. It's called a market decision." Premier Kenney, whose United Conservative Party (UCP), won a majority of seats in the Alberta Legislature announced the creation of Calgary-based $30 million "Energy War Room" on 7 June 2019 to "fight misinformation related to oil and gas".
On 6 May 2019 Nick Koolsbergen, who was the UCP's Alberta campaign manager fo |
https://en.wikipedia.org/wiki/Nahid%20Khazenie | Dr. Nahid Khazenie is a mechanical engineer who served as president of the IEEE Geoscience and Remote Sensing Society from 1998 to 1999.
Khazenie completed her undergraduate education at the Michigan Technological University before going on to receive several graduate degrees from the University of Texas at Austin, including a Ph.D. in 1987 for Mechanical Engineering and Operations Research. She joined the faculty and was a research scientist, specializing in remote sensing applications in agriculture and ocean studies. Because of her work there, she became a Senior Scientist appointment to the Naval Research Laboratory and then NASA as Earth Science Enterprise Education Programs Manager.
References
Living people
Year of birth missing (living people)
American mechanical engineers
Michigan Technological University alumni
Cockrell School of Engineering alumni
Members of the IEEE |
https://en.wikipedia.org/wiki/Eastin%E2%80%93Knill%20theorem | The Eastin–Knill theorem is a no-go theorem that states: "No quantum error correcting code can have a continuous symmetry which acts transversely on physical qubits". In other words, no quantum error correcting code can transversely implement a universal gate set, where a transversal logical gate is one that can be implemented on a logical qubit by the independent action of separate physical gates on corresponding physical qubits.
In addition to investigating fault tolerant quantum computation, the Eastin–Knill theorem is also useful for studying quantum gravity via the AdS/CFT correspondence and in condensed matter physics via quantum reference frame or many-body theory.
The theorem is named after Bryan Eastin and Emanuel Knill, who published it in 2009.
Description
Since quantum computers are inherently noisy, quantum error correcting codes are used to correct errors that affect information due to decoherence and dissipation. Decoding error corrected data in order to perform gates on the qubits makes it prone to errors. Fault tolerant quantum computation avoids this by performing gates on encoded data. Transversal gates, which perform a gate between two logical qubits each of which is encoded in N physical qubits by pairing up the physical qubits of each encoded qubit ("code block"), and performing independent gates on each pair, can be used to perform fault tolerant but not universal quantum computation because they guarantee that errors don't spread uncontrollably through the computation. This is because transversal gates ensure that each qubit in a code block is acted on by at most a single physical gate and each code block is corrected independently when an error occurs.
The Eastin–Knill theorem implies that a universal set like } gates can't be implemented transversally. For example, the T gate can't be implemented transversely in the Steane code. This calls for ways of circumventing Eastin–Knill in order to perform fault tolerant quantum computatio |
https://en.wikipedia.org/wiki/Hawking%20Index | The Hawking Index (HI) is a mock mathematical measure on how far people will, on average, read through a book before giving up. It was invented by American mathematician Jordan Ellenberg, who created it in a blog for the Wall Street Journal in 2014. The index is named after English physicist Stephen Hawking, whose book A Brief History of Time has been dubbed "the most unread book of all time".
Calculation
Ellenberg's method of calculating the index draws on the "popular highlights", the five most highlighted passages marked by Amazon Kindle readers of each title. A wide spread of highlights throughout the work means that most readers will have read the entire book, resulting in a high on the index. If the spread of highlights occurs only at the beginning of the book, then it means that fewer people will have read the book completely and it will thus score low on the index. When the index was created, this information was easier to access, as "popular highlights" were available to everyone, but since then this information has only been made available to people who buy the books on Kindle.
Hawking Index scores
When Ellenberg first used the index, he used the following books as his examples.
References
Literature
Units of measurement |
https://en.wikipedia.org/wiki/Domain%20controller | A domain controller (DC) is a server that responds to security authentication requests within a computer network domain. It is a network server that is responsible for allowing host access to domain resources. It authenticates users, stores user account information and enforces security policy for a domain. It is most commonly implemented in Microsoft Windows environments (see Domain controller (Windows)), where it is the centerpiece of the Windows Active Directory service. However, non-Windows domain controllers can be established via identity management software such as Samba and Red Hat FreeIPA.
Software
The software and operating system used to run a domain controller usually consists of several key components shared across platforms. This includes the operating system (usually Windows Server or Linux), an LDAP service (Red Hat Directory Server, etc.), a network time service (ntpd, chrony, etc.), and a computer network authentication protocol (usually Kerberos). Other components, such as a public key infrastructure (Active Directory Certificate Services, DogTag, OpenSSL) service and Domain Name System (Windows DNS or BIND) may also be included on the same server or on another domain-joined server.
Implementation
Domain controllers are typically deployed as a cluster to ensure high-availability and maximize reliability. In a Windows environment, one domain controller serves as the Primary Domain Controller (PDC) and all other servers promoted to domain controller status in the domain server as a Backup Domain Controller (BDC). In Unix-based environments, one machine serves as the master domain controller and others serve as replica domain controllers, periodically replicating database information from the main domain controller and storing it in a read-only format.
See also
Apple Open Directory
Domain controller (Windows)
Microsoft Windows Active Directory
Red Hat Identity Manager/Red Hat FreeIPA
Univention Corporate Server
References
Servers (computing |
https://en.wikipedia.org/wiki/K-stability | In mathematics, and especially differential and algebraic geometry, K-stability is an algebro-geometric stability condition, for complex manifolds and complex algebraic varieties. The notion of K-stability was first introduced by Gang Tian and reformulated more algebraically later by Simon Donaldson. The definition was inspired by a comparison to geometric invariant theory (GIT) stability. In the special case of Fano varieties, K-stability precisely characterises the existence of Kähler–Einstein metrics. More generally, on any compact complex manifold, K-stability is conjectured to be equivalent to the existence of constant scalar curvature Kähler metrics (cscK metrics).
History
In 1954, Eugenio Calabi formulated a conjecture about the existence of Kähler metrics on compact Kähler manifolds, now known as the Calabi conjecture. One formulation of the conjecture is that a compact Kähler manifold admits a unique Kähler–Einstein metric in the class . In the particular case where , such a Kähler–Einstein metric would be Ricci flat, making the manifold a Calabi–Yau manifold. The Calabi conjecture was resolved in the case where by Thierry Aubin and Shing-Tung Yau, and when by Yau. In the case where , that is when is a Fano manifold, a Kähler–Einstein metric does not always exist. Namely, it was known by work of Yozo Matsushima and André Lichnerowicz that a Kähler manifold with can only admit a Kähler–Einstein metric if the Lie algebra is reductive. However, it can be easily shown that the blow up of the complex projective plane at one point, is Fano, but does not have reductive Lie algebra. Thus not all Fano manifolds can admit Kähler–Einstein metrics.
After the resolution of the Calabi conjecture for attention turned to the loosely related problem of finding canonical metrics on vector bundles over complex manifolds. In 1983, Donaldson produced a new proof of the Narasimhan–Seshadri theorem. As proved by Donaldson, the theorem states that a holomorphic vector |
https://en.wikipedia.org/wiki/Information%20Technology%20Rules%2C%202021 | The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 is secondary or subordinate legislation that suppresses India's Intermediary Guidelines Rules 2011. The 2021 rules have stemmed from section 87 of the Information Technology Act, 2000 and are a combination of the draft Intermediaries Rules, 2018 and the OTT Regulation and Code of Ethics for Digital Media.
The Central Government of India along with the Ministry of Electronics and Information Technology (MeitY) and the Ministry of Information and Broadcasting (MIB) have coordinated in the development of the rules.
Intermediaries had until 25 May 2021 to comply with the rules.
History
During Monsoon session of the Parliament in 2018 a motion on “Misuse of social media platforms and spreading of fake news” was admitted. The Minister of Electronics and Information Technology accordingly made a detailed statement of the "resolve of the Government to strengthen the legal framework and make the social media platforms accountable under the law". MeitY then prepared the draft Information Technology (Intermediary Guidelines) Rules 2018 to replace the 2011 rules. The Information Technology Act, 2000 provided that intermediaries are protected liabilities in some cases. The draft 2018 Rules sought to elaborate the liabilities and responsibilities of the intermediaries in a better way. Further the draft Rules have been made "in order to prevent spreading of fake news, curb obscene information on the internet, prevent misuse of social-media platforms and to provide security to the users." The move followed a notice issued to WhatsApp in July 2018, warning it against helping to spread fake news and look on as a "mute spectator".
In relation to the Prajawala case, on 11 December 2018, the Supreme Court of India observed that "the Government of India may frame the necessary Guidelines / SOP and implement them within two weeks so as to eliminate child pornography, rape and gang rape imag |
https://en.wikipedia.org/wiki/Divisorial%20scheme | In algebraic geometry, a divisorial scheme is a scheme admitting an ample family of line bundles, as opposed to an ample line bundle. In particular, a quasi-projective variety is a divisorial scheme and the notion is a generalization of "quasi-projective". It was introduced in (in the case of a variety) as well as in (in the case of a scheme). The term "divisorial" refers to the fact that "the topology of these varieties is determined by their positive divisors." The class of divisorial schemes is quite large: it includes affine schemes, separated regular (noetherian) schemes and subschemes of a divisorial scheme (such as projective varieties).
Definition
Here is the definition in SGA 6, which is a more general version of the definition of Borelli. Given a quasi-compact quasi-separated scheme X, a family of invertible sheaves on it is said to be an ample family if the open subsets form a base of the (Zariski) topology on X; in other words, there is an open affine cover of X consisting of open sets of such form. A scheme is then said to be divisorial if there exists such an ample family of invertible sheaves.
Properties and counterexample
Since a subscheme of a divisorial scheme is divisorial, "divisorial" is a necessary condition for a scheme to be embedded into a smooth variety (or more generally a separated Noetherian regular scheme). To an extent, it is also a sufficient condition.
A divisorial scheme has the resolution property; i.e., a coherent sheaf is a quotient of a vector bundle. In particular, a scheme that does not have the resolution property is an example of a non-divisorial scheme.
See also
Jouanolou's trick
References
Algebraic geometry |
https://en.wikipedia.org/wiki/Grounding%20resistance%20tester | A grounding resistance tester also called an earth tester is a soil resistance measuring instrument. It is used for sizing and projecting grounding grids.
The first soil resistance measuring instrument was invented in the 1950s by Evershed & Vignoles Meggers who made the first insulation and earth resistance testers. One of the most used analog grounding testers in USSR were М416. From the 21st century several companies produced digital earth resistance meters and testers. The main purpose of the instrument is to determine the adequacy of the grounding of an electrical system. By a standard of the National Electrical Code the resistance of the soil should be less than 25 Ohms to reliably and efficiently ground the installation.
Operating principle
The meter generates an electrical current and then supplies it to the measuring electrodes. The potential difference between the two electrodes gives information about the value of soil resistance.
Analog grounding resistance tester
The analog grounding resistance tester is realized by four main blocks DC generator, rectifier, current and potential coil. The deflection of the pointer of the analog screen depends on the ratio of the voltage of pressure coil to the current of the current coil.
Digital grounding resistance tester
The digital grounding resistance tester is realized by digital electronic blocks as Timers, voltage regulators, and digital display. The ranges are changed with multiturn trimpot.
Main characteristics
When measuring earth resistance with an instrument, it is important to know some of its basic characteristics in order to accurately measure the soil resistance and to properly size the grounding installation. Most importantly, the range of resistance the device measures. Usually the range is three or four degrees. The soil moisture at which the appliance operates is another important parameter. If the instrument cannot operate at a certain humidity, then the measurement may differ significant |
https://en.wikipedia.org/wiki/CALLISTO | CALLISTO (Cooperative Action Leading to Launcher Innovation in Stage Toss-back Operations) is a single stage and reusable VTVL demonstrator propelled by the Prometheus rocket engine and developed jointly by the German Aerospace Center, the French Space Agency, and the Japanese Aerospace Exploration Agency.
The goals are to mature and demonstrate the technologies which are necessary to build and operate a reusable launch vehicle, but also to better assess the operational cost of such a vehicle. The first flight was planned for 2022.
Eventually, lessons learned with the development of CALLISTO will pave the way to develop the European reusable launcher Ariane Next.
See also
Themis programme
Prometheus (rocket engine)
Ariane Next
References
Reusable launch systems
Partially reusable space launch vehicles
Space launch vehicles of Europe
Space programs
European space programmes
Spaceflight technology |
https://en.wikipedia.org/wiki/Molecular%20fragmentation%20methods | Molecular fragmentation (mass spectrometry), or molecular dissociation, occurs both in nature and in experiments. It occurs when a complete molecule is rendered into smaller fragments by some energy source, usually ionizing radiation. The resulting fragments can be far more chemically reactive than the original molecule, as in radiation therapy for cancer, and are thus a useful field of inquiry. Different molecular fragmentation methods have been built to break apart molecules, some of which are listed below.
Background
A major objective of theoretical chemistry and computational chemistry is the calculation of the energy and properties of molecules so that chemical reactivity and material properties can be understood from first principles. As a practical matter, the aim is to complement the knowledge we gain from experiments, particularly where experimental data may be incomplete or very difficult to obtain.
High-level ab-initio quantum chemistry methods are known to be an invaluable tool for understanding the structure, energy, and properties of small up to medium-sized molecules. However, the computational time for these calculations grows rapidly with increased size of molecules. One way of dealing with this problem is the molecular fragmentation approach which provides a hierarchy of approximations to the molecular electronic energy. In this approach, large molecules are divided in a systematic way to small fragments, for which high-level ab-initio calculation can be performed with acceptable computational time.
The defining characteristic of an energy-based molecular fragmentation method is that the molecule (also cluster of molecules, or liquid or solid) is broken up into a set of relatively small molecular fragments, in such a way that the electronic energy, , of the full system is given by a sum of the energies of these fragment molecules:
where is the energy of a relatively small molecular fragment,. The are simple coefficients (typically integers) |
https://en.wikipedia.org/wiki/List%20of%20biology%20awards | This list of biology awards is an index to articles about notable awards for biology. It includes a general list and lists of ecology, genetics and neuroscience awards. It excludes awards for biochemistry, biomedical science, medicine, ornithology and paleontology, which are covered by separate lists.
General awards
International
Americas
Asia
Europe
Oceania
Ecology
Genetics
Genetics is a branch of biology concerned with the study of genes, genetic variation, and heredity in organisms.
Neuroscience
See also
Competitions and prizes in biotechnology
Lists of awards
Lists of science and technology awards
List of biochemistry awards
List of biomedical science awards
List of awards in bioinformatics and computational biology
List of fellows of the AACR Academy
List of medicine awards
List of ornithology awards
List of paleontology awards
References
Awards
biology |
https://en.wikipedia.org/wiki/List%20of%20engineering%20awards | This list of engineering awards is an index to articles about notable awards for achievements in engineering. It includes aerospace engineering, chemical engineering, civil engineering, electrical engineering, electronic engineering, structural engineering and systems science awards. It excludes computer-related awards, computer science awards, industrial design awards, mechanical engineering awards, motor vehicle awards, occupational health and safety awards and space technology awards, which are covered by separate lists.
The list is organized by the region and country of the organizations that sponsor the awards, but some awards are not limited to people from that country.
International
Africa
Americas
Asia
Europe
Oceania
See also
List of computer science awards
List of computer-related awards
List of mechanical engineering awards
List of motor vehicle awards
List of space technology awards
Lists of awards
Lists of science and technology awards
References
Awards
Engineering |
https://en.wikipedia.org/wiki/Nikolay%20V.%20Kuznetsov | Nikolay Vladimirovich Kuznetsov (; born 13 May 1979 in Leningrad, USSR) is a specialist in nonlinear dynamics and control theory.
Academic career
He graduated from the St. Petersburg University, Department of Theoretical Cybernetics chaired by V.A. Yakubovich, in 2001. In 2004 he received Candidate of Science degree (supervisor G.A. Leonov) and in 2016 Doctor of Science degree (Habilitation) from St. Petersburg University. From 2003 Nikolay Kuznetsov has been working in St. Petersburg University and now he is Full professor (tenured) and Head of the Department of Applied Cybernetics there. Since 2018 the research group chaired by Kuznetsov has been awarded the status of the Leading Scientific School (Center of Excellence) of Russia in the field of mathematics and mechanics. In 2020 he was named Professor of the Year in the field of mathematics and physics in Russia.
Since 2018, Kuznetsov is Head of the Laboratory of information and control systems at the Institute for Problems in Mechanical Engineering of the Russian Academy of Science. In 2022, Nikolay Kuznetsov was elected a member of the Russian Academy of Science.
In 2008, Kuznetsov defended his Ph.D. degree at the University of Jyväskylä, Finland (supervisors P. Neittaanmäki, G.A. Leonov).
After the defense, he has been working at the University of Jyväskylä as the Academy of Finland postdoc,
then as a part-time professor at the IT Faculty: from 2014 he is Adjunct Docent and from 2017 – Visiting Professor.
He is co-chair of the Finnish-Russian educational & research program organized in 2007
by the University of Jyväskylä and St. Petersburg University. As a recognition, University of Jyväskylä awarded him a medal for his distinguished merits in the field of applied mathematics and training doctoral students, in 2020 he got the Finnish Information Processing Association (TIVIA) award and was elected a foreign member of the Finnish Academy of Science and Letters (becoming its youngest foreign member at the ti |
https://en.wikipedia.org/wiki/LC3%20%28codec%29 | LC3 (Low Complexity Communication Codec) is an audio codec specified by the Bluetooth Special Interest Group (SIG) for the LE Audio audio protocol introduced in Bluetooth 5.2. It's developed by Fraunhofer IIS and Ericsson as the successor of the SBC codec.
LC3 provides higher audio quality and better Packet loss concealment than SBC, G.722 and Opus, according to subjective testing by the Bluetooth Special Interest Group and ETSI. The conclusion regarding Opus is disputed as the test only included speech audio, but the comparison was made to version 1.1.4 of the reference Opus encoder, using complexity level 0 at 32kbps and relying on CELT (general audio) instead of the FEC-capable SILK (speech); the test also did not take into account the newer version 1.2 of the Opus encoder released in 2017, where significant improvements were made to low bitrate streams.
Supported systems:
Android 13; Google's liblc3 codec is open-source as a standalone GitHub project
Windows 11
Zephyr OS
Linux via bluez-alsa or BlueZ + PipeWire
LC3plus
LC3plus High Resolution mode is a codec defined by ETSI and is not compatible with the LC3 defined by Bluetooth SIG. It's included in the 2019 DECT standard.
On November 9th 2022, the Japan Audio Society (JAS) released a statement certifying LC3plus with the "Hi-Res AUDIO WIRELESS" logo. LC3plus is the 4th codec to receive this, alongside MQair, LDAC and LHDC codecs.
The ETSI implementation of LC3plus is source-available software, subject to a ETSI Intellectual Property Rights Policy and the usual patent restrictions.
Fraunhofer defines a way to use LC3plus over A2DP.
See also
SBC (codec)
AptX
LDAC (codec)
LHDC (codec)
References
Audio codecs
Bluetooth |
https://en.wikipedia.org/wiki/Static%20application%20security%20testing | Static application security testing (SAST) is used to secure software by reviewing the source code of the software to identify sources of vulnerabilities. Although the process of statically analyzing the source code has existed as long as computers have existed, the technique spread to security in the late 90s and the first public discussion of SQL injection in 1998 when Web applications integrated new technologies like JavaScript and Flash.
Unlike dynamic application security testing (DAST) tools for black-box testing of application functionality, SAST tools focus on the code content of the application, white-box testing.
A SAST tool scans the source code of applications and its components to identify potential security vulnerabilities in their software and architecture.
Static analysis tools can detect an estimated 50% of existing security vulnerabilities.
In the software development life cycle (SDLC), SAST is performed early in the development process and at code level, and also when all pieces of code and components are put together in a consistent testing environment. SAST is also used for software quality assurance. even if the many resulting false-positive impede its adoption by developers
SAST tools are integrated into the development process to help development teams as they are primarily focusing on developing and delivering software respecting requested specifications.
SAST tools, like other security tools, focus on reducing the risk of downtime of applications or that private information stored in applications will not be compromised.
For the year of 2018, the Privacy Rights Clearinghouse database shows that more than 612 million records have been compromised by hacking.
Overview
Application security tests of applications their release: static application security testing (SAST), dynamic application security testing (DAST), and interactive application security testing (IAST), a combination of the two.
Static analysis tools examine the text of a p |
https://en.wikipedia.org/wiki/Carol%20L.%20Boggs | Carol Linda Boggs (born April 11, 1952) is an American biologist specializing in the reproductive biology, population biology, ecology, and evolution of butterflies. Boggs completed her BA in 1973 and her PhD in 1979 in zoology at the University of Texas at Austin. Since 2013, she has been a professor in the School of the Earth, Ocean and Environment and the Department of Biological Sciences at the University of South Carolina. Boggs is the author of more than 120 peer-reviewed articles and has served on editorial boards for several journals. She has been a fellow of the American Association for the Advancement of Science since 2001.
Career
Boggs was a postdoctoral scholar at Stanford University from 1980 to 1985. Shortly after, Stanford hired her as a lecturer and consulting assistant professor in the Department of Biological Sciences (1986-1997). She was promoted to associate professor (teaching) (1997–2002), consulting professor (2002–2006), and finally, professor (teaching) (2006–2012). In parallel with these appointments, she was also a senior research scientist with Stanford University (1994–2006). Boggs also held administrative appointments at Stanford University such as the associate director (1994–1995) and director (1995–2006) of the Center for Conservation Biology, and the Bing Director for the Program in Human Biology (2006–2012). In 2013, Boggs moved to the University of South Carolina where she was hired as the director of the School of the Earth, Ocean and Environment (2013–2018) and as a professor in the School of the Earth, Ocean and Environment and the Department of Biological Sciences (2013–present).
Boggs has served on several editorial boards, either as a founding member or as an associate editor, for journals including Functional Ecology, Ecological Applications, Evolution, and the Journal of Insect Conservation. She has also worked with the Rocky Mountain Biological Laboratory (RMBL), serving on the board of trustees as a member for more tha |
https://en.wikipedia.org/wiki/NearlyFreeSpeech | NearlyFreeSpeech is a privately funded, US-based, low cost web hosting provider and domain name registrar that began in 2002. It was started in response to concerns about the entry of large companies into Internet publishing, and to promote freedom of speech.
History
In September 2006, Jeffrey D. Wheelhouse registered the NearlyFreeSpeech trademark.
Endorsements
By 2008, Michael Hemmingson of San Diego Reader wrote that the Electronic Frontier Foundation suggested using services such as NearlyFreeSpeech.net and Tor software to avoid being fired for blogging. In 2009 Shawn Powers of Linux Journal reviewed Nearly Free Speech and recommended them over GoDaddy even after having some technical issues. In 2010 Jason Fitzpatrick of LifeHacker.com listed Nearly Free Speech as first of "Five Best Personal Web Hosts" and said they were unusual because of their incremental billing based on usage. In a similar 2012 "top five" list by Alan Henry of LifeHacker.com, Nearly Free Speech was given "honorable mention" and he said they offer exceptional hosting plans for as low as $0.25, and promise to only make you pay for what you use.
In 2010 in "Twitter Application Development For Dummies", Dusty Reagan recommended Nearly Free Speech for learning PHP development. In 2010 Cody Fink of MacStories.net, describing how to install Fever in 10 minutes, called Nearly Free Speech, "an amazing hosting solution that's relatively cheap, especially for light use." In 2012 in "Handbook of Research on Didactic Strategies and Technologies for Education" Nearly Free Speech was cited as a "pay as you go" service, which could reduce costs significantly. In 2013, Nearly Free Speech was used for a low-cost promotion involving the posting of indie Zelda-alike game Anodyne on The Pirate Bay.
Controversies
BugMeNot controversy
In 2004 Matt Hines of CNET said Nearly Free Speech supported BugMeNot against take-down attempts. Kevin Newcomb of clickz.com wrote that Texas-based NearlyFreeSpeech.net spo |
https://en.wikipedia.org/wiki/Kleptoprotein | A kleptoprotein is a protein which is not encoded in the genome of the organism which uses it, but instead is obtained through diet from a prey organism. Importantly, a kleptoprotein must maintain its function and be mostly or entirely undigested, drawing a distinction from proteins that are digested for nutrition, which become destroyed and non-functional in the process.
This phenomenon was first reported in the bioluminescent fish Parapriacanthus, which has specialized light organs adapted towards counter-illumination, but obtains the luciferase enzyme within these organs from bioluminescent ostracods, including Cypridina noctiluca or Vargula hilgendorfii.
See also
Kleptoplasty
References
Biology
Proteins |
https://en.wikipedia.org/wiki/AE1/AE3 | AE1/AE3 is an antibody cocktail that is used in immunohistochemistry, being generally positive in the cytoplasm of carcinomas (cancers of epithelial origin).
Targets
The antibody cocktail binds to cytokeratin 1 - 8, 10, 14 - 16 and 19 (but not CK17 or CK18). It is therefore used as a marker of carcinomas, such as depth of invasion and metastases. For example, it is both relatively sensitive and specific for detection of breast cancer metastasis to sentinel lymph nodes.
It may cross-react with GFAP, leading to aberrant staining of glial tumors such as ependymoma, glioblastoma and schwannoma. It may also stain myofibroblasts and smooth muscle cells. Furthermore, it may stain nodal epithelial cells that has contaminated a tumor from recent biopsy.
See also
List of histologic stains that aid in diagnosis of cutaneous conditions
References
Antibodies
Pathology
Carcinoma |
https://en.wikipedia.org/wiki/Burroughs%20B20 | The B20 is a line of microcomputers from Burroughs Corporation. The systems, introduced in May 1982, consist of two models: the B21 and the B22. The B21 models are rebadged Convergent Technologies AWS workstations incorporating an Intel 8086 CPU. The B22 models are rebadged IWS workstations. They run the BTOS operating system, which is a version of Convergent's CTOS, as well as CP/M and MS-DOS.
Systems support up to 640 KB of RAM. The B22 included a mass storage unit with a capacity of up to 60 MB.
The Burroughs B25, a rebadged Convergent NGEN system with an Intel 80186 CPU, was introduced in 1983. The B26 was introduced in 1984, and a B28 system followed in 1985 based on the Intel 80286 CPU.
There is also an 80186-based B27 which used an "F-bus" rather than the "X-bus" used on the B25/B26/B28.
A cluster only (no storage) 80186-based B24 was later released and commonly used by bank tellers.
References
External links
B20 brochure, 1982
Burroughs Corporation
16-bit computers
Products introduced in 1982 |
https://en.wikipedia.org/wiki/GOLOG | GOLOG is a high-level logic programming language for the specification and execution of complex actions in dynamical domains. It is based on the situation calculus. It is a first-order logical language for reasoning about action and change. GOLOG was developed at the University of Toronto.
History
The concept of situation calculus on which the GOLOG programming language is based was first proposed by John McCarthy in 1963.
Language
A GOLOG interpreter automatically maintains a direct characterization of the dynamic world being modeled, on the basis of user supplied axioms about preconditions, effects of actions and the initial state of the world. This allows the application to reason about the condition of the world and consider the impacts of different potential actions before focusing on a specific action.
Golog is a logic programming language and is very different from conventional programming languages. A procedural programming language like C defines the execution of statements in advance. The programmer creates a subroutine which consists of statements, and the computer executes each statement in a linear order. In contrast, fifth-generation programming languages like Golog are working with an abstract model with which the interpreter can generate the sequence of actions. The source code defines the problem and it is up to the solver to find the next action. This approach can facilitate the management of complex problems from the domain of robotics.
A Golog program defines the state space in which the agent is allowed to operate. A path in the symbolic domain is found with state space search. To speed up the process, Golog programs are realized as hierarchical task networks.
Apart from the original Golog language, there are some extensions available. The ConGolog language provides concurrency and interrupts. Other dialects like IndiGolog and Readylog were created for real time applications in which sensor readings are updated on the fly.
Uses
Golog ha |
https://en.wikipedia.org/wiki/Computer%20engineering%20compendium | This is a list of the individual topics in Electronics, Mathematics, and Integrated Circuits that together make up the Computer Engineering field. The organization is by topic to create an effective Study Guide for this field. The contents match the full body of topics and detail information expected of a person identifying themselves as a Computer Engineering expert as laid out by the National Council of Examiners for Engineering and Surveying. It is a comprehensive list and superset of the computer engineering topics generally dealt with at any one time.
Part 1 - Basics
Character Encoding
Character (computing)
Universal Character Set
IEEE 1394
ASCII
Math
Bitwise operation
Signed number representations
IEEE floating point
Operators in C and C++
De Morgan's laws
Booth's multiplication algorithm
Binary multiplier
Wallace tree
Dadda multiplier
Multiply–accumulate operation
Big O notation
Euler's identity
Basic Electronics
Series and parallel circuits
RLC circuit
Transistor
Operational amplifier applications
Signal Processing
Signal processing
Digital filter
Fast Fourier transform
Cooley–Tukey FFT algorithm
Modified discrete cosine transform
Digital signal processing
Analog-to-digital converter
Error Detection/Correction
Parity bit
Error detection and correction
Cyclic redundancy check
Hamming code
Hamming(7,4)
Convolutional code
Forward error correction
Noisy-channel coding theorem
Modulation
Signal-to-noise ratio
Linear code
Noise (electronics)
Part 2 - Hardware
Hardware
Logic family
Multi-level cell
Flip-flop (electronics)
Race condition
Binary decision diagram
Circuit minimization for Boolean functions
Karnaugh map
Quine–McCluskey algorithm
Integrated circuit design
Programmable Logic
Standard cell
Programmable logic device
Field-programmable gate array
Complex programmable logic device
Application-specific integrated circuit
Logic optimization
Register-transfer level
Floorplan (microelectronics)
Hardware description language
VHDL
Verilog
Electronic des |
https://en.wikipedia.org/wiki/Leakage%20%28machine%20learning%29 | In statistics and machine learning, leakage (also known as data leakage or target leakage) is the use of information in the model training process which would not be expected to be available at prediction time, causing the predictive scores (metrics) to overestimate the model's utility when run in a production environment.
Leakage is often subtle and indirect, making it hard to detect and eliminate. Leakage can cause a statistician or modeler to select a suboptimal model, which could be outperformed by a leakage-free model.
Leakage modes
Leakage can occur in many steps in the machine learning process. The leakage causes can be sub-classified into two possible sources of leakage for a model: features and training examples.
Feature leakage
Feature or column-wise leakage is caused by the inclusion of columns which are one of the following: a duplicate label, a proxy for the label, or the label itself. These features, known as anachronisms, will not be available when the model is used for predictions, and result in leakage if included when the model is trained.
For example, including a "MonthlySalary" column when predicting "YearlySalary"; or "MinutesLate" when predicting "IsLate".
Training example leakage
Row-wise leakage is caused by improper sharing of information between rows of data. Types of row-wise leakage include:
Premature featurization; leaking from premature featurization before Cross-validation/Train/Test split (must fit MinMax/ngrams/etc on only the train split, then transform the test set)
Duplicate rows between train/validation/test (e.g. oversampling a dataset to pad its size before splitting; e.g. different rotations/augmentations of a single image; bootstrap sampling before splitting; or duplicating rows to up sample the minority class)
Non-i.i.d. data
Time leakage (e.g. splitting a time-series dataset randomly instead of newer data in test set using a TrainTest split or rolling-origin cross validation)
Group leakage—not including a groupin |
https://en.wikipedia.org/wiki/Gift-exchange%20game | The gift-exchange game, also commonly known as the gift exchange dilemma, is a common economic game introduced by George Akerlof and Janet Yellen to model reciprocacy in labor relations. The gift-exchange game simulates a labor-management relationship execution problem in the principal-agent problem in labor economics. The simplest form of the game involves two players – an employee and an employer. The employer first decides whether they should award a higher salary to the employee. The employee then decides whether to reciprocate with a higher level of effort (work harder) due to the salary increase or not. Like trust games, gift-exchange games are used to study reciprocity for human subject research in social psychology and economics. If the employer pays extra salary and the employee puts in extra effort, then both players are better off than otherwise. The relationship between an investor and an investee has been investigated as the same type of a game.
The gift exchange game serves as a valuable lens through which to understand economic theory as it demonstrates that self-interest maximization is not the sole determinant of economic decision-making. Rather, reciprocity is a fundamental factor that shapes individuals' behaviour in economic contexts. By simulating labor relations between an employer and employee, the game explicates that when employer offer a higher salary, employees are more inclined to reciprocate with great effort, leading to mutually beneficial outcomes. Gift exchange games have been used to study economic and social phenomena such as labor contracts, market transactions, strike and the decline of unionization. The gift-exchange theory also incorporates a social component, where homogenous agents who are employed with an equivalent wage level will exert greater effort. This then continues to result in a higher market efficiency and higher rent than those agents receiving different wages. The first examination of this component is referred t |
https://en.wikipedia.org/wiki/Phenotype%20modification | Phenotype modification is the process of experimentally altering an organism's phenotype to investigate the impact of phenotype on the fitness.
Phenotype modification has been used to assess the impact of parasite mechanical presence on fish host behaviour.
References
Genetics |
https://en.wikipedia.org/wiki/Improved%20Layer%202%20Protocol | IL2P (Improved Layer 2 Protocol) is a data link layer protocol originally derived from layer 2 of the X.25 protocol suite and designed for use by amateur radio operators. It is used exclusively on amateur packet radio networks.
IL2P occupies the data link layer, the second layer of the OSI model. It is responsible for establishing link-layer connections, transferring data encapsulated in frames between nodes, and detecting errors introduced by the communications channel.
The Improved Layer 2 Protocol (IL2P) was created by Nino Carrillo, KK4HEJ, based on AX.25 version 2.0 and implements Reed Solomon Forward Error Correction for greater accuracy and throughput than either AX.25 or FX.25. Specifically, in order to achieve greater stability on link speeds higher than 1200 baud.
IL2P can be used with a variety of modulation methods including AFSK and GFSK. The direwolf software TNC contains the first open source implementation of the protocol.
IL2P Specification
The IL2P draft specification v0.5 was published via the Terrestrial Amateur Radio Packet Network (TARPN) on June 10, 2022.
As of version 0.5, the Weak-Signal-Extensions were added which adds several features to the protocols, intended primarily for SSB links. The automatic ID transmission for the FM/1200-baud and faster modes is still in place, but is not enabled if these lower speed weak-signal SSB modes are selected.
Implementations
IL2P was first implemented in the closed source and proprietary ninoTNC to solve for lossy network links due to low Signal-to-noise ratio or weak signal strength.
The specification itself outlines several design goals including:
Forward error correction
Eliminating bit-stuffing
Streamlining the AX.25 header format
Improved packet detection in the absence of Decode (DCD) and for open-squelch receive
Produce a bitstream suitable for modulation on various physical layers
Avoid bit-error-amplifying methods (differential encoding and free-running LFSRs)
Increase efficie |
https://en.wikipedia.org/wiki/Blue%E2%80%93green%20deployment | In software engineering, blue–green (also blue/green) deployment is a method of installing changes to a web, app, or database server by swapping alternating production and staging servers.
Overview
In blue–green deployments, two servers are maintained: a "blue" server and a "green" server. At any given time, only one server is handling requests (e.g., being pointed to by the DNS). For example, public requests may be routed to the blue server, making it the production server and the green server the staging server, which can only be accessed on a private network. Changes are installed on the non-live server, which is then tested through the private network to verify the changes work as expected. Once verified, the non-live server is swapped with the live server, effectively making the deployed changes live.
Using this method of software deployment offers the ability to quickly roll back to a previous state if anything goes wrong. This rollback is achieved by simply routing traffic back to the previous live server, which still does not have the deployed changes. An additional benefit to the blue–green method of deployment is the reduced downtime for the server. Because requests are routed instantly from one server to the other, there is ideally no period where requests will be unfulfilled.
The blue–green deployment technique is often contrasted with the canary release deployment technique.
References
Software distribution
System administration
Software release |
https://en.wikipedia.org/wiki/List%20of%20OBO%20Foundry%20ontologies | This is a list of ontologies that are part of the OBO Foundry
OBO Foundry ontologies
References
OBO Foundry ontologies
Ontology (information science) |
https://en.wikipedia.org/wiki/Chalais-Meudon | Chalais-Meudon is an aeronautical research and development centre in Meudon, to the south-west of Paris. It was originally founded in 1793 in the nearby Château de Meudon and has played an important role in the development of French aviation.
Balloons
The story of aviation at Chalais-Meudon starts in October 1793 when the French Public Safety Committee ordered the construction of an observation balloon capable of carrying two observers. The old royal grounds at Meudon were allocated for this work, with the Château de Meudon chosen as the centre, with Nicolas-Jacques Conté as director. Two French Balloon Corps balloon companies had already been created, and the new organisation's role was to build balloons and train their pilots and operators.
The first balloon, the Entreprenant, was built within four months, and on 31 October 1794, the National School of Ballooning was created, with Conté as its director. Many other balloons were then built in a short period, including, in 1795, l’Intrépide which, with the First Balloon Company, was captured by Austrian troops in 1796, and is now on display at the Austrian Military Museum in Vienna - the oldest aircraft in Europe. They were all spherical hydrogen balloons with a diameter of at least . Conté himself had improved production methods for hydrogen and the treatment of the gas bags.
In 1798 Napoleon sent one of the balloon companies in one ship to Egypt. It was sunk by the British at Aboukir and all the equipment was lost. The two balloon companies were disbanded soon afterwards, and work on balloons at Meudon ceased.
In 1877, balloons had regained their importance after their successful use in the Siege of Paris (1870–71). Léon Gambetta, the Minister for War, who had himself escaped from Paris by balloon, created a commission of air communications, and Colonel Charles Renard was put in charge of military ballooning. In 1877, he became director of the l'Etablissement Central de l'Aérostation Militaire (Central Establ |
https://en.wikipedia.org/wiki/Swiftships | Swiftships is a shipbuilding and marine engineering company headquartered in South Louisiana, USA. Company operates globally and specialized in the construction of small to medium sized vessels made of steel, aluminum or fiberglass. Swiftships is involved in ship design, construction, repair and maintenance activities.
History
Founded by Fred Sewart in 1942, Swiftships began as Sewart Machine Works and then as Sewart Seacraft in 1946. Company became a supplier of “Swift Boats” to the US Navy during the Vietnam War (Swiftships delivered 193 Fast Patrol Crafts to the US Navy throughout the conflict). The mission objective of the Swift Boat was to provide the Navy with a fast boat that could patrol the river shores for enemy soldiers.
In 1969 the company was renamed as Swiftships.
Since 2004 and for the next years, Swiftships built ships for the oil and gas industry of the Gulf of Mexico and restored vessels for the Dominican Republic.
Company has created its first fully unmanned surface vehicle in 2015, called Anaconda (AN-1), and later the Anaconda (AN-2), for which Swiftships teamed with the University of Louisiana at Lafayette and augmented technology developers.
Since 1942 Swiftships has designed and built over 600 naval vessels and commercial platforms.
Co-production
In 2008 the company signed a contract with the Egyptian Navy, initiating a co-production program, building vessels in-country. The partnership includes a yard in Alexandria, where the company produces patrol crafts.
In 2009, Swiftships was awarded a contract by the U.S. Navy to provide Follow on Technical Support on behalf of the Iraqi Navy that included the establishment of a Ship Repair Facility in Umm Qasr, Iraq.
Yards
In 2020, Swiftships operates 3 yards in USA and 1 co-production yard (JV) with Egyptian Navy in Alexandria, Egypt:
Morgan City, Louisiana
New Iberia, Jennerate, Louisiana
Freeport, Texas
Egyptian Shipyard Repair Building Co. - partnership with the Egyptian Navy to c |
https://en.wikipedia.org/wiki/Xenobot | Xenobots, named after the African clawed frog (Xenopus laevis), are synthetic lifeforms that are designed by computers to perform some desired function and built by combining together different biological tissues. Whether xenobots are robots, organisms, or something else entirely remains a subject of debate among scientists.
Existing xenobots
The first xenobots were built by Douglas Blackiston according to blueprints generated by an AI program, which was developed by Sam Kriegman.
Xenobots built to date have been less than wide and composed of just two things:
skin cells and heart muscle cells, both of which are derived from stem cells harvested from early (blastula stage) frog embryos.
The skin cells provide rigid support and the heart cells act as small motors, contracting and expanding in volume to propel the xenobot forward.
The shape of a xenobot's body, and its distribution of skin and heart cells, are automatically designed in simulation to perform a specific task, using a process of trial and error (an evolutionary algorithm).
Xenobots have been designed to walk, swim, push pellets, carry payloads, and work together in a swarm to aggregate debris scattered along the surface of their dish into neat piles.
They can survive for weeks without food and heal themselves after lacerations.
Other kinds of motors and sensors have been incorporated into xenobots.
Instead of heart muscle, xenobots can grow patches of cilia and use them as small oars for swimming.
However, cilia-driven xenobot locomotion is currently less controllable than cardiac-driven xenobot locomotion.
An RNA molecule can also be introduced to xenobots to give them molecular memory: if exposed to specific kind of light during behavior, they will glow a prespecified color when viewed under a fluorescence microscope.
Xenobots can also self-replicate. Xenobots can gather loose cells in their environment, forming them into new xenobots with the same capability.
Potential applications
Currently |
https://en.wikipedia.org/wiki/Journal%20of%20Performance%20of%20Constructed%20Facilities | The Journal of Performance of Constructed Facilities is a peer-reviewed scientific journal published by the American Society of Civil Engineers and is engaged in sharing information on failures and performance issues of constructed facilities. The editors seek papers that address construction practices, failure investigation (both technical and procedural failures), as well as reconstruction and ethics topics. Also covered are topics that address performance and maintenance of existing structures.
Abstracting and indexing
The journal is abstracted and indexed in Ei Compendex, Science Citation Index Expanded, ProQuest databases, Civil Engineering Database, Inspec, Scopus, and EBSCO databases.
References
External links
Library
Engineering journals
American Society of Civil Engineers academic journals
Academic journals established in 1987 |
https://en.wikipedia.org/wiki/Journal%20of%20Pipeline%20Systems%20Engineering%20and%20Practice | The Journal of Pipeline Systems Engineering and Practice is a peer-reviewed scientific journal published by the American Society of Civil Engineers that covers topics on pipeline systems, from planning, construction, to safety and maintenance. This journal has a lot of practical coverage and a good resource for practicing engineers looking for environmental and sustainable pipeline information to address water distribution, wastewater systems, storm sewers and more.
Abstracting and indexing
The journal is abstracted and indexed in Ei Compendex, ProQuest databases, Civil Engineering Database, Inspec, Science Citation Index Expanded, Scopus, and EBSCO databases.
References
External links
Engineering journals
American Society of Civil Engineers academic journals
Academic journals established in 2010 |
https://en.wikipedia.org/wiki/Dependency%20network%20%28graphical%20model%29 | Dependency networks (DNs) are graphical models, similar to Markov networks, wherein each vertex (node) corresponds to a random variable and each edge captures dependencies among variables.
Unlike Bayesian networks, DNs may contain cycles.
Each node is associated to a conditional probability table, which determines the realization of the random variable given its parents.
Markov blanket
In a Bayesian network, the Markov blanket of a node is the set of parents and children of that node, together with the children's parents. The values of the parents and children of a node evidently give information about that node. However, its children's parents also have to be included in the Markov blanket, because they can be used to explain away the node in question. In a Markov random field, the Markov blanket for a node is simply its adjacent (or neighboring) nodes. In a dependency network, the Markov blanket for a node is simply the set of its parents.
Dependency network versus Bayesian networks
Dependency networks have advantages and disadvantages with respect to Bayesian networks. In particular, they are easier to parameterize from data, as there are efficient algorithms for learning both the structure and probabilities of a dependency network from data. Such algorithms are not available for Bayesian networks, for which the problem of determining the optimal structure is NP-hard. Nonetheless, a dependency network may be more difficult to construct using a knowledge-based approach driven by expert-knowledge.
Dependency networks versus Markov networks
Consistent dependency networks and Markov networks have the same representational power. Nonetheless, it is possible to construct non-consistent dependency networks, i.e., dependency networks for which there is no compatible valid joint probability distribution. Markov networks, in contrast, are always consistent.
Definition
A consistent dependency network for a set of random variables with joint distribution |
https://en.wikipedia.org/wiki/Grayshift | Grayshift is an American mobile device forensics company which makes a device named GrayKey to crack iPhones, iPads, and Android devices.
Grayshift was co-founded by David Miles, Braden Thomas, Justin Fisher and Sean Larsson. The company is funded by private investors PeakEquity Partners and C&B Capital.
GrayKey
The GrayKey product has been used by the FBI and U.S., British and Canadian local police forces. Canadian police forces require judicial authorization (court order or warrant) per mobile phone to use GrayKey. GrayKey is estimated to be used in up to 30 countries.
According to media reports, GrayKey costs US$15,000 to US$30,000 per copy depending on the functional options chosen. One thousand agencies currently use GrayKey The device is a gray box, 4 inches by 4 inches by 2 inches in size, with two Lightning cables. The time to solve an iPhone's passcode can be a few minutes to several hours, depending on the length of the passcode. Thus, it is possible that GrayKey is performing a brute-force attack to perform to solve after disabling the passcode attempt limit.
The GrayKey reportedly provides support for iPhones running iOS 9 and later. Apple modified iOS so that external device connections must be authorized by the iPhone owner after it has been unlocked. On newer iPhone models, only unencrypted files and some metadata might be extracted. With earlier models, full data extraction, such as decrypting encrypted files, is possible.
In 2018, hackers obtained the GrayKey source code and attempted to extort a payment of 2 bitcoins from Grayshift after leaking "small chunks of code".
GrayKey with Android support was released in early 2021.
References
External links
Companies based in Atlanta
Computer security
Computer security specialists
Forensics organizations
Computer security software
Computer security software companies |
https://en.wikipedia.org/wiki/Envy-free%20pricing | Envy-free pricing is a kind of fair item allocation. There is a single seller that owns some items, and a set of buyers who are interested in these items. The buyers have different valuations to the items, and they have a quasilinear utility function; this means that the utility an agent gains from a bundle of items equals the agent's value for the bundle minus the total price of items in the bundle. The seller should determine a price for each item, and sell the items to some of the buyers, such that there is no envy. Two kinds of envy are considered:
Agent envy means that some agent assigns a higher utility (a higher difference value-price) to a bundle allocated to another agent.
Market envy means that some agent assigns a higher utility (a higher difference value-price) to any bundle.
The no-envy conditions guarantee that the market is stable and that the buyers do not resent the seller. By definition, every market envy-free allocation is also agent envy-free, but not vice versa.
There always exists a market envy-free allocation (which is also agent envy-free): if the prices of all items are very high, and no item is sold (all buyers get an empty bundle), then there is no envy, since no agent would like to get any bundle for such high prices. However, such an allocation is very inefficient. The challenge in envy-free pricing is to find envy-free prices that also maximize one of the following objectives:
The social welfare - the sum of buyers' utilities;
The seller's revenue (or profit) - the sum of prices paid by buyers.
Envy-free pricing is related, but not identical, to other fair allocation problems:
In envy-free item allocation, monetary payments are not allowed.
In the rental harmony problem, monetary payments are allowed, and the agents are quasilinear, but all objects should be allocated (and each agent should get exactly one object).
Results
A Walrasian equilibrium is a market-envy-free pricing with the additional requirement that all items wi |
https://en.wikipedia.org/wiki/Journal%20of%20Computing%20in%20Civil%20Engineering | The Journal of Computing in Civil Engineering is a bimonthly peer-reviewed scientific journal published by the American Society of Civil Engineers. It covers research specific to computing as it relates to civil engineering.
Abstracting and indexing
The journal is abstracted and indexed in Ei Compendex, Science Citation Index Expanded, ProQuest databases, Civil Engineering Database, Inspec, Scopus, and EBSCO databases.
References
External links
Civil engineering journals
American Society of Civil Engineers academic journals
Academic journals established in 1987 |
https://en.wikipedia.org/wiki/Journal%20of%20Cold%20Regions%20Engineering | The Journal of Cold Regions Engineering is a quarterly peer-reviewed scientific journal published by the American Society of Civil Engineers. It covers civil engineering related to cold regions.
Abstracting and indexing
The journal is abstracted and indexed in Ei Compendex, Science Citation Index Expanded, ProQuest databases, Civil Engineering Database, Inspec, Scopus, and EBSCO databases.
References
External links
Civil engineering journals
American Society of Civil Engineers academic journals
Academic journals established in 1987
Glaciology journals |
https://en.wikipedia.org/wiki/Journal%20of%20Energy%20Engineering | The Journal of Energy Engineering is a quarterly peer-reviewed scientific journal published by the American Society of Civil Engineers. It covers civil engineering as related to the production, distribution, and storage of energy.
Abstracting and indexing
The journal is abstracted and indexed in Ei Compendex, Science Citation Index Expanded, ProQuest databases, Civil Engineering Database, Inspec, Scopus, and EBSCO databases.
History
The journal has been known by several names:
Journal of the Power Division (1956-1978)
Journal of the Energy Division (1979-1982)
Journal of Energy Engineering (1983–present)
External links
Civil engineering journals
American Society of Civil Engineers academic journals
Academic journals established in 1956 |
https://en.wikipedia.org/wiki/Cowrie%20%28honeypot%29 | Cowrie is a medium interaction SSH and Telnet honeypot designed to log brute force attacks and shell interaction performed by an attacker. Cowrie also functions as an SSH and telnet proxy to observe attacker behavior to another system. Cowrie was developed from Kippo.
Reception
Cowrie has been referenced in published papers. The Book "Hands-On Ethical Hacking and Network Defense" includes Cowrie in a list of 5 commercial honeypots.
Prior uses
Discussing a honeypot effort called the Project Heisenberg Cloud by Rapid7, Bob Rudis, the company's chief data scientist, told eWEEK, "There are custom Rapid7-developed low- and medium-interaction honeypots used within the framework, along with open-source ones, such as Cowrie."
Doug Rickert has experimented with the open-source Cowrie SSH honeypot and wrote about it on Medium. Putting up a simple honeypot isn't difficult, and there are many open-source products besides Cowrie, including the original Honeyd to MongoDB and NoSQL honeypots, to ones that emulate web servers. Some appear to be SCADA or other more advanced applications.
Best practices
Researchers at the SysAdmin, Audit, Network and Security (SANS) institute urged administrators and security researchers to run the latest version of Cowrie on a honeypot to monitor shifts in the type of passwords being scanned for and pattern of attacks on IoT devices.
Discussion and further resources
Attack Detection and Forensics Using Honeypot in an IoT Environment calls Cowrie a "medium interaction honeypot" and describes results from using it for 40 days to capture "all communicated sessions in log files."
The book Advances on Data Science also devotes chapter two to "Cowrie Honeypot Dataset and Logging."
ICCWS 2018 13th International Conference on Cyber Warfare and Security describes using Cowrie.
On the Move to Meaningful Internet Systems: OTM 2019 Conferences includes details of using Cowrie.
Splunk, a security tool that can receive information from honeypots, |
https://en.wikipedia.org/wiki/Circuit%20breaker%20analyzer | A circuit breaker analyzer is an instrument that measures the parameters of a circuit breaker.
In 1984, Megger patented a digital circuit breaker analyzer, controlled by a microprocessor. in 2020 few companies develop software to control circuit breaker analyzers from different devices such as computers, tablet computer, smartphones and others.
The following tests can be carried out on the circuit breaker: mechanical, thermal, dielectric, short-circuit.
The analyzer operates the circuit breaker under fault current conditions. After finishing the test of the breaker, the system measures currents, voltages and other main parameters of the breaker and through a set algorithm diagnoses the condition of the device under different conditions. The final result of the analysis give information about trip times, essential synchronism of the poles in the different operations of the circuit breaker.
Measured values
Timing measurements
Motion measurements
Coil currents
Dynamic resistance measurement (DRM)
Vibration analysis
Dynamic capacitance measurement
Static and dynamic resistance measurement
References
Electronic test equipment
Product testing
Measuring instruments |
https://en.wikipedia.org/wiki/Nwipe | nwipe is a Linux computer program used to securely erase data. It is maintained by Martijn van Brummelen and is free software, released under the GNU General Public License 2.0 licence. The program is a fork of the dwipe program that was previously incorporated in the DBAN secure erase disk.
nwipe was created to allow dwipe to be run outside DBAN, using any host distribution. It utilizes a simple text-based ncurses user interface or can be run directly from the command line. It is available as an installable package in the repositories of many Linux distributions, including Debian and Ubuntu.
nwipe was first released as version 0.17 on 20 October 2014.
Erasing methods
nwipe can be set to use a number of different patterns, through the method selection:
Default - DoD Short - The United States Department of Defense 5220.22-M short 3 pass wipe (passes 1, 2 & 7).
Zero Fill - Fills the device with zeros, in a single pass.
RCMP TSSIT OPS-II - Royal Canadian Mounted Police Technical Security Standard, OPS-II
DoD 5220.22M - The United States Department of Defense 5220.22-M full 7 pass wipe.
Gutmann Wipe - Peter Gutmann's method for the Secure Deletion of Data from Magnetic and Solid-State Memory.
PRNG Stream - Fills the device with a stream from the PRNG.
Verify only - Only reads the device and checks that it is all zero.
HMG IS5 enhanced - Secure Sanitation of Protectively Marked Information or Sensitive Information
It uses two types of pseudo random number generators:
Mersenne Twister
ISAAC
Employment
nwipe has also been incorporated in free software rescue toolkit packages, such as the All in One - System Rescue Toolkit, Parted Magic, and SystemRescue.
References
External links
Data erasure software
Software using the GPL license |
https://en.wikipedia.org/wiki/Finite%20algebra | In abstract algebra, an -algebra is finite if it is finitely generated as an -module. An -algebra can be thought as a homomorphism of rings , in this case is called a finite morphism if is a finite -algebra.
The definition of finite algebra is related to that of algebras of finite type.
Finite morphisms in algebraic geometry
This concept is closely related to that of finite morphism in algebraic geometry; in the simplest case of affine varieties, given two affine varieties , and a dominant regular map , the induced homomorphism of -algebras defined by turns into a -algebra:
is a finite morphism of affine varieties if is a finite morphism of -algebras.
The generalisation to schemes can be found in the article on finite morphisms.
References
See also
Finite morphism
Finitely generated algebra
Finitely generated module
Commutative algebra
Algebraic geometry
Algebras |
https://en.wikipedia.org/wiki/Pamela%20E.%20Harris | Pamela Estephania Harris (born November 28, 1983) is a Mexican-American mathematician, educator and advocate for immigrants. She is currently an associate professor at the University of Wisconsin-Milwaukee in Milwaukee, Wisconsin, was formerly an associate professor at Williams College in Williamstown, Massachusetts and is co-founder of the online platform Lathisms. She is also an editor of the e-mentoring blog of the American Mathematical Society (AMS).
Early life and career
Harris first emigrated with her family from Mexico to the United States when she was 8 years old. They returned to Mexico, before eventually settling in Wisconsin when Harris was 12. Because she was undocumented, she could not attend university. Instead, she studied at the Milwaukee Area Technical College, where she earned two associate degrees in two and a half years. After she married a US citizen and her immigration status changed, she transferred to Marquette University, where she obtained a bachelor's degree in mathematics. She went on to complete her master's degree and in 2012 a PhD at the University of Wisconsin-Milwaukee. Her PhD dissertation was advised by Jeb F. Willenbring. Harris was a Project NExT (New Experiences in Teaching) fellow in 2012. She was a Davies Research Fellow at the United States Military Academy, and, in 2016, joined the faculty at Williams College where she was an associate professor. In 2022, she joined the faculty at the University of Wisconsin-Milwaukee as an associate professor.
Harris studies algebraic combinatorics, in particular the representation of Lie algebras. In order to understand this representation she studies vector partition functions, in particular Kostant's partition function. She is also interested in graph theory and number theory. In 2016 she co-founded an online platform called 'Lathisms' which aims to promote the contributions of Latinxs and Hispanics in the Mathematical Sciences. In 2020 she co-authored the book "Asked and Answered: D |
https://en.wikipedia.org/wiki/Living%20building%20material | A living building material (LBM) is a material used in construction or industrial design that behaves in a way resembling a living organism. Examples include: self-mending biocement, self-replicating concrete replacement, and mycelium-based composites for construction and packaging. Artistic projects include building components and household items.
History
The development of living building materials began with research of methods for mineralizing concrete, that were inspired by coral mineralization. The use of microbiologically induced calcite precipitation (MICP) in concrete was pioneered by Adolphe et al. in 1990, as a method of applying a protective coating to building façades.
In 2007, "Greensulate", a mycelium-based building insulation material was introduced by Ecovative Design, a spin off of research conducted at the Rensselaer Polytechnic Institute. Mycelium composites were later developed for packaging, sound absorption, and structural building materials such as bricks.
In the United Kingdom, the Materials for Life (M4L) project was founded at Cardiff University in 2013 to "create a built environment and infrastructure which is a sustainable and resilient system comprising materials and structures that continually monitor, regulate, adapt and repair themselves without the need for external intervention." M4L led to the UK's first self-healing concrete trials. In 2017 the project expanded into a consortium led by the universities of Cardiff, Cambridge, Bath and Bradford, changing its name to Resilient Materials 4 Life (RM4L) and receiving funding from the Engineering and Physical Sciences Research Council. This consortium focuses on four aspects of material engineering: self-healing of cracks at multiple scales; self-healing of time-dependent and cycling loading damage; self-diagnosis and healing of chemical damage; and self-diagnosis and immunization against physical damage.
In 2016 the United States Department of Defense's Defense Advanced Research P |
https://en.wikipedia.org/wiki/Regular%20numerical%20predicate | In computer science and mathematics, more precisely in automata theory, model theory and formal language, a regular numerical predicate is a kind of relation over integers. Regular numerical predicates can also be considered as a subset of for some arity . One of the main interests of this class of predicates is that it can be defined in plenty of different ways, using different logical formalisms. Furthermore, most of the definitions use only basic notions, and thus allows to relate foundations of various fields of fundamental computer science such as automata theory, syntactic semigroup, model theory and semigroup theory.
The class of regular numerical predicate is denoted , and REG.
Definitions
The class of regular numerical predicate admits a lot of equivalent definitions. They are now given. In all of those definitions, we fix and a (numerical) predicate of arity .
Automata with variables
The first definition encodes predicate as a formal language. A predicate is said to be regular if the formal language is regular.
Let the alphabet be the set of subset of . Given a vector of integers , it is represented by the word of length whose -th letter is . For example, the vector is represented by the word .
We then define as .
The numerical predicate is said to be regular if is a regular language over the alphabet . This is the reason for the use of the word "regular" to describe this kind of numerical predicate.
Automata reading unary numbers
This second definition is similar to the previous one. Predicates are encoded into languages in a different way, and the predicate is said to be regular if and only if the language is regular.
Our alphabet is the set of vectors of binary digits. That is: . Before explaining how to encode a vector of numbers, we explain how to encode a single number.
Given a length and a number , the unary representation of of length is the word over the binary alphabet , beginning by a sequence of "1"'s, followed by |
https://en.wikipedia.org/wiki/National%20Sewerage%20Program | The National Sewerage Program was an Australian federal program under the Whitlam and Fraser governments established to provide funding for the expansion of municipal sewerage systems. At the time Australia was lagging behind other developed nations and, as of the commencement of the program in 1972, 17.2% of the Australian population were not connected to sewerage. Even in major population centers like Sydney and Melbourne, there was a backlog of over 318,000 homes waiting to be connected to municipal sewerage systems. The program was administered by the newly formed Department of Urban and Regional Development, and over AUD$330 million of funding was allocated to be distributed to individual states and territories over ten years. Over the life of the program the sewerage connection backlog was reduced by 30% to 40%. The program was abolished in 1977 by the incumbent Fraser government. Consequently, many communities struggled to connect to sewerage for decades afterwards.
References
Engineering projects
Public policy in Australia
1972 establishments in Australia
1977 disestablishments in Australia
Sewerage infrastructure in Australia |
https://en.wikipedia.org/wiki/Knowledge%20as%20a%20service | Knowledge as a service (KaaS) is a computing service that delivers information to users, backed by a knowledge model, which might be drawn from a number of possible models based on decision trees, association rules, or neural networks. A knowledge as a service provider responds to knowledge requests from users through a centralised knowledge server, and provides an interface between users and data owners.
Knowledge as a service, is one of a number of "... as a service" cloud computing models.
Overview
KaaS is a new type of "...as a Service" offerings that has been discussed with only nascent examples demonstrated in recent computer (2019) science conferences, in particular ISWC '19, the 18th International Semantic Web Conference. At that conference, it was described how knowledge can be made live and evolve on the web allowing users to learn directly from elaborated knowledge, now appearing in the form of knowledge graphs (KGs). KaaS appear when KGs are accessed via services This is opposed to DaaS which might "compute large volumes of data; integrate and analyzes that data; and publish it in real-time, using Web service APIs" (from Data as a Service) where the KaaS is able to exploit context - both the context of the user in relation to their information requests of the KaaS (where and when they make the request) and also the context of the information in relation to some objective or purpose of the users either understood by the KaaS automatically or indicated to it by the user.
KaaS is described as being more related to Data as a Service, Content as a Service and other services which supply information to users, more than other *aaS, such as Software as a Service which provide functionality however, the idea that a KaaS may analyse context at query time indicates that there is overlap between KaaS and *aaS such as Search as a Service and that perhaps not all KaaS responses are idempotent since their results depend on a context that might be time-dependent.
Di |
https://en.wikipedia.org/wiki/List%20of%20mechanical%20engineering%20awards | This list of mechanical engineering awards is an index to articles about notable awards for mechanical engineering.
Awards
See also
Lists of awards
Lists of science and technology awards
List of engineering awards
References
Mechanical engineering |
https://en.wikipedia.org/wiki/C-ImmSim | C-ImmSim started, in 1995, as the C-language "version" of IMMSIM, the IMMune system SIMulator, a program written back in 1991 in APL-2 (APL2 is a Registered Trademark of IBM Corp.) by the astrophysicist Phil E. Seiden together with the immunologist Franco Celada to implement the Celada-Seiden model. The porting was mainly conducted and further developed by Filippo Castiglione with the help of few other people.
The Celada-Seiden model
The Celada-Seiden model is a logical description of the mechanisms making up the adaptive immune humoral and cellular response to a genetic antigen at the mesoscopic level.
The computational counterpart of the Celada-Seiden model is the IMMSIM code.
The Celada-Seiden model, as well as C-ImmSim, is best viewed as a collection of models in a single program. In fact, there are various components realising a particular function which can be turned on or off. At its current stage, C-ImmSim incorporates the principal "core facts" of today's immunological knowledge, e.g.
the diversity of specific elements,
MHC restriction,
clonal selection by antigen affinity,
thymic education of T cells, antigen processing and presentation (both the cytosolic and endocytic pathways are implemented,
cell-cell cooperation,
homeostasis of cells created by the bone marrow,
hypermutation of antibodies,
maturation of the cellular and humoral response and memory.
Besides, an antigen can represent a bacterium, a virus or an allergen or a tumour cell.
The high degree of complexity of the Celada-Seiden model makes it suitable to simulate different immunological phenomena, e.g., the hypermutation of antibodies, the germinal centre reaction (GCR), immunization, Thymus selection, viral infections, hypersensitivity, etc.
Since the first release of C-ImmSim, the code has been modified many times. The actual version now includes features that were not in the original Celada-Seiden model.
C-ImmSim has been recently customised to simulate the HIV-1 infection. |
https://en.wikipedia.org/wiki/Lean%20%28proof%20assistant%29 | Lean is a proof assistant and programming language. It is based on the calculus of constructions with inductive types. It is an open-source project hosted on GitHub. It was made by Microsoft Research.
History
Initially launched by Leonardo de Moura at Microsoft Research in 2013.
Lean 3 was implemented as a virtual machine, which made it less efficient due to overhead associated with interpretation, making it less competitive compared to other proof assistants such as Coq.
In 2021, Lean 4 was released with a reimplementation of the Lean theorem prover capable of producing C code which is then compiled, enabling the development of efficient domain-specific automation. Another improvement compared to the previous version was ability to avoid touching C++ code in order to obtain certain features.
Lean 4 is not backwards-compatible with Lean 3.
Overview
Libraries
In 2017, the project adopted a user-maintained library mathlib with the goal to digitize pure mathematics research. As of November 2023, mathlib had formalized over 127,000 theorems and 70,000 definitions in Lean.
Editors integration
Lean integrates with:
Visual Studio Code
Neovim
Emacs
Interfacing is done via a client-extension and Language Server Protocol server.
It has native support for Unicode symbols, which can be typed using LaTeX-like sequences, such as "\times" for "×". Lean can also be compiled to JavaScript and accessed in a web browser and has extensive support for meta-programming.
Examples (Lean 3)
The natural numbers can be defined as an inductive type. This definition is based on the Peano axioms and states that every natural number is either zero or the successor of some other natural number.
inductive nat : Type
| zero : nat
| succ : nat → nat
Addition of natural numbers can be defined recursively, using pattern matching.
definition add : nat → nat → nat
| n zero := n
| n (succ m) := succ (add n m)
This is a simple proof in lean in term mode.
theorem and_swap : p ∧ q |
https://en.wikipedia.org/wiki/Abelian%20Lie%20group | In geometry, an abelian Lie group is a Lie group that is an abelian group.
A connected abelian real Lie group is isomorphic to . In particular, a connected abelian (real) compact Lie group is a torus; i.e., a Lie group isomorphic to . A connected complex Lie group that is a compact group is abelian and a connected compact complex Lie group is a complex torus; i.e., a quotient of by a lattice.
Let A be a compact abelian Lie group with the identity component . If is a cyclic group, then is topologically cyclic; i.e., has an element that generates a dense subgroup. (In particular, a torus is topologically cyclic.)
See also
Cartan subgroup
Citations
Works cited
Abelian group theory
Geometry
Lie groups |
https://en.wikipedia.org/wiki/Sommerfeld%20effect | In mechanics, Sommerfeld effect is a phenomenon arising from feedback in the energy exchange between vibrating systems: for example, when for the rocking table, under given conditions, energy transmitted to the motor resulted not in higher revolutions but in stronger vibrations of the table. It is named after Arnold Sommerfeld.
In 1902, A. Sommerfeld analyzed the vibrations caused by a motor driving an unbalanced weight and wrote that "This experiment corresponds roughly to the case in which a factory owner has a machine set on a poor foundation running at 30 horsepower. He achieves an effective level of just 1/3, however, because only 10 horsepower are doing useful work, while 20 horsepower are transferred to the foundational masonry".
First mathematical descriptions of Sommerfeld effect were suggested by I. Blekhman and V. Konenko.
Hidden attractors in Sommerfeld effect
In the theory of hidden oscillations, Sommerfeld effect is explained by the multistability and presence in the phase space of dynamical model without stationary states of two coexisting hidden attractors, one of which attracts trajectories from vicinity of zero initial data (which correspond to the typical start up of the motor), and the other attractor corresponds to the desired mode of operation with a higher frequency of rotation. Depending on the model under consideration, coexisting hidden attractors in the model may be either periodic or chaotic; such dynamical models with Sommerfeld effect are the earliest known mechanical example of a system without equilibria and with hidden attractors. For example, the Sommerfeld effect with hidden attractors can be observed in dynamic models of drilling rigs,
where the electric motor may excite torsional vibrations of the drill.
References
Dynamical systems
Physical phenomena
Hidden oscillation |
https://en.wikipedia.org/wiki/Gelfand%E2%80%93Fuks%20cohomology | In mathematics, Gelfand–Fuks cohomology, introduced in , is a cohomology theory for Lie algebras of smooth vector fields. It differs from the Lie algebra cohomology of Chevalley-Eilenberg in that its cochains are taken to be continuous multilinear alternating forms on the Lie algebra of smooth vector fields where the latter is given the topology.
References
Further reading
Cohomology theories
Lie algebras
Homological algebra |
https://en.wikipedia.org/wiki/Biliprotein | Biliproteins are pigment protein compounds that are located in photosynthesising organisms such as algae and certain insects. They refer to any protein that contains a bilin chromophore. In plants and algae, the main function of biliproteins is to make the process of light accumulation required for photosynthesis more efficient; while in insects they play a role in growth and development. Some of their properties: including light-receptivity, light-harvesting and fluorescence have made them suitable for applications in bioimaging and as indicators; while other properties such as anti-oxidation, anti-aging and anti-inflammation in phycobiliproteins have given them potential for use in medicine, cosmetics and food technology. While research on biliproteins dates back as far as 1950, it was hindered due to issues regarding biliprotein structure, lack of methods available for isolating individual biliprotein components, as well as limited information on lyase reactions (which are needed to join proteins with their chromophores). Research on biliproteins has also been primarily focused on phycobiliproteins; but advances in technology and methodology, along with the discovery of different types of lyases, has renewed interest in biliprotein research, allowing new opportunities for investigating biliprotein processes such as assembly/disassembly and protein folding.
Functions
In plants and algae
Biliproteins found in plants and algae serve as a system of pigments whose purpose is to detect and absorb light needed for photosynthesis. The absorption spectra of biliproteins complements that of other photosynthetic pigments such as chlorophyll or carotene. The pigments detect and absorb energy from sunlight; the energy later being transferred to chlorophyll via internal energy transfer. According to a 2002 article written by Takashi Hirata et al., the chromophores of certain phycobiliproteins are responsible for antioxidant activities in these biliproteins, and phycocya |
https://en.wikipedia.org/wiki/Libarchive | libarchive is a free and open-source library for reading and writing various archive and compression formats. It is written in C and works on most Unix-like systems and Windows.
History
libarchive's development was started in 2003 as part of the FreeBSD project. During the early years it was led by the FreeBSD project, but later it became an independent project. It was first released with FreeBSD 5.3 in November 2004.
libarchive
libarchive automatically detects and reads archive formats. If the archive is compressed, libarchive also detects and handles compression formats before evaluating the archive. libarchive is designed to minimize the copying of data internally for very optimal performance.
Supported archive formats:
7z - read and write
ar - read and write
cab - read only
cpio - read and write
ISO9660- read and write
lha & lzh - read only
pax - read and write
rar - read only
shar - write only
tar - read and write
warc (ISO 28500:2009) - read and write
xar - read and write
zip - read and write
Utilities
libarchive provides command-line utilities called bsdtar and bsdcpio. These are complete re-implementation based on libarchive. These are the default system tar and cpio on FreeBSD, NetBSD, macOS and Windows.
There is also bsdcat, designed to decompress a file to the standard output like zcat.
Users
libarchive was originally developed for FreeBSD, but is also used in NetBSD and macOS as part of those operating systems.
bsdtar has been included in Windows since Windows 10 April 2018 Update. In May 2023, Microsoft announced Windows 11 will natively support additional archive formats such as 7z and RAR via libarchive.
References
External links
2003 software
C (programming language) libraries
Free software programmed in C
Cross-platform software
File archivers
Free data compression software |
https://en.wikipedia.org/wiki/Wavefront%20expansion%20algorithm | The wavefront expansion algorithm is a specialized potential field path planner with breadth-first search to avoid local minima. It uses a growing circle around the robot. The nearest neighbors are analyzed first and then the radius of the circle is extended to distant regions.
Motivation
Before a robot is able to navigate a map it needs a plan. The plan is a trajectory from start to goal and describes, for each moment in time and each position in the map, the robot's next action. Path planning is solved by many different algorithms, which can be categorised as sampling-based and heuristics-based approaches.
Before path planning, the map is discretized into a grid. The vector information is converted into a 2D array and stored in memory. The potential field path planning algorithm determines the direction of the robot for each cell. This direction field is shown overlaid on the robotic map containing the robot and the obstacles. The question for the potential field algorithm is: which cell is labeled with which direction? This can be answered with a sampling-based algorithm.
Wavefront expansion
A sampling-based planner works by searching the graph. In the case of path planning, the graph contains the spatial nodes which can be observed by the robot. The wavefront expansion increases the performance of the search by analyzing only nodes near the robot. The decision is made on a geometrical level which is equal to breadth-first search. That means, it uses metrics like distances from obstacles and gradient search for the path planning algorithm.
The algorithm includes a cost function as an additional heuristic for path planning.
Implementation
Practical open-source implementations of the algorithm are available. The map of the world is provided as an array. Obstacles and the start position of the robot are given by special values in the array. The solver determines the goal direction in the imagined wave.
Existing implementations use a queue to store a wave data |
https://en.wikipedia.org/wiki/Parallel%20task%20scheduling | Parallel task scheduling (also called parallel job scheduling or parallel processing scheduling) is an optimization problem in computer science and operations research. It is a variant of optimal job scheduling. In a general job scheduling problem, we are given n jobs J1, J2, ..., Jn of varying processing times, which need to be scheduled on m machines while trying to minimize the makespan - the total length of the schedule (that is, when all the jobs have finished processing). In the specific variant known as parallel-task scheduling, all machines are identical. Each job j has a length parameter pj and a size parameter qj, and it must run for exactly pj time-steps on exactly qj machines in parallel.
Veltman et al. and Drozdowski denote this problem by in the three-field notation introduced by Graham et al. P means that there are several identical machines running in parallel; sizej means that each job has a size parameter; Cmax means that the goal is to minimize the maximum completion time. Some authors use instead. Note that the problem of parallel-machines scheduling is a special case of parallel-task scheduling where for all j, that is, each job should run on a single machine.
The origins of this problem formulation can be traced back to 1960. For this problem, there exists no polynomial time approximation algorithm with a ratio smaller than unless .
Definition
There is a set of jobs, and identical machines. Each job has a processing time (also called the length of j), and requires the simultaneous use of machines during its execution (also called the size or the width of j).
A schedule assigns each job to a starting time and a set of machines to be processed on. A schedule is feasible if each processor executes at most one job at any given time.
The objective of the problem denoted by is to find a schedule with minimum length , also called the makespan of the schedule.
A sufficient condition for the feasibility of a schedule is the follow |
https://en.wikipedia.org/wiki/Huddles%20%28app%29 | Huddles (originally V2, Byte, and later Clash) was an American short-form video hosting service and social network where users could create looping videos that are between 2–16 seconds long. It was created by a team led by Dom Hofmann as a successor to Vine, which Hofmann co-founded, until the project was sold to Clash App, Inc. and subsequently renamed.
Initially teased as v2, it was branded as Byte in November 2018. After a three-year closed beta, it officially launched on Apple's App Store and the Google Play Store on January 24, 2020. It was later sold to Clash, another short-form video app, a year later. Both apps thus merged into a single app called Clash, which was then later renamed to Huddles. It was discontinued on May 3, 2023.
History
Byte's predecessor, Vine, was founded in June 2012. It was acquired by Twitter in October 2012. It underwent a staggered update on iOS, Android, and Windows Phone systems throughout much of 2013. The main Vine app was shut down by Twitter in January 2017, disallowing all new videos to be uploaded. The Vine homepage was made into an archive, with users being able to view previously uploaded content. As of 2019, the archive is no longer available, though individual videos are still able to be accessed via their direct link.
Vine co-founder Dom Hofmann announced in December 2017 that he intended to launch a successor to Vine. At the time, he called it "v2". In May 2018, he posted an update that the project was being put on hold. Among other things, he said that the biggest reason for this was "financial and legal hurdles". He said that his intention was to fund the new service himself as a personal project, but the attention that the announcement generated suggested that the cost to build and run a service that was sustainable at launch would be too high. In November, he announced that the project was moving forward again with funding and a team, under the new "Byte" branding. At the time, the website invited users to sign |
https://en.wikipedia.org/wiki/Wuhan%20Institute%20of%20Virology | The Wuhan Institute of Virology, Chinese Academy of Sciences (WIV; ) is a research institute on virology administered by the Chinese Academy of Sciences (CAS), which reports to the State Council of the People's Republic of China. The institute is one of nine independent organisations in the Wuhan Branch of the CAS. Located in Jiangxia District, Wuhan, Hubei, it was founded in 1956 and opened mainland China's first biosafety level 4 (BSL-4) laboratory in 2018. The institute has collaborated with the Galveston National Laboratory in the United States, the Centre International de Recherche en Infectiologie in France, and the National Microbiology Laboratory in Canada. The institute has been an active premier research center for the study of coronaviruses.
History
The WIV was founded in 1956 as the Wuhan Microbiology Laboratory under the Chinese Academy of Sciences (CAS). It was established by scientists Gao Shangyin, a graduate of Soochow University (Suzhou), and Chen Huagui. In 1961, it became the South China Institute of Microbiology, and in 1962 was renamed Wuhan Microbiology Institute. In 1970, it became the Microbiology Institute of Hubei Province when the Hubei Commission of Science and Technology took over the administration. In June 1978, it was returned to the CAS and renamed Wuhan Institute of Virology.
In 2003, the Chinese academy of Sciences approved the construction of mainland China's first biosafety level 4 (BSL-4) laboratory at the WIV. In 2014, the WIV's National Bio-safety Laboratory was built at a cost of 300 million yuan (US$44 million), in collaboration and with assistance from the French government's CIRI lab). The new laboratory building has 3000 m2 of BSL-4 space, and also 20 BSL-2 and two BSL-3 laboratories. The BSL-4 facilities were accredited by the China National Accreditation Service for Conformity Assessment (CNAS) in January 2017, with the BSL-4 level lab put into operation in January 2018. The highest level biosafety installation is ne |
https://en.wikipedia.org/wiki/List%20of%20piezoelectric%20materials | This page lists properties of several commonly used piezoelectric materials.
Piezoelectric materials (PMs) can be broadly classified as either crystalline, ceramic, or polymeric. The most commonly produced piezoelectric ceramics are lead zirconate titanate (PZT), barium titanate, and lead titanate. Gallium nitride and zinc oxide can also be regarded as a ceramic due to their relatively wide band gaps. Semiconducting PMs offer features such as compatibility with integrated circuits and semiconductor devices. Inorganic ceramic PMs offer advantages over single crystals, including ease of fabrication into a variety of shapes and sizes not constrained crystallographic directions. Organic polymer PMs, such as PVDF, have low Young's modulus compared to inorganic PMs. Piezoelectric polymers (PVDF, 240 mV-m/N) possess higher piezoelectric stress constants (g33), an important parameter in sensors, than ceramics (PZT, 11 mV-m/N), which show that they can be better sensors than ceramics. Moreover, piezoelectric polymeric sensors and actuators, due to their processing flexibility, can be readily manufactured into large areas, and cut into a variety of shapes. In addition polymers also exhibit high strength, high impact resistance, low dielectric constant, low elastic stiffness, and low density, thereby a high voltage sensitivity which is a desirable characteristic along with low acoustic and mechanical impedance useful for medical and underwater applications.
Among PMs, PZT ceramics are popular as they have a high sensitivity, a high g33 value. They are however brittle. Furthermore, they show low Curie temperature, leading to constraints in terms of applications in harsh environmental conditions. However, promising is the integration of ceramic disks into industrial appliances moulded from plastic. This resulted in the development of PZT-polymer composites, and the feasible integration of functional PM composites on large scale, by simple thermal welding or by conforming proc |
https://en.wikipedia.org/wiki/UK%20Battery%20Industrialisation%20Centre | The UK Battery Industrialisation Centre (UK BIC) is a research centre in the United Kingdom, to develop new electrical batteries, for the British automotive industry. UKBIC provides over £60 million worth of specialized manufacturing equipment, supporting manufacturers, entrepreneurs, researchers, and educators in battery technology development. It has accelerated low carbon R&D, contributing to the UK's Net Zero goal by 2050.
History
Funding for the UK Battery Industrialisation Centre (UKBIC) is supplied by United Kingdom Research and Innovation (UKRI). This financial support was announced on 29 November 2017. The facility was officially inaugurated by the British Prime Minister, Boris Johnson, in July 2021, as documented on the UKBIC's official website.
Location
The UKBIC facility is located outside Coventry, adjacent to Coventry airport and about half a mile east of the junction between the A46 and A45. This is just outside the city boundary, in the extreme north of Warwick District, Warwickshire.
References
External links
2020 establishments in England
Automotive industry in the United Kingdom
Engineering research institutes
Research institutes in Warwickshire
Warwick District |
https://en.wikipedia.org/wiki/Blood%20compatibility%20testing | Blood compatibility testing is conducted in a medical laboratory to identify potential incompatibilities between blood group systems in blood transfusion. It is also used to diagnose and prevent some complications of pregnancy that can occur when the baby has a different blood group from the mother. Blood compatibility testing includes blood typing, which detects the antigens on red blood cells that determine a person's blood type; testing for unexpected antibodies against blood group antigens (antibody screening and identification); and, in the case of blood transfusions, mixing the recipient's plasma with the donor's red blood cells to detect incompatibilities (crossmatching). Routine blood typing involves determining the ABO and RhD (Rh factor) type, and involves both identification of ABO antigens on red blood cells (forward grouping) and identification of ABO antibodies in the plasma (reverse grouping). Other blood group antigens may be tested for in specific clinical situations.
Blood compatibility testing makes use of reactions between blood group antigens and antibodies—specifically the ability of antibodies to cause red blood cells to clump together when they bind to antigens on the cell surface, a phenomenon called agglutination. Techniques that rely on antigen-antibody reactions are termed serologic methods, and several such methods are available, ranging from manual testing using test tubes or slides to fully automated systems. Blood types can also be determined through genetic testing, which is used when conditions that interfere with serologic testing are present or when a high degree of accuracy in antigen identification is required.
Several conditions can cause false or inconclusive results in blood compatibility testing. When these issues affect ABO typing, they are called ABO discrepancies. ABO discrepancies must be investigated and resolved before the person's blood type is reported. Other sources of error include the "weak D" phenomenon, in whi |
https://en.wikipedia.org/wiki/Heinz%20Raether | Heinz Artur Raether (14 October 1909 — 31 December 1986) was a German physicist. He is best known for his theoretical and experimental contributions to the study of surface plasmons, as well as for Kretschmann-Raether configuration, a commonly-used experimental setup for the excitation of surface plasmon resonances.
From 1944 to 1946 he was a professor of physics at the University of Jena at the Physikalisches Institut. Here he dealt with electron physics, electron microscopy, electron interference and gas discharges.
In 1951, he took over the management of the Institute for Applied Physics at the University of Hamburg. After the development of the transistor, he focused on solid state physics. His work during this period concerned the structure and growth of crystals. Later he became interested in the collective behavior of the electrons of a crystal, the solid-state electron plasma.
In gas discharge physics, he devoted himself to the ignition process, especially the formation of the spark channel, the initial phase of electrical breakdown. In 1963 he was elected a full member of the Göttingen Academy of Sciences. In 1979 he was elected a member of the Academy of Sciences Leopoldina.
Selected publications
Articles
Books
See also
Raether limit
Surface plasmon polariton
Townsend discharge
References
1909 births
1986 deaths
Condensed matter physicists
Electrical breakdown
Optical physicists
People associated with electricity
Scientists from Nuremberg
German plasma physicists
Plasmonics
Academic staff of the University of Hamburg
Academic staff of the University of Jena
20th-century German physicists
Nanophysicists |
https://en.wikipedia.org/wiki/American%20Vaudeville%20Museum | The American Vaudeville Museum (AVM) was a vaudeville history and memorabilia museum in Edgewood, New Mexico which moved its collection to the University of Arizona and online.
The museum was founded by Frank Cullen and Donald McNeilly. The museum posted historic content online and published Vaudeville Times magazine quarterly from 1998 to 2008 Its virtual museum included a bibliography of sources and an index of vaudevillians. The museum was founded in 1986.
References
External links
American Vaudeville Museum website
University of Arizona Vaudeville Archive
Defunct museums in New Mexico
Vaudeville
Virtual museums
University of Arizona
University museums in Arizona |
https://en.wikipedia.org/wiki/ACES%20Coders | ACES Coders is an algorithmic programming competition in Sri Lanka organized by the Association of Computer Engineering Students (ACES) of the Department of Computer Engineering, Faculty of Engineering, University of Peradeniya. ACES Coders commenced in 2011, as a training event for computer engineering students to improve their efficiency and accuracy to participate in similar online competitions such as IEEEXtreme and ACM ICPC. With the increase in popularity it is now organized on an annual basis and today it is one of the largest coding competitions that attracts participants from the universities and higher education institutions all over the country. The event usually lasts for 12 hours where the contestants are given a set of problems to solve using programming and problem-solving skills.
Past winners
Awards
ACES coders award cash prizes along with certificates to the winning teams. The cash prizes have grown in size with years and the 2019 event came with a 150,000 LKR total split between the top 3 teams. It is predicted that the 2020 event will award cash in excess of 200,000 LKR.
References
Programming contests |
https://en.wikipedia.org/wiki/Windows%20Open%20Services%20Architecture | Windows Open Services Architecture (WOSA) is a set of proprietary Microsoft technologies intended to "...provide a single, open-ended interface to enterprise computing environments.". WOSA was announced by Microsoft in 1992. WOSA was pitched as a set of programming interfaces designed to provide application interoperability across the Windows environment.
The set of technologies that were part of he WOSA initiative include:
LSAPI (Software Licensing API)
MAPI (Mail Application Programming Interface)
ODBC (Open Database Connectivity)
OLE for Process Control
SAPI (Speech Application Programming Interface)
TAPI (Telephony Application Programming Interface)
Windows SNA (IBM SNA Networks)
WOSA/XFS (WOSA for Financial Services)
WOSA/XRT (WOSA for Real-time Market Data)
See also
Component Object Model
Object Linking and Embedding
References
External links
Inter-process communication
Windows communication and services
Architectural pattern (computer science)
Enterprise application integration
Service-oriented (business computing)
Web services
Component-based software engineering |
https://en.wikipedia.org/wiki/Radiant%20Mercury | Radiant Mercury is a cross-domain solution (CDS) software application developed by Lockheed Martin primarily in use by the US Navy.
As a CDS, it is designed to allow communications between higher-level classified networks and lower-level, unclassified networks.
See also
Guard (information security)
References
Computer security software
Lockheed Martin |
https://en.wikipedia.org/wiki/GGSE-4 | The Gravity Gradient Stabilization Experiment (GGSE-4) was a technology satellite launched in 1967. This was ostensibly the fourth in a series that developed designs and deployment techniques later applied to the NOSS/Whitecloud reconnaissance satellites.
History
GGSE-4 was launched by the U.S. Airforce from Vandenberg Air Force Base atop a Thor Agena-D rocket.
GGSE-4 remained operational from 1967 through 1972.
It is alleged that the real name of GGSE-4 was POPPY 5B or POPPY 5b and that it was a U.S. National Reconnaissance Office satellite designed to collect signals intelligence; POPPY 5B was part of a 7-satellite mission. A partial subset of information about POPPY was declassified in 2005.
Other sources say that GGSE-4 weighed only 10 pounds but that it was attached to the much larger Poppy 5, which would have weighed 85 kg and featured an 18-meter boom.
It is further alleged that GGSE-4's mass is not at all like GGSE-1's mass and that GGSE'4 weighs 85 kg.
2020 near-miss
On , GGSE-4 was expected to pass as closely as 12 meters from IRAS, another un-deorbited satellite left aloft. IRAS was launched in 1983 and abandoned after a 10-month mission. The 14.7-kilometer per second pass had an estimated risk of collision of 5%. Further complications arose from the fact that GGSE-4 was outfitted with an 18 meter long stabilization boom that was in an unknown orientation and may have struck the satellite even if the spacecraft's main body did not. Initial observations from amateur astronomers seemed to indicate that both satellites had survived the pass, with the California-based debris tracking organization LeoLabs later confirming that they had detected no new tracked debris following the incident.
See also
Gravity Gradient Stabilization Experiment (GGSE-1)
References
Space |
https://en.wikipedia.org/wiki/Chinese%20Ceramic%20Society | The Chinese Ceramic Society (; abbreviated CCS) of Beijing is a Chinese non-profit professional body and learned society in the field of Chinese ceramics with a focus on scientific research, emerging technologies, and applications in which ceramic materials are an element. It was established in 1945. As of 2018, the society has 21 specialized committees and 3 working committees with more than 20,000 individual members.
History
The Chinese Ceramic Society started in 1945 as a research group in southwest China's Chongqing city. In January 1951 this group became the "China Kiln Engineering Society" (), but was closed down in October. In December 1956, the "Preparatory Committee of China Silicates Society" () was founded in Beijing, and in November 1959 the name was changed to the "Chinese Ceramic Society".
Scientific publishing
Journal of Materiomics
References
External links
Ceramic engineering
Ceramic materials
Glass engineering and science
Materials science organizations
Scientific organizations established in 1945
Organizations based in Beijing
1945 establishments in China |
https://en.wikipedia.org/wiki/ASIL%20accuracy | ASIL accuracy describes the maximum possible deviation of a measurement in a system in which a single point fault occurred before some diagnostic detects this fault. This concept applies to automotive systems designed under the ISO-26262 methodology for automotive functional safety, which defines Automotive Safety Integrity Levels (ASILs) to classify risks.
While accuracy refers to a single measurement, ASIL accuracy considers variation in the primary measurement being assessed as well as variation in the diagnostic measurement or measurements used to detect single point faults.
How to calculate
A conceptually simple implementation incorporates a fully redundant measurement. A fault in the primary measurement can be detected by comparing the primary and diagnostic measurements, and signaling a fault if the difference is outside the expected operating range. If the two measurements are truly independent and uncorrelated, in normal operation they can be at opposite ends of their operating ranges. If the primary measurement has an accuracy V1, and if the redundant diagnostic measurement has an accuracy V2, then the fault detection limit should be set to at least VLIM=V1+V2 to avoid false positives. The system shall flag a fault if the difference between V1 and V2 is greater than VLIM. The fault detection limit, however, should not be confused with ASIL accuracy. Consider the case of a single point fault in which the primary measurement drifts to an incorrect value. ASIL accuracy describes the maximum such drift before the fault is flagged. If the diagnostic measurement V2 is at the maximum of its operating range, the primary measurement can drift VLIM further before the fault is raised. The maximum possible drift in the primary measurement V1, then, is V2+VLIM, and so the ASIL accuracy VASIL=V2+VLIM.
References
Automotive engineering |
https://en.wikipedia.org/wiki/Mihaela%20Ignatova | Mihaela Ignatova is a Bulgarian mathematician who won the 2020 Sadosky Prize of the Association for Women in Mathematics for her research in mathematical analysis, and in particular in partial differential equations and fluid dynamics.
Education
In 2004, Ignatova earned both a bachelor's degree from Sofia University and a master's degree from the University of Nantes. She earned a second master's degree from Sofia University in 2006, working under the supervision of mathematician Emil Horozov. She then completed PhD studies from University of Southern California in 2011 under the supervision of Igor Kukavica.
Career
After working as a visiting assistant professor at the University of California, Riverside, a postdoctoral researcher at Stanford University, and an instructor at Princeton University, she moved to Temple University as an assistant professor in 2018.
References
External links
Home page
Year of birth missing (living people)
Living people
21st-century Bulgarian mathematicians
Bulgarian women mathematicians
Mathematical analysts
Sofia University alumni
University of Nantes alumni
University of Southern California alumni
Temple University faculty |
https://en.wikipedia.org/wiki/Reindeer%20distribution | The reindeer (caribou in North America) is a widespread and numerous species in the northern Holarctic, being present in both tundra and taiga (boreal forest). Originally, the reindeer was found in Scandinavia, eastern Europe, Russia, Mongolia, and northern China north of the 50th latitude. In North America, it was found in Canada, Alaska (United States), and the northern contiguous USA from Washington to Maine. In the 19th century, it was apparently still present in southern Idaho. It also occurred naturally on Sakhalin, Greenland, and probably even in historical times in Ireland.
During the late Pleistocene era, reindeer were found further south, such as at Nevada, Tennessee, and Alabama in North America and Spain in Europe. Today, wild reindeer have disappeared from many areas within this large historical range, especially from the southern parts, where it vanished almost everywhere. Populations of wild reindeer are still found in Norway, Finland, Siberia, Greenland, Alaska, and Canada.
The George River reindeer herd in the tundra of Quebec and Labrador in eastern Canada, once numbered world's largest 8–900,000 animals, stands December 2011 at 74,000 – a drop of up to 92% because of Iron-ore mining, flooding for hydropower and road building.
Domesticated reindeer are mostly found in northern Fennoscandia and Russia, with a herd of approximately 150–170 semi-domesticated reindeer living around the Cairngorms region in Scotland. Although formerly more widespread in Scandinavia, the last remaining wild mountain reindeer in Europe are found in portions of southern Norway. Siberian tundra reindeer are widespread in Russia.
A few reindeer from Norway were introduced to the South Atlantic island of South Georgia in the beginning of the 20th century. The South Georgian reindeer totaled some estimated 2600 animals in two distinct herds separated by glaciers. Although the flag and the coat of arms of the territory contain an image of a reindeer, they were eradicated fr |
https://en.wikipedia.org/wiki/Geometric%20Folding%20Algorithms | Geometric Folding Algorithms: Linkages, Origami, Polyhedra is a monograph on the mathematics and computational geometry of mechanical linkages, paper folding, and polyhedral nets, by Erik Demaine and Joseph O'Rourke. It was published in 2007 by Cambridge University Press ().
A Japanese-language translation by Ryuhei Uehara was published in 2009 by the Modern Science Company ().
Audience
Although aimed at computer science and mathematics students, much of the book is accessible to a broader audience of mathematically-sophisticated readers with some background in high-school level geometry.
Mathematical origami expert Tom Hull has called it "a must-read for anyone interested in the field of computational origami".
It is a monograph rather than a textbook, and in particular does not include sets of exercises.
The Basic Library List Committee of the Mathematical Association of America has recommended this book for inclusion in undergraduate mathematics libraries.
Topics and organization
The book is organized into three sections, on linkages, origami, and polyhedra.
Topics in the section on linkages include
the Peaucellier–Lipkin linkage for converting rotary motion into linear motion,
Kempe's universality theorem that any algebraic curve can be traced out by a linkage,
the existence of linkages for angle trisection,
and the carpenter's rule problem on straightening two-dimensional polygonal chains.
This part of the book also includes applications to motion planning for robotic arms, and to protein folding.
The second section of the book concerns the mathematics of paper folding, and mathematical origami. It includes the NP-completeness of testing flat foldability,
the problem of map folding (determining whether a pattern of mountain and valley folds forming a square grid can be folded flat),
the work of Robert J. Lang using tree structures and circle packing to automate the design of origami folding patterns,
the fold-and-cut theorem according to which any polygon |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.