text
stringlengths
31
999
source
stringclasses
5 values
Birkenhead Town Hall is a civic building and former town hall in Birkenhead on the Wirral Peninsula in Merseyside, England. The building was the former administrative headquarters of the County Borough of Birkenhead, and more recently, council offices for the Metropolitan Borough of Wirral. Birkenhead Town Hall remains the location of the town's register office
https://huggingface.co/datasets/fmars/wiki_stem
The National Civic Art Society is a nonprofit organization that advocates and promotes public art, architecture, and urbanism in the classical tradition and opposes the inclusion of modern and contemporary architectural styles. The Society has various regional chapters that host local events and outreach. The organization is headquartered in Washington, D
https://huggingface.co/datasets/fmars/wiki_stem
Colonial architecture is an hybrid architectural style that arose as colonists combined architectural styles from their country of origin with design characteristics of the settled country. Colonists frequently built houses and buildings in a style that was familiar to them but with local characteristics more suited to their new climate. In recent years, people in the interior design community are beginning to acknowledge the baggage the term might have - and to explore how the style may be appreciated whilst also acknowledging the harm and trauma of the Colonial era
https://huggingface.co/datasets/fmars/wiki_stem
The Johnathan True House is a historic home at 43 East Main Street in the Lower Falls area of Yarmouth, Maine. Built in 1780, before Yarmouth's secession from North Yarmouth, it is one of the oldest surviving buildings in the town. Between 1780 and the turn of the 19th century, Lower Falls saw an increase in its population after early settlers gradually moved inland from the area around the Meetinghouse under the Ledge on Gilman Road
https://huggingface.co/datasets/fmars/wiki_stem
Medieval architecture was the the art of designing and constructing buildings in the Middle Ages. Major styles of the period include pre-Romanesque, Romanesque, and Gothic. The Renaissance marked the end of the medieval period, when architects began to favour classical forms
https://huggingface.co/datasets/fmars/wiki_stem
Romanesque architecture is an architectural style of medieval Europe characterized by semi-circular arches. There is no consensus for the beginning date of the Romanesque style, with proposals ranging from the 6th to the 11th century, this later date being the most commonly held. In the 12th century it developed into the Gothic style, marked by pointed arches
https://huggingface.co/datasets/fmars/wiki_stem
Ad Quadratum: The Practical Application of Geometry in Medieval Architecture is an edited volume on the mathematical design of medieval architecture. It was edited by Nancy Y. Wu, published in 2002 by Ashgate Publishing, and reprinted in 2016 by Routledge
https://huggingface.co/datasets/fmars/wiki_stem
Pre-Romanesque architecture in Asturias is framed between the years 711 and 910, the period of the creation and expansion of the kingdom of Asturias. History In the 5th century, the Goths, a Christianized tribe of Eastern Germanic origin, arrived in the Iberian peninsula after the fall of the Roman empire, and dominated most of the territory, attempting to continue Roman order by the so-called Ordo Gothorum. In the year 710, the Visigothic king Wittiza died, and instead of being succeeded by the eldest of his three sons, Agila, the throne was usurped by the duke of Baetica, Roderic
https://huggingface.co/datasets/fmars/wiki_stem
A barrel vault, also known as a tunnel vault, wagon vault or wagonhead vault, is an architectural element formed by the extrusion of a single curve (or pair of curves, in the case of a pointed barrel vault) along a given distance. The curves are typically circular in shape, lending a semi-cylindrical appearance to the total design. The barrel vault is the simplest form of a vault: effectively a series of arches placed side by side (i
https://huggingface.co/datasets/fmars/wiki_stem
Birley Old Hall is a small English country house situated in the Birley Edge area of the City of Sheffield, England. The hall stands in an exposed situation at almost 200 metres above sea level on Edge Lane, some six km NW of the city centre and has been designated a Grade II listed building by English Heritage as has the Falconry which stands in the garden. History The hall was constructed in two phases, the easterly facing wing was constructed in the Late Middle Ages when it was known as “Byrlay Edge”, it was built with an oak cruck frame with an exterior of squared stone
https://huggingface.co/datasets/fmars/wiki_stem
Bjelaj Fortress locally known as Stari Grad Bjelaj (Bjelaj old town), is a medieval town-fortress complex near the village of Bjelaj, Bosanski Petrovac, Bosnia and Herzegovina. Location Located on the edge of the Petrovac plain (or part thereof Bjelajsko fields), on the northern slope of the Osječenica mountain. The area around the city is uninhabited and lean
https://huggingface.co/datasets/fmars/wiki_stem
Booth Mansion is a former town house at 28–34 Watergate Street, Chester, Cheshire, England. It contains a portion of the Chester Rows, is recorded in the National Heritage List for England as a designated Grade I listed building, and is included in the English Heritage Archive. Its frontage was built in 1700 in Georgian style but much medieval material remains behind it
https://huggingface.co/datasets/fmars/wiki_stem
A calvary, also called calvary hill, Sacred Mount, or Sacred Mountain, is a type of Christian sacred place, built on the slopes of a hill, composed by a set of chapels, usually laid out in the form of a pilgrims' way. It is intended to represent the passion of Jesus Christ and takes its name after Calvary, the hill in Jerusalem where Jesus was crucified. These function as greatly expanded versions of the Stations of the Cross that are usual in Catholic churches, allowing the devout to follow the progress of the stages of the Passion of Christ along the Via Dolorosa in Jerusalem
https://huggingface.co/datasets/fmars/wiki_stem
The Castle of Alva (Portuguese: Castelo de Alva) is a medieval castle located in the civil parish of Viade de Baixo e Fervidelas, in the municipality of Freixo de Espada à Cinta, Portuguese district of Bragança. History As part of the Leonese invasion, in June 1212, the first reference to the castle between Freixo de Espada à Cinta and Urros. By 1236 it was occupied by Leonese forces with the consent of its inhabitants
https://huggingface.co/datasets/fmars/wiki_stem
The Castle of Avis (Portuguese: Castelo de Avis), is a Portuguese medieval castle in civil parish of Avis, in the municipality of the same name, in the Alentejo district of Portalegre. History In 1211, D. Afonso II awarded the "lands of Avis" to the Knights of Évora, under the dependency of the Order of Calatrava, in the person of D
https://huggingface.co/datasets/fmars/wiki_stem
The Castle of Mértola (Portuguese: Castelo de Mértola) is a well-preserved medieval castle located in the civil parish and municipality of Mértola, in the Portuguese district of Beja. History In 318 B. C
https://huggingface.co/datasets/fmars/wiki_stem
The Castle of Mós (Portuguese: Castelo de Porto de Mós) is a medieval castle located in the civil parish of Mós, municipality of Torre de Moncorvo, in the Portuguese district of Bragança. It is classified by IGESPAR as a Site of Public Interest. Just north of the village, the castle was erected on a vegetation-heavy small hill called the Promenade de Mos, an ancient road axis in the region
https://huggingface.co/datasets/fmars/wiki_stem
39 Bridge Street is a building in Chester, Cheshire, England. It is recorded in the National Heritage List for England as a designated Grade I listed building, its major archaeological feature being the remains of a Roman hypocaust in its cellar. The building has four storeys, with a shop at street level and a portion of Chester Rows in the storey above
https://huggingface.co/datasets/fmars/wiki_stem
Church of the Holy Cross (Croatian: Crkva svetog Križa) is a Croatian Pre-Romanesque Catholic church originating from the 9th century in Nin. Description According to a theory from an art historian Mladen Pejaković, the design has an intentionally unbalanced elliptical form designated to "follow" the position of the Sun, retaining the functionality of a calendar and sundial. In its beginning, in the time of the Croatian principality, it was used as a royal chapel of the duke's courtyard nearby
https://huggingface.co/datasets/fmars/wiki_stem
Cowper House is a former town house at 12 Bridge Street, Chester, Cheshire, England. It is recorded in the National Heritage List for England as a designated Grade I listed building, and it incorporates a section of the Chester Rows. History Cowper House was built in 1664, following the destruction of many buildings in Chester during the Civil War
https://huggingface.co/datasets/fmars/wiki_stem
Croatian Pre-Romanesque art and architecture or Old Croatian Art is Pre-Romanesque art and architecture of Croats from their arrival at Balkans till the end of the 11th century when begins the dominance of Romanesque style in art; that was the time of Croatian rulers (Croatian dukes and Croatian Kingdom). Historical background The Croats were an Iron Age nomadic culture, and tended to avoid the Roman cities, instead moving to the countryside nearby (for example the river island on the delta of the river Jadro beside the city of Salona). The Croats had assimilated the Avars, accepted Christianity, and the ruling caste had learned to speak and write Latin
https://huggingface.co/datasets/fmars/wiki_stem
Kudos: Rock Legend is a spinoff of the game Kudos. Unlike the previous Kudos game, however, this game allows the player to start his own rock/pop band. Plot The player begins as a vocalist with ambitions for becoming a rock star, setting a personal goal of achieving this in five years
https://huggingface.co/datasets/fmars/wiki_stem
Buzz! Junior: Ace Racers is the fifth and latest game in the Buzz! Junior series of party games. It was developed by Cohort Studios and released in 2008 for the PlayStation 2. Reception GamesRadar scored the game as 3/5 and was happy with the variety of game types but criticised the racing mini-games and the "stuck record" announcer
https://huggingface.co/datasets/fmars/wiki_stem
Game Party Champions is the fifth videogame in the Game Party series, and was a launch title for the Wii U console in North America and the PAL region. It is the successor to Game Party: In Motion. Gameplay Game Party Champions has eight minigames to choose from: Ping Pong, Skill Ball, Water Gun, Mini Golf, Air Hockey, Hoop Shoot, Football, and Baseball
https://huggingface.co/datasets/fmars/wiki_stem
Angry Birds Blast (stylized as Angry Birds Blast!) is a free-to-play tile-matching puzzle game, developed by Bandai Namco Studios and published by Rovio Entertainment in 2016 as a spin-off from the Angry Birds franchise. Gameplay Similar to other tile-matching games, balloons are cleared from the gameplay field in groups of at least two, adjoining in the same color. The game is divided into levels with specific goals, such as popping specific bird-shaped balloons, pigs, clearing bubbles from the field, or clearing the path for a hot air balloon to reach the top of the field
https://huggingface.co/datasets/fmars/wiki_stem
Bit. Trip Complete is a compilation of six games in the Bit. Trip series, including Bit
https://huggingface.co/datasets/fmars/wiki_stem
Blocktrix is a free, online, multiplayer, puzzle game based on TetriNET and was created in 2000 by StrikeLight. It was developed as an update to the official TetriNET 1. 13 client after the original creator, St0rmCat, created a new version to the game entitled TetriNET 2, which included major changes to the TetriNET client such as not allowing private servers
https://huggingface.co/datasets/fmars/wiki_stem
Variable bitrate (VBR) is a term used in telecommunications and computing that relates to the bitrate used in sound or video encoding. As opposed to constant bitrate (CBR), VBR files vary the amount of output data per time segment. VBR allows a higher bitrate (and therefore more storage space) to be allocated to the more complex segments of media files while less space is allocated to less complex segments
https://huggingface.co/datasets/fmars/wiki_stem
In coding theory, a variable-length code is a code which maps source symbols to a variable number of bits. The equivalent concept in computer science is bit string. Variable-length codes can allow sources to be compressed and decompressed with zero error (lossless data compression) and still be read back symbol by symbol
https://huggingface.co/datasets/fmars/wiki_stem
Video is an electronic medium for the recording, copying, playback, broadcasting, and display of moving visual media. Video was first developed for mechanical television systems, which were quickly replaced by cathode-ray tube (CRT) systems which, in turn, were replaced by flat panel displays of several types. Video systems vary in display resolution, aspect ratio, refresh rate, color capabilities and other qualities
https://huggingface.co/datasets/fmars/wiki_stem
A video codec is software or hardware that compresses and decompresses digital video. In the context of video compression, codec is a portmanteau of encoder and decoder, while a device that only compresses is typically called an encoder, and one that only decompresses is a decoder. The compressed data format usually conforms to a standard video coding format
https://huggingface.co/datasets/fmars/wiki_stem
In the field of video compression a video frame is compressed using different algorithms with different advantages and disadvantages, centered mainly around amount of data compression. These different algorithms for video frames are called picture types or frame types. The three major picture types used in the different video algorithms are I, P and B
https://huggingface.co/datasets/fmars/wiki_stem
Warped linear predictive coding (warped LPC or WLPC) is a variant of linear predictive coding in which the spectral representation of the system is modified, for example by replacing the unit delays used in an LPC implementation with first-order all-pass filters. This can have advantages in reducing the bitrate required for a given level of perceived audio quality/intelligibility, especially in wideband audio coding. History Warped LPC was first proposed in 1980 by Hans Werner Strube
https://huggingface.co/datasets/fmars/wiki_stem
In signal processing, white noise is a random signal having equal intensity at different frequencies, giving it a constant power spectral density. The term is used, with this or similar meanings, in many scientific and technical disciplines, including physics, acoustical engineering, telecommunications, and statistical forecasting. White noise refers to a statistical model for signals and signal sources, rather than to any specific signal
https://huggingface.co/datasets/fmars/wiki_stem
zoo is a data compression program and format developed by Rahul Dhesi in the mid-1980s. The format is based on the LZW compression algorithm and compressed files are identified by the . zoo file extension
https://huggingface.co/datasets/fmars/wiki_stem
Delta encoding is a way of storing or transmitting data in the form of differences (deltas) between sequential data rather than complete files; more generally this is known as data differencing. Delta encoding is sometimes called delta compression, particularly where archival histories of changes are required (e. g
https://huggingface.co/datasets/fmars/wiki_stem
In computing, the utility diff is a data comparison tool that computes and displays the differences between the contents of files. Unlike edit distance notions used for other purposes, diff is line-oriented rather than character-oriented, but it is like Levenshtein distance in that it tries to determine the smallest set of deletions and insertions to create one file from the other. The utility displays the changes in one of several standard formats, such that both humans or computers can parse the changes, and use them for patching
https://huggingface.co/datasets/fmars/wiki_stem
xdelta is a command line program for delta encoding, which generates the difference between two files. This is similar to diff and patch, but it is targeted for binary files and does not generate human readable output. It was first released in 1997
https://huggingface.co/datasets/fmars/wiki_stem
Detection theory or signal detection theory is a means to measure the ability to differentiate between information-bearing patterns (called stimulus in living organisms, signal in machines) and random patterns that distract from the information (called noise, consisting of background stimuli and random activity of the detection machine and of the nervous system of the operator). In the field of electronics, signal recovery is the separation of such patterns from a disguising background. According to the theory, there are a number of determiners of how a detecting system will detect a signal, and where its threshold levels will be
https://huggingface.co/datasets/fmars/wiki_stem
Bell Laboratories Layer Space-Time (BLAST) is a transceiver architecture for offering spatial multiplexing over multiple-antenna wireless communication systems. Such systems have multiple antennas at both the transmitter and the receiver in an effort to exploit the many different paths between the two in a highly-scattering wireless environment. BLAST was developed by Gerard Foschini at Lucent Technologies' Bell Laboratories (now Nokia Bell Labs)
https://huggingface.co/datasets/fmars/wiki_stem
Location estimation in wireless sensor networks is the problem of estimating the location of an object from a set of noisy measurements. These measurements are acquired in a distributed manner by a set of sensors. Use Many civilian and military applications require monitoring that can identify objects in a specific area, such as monitoring the front entrance of a private house by a single camera
https://huggingface.co/datasets/fmars/wiki_stem
Minmax (sometimes Minimax, MM or saddle point) is a decision rule used in artificial intelligence, decision theory, game theory, statistics, and philosophy for minimizing the possible loss for a worst case (maximum loss) scenario. When dealing with gains, it is referred to as "maximin" – to maximize the minimum gain. Originally formulated for several-player zero-sum game theory, covering both the cases where players take alternate moves and those where they make simultaneous moves, it has also been extended to more complex games and to general decision-making in the presence of uncertainty
https://huggingface.co/datasets/fmars/wiki_stem
A receiver operating characteristic curve, or ROC curve, is a graphical plot that illustrates the diagnostic ability of a binary classifier system as its discrimination threshold is varied. The ROC curve is the plot of the true positive rate (TPR) against the false positive rate (FPR), at various threshold settings. The ROC can also be thought of as a plot of the power as a function of the Type I Error of the decision rule (when the performance is calculated from just a sample of the population, it can be thought of as estimators of these quantities)
https://huggingface.co/datasets/fmars/wiki_stem
The sensitivity index or discriminability index or detectability index is a dimensionless statistic used in signal detection theory. A higher index indicates that the signal can be more readily detected. Definition The discriminability index is the separation between the means of two distributions (typically the signal and the noise distributions), in units of the standard deviation
https://huggingface.co/datasets/fmars/wiki_stem
The following terms are used by electrical engineers in statistical signal processing studies instead of typical statistician's terms. In other engineering fields, particularly mechanical engineering, uncertainty analysis examines systematic and random components of variations in measurements associated with physical experiments. Notes References S
https://huggingface.co/datasets/fmars/wiki_stem
The total operating characteristic (TOC) is a statistical method to compare a Boolean variable versus a rank variable. TOC can measure the ability of an index variable to diagnose either presence or absence of a characteristic. The diagnosis of presence or absence depends on whether the value of the index is above a threshold
https://huggingface.co/datasets/fmars/wiki_stem
The Akaike information criterion (AIC) is an estimator of prediction error and thereby relative quality of statistical models for a given set of data. Given a collection of models for the data, AIC estimates the quality of each model, relative to each of the other models. Thus, AIC provides a means for model selection
https://huggingface.co/datasets/fmars/wiki_stem
In statistics, an approximate entropy (ApEn) is a technique used to quantify the amount of regularity and the unpredictability of fluctuations over time-series data. For example, consider two series of data: Series A: (0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, .
https://huggingface.co/datasets/fmars/wiki_stem
Ascendency or ascendancy is a quantitative attribute of an ecosystem, defined as a function of the ecosystem's trophic network. Ascendency is derived using mathematical tools from information theory. It is intended to capture in a single index the ability of an ecosystem to prevail against disturbance by virtue of its combined organization and size
https://huggingface.co/datasets/fmars/wiki_stem
In information theory, the binary entropy function, denoted H ⁡ ( p ) {\displaystyle \operatorname {H} (p)} or H b ⁡ ( p ) {\displaystyle \operatorname {H} _{\text{b}}(p)} , is defined as the entropy of a Bernoulli process with probability p {\displaystyle p} of one of two values. It is a special case of H ( X ) {\displaystyle \mathrm {H} (X)} , the entropy function. Mathematically, the Bernoulli trial is modelled as a random variable X {\displaystyle X} that can take on only two values: 0 and 1, which are mutually exclusive and exhaustive
https://huggingface.co/datasets/fmars/wiki_stem
In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y {\displaystyle Y} given that the value of another random variable X {\displaystyle X} is known. Here, information is measured in shannons, nats, or hartleys. The entropy of Y {\displaystyle Y} conditioned on X {\displaystyle X} is written as H ( Y | X ) {\displaystyle \mathrm {H} (Y|X)}
https://huggingface.co/datasets/fmars/wiki_stem
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average (surprisal) of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. : 181–218  The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP)
https://huggingface.co/datasets/fmars/wiki_stem
In various science/engineering applications, such as independent component analysis, image analysis, genetic analysis, speech recognition, manifold learning, and time delay estimation it is useful to estimate the differential entropy of a system or process, given some observations. The simplest and most common approach uses histogram-based estimation, but other approaches have been developed and used, each with its own benefits and drawbacks. The main factor in choosing a method is often a trade-off between the bias and the variance of the estimate, although the nature of the (suspected) distribution of the data may also be a factor
https://huggingface.co/datasets/fmars/wiki_stem
The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s. Equivalence of form of the defining expressions The defining expression for entropy in the theory of statistical mechanics established by Ludwig Boltzmann and J
https://huggingface.co/datasets/fmars/wiki_stem
A set of networks that satisfies given structural characteristics can be treated as a network ensemble. Brought up by Ginestra Bianconi in 2007, the entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble. The entropy is the logarithm of the number of graphs
https://huggingface.co/datasets/fmars/wiki_stem
Exformation (originally spelled eksformation in Danish) is a term coined by Danish science writer Tor Nørretranders in his book The User Illusion published in English 1998. It is meant to mean explicitly discarded information. Example Consider the following phrase: "the best horse at the race is number 7"
https://huggingface.co/datasets/fmars/wiki_stem
Inequalities are very important in the study of information theory. There are a number of different contexts in which these inequalities appear. Entropic inequalities Consider a tuple X 1 , X 2 , … , X n {\displaystyle X_{1},X_{2},\dots ,X_{n}} of n {\displaystyle n} finitely (or at most countably) supported random variables on the same probability space
https://huggingface.co/datasets/fmars/wiki_stem
In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative way of expressing probability, much like odds or log-odds, but which has particular mathematical advantages in the setting of information theory. The Shannon information can be interpreted as quantifying the level of "surprise" of a particular outcome
https://huggingface.co/datasets/fmars/wiki_stem
In decision tree learning, Information gain ratio is a ratio of information gain to the intrinsic information. It was proposed by Ross Quinlan, to reduce a bias towards multi-valued attributes by taking the number and size of branches into account when choosing an attribute. Information Gain is also known as Mutual Information
https://huggingface.co/datasets/fmars/wiki_stem
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D KL ( P ∥ Q ) {\displaystyle D_{\text{KL}}(P\parallel Q)} , is a type of statistical distance: a measure of how one probability distribution P is different from a second, reference probability distribution Q. A simple interpretation of the KL divergence of P from Q is the expected excess surprise from using Q as a model when the actual distribution is P. While it is a distance, it is not a metric, the most familiar type of distance: it is not symmetric in the two distributions (in contrast to variation of information), and does not satisfy the triangle inequality
https://huggingface.co/datasets/fmars/wiki_stem
Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that an irreversible change in information stored in a computer, such as merging two computational paths, dissipates a minimum amount of heat to its surroundings. Statement Landauer's principle states that the minimum energy needed to erase one bit of information is proportional to the temperature at which the system is operating
https://huggingface.co/datasets/fmars/wiki_stem
In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions. According to the principle of maximum entropy, if nothing is known about a distribution except that it belongs to a certain class (usually defined in terms of specified properties or measures), then the distribution with the largest entropy should be chosen as the least-informative default. The motivation is twofold: first, maximizing entropy minimizes the amount of prior information built into the distribution; second, many physical systems tend to move towards maximal entropy configurations over time
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, the mean (topological) dimension of a topological dynamical system is a non-negative extended real number that is a measure of the complexity of the system. Mean dimension was first introduced in 1999 by Gromov. Shortly after it was developed and studied systematically by Lindenstrauss and Weiss
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, a measure-preserving dynamical system is an object of study in the abstract formulation of dynamical systems, and ergodic theory in particular. Measure-preserving systems obey the Poincaré recurrence theorem, and are a special case of conservative systems. They provide the formal, mathematical basis for a broad range of physical systems, and, in particular, many systems from classical mechanics (in particular, most non-dissipative systems) as well as systems in thermodynamic equilibrium
https://huggingface.co/datasets/fmars/wiki_stem
A Molecular demon or biological molecular machine is a biological macromolecule that resembles and seems to have the same properties as Maxwell's demon. These macromolecules gather information in order to recognize their substrate or ligand within a myriad of other molecules floating in the intracellular or extracellular plasm. This molecular recognition represents an information gain which is equivalent to an energy gain or decrease in entropy
https://huggingface.co/datasets/fmars/wiki_stem
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable
https://huggingface.co/datasets/fmars/wiki_stem
In information theory and statistics, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book What is Life? Later, Léon Brillouin shortened the phrase to negentropy. In 1974, Albert Szent-Györgyi proposed replacing the term negentropy with syntropy
https://huggingface.co/datasets/fmars/wiki_stem
Entropy is considered to be an extensive property, i. e. , that its value depends on the amount of material present
https://huggingface.co/datasets/fmars/wiki_stem
The partition function or configuration integral, as used in probability theory, information theory and dynamical systems, is a generalization of the definition of a partition function in statistical mechanics. It is a special case of a normalizing constant in probability theory, for the Boltzmann distribution. The partition function occurs in many problems of probability theory because, in situations where there is a natural symmetry, its associated probability measure, the Gibbs measure, has the Markov property
https://huggingface.co/datasets/fmars/wiki_stem
In information theory, perplexity is a measurement of how well a probability distribution or probability model predicts a sample. It may be used to compare probability models. A low perplexity indicates the probability distribution is good at predicting the sample
https://huggingface.co/datasets/fmars/wiki_stem
In statistics, probability theory and information theory, pointwise mutual information (PMI), or point mutual information, is a measure of association. It compares the probability of two events occurring together to what this probability would be if the events were independent. PMI (especially in its positive pointwise mutual information variant) has been described as "one of the most important concepts in NLP", where it "draws on the intuition that the best way to weigh the association between two words is to ask how much more the two words co-occur in [a] corpus than we would have a priori expected them to appear by chance
https://huggingface.co/datasets/fmars/wiki_stem
The principle of maximum caliber (MaxCal) or maximum path entropy principle, suggested by E. T. Jaynes, can be considered as a generalization of the principle of maximum entropy
https://huggingface.co/datasets/fmars/wiki_stem
The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data (such as a proposition that expresses testable information). Another way of stating this: Take precisely stated prior data or testable information about a probability distribution function. Consider the set of all trial probability distributions that would encode the prior data
https://huggingface.co/datasets/fmars/wiki_stem
In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for the most general way to quantify information while preserving additivity for independent events. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, the topological entropy of a topological dynamical system is a nonnegative extended real number that is a measure of the complexity of the system. Topological entropy was first introduced in 1965 by Adler, Konheim and McAndrew. Their definition was modelled after the definition of the Kolmogorov–Sinai, or metric entropy
https://huggingface.co/datasets/fmars/wiki_stem
In probability theory and information theory, the variation of information or shared information distance is a measure of the distance between two clusterings (partitions of elements). It is closely related to mutual information; indeed, it is a simple linear expression involving the mutual information. Unlike the mutual information, however, the variation of information is a true metric, in that it obeys the triangle inequality
https://huggingface.co/datasets/fmars/wiki_stem
In information theory and coding theory with applications in computer science and telecommunication, error detection and correction (EDAC) or error control are techniques that enable reliable delivery of digital data over unreliable communication channels. Many communication channels are subject to channel noise, and thus errors may be introduced during transmission from the source to a receiver. Error detection techniques allow detecting such errors, while error correction enables reconstruction of the original data in many cases
https://huggingface.co/datasets/fmars/wiki_stem
In data networking, telecommunications, and computer buses, an acknowledgment (ACK) is a signal that is passed between communicating processes, computers, or devices to signify acknowledgment, or receipt of message, as part of a communications protocol. The negative-acknowledgement (NAK or NACK) is a signal that is sent to reject a previously received message or to indicate some kind of error. Acknowledgments and negative acknowledgments inform a sender of the receiver's state so that it can adjust its own state accordingly
https://huggingface.co/datasets/fmars/wiki_stem
AN codes are error-correcting code that are used in arithmetic applications. Arithmetic codes were commonly used in computer processors to ensure the accuracy of its arithmetic operations when electronics were more unreliable. Arithmetic codes help the processor to detect when an error is made and correct it
https://huggingface.co/datasets/fmars/wiki_stem
A meteorological observation at a given place can be inaccurate for a variety of reasons, such as a hardware defect. Quality control can help spot which meteorological observations are inaccurate. One of the main automated quality control programs used today in the area of meteorological observations is the meteorological assimilation data ingest system (MADIS)
https://huggingface.co/datasets/fmars/wiki_stem
Automatic repeat request (ARQ), also known as automatic repeat query, is an error-control method for data transmission that uses acknowledgements (messages sent by the receiver indicating that it has correctly received a message) and timeouts (specified periods of time allowed to elapse before an acknowledgment is to be received) to achieve reliable data transmission over an unreliable communication channel. ARQ is appropriate if the communication channel has varying or unknown capacity. If the sender does not receive an acknowledgment before the timeout, it re-transmits the message until it receives an acknowledgment or exceeds a predefined number of retransmissions
https://huggingface.co/datasets/fmars/wiki_stem
The Berlekamp–Massey algorithm is an algorithm that will find the shortest linear-feedback shift register (LFSR) for a given binary output sequence. The algorithm will also find the minimal polynomial of a linearly recurrent sequence in an arbitrary field. The field requirement means that the Berlekamp–Massey algorithm requires all non-zero elements to have a multiplicative inverse
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics and electronics engineering, a binary Golay code is a type of linear error-correcting code used in digital communications. The binary Golay code, along with the ternary Golay code, has a particularly deep and interesting connection to the theory of finite sporadic groups in mathematics. These codes are named in honor of Marcel J
https://huggingface.co/datasets/fmars/wiki_stem
A bipolar violation, bipolarity violation, or BPV, is a violation of the bipolar encoding rules where two pulses of the same polarity occur without an intervening pulse of the opposite polarity. This indicates an error in the transmission of the signal. T-carrier and E-carrier signals are transmitted using a scheme called bipolar encoding, a
https://huggingface.co/datasets/fmars/wiki_stem
Casting out nines is any of three arithmetical procedures: Adding the decimal digits of a positive whole number, while optionally ignoring any 9s or digits which sum to a multiple of 9. The result of this procedure is a number which is smaller than the original whenever the original has more than one digit, leaves the same remainder as the original after division by nine, and may be obtained from the original by subtracting a multiple of 9 from it. The name of the procedure derives from this latter property
https://huggingface.co/datasets/fmars/wiki_stem
A check digit is a form of redundancy check used for error detection on identification numbers, such as bank account numbers, which are used in an application where they will at least sometimes be input manually. It is analogous to a binary parity bit used to check for errors in computer-generated data. It consists of one or more digits (or letters) computed by an algorithm from the other digits (or letters) in the sequence input
https://huggingface.co/datasets/fmars/wiki_stem
A confidential incident reporting system is a mechanism which allows problems in safety-critical fields such as aviation and medicine to be reported in confidence. This allows events to be reported which otherwise might not be reported through fear of blame or reprisals against the reporter. Analysis of the reported incidents can provide insight into how those events occurred, which can spur the development of measures to make the system safer
https://huggingface.co/datasets/fmars/wiki_stem
In coding theory, a constant-weight code, also called an m-of-n code, is an error detection and correction code where all codewords share the same Hamming weight. The one-hot code and the balanced code are two widely used kinds of constant-weight code. The theory is closely connected to that of designs (such as t-designs and Steiner systems)
https://huggingface.co/datasets/fmars/wiki_stem
In telecommunication, a convolutional code is a type of error-correcting code that generates parity symbols via the sliding application of a boolean polynomial function to a data stream. The sliding application represents the 'convolution' of the encoder over the data, which gives rise to the term 'convolutional coding'. The sliding nature of the convolutional codes facilitates trellis decoding using a time-invariant trellis
https://huggingface.co/datasets/fmars/wiki_stem
Cosine error occurs in measuring instrument readings when the user of an instrument does not realize that the vector that an instrument is measuring does not coincide with the vector that the user wishes to measure. Often the lack of coincidence is subtle (with vectors almost coinciding), which is why the user does not notice it (or notices but fails to appreciate its importance). A simple example is taking a measurement across a rectangle but failing to realize that the line of measurement is not quite parallel with the edges, being slightly diagonal
https://huggingface.co/datasets/fmars/wiki_stem
Crew resource management or cockpit resource management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. CRM is primarily used for improving aviation safety and focuses on interpersonal communication, leadership, and decision making in aircraft cockpits. Its founder is David Beaty, a former Royal Air Force and a BOAC pilot who wrote "The Human Factor in Aircraft Accidents" (1969)
https://huggingface.co/datasets/fmars/wiki_stem
In the compact disc system, cross-interleaved Reed–Solomon code (CIRC) provides error detection and error correction. CIRC adds to every three data bytes one redundant parity byte. Overview Reed–Solomon codes are specifically useful in combating mixtures of random and burst errors
https://huggingface.co/datasets/fmars/wiki_stem
Data Integrity Field (DIF) is an approach to protect data integrity in computer data storage from data corruption. It was proposed in 2003 by the T10 subcommittee of the International Committee for Information Technology Standards. A similar approach for data integrity was added in 2016 to the NVMe 1
https://huggingface.co/datasets/fmars/wiki_stem
Data scrubbing is an error correction technique that uses a background task to periodically inspect main memory or storage for errors, then corrects detected errors using redundant data in the form of different checksums or copies of data. Data scrubbing reduces the likelihood that single correctable errors will accumulate, leading to reduced risks of uncorrectable errors. Data integrity is a high-priority concern in writing, reading, storage, transmission, or processing of the computer data in computer operating systems and in computer storage and data transmission systems
https://huggingface.co/datasets/fmars/wiki_stem
Adrenocorticotropic hormone (ACTH; also adrenocorticotropin, corticotropin) is a polypeptide tropic hormone produced by and secreted by the anterior pituitary gland. It is also used as a medication and diagnostic agent. ACTH is an important component of the hypothalamic-pituitary-adrenal axis and is often produced in response to biological stress (along with its precursor corticotropin-releasing hormone from the hypothalamus)
https://huggingface.co/datasets/fmars/wiki_stem
Amfepramone, also known as diethylpropion, is a stimulant drug of the phenethylamine, amphetamine, and cathinone classes that is used as an appetite suppressant. It is used in the short-term management of obesity, along with dietary and lifestyle changes. Amfepramone has a similar chemical structure to the antidepressant and smoking cessation aid bupropion (previously called amfebutamone), which has also been developed as a weight-loss medicine when in a combination product with naltrexone
https://huggingface.co/datasets/fmars/wiki_stem
Amiloride, sold under the trade name Midamor among others, is a medication typically used with other medications to treat high blood pressure or swelling due to heart failure or cirrhosis of the liver. Amiloride is classified as a potassium-sparing diuretic. Amiloride is often used together with another diuretic, such as a thiazide or loop diuretic
https://huggingface.co/datasets/fmars/wiki_stem
Aminoglutethimide (AG), sold under the brand names Elipten, Cytadren, and Orimeten among others, is a medication which has been used in the treatment of seizures, Cushing's syndrome, breast cancer, and prostate cancer, among other indications. It has also been used by bodybuilders, athletes, and other men for muscle-building and performance- and physique-enhancing purposes. AG is taken by mouth three or four times per day
https://huggingface.co/datasets/fmars/wiki_stem
Amphetamine (contracted from alpha-methylphenethylamine) is a central nervous system (CNS) stimulant that is used in the treatment of attention deficit hyperactivity disorder (ADHD), narcolepsy, and obesity. Amphetamine was discovered in 1887 and exists as two enantiomers: levoamphetamine and dextroamphetamine. Amphetamine properly refers to a specific chemical, the racemic free base, which is equal parts of the two enantiomers in their pure amine forms
https://huggingface.co/datasets/fmars/wiki_stem
Anastrozole, sold under the brand name Arimidex among others, is a medication used in addition to other treatments for breast cancer. Specifically it is used for hormone receptor-positive breast cancer. It has also been used to prevent breast cancer in those at high risk
https://huggingface.co/datasets/fmars/wiki_stem