text
stringlengths
31
999
source
stringclasses
5 values
Stochastic Petri nets are a form of Petri net where the transitions fire after a probabilistic delay determined by a random variable. Definition A stochastic Petri net is a five-tuple SPN = (P, T, F, M0, Λ) where: P is a set of states, called places. T is a set of transitions
https://huggingface.co/datasets/fmars/wiki_stem
In computer science, synchronization refers to one of two distinct but related concepts: synchronization of processes, and synchronization of data. Process synchronization refers to the idea that multiple processes are to join up or handshake at a certain point, in order to reach an agreement or commit to a certain sequence of action. Data synchronization refers to the idea of keeping multiple copies of a dataset in coherence with one another, or to maintain data integrity
https://huggingface.co/datasets/fmars/wiki_stem
Temporal logic of actions (TLA) is a logic developed by Leslie Lamport, which combines temporal logic with a logic of actions. It is used to describe behaviours of concurrent and distributed systems. It is the logic underlying the specification language TLA+
https://huggingface.co/datasets/fmars/wiki_stem
A routing algorithm decides the path followed by a packet from the source to destination routers in a network. An important aspect to be considered while designing a routing algorithm is avoiding a deadlock. Turn restriction routing is a routing algorithm for mesh-family of topologies which avoids deadlocks by restricting the types of turns that are allowed in the algorithm while determining the route from source node to destination node in a network
https://huggingface.co/datasets/fmars/wiki_stem
In computer science, unbounded nondeterminism or unbounded indeterminacy is a property of concurrency by which the amount of delay in servicing a request can become unbounded as a result of arbitration of contention for shared resources while still guaranteeing that the request will eventually be serviced. Unbounded nondeterminism became an important issue in the development of the denotational semantics of concurrency, and later became part of research into the theoretical concept of hypercomputation. Fairness Discussion of unbounded nondeterminism tends to get involved with discussions of fairness
https://huggingface.co/datasets/fmars/wiki_stem
A vector addition system (VAS) is one of several mathematical modeling languages for the description of distributed systems. Vector addition systems were introduced by Richard M. Karp and Raymond E
https://huggingface.co/datasets/fmars/wiki_stem
Database theory encapsulates a broad range of topics related to the study and research of the theoretical realm of databases and database management systems. Theoretical aspects of data management include, among other areas, the foundations of query languages, computational complexity and expressive power of queries, finite model theory, database design theory, dependency theory, foundations of concurrency control and database recovery, deductive databases, temporal and spatial databases, real-time databases, managing uncertain data and probabilistic databases, and Web data. Most research work has traditionally been based on the relational model, since this model is usually considered the simplest and most foundational model of interest
https://huggingface.co/datasets/fmars/wiki_stem
In database theory, the analytical base table (ABT) is a flat table that is used for building analytical models and scoring (predicting) the future behavior of a subject. A single record in this table represents the subject of the prediction (e. g
https://huggingface.co/datasets/fmars/wiki_stem
The chase is a simple fixed-point algorithm testing and enforcing implication of data dependencies in database systems. It plays important roles in database theory as well as in practice. It is used, directly or indirectly, on an everyday basis by people who design databases, and it is used in commercial systems to reason about the consistency and correctness of a data design
https://huggingface.co/datasets/fmars/wiki_stem
Codd's twelve rules are a set of thirteen rules (numbered zero to twelve) proposed by Edgar F. Codd, a pioneer of the relational model for databases, designed to define what is required from a database management system in order for it to be considered relational, i. e
https://huggingface.co/datasets/fmars/wiki_stem
In database theory, a conjunctive query is a restricted form of first-order queries using the logical conjunction operator. Many first-order queries can be written as conjunctive queries. In particular, a large part of queries issued on relational databases can be expressed in this way
https://huggingface.co/datasets/fmars/wiki_stem
In computer programming contexts, a data cube (or datacube) is a multi-dimensional ("n-D") array of values. Typically, the term data cube is applied in contexts where these arrays are massively larger than the hosting computer's main memory; examples include multi-terabyte/petabyte data warehouses and time series of image data. The data cube is used to represent data (sometimes called facts) along some dimensions of interest
https://huggingface.co/datasets/fmars/wiki_stem
A data event is a relevant state transition defined in an event schema. Typically, event schemata are described by pre- and post condition for a single or a set of data items. In contrast to ECA (Event condition action), which considers an event to be a signal, the data event not only refers to the change (signal), but describes specific state transitions, which are referred to in ECA as conditions
https://huggingface.co/datasets/fmars/wiki_stem
A data item describes an atomic state of a particular object concerning a specific property at a certain time point. A collection of data items for the same object at the same time forms an object instance (or table row). Any type of complex information can be broken down to elementary data items (atomic state)
https://huggingface.co/datasets/fmars/wiki_stem
Database design is the organization of data according to a database model. The designer determines what data must be stored and how the data elements interrelate. With this information, they can begin to fit the data to the database model
https://huggingface.co/datasets/fmars/wiki_stem
A database dump contains a record of the table structure and/or the data from a database and is usually in the form of a list of SQL statements ("SQL dump"). A database dump is most often used for backing up a database so that its contents can be restored in the event of data loss. Corrupted databases can often be recovered by analysis of the dump
https://huggingface.co/datasets/fmars/wiki_stem
The problem of database repair is a question about relational databases which has been studied in database theory, and which is a particular kind of data cleansing. The problem asks about how we can "repair" an input relational database in order to make it satisfy integrity constraints. The goal of the problem is to be able to work with data that is "dirty", i
https://huggingface.co/datasets/fmars/wiki_stem
Database tables and indexes may be stored on disk in one of a number of forms, including ordered/unordered flat files, ISAM, heap files, hash buckets, or B+ trees. Each form has its own particular advantages and disadvantages. The most commonly used forms are B-trees and ISAM
https://huggingface.co/datasets/fmars/wiki_stem
In relational database theory, an embedded dependency (ED) is a certain kind of constraint on a relational database. It is the most general type of constraint used in practice, including both tuple-generating dependencies and equality-generating dependencies. Embedded dependencies can express functional dependencies, join dependencies, multivalued dependencies, inclusion dependencies, foreign key dependencies, and many more besides
https://huggingface.co/datasets/fmars/wiki_stem
An entity–attribute–value model (EAV) is a data model optimized for the space-efficient storage of sparse—or ad-hoc—property or data values, intended for situations where runtime usage patterns are arbitrary, subject to user variation, or otherwise unforseeable using a fixed design. The use-case targets applications which offer a large or rich system of defined property types, which are in turn appropriate to a wide set of entities, but where typically only a small, specific selection of these are instantated (or persisted) for a given entity. Therefore, this type of data model relates to the mathematical notion of a sparse matrix
https://huggingface.co/datasets/fmars/wiki_stem
In relational database theory, an equality-generating dependency (EGD) is a certain kind of constraint on data. It is a subclass of the class of embedded dependencies (ED). An algorithm known as the chase takes as input an instance that may or may not satisfy a set of EGDs (or, more generally, a set of EDs), and, if it terminates (which is a priori undecidable), output an instance that does satisfy the EGDs
https://huggingface.co/datasets/fmars/wiki_stem
Evolutionary database design involves incremental improvements to the database schema so that it can be continuously updated with changes, reflecting the customer's requirements. People across the globe work on the same piece of software at the same time hence, there is a need for techniques that allow a smooth evolution of database as the design develops. Such methods utilize automated refactoring and continuous integration so that it supports agile methodologies for software development
https://huggingface.co/datasets/fmars/wiki_stem
In computing, an exclusive relationship is a type of Relationship in computer data base design. In Relational Database Design, in some cases the existence of one kind of relationship type precludes the existence of another. Entities within an entity type A may be related by a relationship type R to an entity in entity type B or entity type C but not both
https://huggingface.co/datasets/fmars/wiki_stem
In mathematical logic, fixed-point logics are extensions of classical predicate logic that have been introduced to express recursion. Their development has been motivated by descriptive complexity theory and their relationship to database query languages, in particular to Datalog. Least fixed-point logic was first studied systematically by Yiannis N
https://huggingface.co/datasets/fmars/wiki_stem
A full table scan (also known as a sequential scan) is a scan made on a database where each row of the table is read in a sequential (serial) order and the columns encountered are checked for the validity of a condition. Full table scans are usually the slowest method of scanning a table due to the heavy amount of I/O reads required from the disk which consists of multiple seeks as well as costly disk to memory transfers. Overview In a database, a query that is not indexed results in a full table scan, where the database processes each record of the table to find all records meeting the given requirements
https://huggingface.co/datasets/fmars/wiki_stem
In database theory, Imieliński–Lipski algebra is an extension of relational algebra onto tables with different types of null values. It is used to operate on relations with incomplete information. Imieliński–Lipski algebras are defined to satisfy precise conditions for semantically meaningful extension of the usual relational operators, such as projection, selection, union, and join, from operators on relations to operators on relations with various kinds of "null values"
https://huggingface.co/datasets/fmars/wiki_stem
The International Conference on Database Theory (ICDT) is an international research conference on foundations of database theory, and has been held since 1986. It is frequently also called the PODS of Europe. Since 2009, ICDT has been held jointly with the EDBT, a research conference on systems aspects of data management
https://huggingface.co/datasets/fmars/wiki_stem
A key–value database, or key–value store, is a data storage paradigm designed for storing, retrieving, and managing associative arrays, and a data structure more commonly known today as a dictionary or hash table. Dictionaries contain a collection of objects, or records, which in turn have many different fields within them, each containing data. These records are stored and retrieved using a key that uniquely identifies the record, and is used to find the data within the database
https://huggingface.co/datasets/fmars/wiki_stem
In database theory and systems, a monotonic query is one that does not lose any tuples it previously made output, with the addition of new tuples in the database. Formally, a query q over a schema R is monotonic if and only if for every two instances I, J of R, I ⊆ J ⇒ q ( I ) ⊆ q ( J ) {\displaystyle I\subseteq J\Rightarrow q(I)\subseteq q(J)} (q must be a monotonic function). An example of a monotonic query is a select-project-join query containing only conditions of equality (also known as conjunctive queries)
https://huggingface.co/datasets/fmars/wiki_stem
The nested set model is a technique for representing nested set collections (also known as trees or hierarchies) in relational databases. It is based on Nested Intervals, that "are immune to hierarchy reorganization problem, and allow answering ancestor path hierarchical queries algorithmically — without accessing the stored hierarchy relation". Motivation The standard relational algebra and relational calculus, and the SQL operations based on them, are unable to express directly all desirable operations on hierarchies
https://huggingface.co/datasets/fmars/wiki_stem
Online aggregation is a technique for improving the interactive behavior of database systems processing expensive analytical queries. Almost all database operations are performed in batch mode, i. e
https://huggingface.co/datasets/fmars/wiki_stem
In theoretical computer science, the PACELC theorem is an extension to the CAP theorem. It states that in case of network partitioning (P) in a distributed computer system, one has to choose between availability (A) and consistency (C) (as per the CAP theorem), but else (E), even when the system is running normally in the absence of partitions, one has to choose between latency (L) and consistency (C). Overview PACELC builds on the CAP theorem
https://huggingface.co/datasets/fmars/wiki_stem
The principle of orthogonal design (abbreviated POOD) was developed by database researchers David McGoveran and Christopher J. Date in the early 1990s, and first published "A New Database Design Principle" in the July 1994 issue of Database Programming and Design and reprinted several times. It is the second of the two principles of database design, which seek to prevent databases from being too complicated or redundant, the first principle being the principle of full normalization (POFN)
https://huggingface.co/datasets/fmars/wiki_stem
Most real databases contain data whose correctness is uncertain. In order to work with such data, there is a need to quantify the integrity of the data. This is achieved by using probabilistic databases
https://huggingface.co/datasets/fmars/wiki_stem
In computational geometry and database theory, a range reporting query asks for a list of the points that match the query. The query is often specified by a geometric shape, containing all the points that should match, and is called a range. Range reporting is a special case of range searching, in which queries may return other kinds of aggregate information about points in a range
https://huggingface.co/datasets/fmars/wiki_stem
In computer science, the range searching problem consists of processing a set S of objects, in order to determine which objects from S intersect with a query object, called the range. For example, if S is a set of points corresponding to the coordinates of several cities, find the subset of cities within a given range of latitudes and longitudes. The range searching problem and the data structures that solve it are a fundamental topic of computational geometry
https://huggingface.co/datasets/fmars/wiki_stem
The recursive join is an operation used in relational databases, also sometimes called a "fixed-point join". It is a compound operation that involves repeating the join operation, typically accumulating more records each time, until a repetition makes no change to the results (as compared to the results of the previous iteration). For example, if a database of family relationships is to be searched, and the record for each person has "mother" and "father" fields, a recursive join would be one way to retrieve all of a person's known ancestors: first the person's direct parents' records would be retrieved, then the parents' information would be used to retrieve the grandparents' records, and so on until no new records are being found
https://huggingface.co/datasets/fmars/wiki_stem
A relational database is a (most commonly digital) database based on the relational model of data, as proposed by E. F. Codd in 1970
https://huggingface.co/datasets/fmars/wiki_stem
In database technologies, a rollback is an operation which returns the database to some previous state. Rollbacks are important for database integrity, because they mean that the database can be restored to a clean copy even after erroneous operations are performed. They are crucial for recovering from database server crashes; by rolling back any transaction which was active at the time of the crash, the database is restored to a consistent state
https://huggingface.co/datasets/fmars/wiki_stem
Single table inheritance is a way to emulate object-oriented inheritance in a relational database. When mapping from a database table to an object in an object-oriented language, a field in the database identifies what class in the hierarchy the object belongs to. All fields of all the classes are stored in the same table, hence the name "Single Table Inheritance"
https://huggingface.co/datasets/fmars/wiki_stem
A supra-entity is a conceptual element derived from the Entity-Relationship (E-R) technique for information system modeling. It is similar to an entity, but it is defined at a higher level, encompassing individual entity occurrences, their parts, groups and groups of parts or parts of groups. The concepts supra-entity, supra-relationship and supra-attribute were created and published by González & Muller in their work “Business Entity-Relationship Model: For Innovation, Entrepreneurship and Management ”, and applied for the first time in this work to conceptually model the key elements and interrelationships of “business” reality
https://huggingface.co/datasets/fmars/wiki_stem
The ACM Symposium on Principles of Database Systems (PODS) is an international research conference on database theory, and has been held yearly since 1982. It is sponsored by three Association for Computing Machinery SIGs, SIGAI, SIGACT, and SIGMOD. Since 1991, PODS has been held jointly with the ACM SIGMOD Conference, a research conference on systems aspects of data management
https://huggingface.co/datasets/fmars/wiki_stem
A temporal database stores data relating to time instances. It offers temporal data types and stores information relating to past, present and future time. Temporal databases can be uni-temporal, bi-temporal or tri-temporal
https://huggingface.co/datasets/fmars/wiki_stem
A transitive dependency is an indirect dependency relationship between software components. This kind of functional dependency is held by virtue of a transitive relation from a component that the software depends on directly. Computer programs In a computer program a direct dependency is functionality from a library, or API, or any software component that is referenced directly by the program itself
https://huggingface.co/datasets/fmars/wiki_stem
A triplestore or RDF store is a purpose-built database for the storage and retrieval of triples through semantic queries. A triple is a data entity composed of subject–predicate–object, like "Bob is 35" or "Bob knows Fred". Much like a relational database, information in a triplestore is stored and retrieved via a query language
https://huggingface.co/datasets/fmars/wiki_stem
In relational database theory, a tuple-generating dependency (TGD) is a certain kind of constraint on a relational database. It is a subclass of the class of embedded dependencies (EDs). An algorithm known as the chase takes as input an instance that may or may not satisfy a set of TGDs (or more generally EDs) and, if it terminates (which is a priori undecidable), outputs an instance that does satisfy the TGDs
https://huggingface.co/datasets/fmars/wiki_stem
Histograms are most commonly used as visual representations of data. However, Database systems use histograms to summarize data internally and provide size estimates for queries. These histograms are not presented to users or displayed visually, so a wider range of options are available for their construction
https://huggingface.co/datasets/fmars/wiki_stem
In a database, a view is the result set of a stored query, which can be queried in the same manner as a persistent database collection object. This pre-established query command is kept in the data dictionary. Unlike ordinary base tables in a relational database, a view does not form part of the physical schema: as a result set, it is a virtual table computed or collated dynamically from data in the database when access to that view is requested
https://huggingface.co/datasets/fmars/wiki_stem
The Z User Group (ZUG) was established in 1992 to promote use and development of the Z notation, a formal specification language for the description of and reasoning about computer-based systems. It was formally constituted on 14 December 1992 during the ZUM'92 Z User Meeting in London, England. Meetings and conferences ZUG has organised a series of Z User Meetings approximately every 18 months initially
https://huggingface.co/datasets/fmars/wiki_stem
In computer science, algebraic semantics is a form of axiomatic semantics based on algebraic laws for describing and reasoning about program specifications in a formal manner. Syntax The syntax of an algebraic specification is formulated in two steps: (1) defining a formal signature of data types and operation symbols, and (2) interpreting the signature through sets and functions. Definition of a signature The signature of an algebraic specification defines its formal syntax
https://huggingface.co/datasets/fmars/wiki_stem
Algebraic specification is a software engineering technique for formally specifying system behavior. It was a very active subject of computer science research around 1980. Overview Algebraic specification seeks to systematically develop more efficient programs by: formally defining types of data, and mathematical operations on those data types abstracting implementation details, such as the size of representations (in memory) and the efficiency of obtaining outcome of computations formalizing the computations and operations on data types allowing for automation by formally restricting operations to this limited set of behaviors and data types
https://huggingface.co/datasets/fmars/wiki_stem
Algorithm characterizations are attempts to formalize the word algorithm. Algorithm does not have a generally accepted formal definition. Researchers are actively working on this problem
https://huggingface.co/datasets/fmars/wiki_stem
An and-inverter graph (AIG) is a directed, acyclic graph that represents a structural implementation of the logical functionality of a circuit or network. An AIG consists of two-input nodes representing logical conjunction, terminal nodes labeled with variable names, and edges optionally containing markers indicating logical negation. This representation of a logic function is rarely structurally efficient for large circuits, but is an efficient representation for manipulation of boolean functions
https://huggingface.co/datasets/fmars/wiki_stem
Applicative universal grammar, or AUG, is a universal semantic metalanguage intended for studying the semantic processes in particular languages. This is a linguistic theory that views the formation of phrase structure by analogy to function application in an applicative programming language. Among the innovations in this approach to natural language processing are the ideas of functional superposition and stratified types
https://huggingface.co/datasets/fmars/wiki_stem
In computer programming, specifically when using the imperative programming paradigm, an assertion is a predicate (a Boolean-valued function over the state space, usually expressed as a logical proposition using the variables of a program) connected to a point in the program, that always should evaluate to true at that point in code execution. Assertions can help a programmer read the code, help a compiler compile it, or help the program detect its own defects. For the latter, some programs check assertions by actually evaluating the predicate as they run
https://huggingface.co/datasets/fmars/wiki_stem
The primary focus of this article is asynchronous control in digital electronic systems. In a synchronous system, operations (instructions, calculations, logic, etc. ) are coordinated by one, or more, centralized clock signals
https://huggingface.co/datasets/fmars/wiki_stem
Automated theorem proving (also known as ATP or automated deduction) is a subfield of automated reasoning and mathematical logic dealing with proving mathematical theorems by computer programs. Automated reasoning over mathematical proof was a major impetus for the development of computer science. Logical foundations While the roots of formalised logic go back to Aristotle, the end of the 19th and early 20th centuries saw the development of modern logic and formalised mathematics
https://huggingface.co/datasets/fmars/wiki_stem
A bigraph can be modelled as the superposition of a graph (the link graph) and a set of trees (the place graph). Each node of the bigraph is part of a graph and also part of some tree that describes how the nodes are nested. Bigraphs can be conveniently and formally displayed as diagrams
https://huggingface.co/datasets/fmars/wiki_stem
A binary moment diagram (BMD) is a generalization of the binary decision diagram (BDD) to linear functions over domains such as booleans (like BDDs), but also to integers or to real numbers. They can deal with Boolean functions with complexity comparable to BDDs, but also some functions that are dealt with very inefficiently in a BDD are handled easily by BMD, most notably multiplication. The most important properties of BMD is that, like with BDDs, each function has exactly one canonical representation, and many operations can be efficiently performed on these representations
https://huggingface.co/datasets/fmars/wiki_stem
In theoretical computer science a bisimulation is a binary relation between state transition systems, associating systems that behave in the same way in that one system simulates the other and vice versa. Intuitively two systems are bisimilar if they, assuming we view them as playing a game according to some rules, match each other's moves. In this sense, each of the systems cannot be distinguished from the other by an observer
https://huggingface.co/datasets/fmars/wiki_stem
In logic and computer science, the Boolean satisfiability problem (sometimes called propositional satisfiability problem and abbreviated SATISFIABILITY, SAT or B-SAT) is the problem of determining if there exists an interpretation that satisfies a given Boolean formula. In other words, it asks whether the variables of a given Boolean formula can be consistently replaced by the values TRUE or FALSE in such a way that the formula evaluates to TRUE. If this is the case, the formula is called satisfiable
https://huggingface.co/datasets/fmars/wiki_stem
Categorical set theory is any one of several versions of set theory developed from or treated in the context of mathematical category theory. See also Categorical logic References External links Leinster, Tom (2014). "Rethinking set theory"
https://huggingface.co/datasets/fmars/wiki_stem
A computer-assisted proof is a mathematical proof that has been at least partially generated by computer. Most computer-aided proofs to date have been implementations of large proofs-by-exhaustion of a mathematical theorem. The idea is to use a computer program to perform lengthy computations, and to provide a proof that the result of these computations implies the given theorem
https://huggingface.co/datasets/fmars/wiki_stem
Critical process parameters (CPP) in pharmaceutical manufacturing are key variables affecting the production process. CPPs are attributes that are monitored to detect deviations in standardized production operations and product output quality or changes in critical quality attributes. Those attributes with a higher impact on CQAs should be prioritized and held in a stricter state of control
https://huggingface.co/datasets/fmars/wiki_stem
In systems engineering, dependability is a measure of a system's availability, reliability, maintainability, and in some cases, other characteristics such as durability, safety and security. In real-time computing, dependability is the ability to provide services that can be trusted within a time-period. The service guarantees must hold even when the system is subject to attacks or natural failures
https://huggingface.co/datasets/fmars/wiki_stem
Oddworld: Soulstorm is a platform game developed and published by Oddworld Inhabitants. It was released for PlayStation 4, PlayStation 5, and Windows in April 2021, with an enhanced edition released in November 2021 alongside a port for Xbox One and Xbox Series X/S. A port for Nintendo Switch, subtitled Oddtimized Edition, was released in November 2022
https://huggingface.co/datasets/fmars/wiki_stem
Opus: Echo of Starsong (stylized as OPUS: Echo of Starsong) is a text-driven side-scrolling adventure game developed by Taiwanese independent studio SIGONO. It is the third installment of the Opus series, after Opus: The Day We Found Earth and Opus: Rocket of Whispers. Like its predecessors, the game focuses on story, puzzles and exploration
https://huggingface.co/datasets/fmars/wiki_stem
Propnight is a survival horror online multiplayer game with Prop Hunt-style mechanics developed by Fntastic and published by Mytona. It was released on Steam in November 2021. It is a one-versus-four game, similar in nature to Dead by Daylight, in which one player takes on the role of a killer and the other four play as survivors with the ability to turn into objects
https://huggingface.co/datasets/fmars/wiki_stem
Psychonauts 2 is a platform game developed by Double Fine and published by Xbox Game Studios. The game was announced at The Game Awards 2015 ceremony, and released on August 25, 2021 for PlayStation 4, Windows, Xbox One and Xbox Series X/S, and on May 24, 2022 for Linux and macOS. Like the game's predecessor, the player controls Raz, a young acrobat who is training to become a Psychonaut, a member of an international task force that uses their psychic abilities to stop those that perform nefarious deeds with their own psychic forces
https://huggingface.co/datasets/fmars/wiki_stem
Resident Evil Village is a 2021 survival horror game developed and published by Capcom. It is the sequel to Resident Evil 7: Biohazard (2017). Players control Ethan Winters, who searches for his kidnapped daughter in a village filled with mutant creatures
https://huggingface.co/datasets/fmars/wiki_stem
Roguebook is a roguelike deck-building game developed by Abrakam Entertainment. Nacon published it in 2021 for Linux, macOS, and Windows. Console ports were released in 2022
https://huggingface.co/datasets/fmars/wiki_stem
Scarlet Nexus is an action role-playing game developed by Bandai Namco Studios and Tose and published by Bandai Namco Entertainment. It was released on June 25, 2021, for PlayStation 4, PlayStation 5, Windows, Xbox One and Xbox Series X and Series S. The game received generally positive reviews from critics, with praise for the combat but criticism for its side missions
https://huggingface.co/datasets/fmars/wiki_stem
Shovel Knight Pocket Dungeon is a dungeon crawler puzzle game developed by Vine and Yacht Club Games, and published by Yacht Club Games. It is a spin-off of Shovel Knight. It was released on December 13, 2021 for Windows, PlayStation 4, and Nintendo Switch
https://huggingface.co/datasets/fmars/wiki_stem
Surviving the Aftermath is a city-building game developed by Iceflake Studios, which is now a division of the game's publisher, Paradox Interactive. Players build a city in a post-apocalyptic setting, which includes elements of survival games. It follows Surviving Mars and is followed by Surviving the Abyss, all of which were published by Paradox
https://huggingface.co/datasets/fmars/wiki_stem
Tainted Grail: Conquest is a roguelike deck-building game developed by Questline and published in 2021 by Awaken Realms. Players control an adventurer who attempts to stop a curse that is infecting their land. Gameplay Tainted Grail: Conquest is a roguelike deck-building game set in a dark fantasy world based on Arthurian legend
https://huggingface.co/datasets/fmars/wiki_stem
Tales of Arise is an action role-playing game developed and published by Bandai Namco Entertainment for PlayStation 4, PlayStation 5, Windows, Xbox One, and Xbox Series X/S. The seventeenth main entry in the Tales series, it was originally planned to release in 2020 but was delayed to September 2021 due to internal quality issues and the ability to launch the game on more platforms. It's also the first game of the series with a worldwide simultaneous launch
https://huggingface.co/datasets/fmars/wiki_stem
Tales of Luminaria is a mobile game developed by Colopl. The game is published by Bandai Namco Entertainment and released on iOS and Android. An anime adaptation by Kamikaze Douga premiered on January 20, 2022
https://huggingface.co/datasets/fmars/wiki_stem
The Vale: Shadow of the Crown is a 2021 action role-playing game developed by Falling Squirrel. Players control a blind princess. The game has few visuals and is almost entirely an audio game
https://huggingface.co/datasets/fmars/wiki_stem
Trials of Fire is a roguelike deck-building game developed by Whatboy Games and released for Windows in 2021. Gameplay Gameplay includes elements of roguelike deck-building games and tactical role-playing games using turn-based combat. It is set in a post-apocalyptic, dark fantasy world
https://huggingface.co/datasets/fmars/wiki_stem
A library and information scientist, also known as a library scholar, is a researcher or academic who specializes in the field of library and information science and often participates in scholarly writing about and related to library and information science. A library and information scientist is neither limited to any one subfield of library and information science nor any one particular type of library. These scientists come from all information-related sectors
https://huggingface.co/datasets/fmars/wiki_stem
In computational complexity theory, a complexity class is a set of computational problems "of related resource-based complexity". The two most commonly analyzed resources are time and memory. In general, a complexity class is defined in terms of a type of computational problem, a model of computation, and a bounded resource like time or memory
https://huggingface.co/datasets/fmars/wiki_stem
In the area of abstract algebra known as group theory, the diameter of a finite group is a measure of its complexity. Consider a finite group ( G , ∘ ) {\displaystyle \left(G,\circ \right)} , and any set of generators S. Define D S {\displaystyle D_{S}} to be the graph diameter of the Cayley graph Λ = ( G , S ) {\displaystyle \Lambda =\left(G,S\right)}
https://huggingface.co/datasets/fmars/wiki_stem
Effective complexity is a measure of complexity defined in a 1996 paper by Murray Gell-Mann and Seth Lloyd that attempts to measure the amount of non-random information in a system. It has been criticised as being dependent on the subjective decisions made as to which parts of the information in the system are to be discounted as random. See also Kolmogorov complexity Excess entropy Logical depth Renyi information Self-dissimilarity Forecasting complexity References External links http://www
https://huggingface.co/datasets/fmars/wiki_stem
The growth function, also called the shatter coefficient or the shattering number, measures the richness of a set family. It is especially used in the context of statistical learning theory, where it measures the complexity of a hypothesis class. The term 'growth function' was coined by Vapnik and Chervonenkis in their 1968 paper, where they also proved many of its properties
https://huggingface.co/datasets/fmars/wiki_stem
Information fluctuation complexity is an information-theoretic quantity defined as the fluctuation of information about entropy. It is derivable from fluctuations in the predominance of order and chaos in a dynamic system and has been used as a measure of complexity in many diverse fields. It was introduced in a 1993 paper by Bates and Shepard
https://huggingface.co/datasets/fmars/wiki_stem
In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produces the object as output. It is a measure of the computational resources needed to specify the object, and is also known as algorithmic complexity, Solomonoff–Kolmogorov–Chaitin complexity, program-size complexity, descriptive complexity, or algorithmic entropy. It is named after Andrey Kolmogorov, who first published on the subject in 1963 and is a generalization of classical information theory
https://huggingface.co/datasets/fmars/wiki_stem
Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It differs from Kolmogorov complexity in that it considers the computation time of the algorithm with nearly minimal length, rather than the length of the minimal algorithm
https://huggingface.co/datasets/fmars/wiki_stem
Self-dissimilarity is a measure of complexity defined in a series of papers by David Wolpert and William G. Macready. The degrees of self-dissimilarity between the patterns of a system observed at various scales (e
https://huggingface.co/datasets/fmars/wiki_stem
In Vapnik–Chervonenkis theory, the Vapnik–Chervonenkis (VC) dimension is a measure of the capacity (complexity, expressive power, richness, or flexibility) of a set of functions that can be learned by a statistical binary classification algorithm. It is defined as the cardinality of the largest set of points that the algorithm can shatter, which means the algorithm can always learn a perfect classifier for any labeling of at least one configuration of those data points. It was originally defined by Vladimir Vapnik and Alexey Chervonenkis
https://huggingface.co/datasets/fmars/wiki_stem
Network performance refers to measures of service quality of a network as seen by the customer. There are many different ways to measure the performance of a network, as each network is different in nature and design. Performance can also be modeled and simulated instead of measured; one example of this is using state transition diagrams to model queuing performance or to use a Network Simulator
https://huggingface.co/datasets/fmars/wiki_stem
In routers and switches, active queue management (AQM) is the policy of dropping packets inside a buffer associated with a network interface controller (NIC) before that buffer becomes full, often with the goal of reducing network congestion or improving end-to-end latency. This task is performed by the network scheduler, which for this purpose uses various algorithms such as random early detection (RED), Explicit Congestion Notification (ECN), or controlled delay (CoDel). RFC 7567 recommends active queue management as a best practice
https://huggingface.co/datasets/fmars/wiki_stem
In multi-hop networks, Adaptive Quality of Service routing (AQoS or AQR) protocols have become increasingly popular and have numerous applications. One application in which it may be useful is in Mobile ad hoc networking (MANET). Adaptive QoS routing is a cross-layer optimization adaptive routing mechanism
https://huggingface.co/datasets/fmars/wiki_stem
ALTQ (ALTernate Queueing) is the network scheduler for Berkeley Software Distribution. ALTQ provides queueing disciplines, and other components related to quality of service (QoS), required to realize resource sharing. It is most commonly implemented on BSD-based routers
https://huggingface.co/datasets/fmars/wiki_stem
Application-layer framing or application-level framing (ALF) is a method of allowing an application to use its semantics for the design of its network protocols. This procedure was first proposed by D. D
https://huggingface.co/datasets/fmars/wiki_stem
Application-Layer Protocol Negotiation (ALPN) is a Transport Layer Security (TLS) extension that allows the application layer to negotiate which protocol should be performed over a secure connection in a manner that avoids additional round trips and which is independent of the application-layer protocols. It is used to establish HTTP/2 connections without additional round trips (client and server can communicate over to ports previously assigned to HTTPS with HTTP/1. 1 and upgrade to use HTTP/2 or continue with HTTP/1
https://huggingface.co/datasets/fmars/wiki_stem
In computing, bandwidth is the maximum rate of data transfer across a given path. Bandwidth may be characterized as network bandwidth, data bandwidth, or digital bandwidth. This definition of bandwidth is in contrast to the field of signal processing, wireless communications, modem data transmission, digital communications, and electronics, in which bandwidth is used to refer to analog signal bandwidth measured in hertz, meaning the frequency range between lowest and highest attainable frequency while meeting a well-defined impairment level in signal power
https://huggingface.co/datasets/fmars/wiki_stem
In data communications, the bandwidth-delay product is the product of a data link's capacity (in bits per second) and its round-trip delay time (in seconds). The result, an amount of data measured in bits (or bytes), is equivalent to the maximum amount of data on the network circuit at any given time, i. e
https://huggingface.co/datasets/fmars/wiki_stem
Best-effort delivery describes a network service in which the network does not provide any guarantee that data is delivered or that delivery meets any quality of service. In a best-effort network, all users obtain best-effort service. Under best-effort, network performance characteristics such as network delay and packet loss depend on the current network traffic load, and the network hardware capacity
https://huggingface.co/datasets/fmars/wiki_stem
In digital transmission, the number of bit errors is the numbers of received bits of a data stream over a communication channel that have been altered due to noise, interference, distortion or bit synchronization errors. The bit error rate (BER) is the number of bit errors per unit time. The bit error ratio (also BER) is the number of bit errors divided by the total number of transferred bits during a studied time interval
https://huggingface.co/datasets/fmars/wiki_stem
In engineering, a bottleneck is a phenomenon by which the performance or capacity of an entire system is severely limited by a single component. The component is sometimes called a bottleneck point. The term is metaphorically derived from the neck of a bottle, where the flow speed of the liquid is limited by its neck
https://huggingface.co/datasets/fmars/wiki_stem