text stringlengths 31 999 | source stringclasses 5 values |
|---|---|
Cubesort is a parallel sorting algorithm that builds a self-balancing multi-dimensional array from the keys to be sorted. As the axes are of similar length the structure resembles a cube. After each key is inserted the cube can be rapidly converted to an array | https://huggingface.co/datasets/fmars/wiki_stem |
DISCUS, or distributed source coding using syndromes, is a method for distributed source coding. It is a compression algorithm used to compress correlated data sources. The method is designed to achieve the Slepian–Wolf bound by using channel codes | https://huggingface.co/datasets/fmars/wiki_stem |
The Data-based Online Nonlinear Extremumseeker (DONE) algorithm is a black-box optimization algorithm.
DONE models the unknown cost function and attempts to find an optimum of the underlying function.
The DONE algorithm is suitable for optimizing costly and noisy functions and does not require derivatives | https://huggingface.co/datasets/fmars/wiki_stem |
An exponential tree is a type of search tree where the number of children of its nodes decreases doubly-exponentially with increasing depth. Values are stored only in the leaf nodes. Each node contains a splitter, a value less than or equal to all values in the subtree which is used during search | https://huggingface.co/datasets/fmars/wiki_stem |
Within abstract algebra, the false nearest neighbor algorithm is an algorithm for estimating the embedding dimension. The concept was proposed by Kennel et al. (1992) | https://huggingface.co/datasets/fmars/wiki_stem |
Featherstone's algorithm is a technique used for computing the effects of forces applied to a structure of joints and links (an "open kinematic chain") such as a skeleton used in ragdoll physics.
The Featherstone's algorithm uses a reduced coordinate representation. This is in contrast to the more popular Lagrange multiplier method, which uses maximal coordinates | https://huggingface.co/datasets/fmars/wiki_stem |
In computer science, finger search trees are a type of binary search tree that keeps pointers to interior nodes, called fingers. The fingers speed up searches, insertions, and deletions for elements close to the fingers, giving amortized O(log n) lookups, and amortized O(1) insertions and deletions. It should not be confused with a finger tree nor a splay tree, although both can be used to implement finger search trees | https://huggingface.co/datasets/fmars/wiki_stem |
The Fireworks Algorithm (FWA) is a swarm intelligence algorithm that explores a very large solution space by choosing a set of random points confined by some distance metric in the hopes that one or more of them will yield promising results, allowing for a more concentrated search nearby.
Algorithm Description
The algorithm is implemented and described in terms of the explosion process of fireworks: explosions occur at specific points, and "sparks" fan out from the explosion. Each spark location is considered until an adequately optimal point is found | https://huggingface.co/datasets/fmars/wiki_stem |
In computational linguistics, the Gale–Church algorithm is a method for aligning corresponding sentences in a parallel corpus. It works on the principle that equivalent sentences should roughly correspond in length—that is, longer sentences in one language should correspond to longer sentences in the other language. The algorithm was described in a 1993 paper by William A | https://huggingface.co/datasets/fmars/wiki_stem |
The generic sensor format (GSF) is a file format used for storing bathymetry data, such as that gathered by a multibeam echosounder.
The format was created by Scott Ferguson and Daniel A. Chayes | https://huggingface.co/datasets/fmars/wiki_stem |
The Hirschberg–Sinclair algorithm is a distributed algorithm designed for leader election problem in a synchronous ring network. It is named after its inventors, Dan Hirschberg and J. B | https://huggingface.co/datasets/fmars/wiki_stem |
In computer chess, and in other games that computers play, late move reductions is a non-game-specific enhancement to the alpha–beta algorithm and its variants which attempts to examine a game search tree more efficiently. It uses the assumption that good game-specific move ordering causes a program to search the most likely moves early. If a cut-off is going to happen in a search, the first few moves are the ones most likely to cause them | https://huggingface.co/datasets/fmars/wiki_stem |
In computer science, lazy deletion refers to a method of deleting elements from a hash table that uses open addressing. In this method, deletions are done by marking an element as deleted, rather than erasing it entirely. Deleted locations are treated as empty when inserting and as occupied during a search | https://huggingface.co/datasets/fmars/wiki_stem |
In mathematical optimization, Lemke's algorithm is a procedure for solving linear complementarity problems, and more generally mixed linear complementarity problems. It is named after Carlton E. Lemke | https://huggingface.co/datasets/fmars/wiki_stem |
A left-leaning red–black (LLRB) tree is a type of self-balancing binary search tree. It is a variant of the red–black tree and guarantees the same asymptotic complexity for operations, but is designed to be easier to implement.
Properties of a left-leaning red–black tree
All of the red-black tree algorithms that have been proposed are characterized by a worst-case search time bounded by a small constant multiple of log N in a tree of N keys, and the behavior observed in practice is typically that same multiple faster than the worst-case bound, close to the optimal log N nodes examined that would be observed in a perfectly balanced tree | https://huggingface.co/datasets/fmars/wiki_stem |
In computer science, the longest repeated substring problem is the problem of finding the longest substring of a string that occurs at least twice.
This problem can be solved in linear time and space
Θ
(
n
)
{\displaystyle \Theta (n)}
by building a suffix tree for the string (with a special end-of-string symbol like '$' appended), and finding the deepest internal node in the tree with more than one child. Depth is measured by the number of characters traversed from the root | https://huggingface.co/datasets/fmars/wiki_stem |
LU reduction is an algorithm related to LU decomposition. This term is usually used in the context of super computing and highly parallel computing. In this context it is used as a benchmarking algorithm, i | https://huggingface.co/datasets/fmars/wiki_stem |
In computer science, the minimum routing cost spanning tree of a weighted graph is a spanning tree minimizing the sum of pairwise distances between vertices in the tree. It is also called the optimum distance spanning tree, shortest total path length spanning tree, minimum total distance spanning tree, or minimum average distance spanning tree. In an unweighted graph, this is the spanning tree of minimum Wiener index | https://huggingface.co/datasets/fmars/wiki_stem |
In computational geometry, the multiple line segment intersection problem supplies a list of line segments in the Euclidean plane and asks whether any two of them intersect (cross).
Simple algorithms examine each pair of segments. However, if a large number of possibly intersecting segments are to be checked, this becomes increasingly inefficient since most pairs of segments are not close to one another in a typical input sequence | https://huggingface.co/datasets/fmars/wiki_stem |
The non-cryptographic hash functions (NCHFs) are hash functions intended for applications that do not need the rigorous security requirements of the cryptographic hash functions (e. g. , preimage resistance) and therefore can be faster and less resource-intensive | https://huggingface.co/datasets/fmars/wiki_stem |
In mathematics, the Odlyzko–Schönhage algorithm is a fast algorithm for evaluating the Riemann zeta function at many points, introduced by (Odlyzko & Schönhage 1988). The main point is the use of the fast Fourier transform to speed up the evaluation of a finite Dirichlet series of length N at O(N) equally spaced values from O(N2) to O(N1+ε) steps (at the cost of storing O(N1+ε) intermediate values). The Riemann–Siegel formula used for
calculating the Riemann zeta function with imaginary part T uses a finite Dirichlet series with about N = T1/2 terms, so when finding about N values of the Riemann zeta function it is sped up by a factor of about T1/2 | https://huggingface.co/datasets/fmars/wiki_stem |
The pairwise sorting network is a sorting network discovered and published by Ian Parberry in 1992 in Parallel Processing Letters. The pairwise sorting network has the same size (number of comparators) and depth as the odd–even mergesort network. At the time of publication, the network was one of several known networks with a depth of
O
(
l
o
g
2
n
)
{\displaystyle O(log^{2}n)}
| https://huggingface.co/datasets/fmars/wiki_stem |
Perceptual hashing is the use of a fingerprinting algorithm that produces a snippet, hash, or fingerprint of various forms of multimedia. A perceptual hash is a type of locality-sensitive hash, which is analogous if features of the multimedia are similar. This is not to be confused with cryptographic hashing, which relies on the avalanche effect of a small change in input value creating a drastic change in output value | https://huggingface.co/datasets/fmars/wiki_stem |
A prefix hash tree (PHT) is a distributed data structure that enables more sophisticated queries over a distributed hash table (DHT). The prefix hash tree uses the lookup interface of a DHT to construct a trie-based data structure that is both efficient (updates are doubly logarithmic in the size of the domain being indexed), and resilient (the failure of any given node in a prefix hash tree does not affect the availability of data stored at other nodes).
References
External links
https://www | https://huggingface.co/datasets/fmars/wiki_stem |
Principle of deferred decisions is a technique used in analysis of randomized algorithms.
Definition
A randomized algorithm makes a set of random choices. These random choices may be intricately related making it difficult to analyze it | https://huggingface.co/datasets/fmars/wiki_stem |
The Priority R-tree is a worst-case asymptotically optimal alternative to the spatial tree R-tree. It was first proposed by Arge, De Berg, Haverkort and Yi, K. in an article from 2004 | https://huggingface.co/datasets/fmars/wiki_stem |
In analysis of algorithms, probabilistic analysis of algorithms is an approach to estimate the computational complexity of an algorithm or a computational problem. It starts from an assumption about a probabilistic distribution of the set of all possible inputs. This assumption is then used to design an efficient algorithm or to derive the complexity of a known algorithm | https://huggingface.co/datasets/fmars/wiki_stem |
Rapidly exploring dense trees is a family of planning algorithms that includes the rapidly exploring random tree.
References
Yershova, Anna; Jaillet, Léonard; Siméon, Thierry; LaVelle, Steven M. (2005) | https://huggingface.co/datasets/fmars/wiki_stem |
SatZ is a well known SAT instance solver. It was developed by Prof. Chu Min Li, a computer science researcher | https://huggingface.co/datasets/fmars/wiki_stem |
A sequence step algorithm (SQS-AL) is an algorithm implemented in a discrete event simulation system to maximize resource utilization. This is achieved by running through two main nested loops: A sequence step loop and a replication loop. For each sequence step, each replication loop is a simulation run that collects crew idle time for activities in that sequence step | https://huggingface.co/datasets/fmars/wiki_stem |
In computer science, a sequential algorithm or serial algorithm is an algorithm that is executed sequentially – once through, from start to finish, without other processing executing – as opposed to concurrently or in parallel. The term is primarily used to contrast with concurrent algorithm or parallel algorithm; most standard computer algorithms are sequential algorithms, and not specifically identified as such, as sequentialness is a background assumption. Concurrency and parallelism are in general distinct concepts, but they often overlap – many distributed algorithms are both concurrent and parallel – and thus "sequential" is used to contrast with both, without distinguishing which one | https://huggingface.co/datasets/fmars/wiki_stem |
Snap rounding is a method of approximating line segment locations by creating a grid and placing each point in the centre of a cell (pixel) of the grid. The method preserves certain topological properties of the arrangement of line segments.
Drawbacks include the potential interpolation of additional vertices in line segments (lines become polylines), the arbitrary closeness of a point to a non-incident edge, and arbitrary numbers of intersections between input line-segments | https://huggingface.co/datasets/fmars/wiki_stem |
Successive Linear Programming (SLP), also known as Sequential Linear Programming, is an optimization technique for approximately solving nonlinear optimization problems. It is related to, but distinct from, quasi-Newton methods.
Starting at some estimate of the optimal solution, the method is based on solving a sequence of first-order approximations (i | https://huggingface.co/datasets/fmars/wiki_stem |
In mathematics and computer science, symbolic-numeric computation is the use of software that combines symbolic and numeric methods to solve problems.
Background
Computational Algebraic Geometry
References
Wang, Dongming; Zhi, Lihong (2007). Symbolic-numeric Computation | https://huggingface.co/datasets/fmars/wiki_stem |
The symmetric hash join is a special type of hash join designed for data streams.
Algorithm
For each input, create a hash table.
For each new record, hash and insert into inputs hash table | https://huggingface.co/datasets/fmars/wiki_stem |
A trace table is a technique used to test algorithms in order to make sure that no logical errors occur while the calculations are being processed. The table usually takes the form of a multi-column, multi-row table; With each column showing a variable, and each row showing each number input into the algorithm and the subsequent values of the variables.
Trace tables are typically used in schools and colleges when teaching students how to program | https://huggingface.co/datasets/fmars/wiki_stem |
The Unicode collation algorithm (UCA) is an algorithm defined in Unicode Technical Report #10, which is a customizable method to produce binary keys from strings representing text in any writing system and language that can be represented with Unicode. These keys can then be efficiently byte-by-byte compared in order to collate or sort them according to the rules of the language, with options for ignoring case, accents, etc. Unicode Technical Report #10 also specifies the Default Unicode Collation Element Table (DUCET), this data file specifies a default collation ordering, the DUCET is customizable for different languages | https://huggingface.co/datasets/fmars/wiki_stem |
In data mining, the WINEPI algorithm is an influential algorithm for episode mining, which helps discover the knowledge hidden in an event sequence.
WINEPI derives part of its name from the fact that it uses a sliding window to go through the event sequence.
The outcome of the algorithm are episode rules describe temporal relationships between events and form an extension of association rules | https://huggingface.co/datasets/fmars/wiki_stem |
In computer science tree data structures, an X-tree (for eXtended node tree) is an index tree structure based on the R-tree used for storing data in many dimensions. It appeared in 1996, and differs from R-trees (1984), R+-trees (1987) and R*-trees (1990) because it emphasizes prevention of overlap in the bounding boxes, which increasingly becomes a problem in high dimensions. In cases where nodes cannot be split without preventing overlap, the node split will be deferred, resulting in super-nodes | https://huggingface.co/datasets/fmars/wiki_stem |
In artificial intelligence (AI), particularly machine learning (ML), ablation is the removal of a component of an AI system. An ablation study investigates the performance of an AI system by removing certain components to understand the contribution of the component to the overall system.
The term is an analogy with biology (removal of components of an organism), and is particularly used in the analysis of artificial neural nets by analogy with ablative brain surgery | https://huggingface.co/datasets/fmars/wiki_stem |
The Artificial Intelligence Center is a laboratory in the Information and Computing Sciences Division of SRI International. It was founded in 1966 by Charles Rosen and studies artificial intelligence. One of their early projects was Shakey the Robot, the first general-purpose mobile robot | https://huggingface.co/datasets/fmars/wiki_stem |
Artificial intelligence in pharmacy is the application of artificial intelligence (AI) to the discovery, development, and the treatment of patients with medications. AI in pharmacy practices has the potential to revolutionize all aspects of pharmaceutical research as well as to improve the clinical application of pharmaceuticals to prevent, treat, or cure disease. AI, a technology that enables machines to simulate human intelligence, has found applications in pharmaceutical research, drug manufacturing, and patient-centered services | https://huggingface.co/datasets/fmars/wiki_stem |
Artificial wisdom is a software system that can demonstrate one or more qualities of being wise.
Artificial wisdom can be described as artificial intelligence reaching the top-level of decision-making when confronted with the most complex challenging situations. The term artificial wisdom is used when the "intelligence" is based on more than by chance collecting and interpreting data, but by design enriched with smart and conscience strategies that wise people would use | https://huggingface.co/datasets/fmars/wiki_stem |
Attributional calculus is a logic and representation system defined by Ryszard S. Michalski. It combines elements of predicate logic, propositional calculus, and multi-valued logic | https://huggingface.co/datasets/fmars/wiki_stem |
Automated negotiation is a form of interaction in systems that are composed of multiple autonomous agents, in which the aim is to reach agreements through an iterative process of making offers. Automated negotiation can be employed for many tasks human negotiators regularly engage in, such as bargaining and joint decision making. The main topics in automated negotiation revolve around the design of protocols and strategies | https://huggingface.co/datasets/fmars/wiki_stem |
BabyX is an interactive lifelike virtual infant created through the use of artificial intelligence by Mark Sagar, Creator and Director of the Laboratory for Animate Technologies located at the University of Auckland's Bioengineering Institute.
Created in 2013, BabyX is a virtual animated baby that learns and reacts like a human baby and was designed after the likeness of Sagar’s own daughter when she was 18 months old. BabyX has a virtual brain built with detailed likeness to the human brain and work through an operating system called Brain Language, invented by Sagar and his team of researchers | https://huggingface.co/datasets/fmars/wiki_stem |
For popular psychology, the belief–desire–intention (BDI) model of human practical reasoning was developed by Michael Bratman as a way of explaining future-directed intention.
BDI is fundamentally reliant on folk psychology (the 'theory theory'), which is the notion that our mental models of the world are theories. It was used as a basis for developing the belief–desire–intention software model | https://huggingface.co/datasets/fmars/wiki_stem |
The blocks world is a planning domain in artificial intelligence. The algorithm is similar to a set of wooden blocks of various shapes and colors sitting on a table. The goal is to build one or more vertical stacks of blocks | https://huggingface.co/datasets/fmars/wiki_stem |
ChipTest was a 1985 chess playing computer built by Feng-hsiung Hsu, Thomas Anantharaman and Murray Campbell at Carnegie Mellon University. It is the predecessor of Deep Thought which in turn evolved into Deep Blue.
ChipTest was based on a special VLSI-technology move generator chip developed by Hsu | https://huggingface.co/datasets/fmars/wiki_stem |
The CN2 induction algorithm is a learning algorithm for rule induction. It is designed to work even when the training data is imperfect. It is based on ideas from the AQ algorithm and the ID3 algorithm | https://huggingface.co/datasets/fmars/wiki_stem |
Computational cybernetics is the integration of cybernetics and computational intelligence techniques. Though the term Cybernetics entered the technical lexicon in the 1940s and 1950s, it was first used informally as a popular noun in the 1960s, when it became associated with computers, robotics, Artificial Intelligence and Science fiction.
The initial promise of cybernetics was that it would revolutionise the mathematical biologies (a blanket term that includes some kinds of AI) by its use of closed loop semantics rather than open loop mathematics to describe and control living systems and biological process behaviours | https://huggingface.co/datasets/fmars/wiki_stem |
The theoretical strength of a solid is the maximum possible stress a perfect solid can withstand. It is often much higher than what current real materials can achieve. The lowered fracture stress is due to defects, such as interior or surface cracks | https://huggingface.co/datasets/fmars/wiki_stem |
Thermal barrier coatings (TBCs) are advanced materials systems usually applied to metallic surfaces operating at elevated temperatures, such as gas turbine or aero-engine parts, as a form of exhaust heat management. These 100 μm to 2 mm thick coatings of thermally insulating materials serve to insulate components from large and prolonged heat loads and can sustain an appreciable temperature difference between the load-bearing alloys and the coating surface. In doing so, these coatings can allow for higher operating temperatures while limiting the thermal exposure of structural components, extending part life by reducing oxidation and thermal fatigue | https://huggingface.co/datasets/fmars/wiki_stem |
A thermal history coating (THC) is a robust coating containing various non-toxic chemical compounds whose crystal structures irreversibly change at high temperatures. This allows for temperature measurements and thermal analysis to be performed on intricate and inaccessible components, which operate in harsh environments. Like thermal barrier coatings, THCs provide protection from intense heat to the surfaces on which they are applied | https://huggingface.co/datasets/fmars/wiki_stem |
A thermal interface material (often abbreviated as TIM) is any material that is inserted between two components in order to enhance the thermal coupling between them. A common use is heat dissipation, in which the TIM is inserted between a heat-producing device (e. g | https://huggingface.co/datasets/fmars/wiki_stem |
Thermal spraying techniques are coating processes in which melted (or heated) materials are sprayed onto a surface. The "feedstock" (coating precursor) is heated by electrical (plasma or arc) or chemical means (combustion flame).
Thermal spraying can provide thick coatings (approx | https://huggingface.co/datasets/fmars/wiki_stem |
Thermally stimulated depolarization current (TSDC) is a scientific technique used to measure dielectric properties of materials. It can be used to measure the thermally stimulated depolarization of molecules within a material. One method of doing so is to place the material between two electrodes, cool the material in the presence of an external electric field, remove the field once a desired temperature has been reached, and measure the current between the electrodes as the material warms | https://huggingface.co/datasets/fmars/wiki_stem |
Thermoelectric materials show the thermoelectric effect in a strong or convenient form.
The thermoelectric effect refers to phenomena by which either a temperature difference creates an electric potential or an electric current creates a temperature difference. These phenomena are known more specifically as the Seebeck effect (creating a voltage from temperature difference), Peltier effect (driving heat flow with an electric current), and Thomson effect (reversible heating or cooling within a conductor when there is both an electric current and a temperature gradient) | https://huggingface.co/datasets/fmars/wiki_stem |
Thermogravimetric analysis or thermal gravimetric analysis (TGA) is a method of thermal analysis in which the mass of a sample is measured over time as the temperature changes. This measurement provides information about physical phenomena, such as phase transitions, absorption, adsorption and desorption; as well as chemical phenomena including chemisorptions, thermal decomposition, and solid-gas reactions (e. g | https://huggingface.co/datasets/fmars/wiki_stem |
Thermomechanical analysis (TMA) is a technique used in thermal analysis, a branch of materials science which studies the properties of materials as they change with temperature.
Thermomechanical analysis is a subdiscipline of the thermomechanometry (TM) technique.
Related techniques and terminology
Thermomechanometry is the measurement of a change of a dimension or a mechanical property of the sample while it is subjected to a temperature regime | https://huggingface.co/datasets/fmars/wiki_stem |
Thermoplastic olefin, thermoplastic polyolefin (TPO), or olefinic thermoplastic elastomers refer to polymer/filler blends usually consisting of some fraction of a thermoplastic, an elastomer or rubber, and usually a filler. Outdoor applications such as roofing frequently contain TPO because it does not degrade under solar UV radiation, a common problem with nylons. TPO is used extensively in the automotive industry | https://huggingface.co/datasets/fmars/wiki_stem |
A thin film is a layer of material ranging from fractions of a nanometer (monolayer) to several micrometers in thickness. The controlled synthesis of materials as thin films (a process referred to as deposition) is a fundamental step in many applications. A familiar example is the household mirror, which typically has a thin metal coating on the back of a sheet of glass to form a reflective interface | https://huggingface.co/datasets/fmars/wiki_stem |
Thiolated polymers – designated thiomers – are functional polymers used in biotechnology product development with the intention to prolong mucosal drug residence time and to enhance absorption of drugs. The name thiomer was coined by Andreas Bernkop-Schnürch in 2000. Thiomers have thiol bearing side chains | https://huggingface.co/datasets/fmars/wiki_stem |
Time resolved microwave conductivity (TRMC) is an experimental technique used to evaluate the electronic properties of semiconductors. Specifically, it is used to evaluate a proxy for charge carrier mobility and a representative carrier lifetime from light-induced changes in conductance. The technique works by photo-generating electrons and holes in a semiconductor, allowing these charge carriers to move under a microwave field, and detecting the resulting changes in the electric field | https://huggingface.co/datasets/fmars/wiki_stem |
A touchstone is a small tablet of dark stone such as slate or lydite, used for assaying precious metal alloys. It has a finely grained surface on which soft metals leave a visible trace.
History
The touchstone was used during the Harappa period of the Indus Valley civilization ca | https://huggingface.co/datasets/fmars/wiki_stem |
In materials science and metallurgy, toughness is the ability of a material to absorb energy and plastically deform without fracturing. Toughness is the strength with which the material opposes rupture. One definition of material toughness is the amount of energy per unit volume that a material can absorb before rupturing | https://huggingface.co/datasets/fmars/wiki_stem |
Meilogu is a geographical, cultural and natural region located in the northern part of Sardinia, in the province of Sassari, which can be considered a sub-region of Logudoro.
It borders to the north with the Sardinian sub-regions of Sassarese, to the east of Montacuto, to the south of the Goceano and Marghine and to the west of the Planargia.
Bonorva is the main municipality in the territory | https://huggingface.co/datasets/fmars/wiki_stem |
The Highlands (Scots: the Hielands; Scottish Gaelic: a’ Ghàidhealtachd [ə ˈɣɛːəl̪ˠt̪ʰəxk], 'the place of the Gaels') is a historical region of Scotland. Culturally, the Highlands and the Lowlands diverged from the Late Middle Ages into the modern period, when Lowland Scots replaced Scottish Gaelic throughout most of the Lowlands. The term is also used for the area north and west of the Highland Boundary Fault, although the exact boundaries are not clearly defined, particularly to the east | https://huggingface.co/datasets/fmars/wiki_stem |
Alaska ( (listen) ə-LAS-kə) is a U. S. state on the northwest extremity of North America | https://huggingface.co/datasets/fmars/wiki_stem |
A textbook of general botany is a botany book first published in 1924 by Gilbert M. Smith (1885 – 1959), James B. Overton , Edward M | https://huggingface.co/datasets/fmars/wiki_stem |
The Jepson Manual is a flora of the vascular plants that are either native to or naturalized in California. Botanists often refer to the book simply as Jepson. It is produced by the University and Jepson Herbaria, of the University of California, Berkeley | https://huggingface.co/datasets/fmars/wiki_stem |
The concept of Cascadian bioregionalism is closely identified with the environmental movement. In the early 1970s, the contemporary vision of bioregionalism began to be formed through collaboration between natural scientists, social and environmental activists, artists and writers, community leaders, and back-to-the-landers who worked directly with natural resources. A bioregion is defined in terms of the unique overall pattern of natural characteristics that are found in a specific place | https://huggingface.co/datasets/fmars/wiki_stem |
Lake Agassiz was a large proglacial lake that existed in central North America during late Pleistocene, fed by meltwater from the retreating Laurentide Ice Sheet at the end of the last glacial period. At its peak, the lake's area was larger than all of the modern Great Lakes combined. First postulated in 1823 by William H | https://huggingface.co/datasets/fmars/wiki_stem |
Lake Algonquin was a prehistoric proglacial lake that existed in east-central North America at the time of the last ice age. Parts of the former lake are now Lake Huron, Georgian Bay, Lake Superior, Lake Michigan, Lake Nipigon, and Lake Nipissing.
The lake varied in size, but it was at its biggest during the post-glacial period and gradually shrank to the current Lake Huron and Georgian Bay | https://huggingface.co/datasets/fmars/wiki_stem |
Lake Arkona was a stage of the lake waters in the Huron-Erie-Ontario basin following the end of the Lake Maumee levels and before the Lake Whittlesey stages, named for Arkona, Ontario, about 50 miles (80 km) east of Sarnia.
Beaches
The ice sheet had withdrawn north of the "thumb" of Michigan, then advanced southward, raising water levels to the east of the "thumb", but not those to the west. This created four distinct areas around Lake Arkona:
In the Saginaw basin, where the Arkona beaches were neither submerged nor modified;
The area on the "thumb", where the beaches were overridden by the ice and destroyed;
The Black River valley, where the beaches were submerged but protected from modification; and
The area, which would be Lake Whittlesey, where the beaches were submerged and modified by storm waves | https://huggingface.co/datasets/fmars/wiki_stem |
Lake Duluth was a proglacial lake that formed in the Lake Superior drainage basin as the Laurentide Ice Sheet retreated. The oldest existing shorelines were formed after retreat from the Greatlakean advance (previously called the Valders), sometime around 11,000 years B. P | https://huggingface.co/datasets/fmars/wiki_stem |
Early Lake Erie was a prehistoric proglacial lake that existed at the end of the last ice age approximately 13,000 years ago. The early Erie fed waters to Glacial Lake Iroquois.
The ancient lake was similar in size to the current lake during glacial retreat, but for some period the eastern half of the lake was covered with ice | https://huggingface.co/datasets/fmars/wiki_stem |
A nor'easter (also northeaster; see below), or an East Coast low, is a synoptic-scale extratropical cyclone in the western North Atlantic Ocean. The name derives from the direction of the winds that blow from the northeast.
Typically, such storms originate as a low-pressure area that forms within 100 miles (160 km) of the shore between North Carolina and Massachusetts | https://huggingface.co/datasets/fmars/wiki_stem |
The April or Spring nor'easter of 2007 was a nor'easter that affected mainly the eastern parts of North America during its four-day course, from April 14 to April 17, 2007. The combined effects of high winds, heavy rainfall, and high tides led to flooding, storm damages, power outages, and evacuations, and disrupted traffic and commerce. In the north, heavy wet snow caused the loss of power for several thousands of homes in Ontario and Quebec | https://huggingface.co/datasets/fmars/wiki_stem |
The April 2021 nor'easter, also referred to as the 2021 Spring nor'easter, was a significant late-season nor'easter that impacted much of New England with heavy snowfall, gusty winds, thundersnow, and near-whiteout conditions from April 15–17, 2021. The system originated from a weak frontal system late on April 14 over North Carolina, which moved into the ocean the next day and began to strengthen. The low-pressure steadily deepened as it moved up the East Coast, and developed an eye-like feature just prior to peak intensity | https://huggingface.co/datasets/fmars/wiki_stem |
The February 1987 nor'easter was a significant winter storm in the US that impacted the Mid-Atlantic States around the end of the month. It delivered 8–12 hours of heavy, wet snowfall to several states from West Virginia to New York between February 22 and February 24. The storm was both preceded and followed by relatively warm temperatures, causing the snow to rapidly melt | https://huggingface.co/datasets/fmars/wiki_stem |
StepManiaX (abbreviated SMX and pronounced "Step Maniacs") is a rhythm game developed and published by Step Revolution, a studio formed by former developers of In the Groove, ReRave, and Pump It Up Pro. It is considered a spiritual successor to the In the Groove series. The name is a nod to the legacy of the open-source simulator StepMania, as many of the original StepMania developers were involved with the project | https://huggingface.co/datasets/fmars/wiki_stem |
Street Fighter II Turbo: Hyper Fighting is a competitive fighting game released by Capcom for arcades in 1992. It is the third arcade version of Street Fighter II, part of the Street Fighter franchise, following Street Fighter II: Champion Edition, and was initially released as an enhancement kit for that game. Released less than a year after the previous installment, Turbo introduced a faster playing speed and new special moves for certain characters, as well as further refinement to the character balance | https://huggingface.co/datasets/fmars/wiki_stem |
Strikers 1945 II (ストライカーズ1945II) is a vertically-scrolling shoot 'em up game developed and originally published by Psikyo in 1997 for the arcades as a follow-up to Strikers 1945. This game was also ported by Kuusou Kagaku to the PlayStation and Sega Saturn for Psikyo and re-released by Success in 2000. Agetec released Strikers 1945 II for the PlayStation in North America under the title Strikers 1945 in 2001, and Midas Games released it in Europe as a budget title in 2003 | https://huggingface.co/datasets/fmars/wiki_stem |
Strikers 1945 III also known as Strikers 1999 (ストライカーズ1999), is a vertically scrolling shoot 'em up game developed and originally released by Psikyo in 1999 for the arcades. The game is a sequel to Strikers 1945 II, chronologically taking place 54 years after the first two games in the series.
Gameplay
The player chooses from one of six modern jet fighter aircraft and shoots through eight stages (the first four levels are randomly chosen, and the last four remain the same for each play) | https://huggingface.co/datasets/fmars/wiki_stem |
Super Chase H. Q. is a racing game developed by Taito | https://huggingface.co/datasets/fmars/wiki_stem |
The Super Spy is an early Neo Geo game released by SNK in 1990. It is a first-person shooter and beat 'em up game with action role-playing elements in which players move through the many floors of an office building shooting terrorists. It is a first-person game where the player character's arms and weapons are visible on screen | https://huggingface.co/datasets/fmars/wiki_stem |
The Weteye bomb was a U. S. chemical weapon designed for the U | https://huggingface.co/datasets/fmars/wiki_stem |
On June 1, 1990, Presidents George H. W. Bush and Mikhail Gorbachev signed the bilateral U | https://huggingface.co/datasets/fmars/wiki_stem |
The Chemical and Biological Arms Control Institute was a private, nonprofit, nonpartisan policy research organization established in 1993 to address the challenges to global security and stability in the early 21st century, with a special, but not exclusive focus on the elimination of chemical weapons and biological weapons. It fosters this goal through an innovative program of research, analysis, technical support, training, and education. CBACI's objective was to promote a strategic approach to contemporary national security challenges that fosters the translation of ideas into action | https://huggingface.co/datasets/fmars/wiki_stem |
A chemical weapon (CW) is a specialized munition that uses chemicals formulated to inflict death or harm on humans. According to the Organisation for the Prohibition of Chemical Weapons (OPCW), this can be any chemical compound intended as a weapon "or its precursor that can cause death, injury, temporary incapacitation or sensory irritation through its chemical action. Munitions or other delivery devices designed to deliver chemical weapons, whether filled or unfilled, are also considered weapons themselves | https://huggingface.co/datasets/fmars/wiki_stem |
Many nations continue to research and/or stockpile chemical weapon agents despite numerous efforts to reduce or eliminate them. Most states have joined the Chemical Weapons Convention (CWC), which required the destruction of all chemical weapons by 2012. Twelve nations have declared chemical weapons production facilities and six nations have declared stockpiles of chemical weapons | https://huggingface.co/datasets/fmars/wiki_stem |
Operation Davy Jones' Locker (or Davey) was a U. K. and U | https://huggingface.co/datasets/fmars/wiki_stem |
Throughout history, chemical weapons have been used as strategic weaponry to devastate the enemy in times of war. After the mass destruction created by WWI and WWII, chemical weapons have been considered to be inhumane by most nations, and governments and organizations have undertaken to locate and destroy existing chemical weapons. However, not all nations have been willing to cooperate with disclosing or demilitarizing their inventory of chemical weapons | https://huggingface.co/datasets/fmars/wiki_stem |
The Field Deployable Hydrolysis System (FDHS) is a transportable, high throughput neutralization system developed by the U. S. Army for converting chemical warfare material into compounds not usable as weapons | https://huggingface.co/datasets/fmars/wiki_stem |
The Finnish Institute for Verification of the Chemical Weapons Convention (VERIFIN) is a Finnish institute carrying out several roles in support of chemical weapons disarmament.
Established in 1994 as a continuation of a research project started in 1973, it is located within the Chemistry Department of the Kumpula Campus of the University of Helsinki.
Funded by the Finnish Ministry for Foreign Affairs, its main task is to develop improved methods of verification of chemical weapons disarmament | https://huggingface.co/datasets/fmars/wiki_stem |
The Protocol for the Prohibition of the Use in War of Asphyxiating, Poisonous or other Gases, and of Bacteriological Methods of Warfare, usually called the Geneva Protocol, is a treaty prohibiting the use of chemical and biological weapons in international armed conflicts. It was signed at Geneva on 17 June 1925 and entered into force on 8 February 1928. It was registered in League of Nations Treaty Series on 7 September 1929 | https://huggingface.co/datasets/fmars/wiki_stem |
Operation Geranium was a U. S. Army mission that dumped more than 3,000 tons of the chemical agent lewisite into the ocean off the Florida coast in 1948 | https://huggingface.co/datasets/fmars/wiki_stem |
The Middle East Treaty Organization (METO) is a non-governmental organization founded in 2017 by a coalition of civil-society activists and disarmament practitioners, with the aim to rid the Middle East of all weapons of mass destruction (WMD). This proposal is in line with the 1970s proposal for a Middle East nuclear weapon free zone, albeit with broader scope following the 1990 Mubarak Initiative to include chemical and biological as well as nuclear weapons.
Working toward the broader vision of regional security and peace, METO defines its purpose as the establishment of a zone free of weapons of mass destruction (WMDFZ) in the Middle East | https://huggingface.co/datasets/fmars/wiki_stem |
The "Statement on Chemical and Biological Defense Policies and Programs" was a speech delivered on November 25, 1969, by U. S. President Richard Nixon | https://huggingface.co/datasets/fmars/wiki_stem |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.