text
stringlengths
31
999
source
stringclasses
5 values
Polymake is software for the algorithmic treatment of convex polyhedra. Albeit primarily a tool to study the combinatorics and the geometry of convex polytopes and polyhedra, it is by now also capable of dealing with simplicial complexes, matroids, polyhedral fans, graphs, tropical objects, toric varieties and other objects. Polymake has been cited in over 100 recent articles indexed by Zentralblatt MATH as can be seen from its entry in the swMATH database
https://huggingface.co/datasets/fmars/wiki_stem
In computational geometry, a power diagram, also called a Laguerre–Voronoi diagram, Dirichlet cell complex, radical Voronoi tesselation or a sectional Dirichlet tesselation, is a partition of the Euclidean plane into polygonal cells defined from a set of circles. The cell for a given circle C consists of all the points for which the power distance to C is smaller than the power distance to the other circles. The power diagram is a form of generalized Voronoi diagram, and coincides with the Voronoi diagram of the circle centers in the case that all the circles have equal radii
https://huggingface.co/datasets/fmars/wiki_stem
Privacy-preserving computational geometry is the research area on the intersection of the domains of secure multi-party computation (SMC) and computational geometry. Classical problems of computational geometry reconsidered from the point of view of SMC include shape intersection, private point inclusion problem, range searching, convex hull, and more. A pioneering work in this area was a 2001 paper by Atallah and Du, in which the secure point in polygon inclusion and polygonal intersection problems were considered
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, a random polytope is a structure commonly used in convex analysis and the analysis of linear programs in d-dimensional Euclidean space R d {\displaystyle \mathbb {R} ^{d}} . Depending on use the construction and definition, random polytopes may differ. Definition There are multiple non equivalent definitions of a Random polytope
https://huggingface.co/datasets/fmars/wiki_stem
In computing, especially computational geometry, a real RAM (random-access machine) is a mathematical model of a computer that can compute with exact real numbers instead of the binary fixed point or floating point numbers used by most actual computers. The real RAM was formulated by Michael Ian Shamos in his 1978 Ph. D
https://huggingface.co/datasets/fmars/wiki_stem
In graph theory, the rectilinear minimum spanning tree (RMST) of a set of n points in the plane (or more generally, in ℝd) is a minimum spanning tree of that set, where the weight of the edge between each pair of points is the rectilinear distance between those two points. Properties and algorithms By explicitly constructing the complete graph on n vertices, which has n(n-1)/2 edges, a rectilinear minimum spanning tree can be found using existing algorithms for finding a minimum spanning tree. In particular, using Prim's algorithm with an adjacency matrix yields time complexity O(n2)
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, specifically in computational geometry, geometric nonrobustness is a problem wherein branching decisions in computational geometry algorithms are based on approximate numerical computations, leading to various forms of unreliability including ill-formed output and software failure through crashing or infinite loops. For instance, algorithms for problems like the construction of a convex hull rely on testing whether certain "numerical predicates" have values that are positive, negative, or zero. If an inexact floating-point computation causes a value that is near zero to have a different sign than its exact value, the resulting inconsistencies can propagate through the algorithm causing it to produce output that is far from the correct output, or even to crash
https://huggingface.co/datasets/fmars/wiki_stem
In robust statistics and computational geometry, simplicial depth is a measure of central tendency determined by the simplices that contain a given point. For the Euclidean plane, it counts the number of triangles of sample points that contain a given point. Definition The simplicial depth of a point p {\displaystyle p} in d {\displaystyle d} -dimensional Euclidean space, with respect to a set of sample points in that space, is the number of d {\displaystyle d} -dimensional simplices (the convex hulls of sets of d + 1 {\displaystyle d+1} sample points) that contain p {\displaystyle p}
https://huggingface.co/datasets/fmars/wiki_stem
Simultaneous localization and mapping (SLAM) is the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it. While this initially appears to be a chicken or the egg problem, there are several algorithms known to solve it in, at least approximately, tractable time for certain environments. Popular approximate solution methods include the particle filter, extended Kalman filter, covariance intersection, and GraphSLAM
https://huggingface.co/datasets/fmars/wiki_stem
The smallest-circle problem (also known as minimum covering circle problem, bounding circle problem, least bounding circle problem, smallest enclosing circle problem) is a mathematical problem of computing the smallest circle that contains all of a given set of points in the Euclidean plane. The corresponding problem in n-dimensional space, the smallest bounding sphere problem, is to compute the smallest n-sphere that contains all of a given set of points. The smallest-circle problem was initially proposed by the English mathematician James Joseph Sylvester in 1857
https://huggingface.co/datasets/fmars/wiki_stem
In computational geometry, the source unfolding of a convex polyhedron is a net obtained by cutting the polyhedron along the cut locus of a point on the surface of the polyhedron. The cut locus of a point p {\displaystyle p} consists of all points on the surface that have two or more shortest geodesics to p {\displaystyle p} . For every convex polyhedron, and every choice of the point p {\displaystyle p} on its surface, cutting the polyhedron on the cut locus will produce a result that can be unfolded into a flat plane, producing the source unfolding
https://huggingface.co/datasets/fmars/wiki_stem
In computational geometry, the star unfolding of a convex polyhedron is a net obtained by cutting the polyhedron along geodesics (shortest paths) through its faces. It has also been called the inward layout of the polyhedron, or the Alexandrov unfolding after Aleksandr Danilovich Aleksandrov, who first considered it. Description In more detail, the star unfolding is obtained from a polyhedron P {\displaystyle P} by choosing a starting point p {\displaystyle p} on the surface of P {\displaystyle P} , in general position, meaning that there is a unique shortest geodesic from p {\displaystyle p} to each vertex of P {\displaystyle P}
https://huggingface.co/datasets/fmars/wiki_stem
In computational geometry, a Steiner point is a point that is not part of the input to a geometric optimization problem but is added during the solution of the problem, to create a better solution than would be possible from the original points alone. The name of these points comes from the Steiner tree problem, named after Jakob Steiner, in which the goal is to connect the input points by a network of minimum total length. If the input points alone are used as endpoints of the network edges, then the shortest network is their minimum spanning tree
https://huggingface.co/datasets/fmars/wiki_stem
In geometry, a straight skeleton is a method of representing a polygon by a topological skeleton. It is similar in some ways to the medial axis but differs in that the skeleton is composed of straight line segments, while the medial axis of a polygon may involve parabolic curves. However, both are homotopy-equivalent to the underlying polygon
https://huggingface.co/datasets/fmars/wiki_stem
In computational complexity theory, there is an open problem of whether some information about a sum of radicals may be computed in polynomial time depending on the input size, i. e. , in the number of bits necessary to represent this sum
https://huggingface.co/datasets/fmars/wiki_stem
The International Symposium on Computational Geometry (SoCG) is an academic conference in computational geometry. It was founded in 1985, with program committee consisting of David Dobkin, Joseph O'Rourke, Franco Preparata, and Godfried Toussaint; O'Rourke was the conference chair. The symposium was originally sponsored by the SIGACT and SIGGRAPH Special Interest Groups of the Association for Computing Machinery (ACM)
https://huggingface.co/datasets/fmars/wiki_stem
In non-parametric statistics, the Theil–Sen estimator is a method for robustly fitting a line to sample points in the plane (simple linear regression) by choosing the median of the slopes of all lines through pairs of points. It has also been called Sen's slope estimator, slope selection, the single median method, the Kendall robust line-fit method, and the Kendall–Theil robust line. It is named after Henri Theil and Pranab K
https://huggingface.co/datasets/fmars/wiki_stem
In computational geometry, the Theta graph, or Θ {\displaystyle \Theta } -graph, is a type of geometric spanner similar to a Yao graph. The basic method of construction involves partitioning the space around each vertex into a set of cones, which themselves partition the remaining vertices of the graph. Like Yao Graphs, a Θ {\displaystyle \Theta } -graph contains at most one edge per cone; where they differ is how that edge is selected
https://huggingface.co/datasets/fmars/wiki_stem
In statistics and computational geometry, the Tukey depth is a measure of the depth of a point in a fixed set of points. The concept is named after its inventor, John Tukey. Given a set of n points X n = { X 1 , … , X n } {\displaystyle {\mathcal {X}}_{n}=\{X_{1},\dots ,X_{n}\}} in d-dimensional space, Tukey's depth of a point x is the smallest fraction (or number) of points in any closed halfspace that contains x
https://huggingface.co/datasets/fmars/wiki_stem
In computational geometry, the Urquhart graph of a set of points in the plane, named after Roderick B. Urquhart, is obtained by removing the longest edge from each triangle in the Delaunay triangulation. The Urquhart graph was described by Urquhart (1980), who suggested that removing the longest edge from each Delaunay triangle would be a fast way of constructing the relative neighborhood graph (the graph connecting pairs of points p {\displaystyle p} and q {\displaystyle q} when there does not exist any third point r {\displaystyle r} that is closer to both p {\displaystyle p} and q {\displaystyle q} than they are to each other)
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, the vertex enumeration problem for a polytope, a polyhedral cell complex, a hyperplane arrangement, or some other object of discrete geometry, is the problem of determination of the object's vertices given some formal representation of the object. A classical example is the problem of enumeration of the vertices of a convex polytope specified by a set of linear inequalities: A x ≤ b {\displaystyle Ax\leq b} where A is an m×n matrix, x is an n×1 column vector of variables, and b is an m×1 column vector of constants. The inverse (dual) problem of finding the bounding inequalities given the vertices is called facet enumeration (see convex hull algorithms)
https://huggingface.co/datasets/fmars/wiki_stem
In computational geometry and robot motion planning, a visibility graph is a graph of intervisible locations, typically for a set of points and obstacles in the Euclidean plane. Each node in the graph represents a point location, and each edge represents a visible connection between them. That is, if the line segment connecting two locations does not pass through any obstacle, an edge is drawn between them in the graph
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, a Voronoi diagram is a partition of a plane into regions close to each of a given set of objects. In the simplest case, these objects are just finitely many points in the plane (called seeds, sites, or generators). For each seed there is a corresponding region, called a Voronoi cell, consisting of all points of the plane closer to that seed than to any other
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics and computer science, computational number theory, also known as algorithmic number theory, is the study of computational methods for investigating and solving problems in number theory and arithmetic geometry, including algorithms for primality testing and integer factorization, finding solutions to diophantine equations, and explicit methods in arithmetic geometry. Computational number theory has applications to cryptography, including RSA, elliptic curve cryptography and post-quantum cryptography, and is used to investigate conjectures and open problems in number theory, including the Riemann hypothesis, the Birch and Swinnerton-Dyer conjecture, the ABC conjecture, the modularity conjecture, the Sato-Tate conjecture, and explicit aspects of the Langlands program. Software packages Magma computer algebra system SageMath Number Theory Library PARI/GP Fast Library for Number Theory Further reading Eric Bach; Jeffrey Shallit (1996)
https://huggingface.co/datasets/fmars/wiki_stem
ABC@Home was an educational and non-profit network computing project finding abc-triples related to the abc conjecture in number theory using the Berkeley Open Infrastructure for Network Computing (BOINC) volunteer computing platform. In March 2011, there were more than 7,300 active participants from 114 countries with a total BOINC credit of more than 2. 9 billion, reporting about 10 teraflops (10 trillion operations per second) of processing power
https://huggingface.co/datasets/fmars/wiki_stem
Algorithmic Number Theory Symposium (ANTS) is a biennial academic conference, first held in Cornell in 1994, constituting an international forum for the presentation of new research in computational number theory. They are devoted to algorithmic aspects of number theory, including elementary number theory, algebraic number theory, analytic number theory, geometry of numbers, arithmetic geometry, finite fields, and cryptography. Selfridge Prize In honour of the many contributions of John Selfridge to mathematics, the Number Theory Foundation has established a prize to be awarded to those individuals who have authored the best paper accepted for presentation at ANTS
https://huggingface.co/datasets/fmars/wiki_stem
In computational complexity theory, a computational hardness assumption is the hypothesis that a particular problem cannot be solved efficiently (where efficiently typically means "in polynomial time"). It is not known how to prove (unconditional) hardness for essentially any useful problem. Instead, computer scientists rely on reductions to formally relate the hardness of a new or complicated problem to a computational hardness assumption about a problem that is better-understood
https://huggingface.co/datasets/fmars/wiki_stem
In computational number theory, Evdokimov's algorithm, named after Sergei Evdokimov, is the asymptotically fastest known algorithm for factorization of polynomials (until 2019). It can factorize a one-variable polynomial of degree n {\displaystyle n} over an explicitly given finite field of cardinality q {\displaystyle q} . Assuming the generalized Riemann hypothesis the algorithm runs in deterministic time ( n log ⁡ n log ⁡ q ) O ( 1 ) {\displaystyle (n^{\log n}\log q)^{{\mathcal {O}}(1)}} (see Big O notation)
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics and computer algebra the factorization of a polynomial consists of decomposing it into a product of irreducible factors. This decomposition is theoretically possible and is unique for polynomials with coefficients in any field, but rather strong restrictions on the field of the coefficients are needed to allow the computation of the factorization by means of an algorithm. In practice, algorithms have been designed only for polynomials with coefficients in a finite field, in the field of rationals or in a finitely generated field extension of one of them
https://huggingface.co/datasets/fmars/wiki_stem
In cryptography, most public key cryptosystems are founded on problems that are believed to be intractable. The higher residuosity problem (also called the n th-residuosity problem) is one such problem. This problem is easier to solve than integer factorization, so the assumption that this problem is hard to solve is stronger than the assumption that integer factorization is hard
https://huggingface.co/datasets/fmars/wiki_stem
The Itoh–Tsujii inversion algorithm is used to invert elements in a finite field. It was introduced in 1988, first over GF(2m) using the normal basis representation of elements, however, the algorithm is generic and can be used for other bases, such as the polynomial basis. It can also be used in any finite field GF(pm)
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, the goal of lattice basis reduction is to find a basis with short, nearly orthogonal vectors when given an integer lattice basis as input. This is realized using different algorithms, whose running time is usually at least exponential in the dimension of the lattice. Nearly orthogonal One measure of nearly orthogonal is the orthogonality defect
https://huggingface.co/datasets/fmars/wiki_stem
The phi-hiding assumption or Φ-hiding assumption is an assumption about the difficulty of finding small factors of φ(m) where m is a number whose factorization is unknown, and φ is Euler's totient function. The security of many modern cryptosystems comes from the perceived difficulty of certain problems. Since P vs
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, the supersingular isogeny graphs are a class of expander graphs that arise in computational number theory and have been applied in elliptic-curve cryptography. Their vertices represent supersingular elliptic curves over finite fields and their edges represent isogenies between curves. Definition and properties A supersingular isogeny graph is determined by choosing a large prime number p {\displaystyle p} and a small prime number ℓ {\displaystyle \ell } , and considering the class of all supersingular elliptic curves defined over the finite field F p 2 {\displaystyle \mathbb {F} _{p^{2}}}
https://huggingface.co/datasets/fmars/wiki_stem
Elliptic curve cryptography is a popular form of public key encryption that is based on the mathematical theory of elliptic curves. Points on an elliptic curve can be added and form a group under this addition operation. This article describes the computational costs for this group addition and certain related operations that are used in elliptic curve cryptography algorithms
https://huggingface.co/datasets/fmars/wiki_stem
Numerical analysis is the study of algorithms that use numerical approximation (as opposed to symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics). It is the study of numerical methods that attempt at finding approximate solutions of problems rather than the exact ones. Numerical analysis finds application in all fields of engineering and the physical sciences, and in the 21st century also the life and social sciences, medicine, business and even the arts
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics and numerical analysis, an adaptive step size is used in some methods for the numerical solution of ordinary differential equations (including the special case of numerical integration) in order to control the errors of the method and to ensure stability properties such as A-stability. Using an adaptive stepsize is of particular importance when there is a large variation in the size of the derivative. For example, when modeling the motion of a satellite about the earth as a standard Kepler orbit, a fixed time-stepping method such as the Euler method may be sufficient
https://huggingface.co/datasets/fmars/wiki_stem
The adjoint state method is a numerical method for efficiently computing the gradient of a function or operator in a numerical optimization problem. It has applications in geophysics, seismic imaging, photonics and more recently in neural networks. The adjoint state space is chosen to simplify the physical interpretation of equation constraints
https://huggingface.co/datasets/fmars/wiki_stem
Affine arithmetic (AA) is a model for self-validated numerical analysis. In AA, the quantities of interest are represented as affine combinations (affine forms) of certain primitive variables, which stand for sources of uncertainty in the data or approximations made during the computation. Affine arithmetic is meant to be an improvement on interval arithmetic (IA), and is similar to generalized interval arithmetic, first-order Taylor arithmetic, the center-slope model, and ellipsoid calculus — in the sense that it is an automatic method to derive first-order guaranteed approximations to general formulas
https://huggingface.co/datasets/fmars/wiki_stem
In numerical analysis, Aitken's delta-squared process or Aitken extrapolation is a series acceleration method, used for accelerating the rate of convergence of a sequence. It is named after Alexander Aitken, who introduced this method in 1926. Its early form was known to Seki Kōwa (end of 17th century) and was found for rectification of the circle, i
https://huggingface.co/datasets/fmars/wiki_stem
The applied element method (AEM) is a numerical analysis used in predicting the continuum and discrete behavior of structures. The modeling method in AEM adopts the concept of discrete cracking allowing it to automatically track structural collapse behavior passing through all stages of loading: elastic, crack initiation and propagation in tension-weak materials, reinforcement yield, element separation, element contact and collision, as well as collision with the ground and adjacent structures. History Exploration of the approach employed in the applied element method began in 1995 at the University of Tokyo as part of Dr
https://huggingface.co/datasets/fmars/wiki_stem
The approximation error in a data value is the discrepancy between an exact value and some approximation to it. This error can be expressed as an absolute error (the numerical amount of the discrepancy) or as a relative error (the absolute error divided by the data value). An approximation error can occur for a variety of reasons, among them a computing machine precision or measurement error (e
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, approximation theory is concerned with how functions can best be approximated with simpler functions, and with quantitatively characterizing the errors introduced thereby. What is meant by best and simpler will depend on the application. A closely related topic is the approximation of functions by generalized Fourier series, that is, approximations based upon summation of a series of terms based upon orthogonal polynomials
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, a basis function is an element of a particular basis for a function space. Every function in the function space can be represented as a linear combination of basis functions, just as every vector in a vector space can be represented as a linear combination of basis vectors. In numerical analysis and approximation theory, basis functions are also called blending functions, because of their use in interpolation: In this application, a mixture of the basis functions provides an interpolating function (with the "blend" depending on the evaluation of the basis functions at the data points)
https://huggingface.co/datasets/fmars/wiki_stem
In the mathematical field of numerical analysis, a Bernstein polynomial is a polynomial that is a linear combination of Bernstein basis polynomials. The idea is named after Sergei Natanovich Bernstein. A numerically stable way to evaluate polynomials in Bernstein form is de Casteljau's algorithm
https://huggingface.co/datasets/fmars/wiki_stem
In mathematics, a bi-directional delay line is a numerical analysis technique used in computer simulation for solving ordinary differential equations by converting them to hyperbolic equations. In this way an explicit solution scheme is obtained with highly robust numerical properties. It was introduced by Auslander in 1968
https://huggingface.co/datasets/fmars/wiki_stem
The bidomain model is a mathematical model to define the electrical activity of the heart. It consists in a continuum (volume-average) approach in which the cardiac microstructure is defined in terms of muscle fibers grouped in sheets, creating a complex three-dimensional structure with anisotropical properties. Then, to define the electrical activity, two interpenetrating domains are considered, which are the intracellular and extracellular domains, representing respectively the space inside the cells and the region between them
https://huggingface.co/datasets/fmars/wiki_stem
In numerical mathematics, the boundary knot method (BKM) is proposed as an alternative boundary-type meshfree distance function collocation scheme. Recent decades have witnessed a research boom on the meshfree numerical PDE techniques since the construction of a mesh in the standard finite element method and boundary element method is not trivial especially for moving boundary, and higher-dimensional problems. The boundary knot method is different from the other methods based on the fundamental solutions, such as boundary element method, method of fundamental solutions and singular boundary method in that the former does not require special techniques to cure the singularity
https://huggingface.co/datasets/fmars/wiki_stem
In applied mathematics, the boundary particle method (BPM) is a boundary-only meshless (meshfree) collocation technique, in the sense that none of inner nodes are required in the numerical solution of nonhomogeneous partial differential equations. Numerical experiments show that the BPM has spectral convergence. Its interpolation matrix can be symmetric
https://huggingface.co/datasets/fmars/wiki_stem
In numerical analysis, catastrophic cancellation is the phenomenon that subtracting good approximations to two nearby numbers may yield a very bad approximation to the difference of the original numbers. For example, if there are two studs, one L 1 = 253. 5 cm {\displaystyle L_{1}=253
https://huggingface.co/datasets/fmars/wiki_stem
Cell-based models are mathematical models that represent biological cells as discrete entities. Within the field of computational biology they are often simply called agent-based models of which they are a specific application and they are used for simulating the biomechanics of multicellular structures such as tissues. to study the influence of these behaviors on how tissues are organised in time and space
https://huggingface.co/datasets/fmars/wiki_stem
In numerical analysis, Chebyshev nodes are specific real algebraic numbers, namely the roots of the Chebyshev polynomials of the first kind. They are often used as nodes in polynomial interpolation because the resulting interpolation polynomial minimizes the effect of Runge's phenomenon. Definition For a given positive integer n the Chebyshev nodes in the interval (−1, 1) are x k = cos ⁡ ( 2 k − 1 2 n π ) , k = 1 , … , n
https://huggingface.co/datasets/fmars/wiki_stem
Christophe Dessimoz is a Swiss National Science Foundation (SNSF) Professor at the University of Lausanne, Associate Professor at University College London and a group leader at the Swiss Institute of Bioinformatics. He was awarded the Overton Prize in 2019 for his contributions to computational biology. Starting in April 2022, he will be joint executive director of the SIB Swiss Institute of Bioinformatics, along with Ron Appel
https://huggingface.co/datasets/fmars/wiki_stem
The Bay Area Biosystematists is a group of biologists, geneticists, paleontologists, and systematists that are also interested in evolution. The group has been active in the San Francisco Bay Area since 1936, and is notable as a connection between many of the leading evolutionary biologists of the 20th century, including Herbert Baker, Theodosius Dobzhansky and G. Ledyard Stebbins who led the modern synthesis
https://huggingface.co/datasets/fmars/wiki_stem
Melanophryniscus peritus is a species of frog in the family Bufonidae. It is only known from a single specimen collected in 1953, and may be extinct. Taxonomy Melanophryniscus peritus was described in 2011 by Ulisses Caramaschi and Carlos Alberto Gonçalves da Cruz
https://huggingface.co/datasets/fmars/wiki_stem
The Navassa curly-tailed lizard or Navassa curlytail lizard (Leiocephalus eremitus) is an extinct lizard species from the family of curly-tailed lizard (Leiocephalidae). It is known only from the holotype, a female specimen from which it was described in 1868. A possible second specimen which was collected by Rollo Beck in 1917 was instead identified as a Tiburon curly-tailed lizard (Leiocephalus melanochlorus) by herpetologist Richard Thomas in 1966
https://huggingface.co/datasets/fmars/wiki_stem
Prigogine's nightjar (Caprimulgus prigoginei) or the Itombwe nightjar, is a bird species of tropical central Africa. It is known from only one specimen taken at Malenge in the Itombwe Mountains in Zaire in August 1955. It appears to be a forest species, but nothing is known of its habits or breeding, although it is likely to nest on bare ground like its relatives
https://huggingface.co/datasets/fmars/wiki_stem
The spotted green pigeon or Liverpool pigeon (Caloenas maculata) is a species of pigeon which is most likely extinct. It was first mentioned and described in 1783 by John Latham, who had seen two specimens of unknown provenance and a drawing depicting the bird. The taxonomic relationships of the bird were long obscure, and early writers suggested many different possibilities, though the idea that it was related to the Nicobar pigeon (C
https://huggingface.co/datasets/fmars/wiki_stem
Strabops is a genus of strabopid, an extinct group of arthropods. Strabops is known from a single specimen from the Late Cambrian (Furongian age) of the Potosi Dolomite, Missouri, collected by a former professor, Arthur Thacher. It is classified in the family Strabopidae of the monotypic order Strabopida, a group closely related to the aglaspidids with uncertain affinities
https://huggingface.co/datasets/fmars/wiki_stem
The Sumatran flying squirrel (Hylopetes winstoni) is a flying squirrel only found on the island of Sumatra. It is listed as data deficient on the IUCN red list. Originally discovered in 1949, it is known only from a single specimen
https://huggingface.co/datasets/fmars/wiki_stem
Abalone shriveling syndrome-associated virus was described in 2010 from an abalone which had died from abalone shriveling syndrome. Taxonomy While the ICTV does not include the Abalone shriveling syndrome-associated virus or the Abalone muscle atrophy virus, as of August 13, the NCBI puts it under the Caudoviricetes class as an unclassified virus. Disclaimer: The NCBI taxonomy database is not an authoritative source for nomenclature or classification of viruses
https://huggingface.co/datasets/fmars/wiki_stem
Arteriviridae is a family of enveloped, positive-strand RNA viruses in the order Nidovirales which infect vertebrates. Host organisms include equids, pigs, Possums, nonhuman primates, and rodents. The family includes, for example, equine arteritis virus in horses which causes mild-to-severe respiratory disease and reproductive failure, porcine reproductive and respiratory syndrome virus type 1 and type 2 in pigs which causes a similar disease, simian hemorrhagic fever virus which causes a highly lethal fever, lactate dehydrogenase–elevating virus which affects mice, and wobbly possum disease virus
https://huggingface.co/datasets/fmars/wiki_stem
Icerudivirus is a genus of viruses in the family Rudiviridae. These viruses are non-enveloped, stiff-rod-shaped viruses with linear dsDNA genomes, that infect hyperthermophilic archaea of the species Sulfolobus islandicus. There are three species in the genus
https://huggingface.co/datasets/fmars/wiki_stem
Bacteriophage T12 is a bacteriophage that infects Streptococcus pyogenes bacteria. It is a proposed species of the family Siphoviridae in the order Caudovirales also known as tailed viruses. It converts a harmless strain of bacteria into a virulent strain
https://huggingface.co/datasets/fmars/wiki_stem
Bovine stool associated circular virus is a single stranded DNA virus with a circular genome that was isolated from bovine stool. It has also been isolated from pig stool. Therefore, Porcine stool-associated circular virus, a proposed species not yet been accepted by the ICTV, appears to be a synonym
https://huggingface.co/datasets/fmars/wiki_stem
Bovine viral diarrhea (BVD), bovine viral diarrhoea (UK English) or mucosal disease, previously referred to as bovine virus diarrhea (BVD), is an economically significant disease of cattle that is found in the majority of countries throughout the world. Worldwide reviews of the economically assessed production losses and intervention programs (e. g
https://huggingface.co/datasets/fmars/wiki_stem
Chimpanzee stool associated circular virus is a single stranded DNA virus isolated from chimpanzee stool. This proposed species has not yet been accepted by the ICTV. Genome The genome is ~2
https://huggingface.co/datasets/fmars/wiki_stem
Cotton leaf curl viruses (CLCuV) are a number of plant pathogenic virus species of the family Geminiviridae. In Asia and Africa the major disease of cotton is caused by the Cotton leaf curl geminivirus (CLCuV). Leaves of infected cotton curl upward and bear leaf-like enations on the underside along with vein thickening
https://huggingface.co/datasets/fmars/wiki_stem
CrAss-like phage are a bacteriophage (virus that infects bacteria) family that was discovered in 2014 by cross assembling reads in human fecal metagenomes. In silico comparative genomics and taxonomic analysis have found that crAss-like phages represent a highly abundant and diverse family of viruses. CrAss-like phage were predicted to infect bacteria of the Bacteroidota phylum and the prediction was later confirmed when the first crAss-like phage (crAss001) was isolated on a Bacteroidota host (B
https://huggingface.co/datasets/fmars/wiki_stem
Echovirus is a polyphyletic group of viruses associated with enteric disease in humans. The name is derived from "enteric cytopathic human orphan virus". These viruses were originally not associated with disease, but many have since been identified as disease-causing agents
https://huggingface.co/datasets/fmars/wiki_stem
Herpes simplex virus 1 and 2 (HSV-1 and HSV-2), also known by their taxonomic names Human alphaherpesvirus 1 and Human alphaherpesvirus 2, are two members of the human Herpesviridae family, a set of viruses that produce viral infections in the majority of humans. Both HSV-1 and HSV-2 are very common and contagious. They can be spread when an infected person begins shedding the virus
https://huggingface.co/datasets/fmars/wiki_stem
Human bocavirus (HBoV) is the name given to all viruses in the genus Bocaparvovirus of virus family Parvoviridae that are known to infect humans. HBoV1 and HBoV3 (and gorilla bocaparvovirus) are members of species Primate bocaparvovirus 1 whereas viruses HBoV2 and HBoV4 belong to species Primate bocaparvovirus 2. Some of these viruses cause human disease
https://huggingface.co/datasets/fmars/wiki_stem
Human herpesvirus 6 (HHV-6) is the common collective name for human betaherpesvirus 6A (HHV-6A) and human betaherpesvirus 6B (HHV-6B). These closely related viruses are two of the nine known herpesviruses that have humans as their primary host. HHV-6A and HHV-6B are double-stranded DNA viruses within the Betaherpesvirinae subfamily and of the genus Roseolovirus
https://huggingface.co/datasets/fmars/wiki_stem
The primate T-lymphotropic viruses (PTLVs) are a group of retroviruses that infect primates, using their lymphocytes to reproduce. The ones that infect humans are known as human T-lymphotropic virus (HTLV), and the ones that infect Old World monkeys are called simian T-lymphotropic viruses (STLVs). PTLVs are named for their ability to cause adult T-cell leukemia/lymphoma, but in the case of HTLV-1 it can also cause a demyelinating disease called tropical spastic paraparesis
https://huggingface.co/datasets/fmars/wiki_stem
Mamavirus is a large and complex virus in the Group I family Mimiviridae. The virus is exceptionally large, and larger than many bacteria. Mamavirus and other mimiviridae belong to nucleocytoplasmic large DNA virus (NCLDVs) family
https://huggingface.co/datasets/fmars/wiki_stem
Murid gammaherpesvirus 68 (MuHV-68) is an isolate of the virus species Murid gammaherpesvirus 4, a member of the genus Rhadinovirus. It is a member of the subfamily Gammaherpesvirinae in the family of Herpesviridae. MuHV-68 serves as a model for study of human gammaherpesviruses which cause significant human disease including B-cell lymphoma and Kaposi's sarcoma
https://huggingface.co/datasets/fmars/wiki_stem
Negevirus is a taxon of non segmented, positive sense single stranded RNA viruses that have been isolated from mosquitoes and phlebotomine sand flies in Africa, the Americas, Asia and Europe. With the electron microscope the viruses appear to be spherical particles 45 to 55 nanometers in diameter. Taxonomy There are at least 91 viruses recognised in this taxon
https://huggingface.co/datasets/fmars/wiki_stem
Pandoravirus is a genus of giant virus, first discovered in 2013. It is the second largest in physical size of any known viral genus. Pandoraviruses have double stranded DNA genomes, with the largest genome size (2
https://huggingface.co/datasets/fmars/wiki_stem
Porcine circovirus (PCV) is a group of four single-stranded DNA viruses that are non-enveloped with an unsegmented circular genome. They are members of the genus Circovirus that can infect pigs. The viral capsid is icosahedral and approximately 17 nm in diameter
https://huggingface.co/datasets/fmars/wiki_stem
Secoviridae is a family of viruses in the order Picornavirales. Plants serve as natural hosts. There are 8 genera and 86 species in this family, one of which is unassigned to a genus
https://huggingface.co/datasets/fmars/wiki_stem
Simian foamy virus (SFV) is a species of the genus Spumavirus that belongs to the family of Retroviridae. It has been identified in a wide variety of primates, including prosimians, New World and Old World monkeys, as well as apes, and each species has been shown to harbor a unique (species-specific) strain of SFV, including African green monkeys, baboons, macaques, and chimpanzees. As it is related to the more well-known retrovirus human immunodeficiency virus (HIV), its discovery in primates has led to some speculation that HIV may have been spread to the human species in Africa through contact with blood from apes, monkeys, and other primates, most likely through bushmeat-hunting practices
https://huggingface.co/datasets/fmars/wiki_stem
Swine acute diarrhea syndrome coronavirus (SADS-CoV) is a coronavirus related to Rhinolophus bat coronavirus HKU2. It is transmitted through the feces of horseshoe bats to pigs. Piglets less than 5 days old die with a probability of up to 90%
https://huggingface.co/datasets/fmars/wiki_stem
Virusoids are circular single-stranded RNA(s) dependent on viruses for replication and encapsidation. The genome of virusoids consist of several hundred (200–400) nucleotides and does not code for any proteins. Virusoids are essentially viroids that have been encapsulated by a helper virus coat protein
https://huggingface.co/datasets/fmars/wiki_stem
Xenotropic murine leukemia virus–related virus (XMRV) is a retrovirus which was first described in 2006 as an apparently novel human pathogen found in tissue samples from men with prostate cancer. Initial reports erroneously linked the virus to prostate cancer and later to chronic fatigue syndrome (CFS), leading to considerable interest in the scientific and patient communities, investigation of XMRV as a potential cause of multiple medical conditions, and public-health concerns about the safety of the donated blood supply. Xenotropic viruses replicate or reproduce in cells other than those of the host species
https://huggingface.co/datasets/fmars/wiki_stem
Active SETI (Active Search for Extra-Terrestrial Intelligence) is the attempt to send messages to intelligent extraterrestrial life. Active SETI messages are predominantly sent in the form of radio signals. Physical messages like that of the Pioneer plaque may also be considered an active SETI message
https://huggingface.co/datasets/fmars/wiki_stem
The communication with extraterrestrial intelligence (CETI) is a branch of the search for extraterrestrial intelligence (SETI) that focuses on composing and deciphering interstellar messages that theoretically could be understood by another technological civilization. The best-known CETI experiment of its kind was the 1974 Arecibo message composed by Frank Drake. There are multiple independent organizations and individuals engaged in CETI research; the generic application of abbreviations CETI and SETI (search for extraterrestrial intelligence) in this article should not be taken as referring to any particular organization (such as the SETI Institute)
https://huggingface.co/datasets/fmars/wiki_stem
Interstellar communication is the transmission of signals between planetary systems. Sending interstellar messages is potentially much easier than interstellar travel, being possible with technologies and equipment which are currently available. However, the distances from Earth to other potentially inhabited systems introduce prohibitive delays, assuming the limitations of the speed of light
https://huggingface.co/datasets/fmars/wiki_stem
International Open Data Day is an annual event that promotes awareness and use of open data. The event takes place globally, usually in February or March. Typical activities include talks, seminars, demonstrations, hackathons, training or the announcement of open data releases or other milestones in open data
https://huggingface.co/datasets/fmars/wiki_stem
In natural language processing, linguistics, and neighboring fields, Linguistic Linked Open Data (LLOD) describes a method and an interdisciplinary community concerned with creating, sharing, and (re-)using language resources in accordance with Linked Data principles. The Linguistic Linked Open Data Cloud was conceived and is being maintained by the Open Linguistics Working Group (OWLG) of the Open Knowledge Foundation, but has been a point of focal activity for several W3C community groups, research projects, and infrastructure efforts since then. Definition and Development Linguistic Linked Open Data describes the publication of data for linguistics and natural language processing using the following principles: Data should be openly licensed using licenses such as the Creative Commons licenses
https://huggingface.co/datasets/fmars/wiki_stem
In computing, linked data is structured data which is interlinked with other data so it becomes more useful through semantic queries. It builds upon standard Web technologies such as HTTP, RDF and URIs, but rather than using them to serve web pages only for human readers, it extends them to share information in a way that can be read automatically by computers. Part of the vision of linked data is for the Internet to become a global database
https://huggingface.co/datasets/fmars/wiki_stem
Open by Default, as widely used in the contexts of Open Government and Open Data, is the principle in which government makes its data accessible to the public by default, unless there is a sufficient justification to explain that greater public interest may be at stake, as a result of disclosure. Since the principle empowers the public's right to know and capacity to oversee government activities, it is closely associated with government transparency, civic engagement, and e-governance in organizing public life. In many cases, the principle is accompanied with the technological commitment to create "metadata standardization for all datasets, publication of a machine-readable data catalogue or inventory of both released and to-be released datasets
https://huggingface.co/datasets/fmars/wiki_stem
The Open Contracting Data Standard is a standards development initiative issued by the Omidyar Network and the World Bank which commenced in November 2014. It sets out the key documents and data which should be published at each stage of the process of letting a contract for the procurement of goods and services for the public sector. Adoption of the standard requires publishers to release data under an open license, because "publishing data under an open license is an important part of open contracting
https://huggingface.co/datasets/fmars/wiki_stem
Open Data Buffalo is the open data program developed under the administration of Mayor Byron W. Brown in Buffalo, New York. The initiative is a commitment to proactively release high-quality, updated "publishable City data" through a centralized portal in machine-readable formats, fully accessible and freely available in the public domain
https://huggingface.co/datasets/fmars/wiki_stem
Open Data Now is a 2014 book on open data by Joel Gurin. Reception A reviewer for the University of California, Berkeley School of Information said the book "is written for the business community, but speaks to the experiences of those in the government, the private sector, or those who make a living advocating for consumers. "A reviewer for the U
https://huggingface.co/datasets/fmars/wiki_stem
The Open Definition is a document published by the Open Knowledge Foundation (OKFN) (previously Open Knowledge International) to define openness in relation to data and content. It specifies what licences for such material may and may not stipulate, in order to be considered open licences. The definition itself was derived from the Open Source Definition for software
https://huggingface.co/datasets/fmars/wiki_stem
The Open Energy Modelling Initiative (openmod) is a grassroots community of energy system modellers from universities and research institutes across Europe and elsewhere. The initiative promotes the use of open-source software and open data in energy system modelling for research and policy advice. The Open Energy Modelling Initiative documents a variety of open-source energy models and addresses practical and conceptual issues regarding their development and application
https://huggingface.co/datasets/fmars/wiki_stem
Open energy system database projects employ open data methods to collect, clean, and republish energy-related datasets for open use. The resulting information is then available, given a suitable open license, for statistical analysis and for building numerical energy system models, including open energy system models. Permissive licenses like Creative Commons CC0 and CC BY are preferred, but some projects will house data made public under market transparency regulations and carrying unqualified copyright
https://huggingface.co/datasets/fmars/wiki_stem
Open energy system models are energy system models that are open source. However, some of them may use third party proprietary software as part of their workflows to input, process, or output data. Preferably, these models use open data, which facilitates open science
https://huggingface.co/datasets/fmars/wiki_stem
Open government is the governing doctrine which maintains that citizens have the right to access the documents and proceedings of the government to allow for effective public oversight. In its broadest construction, it opposes reason of state and other considerations which have tended to legitimize extensive state secrecy. The origins of open-government arguments can be dated to the time of the European Age of Enlightenment, when philosophers debated the proper construction of a then nascent democratic society
https://huggingface.co/datasets/fmars/wiki_stem
The Open Notebook Science Challenge is a crowdsourcing research project which collects measurements of the non-aqueous solubility of organic compounds and publishes these as open data; findings are reported in an open notebook science manner. Although anyone may contribute research data, the competition is only open to post-secondary students in the US and UK. The challenge in turn forms part of the UsefulChem project, an ongoing open notebook science effort to synthesize and screen potential new anti-malarial drugs
https://huggingface.co/datasets/fmars/wiki_stem