Number
int64 1
7.61k
| Text
stringlengths 2
3.11k
|
|---|---|
7,001
|
In the early days of computing, computer use was typically limited to batch processing, i.e., non-interactive tasks, each producing output data from given input data. Computability theory, which studies computability of functions from inputs to outputs, and for which Turing machines were invented, reflects this practice.
|
7,002
|
Since the 1970s, interactive use of computers became much more common. In principle, it is possible to model this by having an external agent read from the tape and write to it at the same time as a Turing machine, but this rarely matches how interaction actually happens; therefore, when describing interactivity, alternatives such as I/O automata are usually preferred.
|
7,003
|
The arithmetic model of computation differs from the Turing model in two aspects: : 32
|
7,004
|
Some algorithms run in polynomial time in one model but not in the other one. For example:
|
7,005
|
However, if an algorithm runs in polynomial time in the arithmetic model, and in addition, the binary length of all involved numbers is polynomial in the length of the input, then it is always polynomial-time in the Turing model. Such an algorithm is said to run in strongly polynomial time.
|
7,006
|
Robin Gandy —a student of Alan Turing , and his lifelong friend—traces the lineage of the notion of "calculating machine" back to Charles Babbage and actually proposes "Babbage's Thesis":
|
7,007
|
That the whole of development and operations of analysis are now capable of being executed by machinery.
|
7,008
|
Gandy's analysis of Babbage's analytical engine describes the following five operations :
|
7,009
|
Gandy states that "the functions which can be calculated by , , and are precisely those which are Turing computable." . He cites other proposals for "universal calculating machines" including those of Percy Ludgate , Leonardo Torres Quevedo , Maurice d'Ocagne , Louis Couffignal , Vannevar Bush , Howard Aiken . However:
|
7,010
|
… the emphasis is on programming a fixed iterable sequence of arithmetical operations. The fundamental importance of conditional iteration and conditional transfer for a general theory of calculating machines is not recognized…
|
7,011
|
With regard to Hilbert's problems posed by the famous mathematician David Hilbert in 1900, an aspect of problem #10 had been floating about for almost 30 years before it was framed precisely. Hilbert's original expression for No. 10 is as follows:
|
7,012
|
10. Determination of the solvability of a Diophantine equation. Given a Diophantine equation with any number of unknown quantities and with rational integral coefficients: To devise a process according to which it can be determined in a finite number of operations whether the equation is solvable in rational integers.
The Entscheidungsproblem is solved when we know a procedure that allows for any given logical expression to decide by finitely many operations its validity or satisfiability ... The Entscheidungsproblem must be considered the main problem of mathematical logic.
|
7,013
|
By 1922, this notion of "Entscheidungsproblem" had developed a bit, and H. Behmann stated that
|
7,014
|
... most general form of the Entscheidungsproblem as follows:
|
7,015
|
A quite definite generally applicable prescription is required which will allow one to decide in a finite number of steps the truth or falsity of a given purely logical assertion ...
|
7,016
|
Behmann remarks that ... the general problem is equivalent to the problem of deciding which mathematical propositions are true.
|
7,017
|
If one were able to solve the Entscheidungsproblem then one would have a "procedure for solving many mathematical problems".
|
7,018
|
By the 1928 international congress of mathematicians, Hilbert "made his questions quite precise. First, was mathematics complete ... Second, was mathematics consistent ... And thirdly, was mathematics decidable?" . The first two questions were answered in 1930 by Kurt Gödel at the very same meeting where Hilbert delivered his retirement speech ; the third—the Entscheidungsproblem—had to wait until the mid-1930s.
|
7,019
|
The problem was that an answer first required a precise definition of "definite general applicable prescription", which Princeton professor Alonzo Church would come to call "effective calculability", and in 1928 no such definition existed. But over the next 6–7 years Emil Post developed his definition of a worker moving from room to room writing and erasing marks per a list of instructions , as did Church and his two students Stephen Kleene and J. B. Rosser by use of Church's lambda-calculus and Gödel's recursion theory . Church's paper showed that the Entscheidungsproblem was indeed "undecidable" and beat Turing to the punch by almost a year . In the meantime, Emil Post submitted a brief paper in the fall of 1936, so Turing at least had priority over Post. While Church refereed Turing's paper, Turing had time to study Church's paper and add an Appendix where he sketched a proof that Church's lambda-calculus and his machines would compute the same functions.
|
7,020
|
But what Church had done was something rather different, and in a certain sense weaker. ... the Turing construction was more direct, and provided an argument from first principles, closing the gap in Church's demonstration.
|
7,021
|
And Post had only proposed a definition of calculability and criticised Church's "definition", but had proved nothing.
|
7,022
|
In the spring of 1935, Turing as a young Master's student at King's College, Cambridge, took on the challenge; he had been stimulated by the lectures of the logician M. H. A. Newman "and learned from them of Gödel's work and the Entscheidungsproblem ... Newman used the word 'mechanical' ... In his obituary of Turing 1955 Newman writes:
|
7,023
|
To the question 'what is a "mechanical" process?' Turing returned the characteristic answer 'Something that can be done by a machine' and he embarked on the highly congenial task of analysing the general notion of a computing machine.
|
7,024
|
Gandy states that:
|
7,025
|
I suppose, but do not know, that Turing, right from the start of his work, had as his goal a proof of the undecidability of the Entscheidungsproblem. He told me that the 'main idea' of the paper came to him when he was lying in Grantchester meadows in the summer of 1935. The 'main idea' might have either been his analysis of computation or his realization that there was a universal machine, and so a diagonal argument to prove unsolvability.
|
7,026
|
While Gandy believed that Newman's statement above is "misleading", this opinion is not shared by all. Turing had a lifelong interest in machines: "Alan had dreamt of inventing typewriters as a boy; Mrs. Turing had a typewriter; and he could well have begun by asking himself what was meant by calling a typewriter 'mechanical'" . While at Princeton pursuing his PhD, Turing built a Boolean-logic multiplier . His PhD thesis, titled "Systems of Logic Based on Ordinals", contains the following definition of "a computable function":
|
7,027
|
It was stated above that 'a function is effectively calculable if its values can be found by some purely mechanical process'. We may take this statement literally, understanding by a purely mechanical process one which could be carried out by a machine. It is possible to give a mathematical description, in a certain normal form, of the structures of these machines. The development of these ideas leads to the author's definition of a computable function, and to an identification of computability with effective calculability. It is not difficult, though somewhat laborious, to prove that these three definitions are equivalent.
|
7,028
|
Alan Turing invented the "a-machine" in 1936. Turing submitted his paper on 31 May 1936 to the London Mathematical Society for its Proceedings , but it was published in early 1937 and offprints were available in February 1937 It was Turing's doctoral advisor, Alonzo Church, who later coined the term "Turing machine" in a review. With this model, Turing was able to answer two questions in the negative:
|
7,029
|
Thus by providing a mathematical description of a very simple device capable of arbitrary computations, he was able to prove properties of computation in general—and in particular, the uncomputability of the Entscheidungsproblem .
|
7,030
|
When Turing returned to the UK he ultimately became jointly responsible for breaking the German secret codes created by encryption machines called "The Enigma"; he also became involved in the design of the ACE , " ACE proposal was effectively self-contained, and its roots lay not in the EDVAC , but in his own universal machine" . Arguments still continue concerning the origin and nature of what has been named by Kleene Turing's Thesis. But what Turing did prove with his computational-machine model appears in his paper "On Computable Numbers, with an Application to the Entscheidungsproblem" :
|
7,031
|
the Hilbert Entscheidungsproblem can have no solution ... I propose, therefore to show that there can be no general process for determining whether a given formula U of the functional calculus K is provable, i.e. that there can be no machine which, supplied with any one U of these formulae, will eventually say whether U is provable.
|
7,032
|
Turing's example : If one is to ask for a general procedure to tell us: "Does this machine ever print 0", the question is "undecidable".
|
7,033
|
In 1937, while at Princeton working on his PhD thesis, Turing built a digital multiplier from scratch, making his own electromechanical relays . "Alan's task was to embody the logical design of a Turing machine in a network of relay-operated switches ..." . While Turing might have been just initially curious and experimenting, quite-earnest work in the same direction was going in Germany ), and in the United States and George Stibitz ; the fruits of their labors were used by both the Axis and Allied militaries in World War II . In the early to mid-1950s Hao Wang and Marvin Minsky reduced the Turing machine to a simpler form ; simultaneously European researchers were reducing the new-fangled electronic computer to a computer-like theoretical object equivalent to what was now being called a "Turing machine". In the late 1950s and early 1960s, the coincidentally parallel developments of Melzak and Lambek , Minsky , and Shepherdson and Sturgis carried the European work further and reduced the Turing machine to a more friendly, computer-like abstract model called the counter machine; Elgot and Robinson , Hartmanis , Cook and Reckhow carried this work even further with the register machine and random-access machine models—but basically all are just multi-tape Turing machines with an arithmetic-like instruction set.
|
7,034
|
Today, the counter, register and random-access machines and their sire the Turing machine continue to be the models of choice for theorists investigating questions in the theory of computation. In particular, computational complexity theory makes use of the Turing machine:
|
7,035
|
Depending on the objects one likes to manipulate in the computations , two models have obtained a dominant position in machine-based complexity theory:
|
7,036
|
the off-line multitape Turing machine..., which represents the standard model for string-oriented computation, and
the random access machine as introduced by Cook and Reckhow ..., which models the idealised Von Neumann-style computer.
|
7,037
|
Only in the related area of analysis of algorithms this role is taken over by the RAM model.
|
7,038
|
Digital physics suggests that there exists, at least in principle, a program for a universal computer that computes the evolution of the universe. The computer could be, for example, a huge cellular automaton.
|
7,039
|
Extant models of digital physics are incompatible with the existence of several continuous characters of physical symmetries, e.g., rotational symmetry, translational symmetry, Lorentz symmetry, and the Lie group gauge invariance of Yang–Mills theories, all central to current physical theory. Moreover, extant models of digital physics violate various well-established features of quantum physics, belonging to the class of theories with local hidden variables that have so far been ruled out experimentally using Bell's theorem.
|
7,040
|
However, covariant discrete theories can be formulated that preserve the aforementioned symmetries.
|
7,041
|
Rapaport has done research and written extensively on intentionality and artificial intelligence. He has research interests in computer science, artificial intelligence , computational linguistics, cognitive science, logic and mathematics, and published many scientific articles on them.
|
7,042
|
While a philosophy graduate student at Indiana University in 1972, he concocted the sentence: "Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo". Throughout his career he developed this theme, and discussed it extensively.
|
7,043
|
His early work on nonexistent objects was influenced by Alexius Meinong.
|
7,044
|
Rapaport has written on the field of intentionality, influencing scientists and writers including Daniel Dennett, Héctor-Neri Castañeda and John Searle .
Rapaport is interested in science educational theory, and received the New York Chancellor's Award for Excellence in Teaching.
|
7,045
|
In June 1988, Rapaport compiled a list of restaurants in the Buffalo area for attendees of an ACL meeting at SUNY Buffalo. The list was continued, becoming interactive, with user reviews of restaurants.
|
7,046
|
Rapaport and his wife Mary, with whom he has a son Michael, are the principal donors to the Lucille Ball-Desi Arnaz Center in Jamestown, NY. The Desilu Playhouse, located in the Rapaport Center, contains memorabilia and other vintage I Love Lucy items. He and his wife have also purchased and renovated Lucille Ball's childhood home in Celoron, New York.
|
7,047
|
Broadly, query languages can be classified according to whether they are database query languages or information retrieval query languages. The difference is that a database query language attempts to give factual answers to factual questions, while an information retrieval query language attempts to find documents containing information that is relevant to an area of inquiry. Other types of query languages include:
|
7,048
|
Common logical data models for databases include:
|
7,049
|
An object–relational database combines the two related structures.
|
7,050
|
Physical data models include:
|
7,051
|
Other models include:
|
7,052
|
Multidimensional model
|
7,053
|
Multivalue model
|
7,054
|
Semantic model
|
7,055
|
XML database
|
7,056
|
Named graph
|
7,057
|
Triplestore
|
7,058
|
A given database management system may provide one or more models. The optimal structure depends on the natural organization of the application's data, and on the application's requirements, which include transaction rate , reliability, maintainability, scalability, and cost. Most database management systems are built around one particular data model, although it is possible for products to offer support for more than one model.
|
7,059
|
Various physical data models can implement any given logical model. Most database software will offer the user some level of control in tuning the physical implementation, since the choices that are made have a significant effect on performance.
|
7,060
|
A model is not just a way of structuring data: it also defines a set of operations that can be performed on the data. The relational model, for example, defines operations such as select and join. Although these operations may not be explicit in a particular query language, they provide the foundation on which a query language is built.
|
7,061
|
The flat model consists of a single, two-dimensional array of data elements, where all members of a given column are assumed to be similar values, and all members of a row are assumed to be related to one another. For instance, columns for name and password that might be used as a part of a system security database. Each row would have the specific password associated with an individual user. Columns of the table often have a type associated with them, defining them as character data, date or time information, integers, or floating point numbers. This tabular format is a precursor to the relational model.
|
7,062
|
These models were popular in the 1960s, 1970s, but nowadays can be found primarily in old legacy systems. They are characterized primarily by being navigational with strong connections between their logical and physical representations, and deficiencies in data independence.
|
7,063
|
In a hierarchical model, data is organized into a tree-like structure, implying a single parent for each record. A sort field keeps sibling records in a particular order. Hierarchical structures were widely used in the early mainframe database management systems, such as the Information Management System by IBM, and now describe the structure of XML documents. This structure allows one-to-many relationship between two types of data. This structure is very efficient to describe many relationships in the real world; recipes, table of contents, ordering of paragraphs/verses, any nested and sorted information.
|
7,064
|
This hierarchy is used as the physical order of records in storage. Record access is done by navigating downward through the data structure using pointers combined with sequential accessing. Because of this, the hierarchical structure is inefficient for certain database operations when a full path is not also included for each record. Such limitations have been compensated for in later IMS versions by additional logical hierarchies imposed on the base physical hierarchy.
|
7,065
|
The network model expands upon the hierarchical structure, allowing many-to-many relationships in a tree-like structure that allows multiple parents. It was most popular before being replaced by the relational model, and is defined by the CODASYL specification.
|
7,066
|
The network model organizes data using two fundamental concepts, called records and sets. Records contain fields . Sets define one-to-many relationships between records: one owner, many members. A record may be an owner in any number of sets, and a member in any number of sets.
|
7,067
|
A set consists of circular linked lists where one record type, the set owner or parent, appears once in each circle, and a second record type, the subordinate or child, may appear multiple times in each circle. In this way a hierarchy may be established between any two record types, e.g., type A is the owner of B. At the same time another set may be defined where B is the owner of A. Thus all the sets comprise a general directed graph , or network construct. Access to records is either sequential or by navigation in the circular linked lists.
|
7,068
|
The network model is able to represent redundancy in data more efficiently than in the hierarchical model, and there can be more than one path from an ancestor node to a descendant. The operations of the network model are navigational in style: a program maintains a current position, and navigates from one record to another by following the relationships in which the record participates. Records can also be located by supplying key values.
|
7,069
|
Although it is not an essential feature of the model, network databases generally implement the set relationships by means of pointers that directly address the location of a record on disk. This gives excellent retrieval performance, at the expense of operations such as database loading and reorganization.
|
7,070
|
Popular DBMS products that utilized it were Cincom Systems' Total and Cullinet's IDMS. IDMS gained a considerable customer base; in the 1980s, it adopted the relational model and SQL in addition to its original tools and languages.
|
7,071
|
Most object databases use the navigational concept to provide fast navigation across networks of objects, generally using object identifiers as "smart" pointers to related objects. Objectivity/DB, for instance, implements named one-to-one, one-to-many, many-to-one, and many-to-many named relationships that can cross databases. Many object databases also support SQL, combining the strengths of both models.
|
7,072
|
In an inverted file or inverted index, the contents of the data are used as keys in a lookup table, and the values in the table are pointers to the location of each instance of a given content item. This is also the logical structure of contemporary database indexes, which might only use the contents from a particular columns in the lookup table. The inverted file data model can put indexes in a set of files next to existing flat database files, in order to efficiently directly access needed records in these files.
|
7,073
|
Notable for using this data model is the ADABAS DBMS of Software AG, introduced in 1970. ADABAS has gained considerable customer base and exists and supported until today. In the 1980s it has adopted the relational model and SQL in addition to its original tools and languages.
|
7,074
|
Document-oriented database Clusterpoint uses inverted indexing model to provide fast full-text search for XML or JSON data objects for example.
|
7,075
|
The relational model was introduced by E.F. Codd in 1970 as a way to make database management systems more independent of any particular application. It is a mathematical model defined in terms of predicate logic and set theory, and implementations of it have been used by mainframe, midrange and microcomputer systems.
|
7,076
|
The products that are generally referred to as relational databases in fact implement a model that is only an approximation to the mathematical model defined by Codd. Three key terms are used extensively in relational database models: relations, attributes, and domains. A relation is a table with columns and rows. The named columns of the relation are called attributes, and the domain is the set of values the attributes are allowed to take.
|
7,077
|
The basic data structure of the relational model is the table, where information about a particular entity is represented in rows and columns. Thus, the "relation" in "relational database" refers to the various tables in the database; a relation is a set of tuples. The columns enumerate the various attributes of the entity , and a row is an actual instance of the entity that is represented by the relation. As a result, each tuple of the employee table represents various attributes of a single employee.
|
7,078
|
All relations in a relational database have to adhere to some basic rules to qualify as relations. First, the ordering of columns is immaterial in a table. Second, there can not be identical tuples or rows in a table. And third, each tuple will contain a single value for each of its attributes.
|
7,079
|
A relational database contains multiple tables, each similar to the one in the "flat" database model. One of the strengths of the relational model is that, in principle, any value occurring in two different records , implies a relationship among those two records. Yet, in order to enforce explicit integrity constraints, relationships between records in tables can also be defined explicitly, by identifying or non-identifying parent-child relationships characterized by assigning cardinality 1:M, M:M). Tables can also have a designated single attribute or a set of attributes that can act as a "key", which can be used to uniquely identify each tuple in the table.
|
7,080
|
A key that can be used to uniquely identify a row in a table is called a primary key. Keys are commonly used to join or combine data from two or more tables. For example, an Employee table may contain a column named Location which contains a value that matches the key of a Location table. Keys are also critical in the creation of indexes, which facilitate fast retrieval of data from large tables. Any column can be a key, or multiple columns can be grouped together into a compound key. It is not necessary to define all the keys in advance; a column can be used as a key even if it was not originally intended to be one.
|
7,081
|
A key that has an external, real-world meaning is sometimes called a "natural" key. If no natural key is suitable , an arbitrary or surrogate key can be assigned . In practice, most databases have both generated and natural keys, because generated keys can be used internally to create links between rows that cannot break, while natural keys can be used, less reliably, for searches and for integration with other databases.
|
7,082
|
The most common query language used with the relational model is the Structured Query Language .
|
7,083
|
The dimensional model is a specialized adaptation of the relational model used to represent data in data warehouses in a way that data can be easily summarized using online analytical processing, or OLAP queries. In the dimensional model, a database schema consists of a single large table of facts that are described using dimensions and measures. A dimension provides the context of a fact and is used in queries to group related facts together. Dimensions tend to be discrete and are often hierarchical; for example, the location might include the building, state, and country. A measure is a quantity describing the fact, such as revenue. It is important that measures can be meaningfully aggregated—for example, the revenue from different locations can be added together.
|
7,084
|
In an OLAP query, dimensions are chosen and the facts are grouped and aggregated together to create a summary.
|
7,085
|
The dimensional model is often implemented on top of the relational model using a star schema, consisting of one highly normalized table containing the facts, and surrounding denormalized tables containing each dimension. An alternative physical implementation, called a snowflake schema, normalizes multi-level hierarchies within a dimension into multiple tables.
|
7,086
|
A data warehouse can contain multiple dimensional schemas that share dimension tables, allowing them to be used together. Coming up with a standard set of dimensions is an important part of dimensional modeling.
|
7,087
|
Its high performance has made the dimensional model the most popular database structure for OLAP.
|
7,088
|
Products offering a more general data model than the relational model are sometimes classified as post-relational. Alternate terms include "hybrid database", "Object-enhanced RDBMS" and others. The data model in such products incorporates relations but is not constrained by E.F. Codd's Information Principle, which requires that
|
7,089
|
all information in the database must be cast explicitly in terms of values in relations and in no other way
|
7,090
|
Some of these extensions to the relational model integrate concepts from technologies that pre-date the relational model. For example, they allow representation of a directed graph with trees on the nodes. The German company sones implements this concept in its GraphDB.
|
7,091
|
Some post-relational products extend relational systems with non-relational features. Others arrived in much the same place by adding relational features to pre-relational systems. Paradoxically, this allows products that are historically pre-relational, such as PICK and MUMPS, to make a plausible claim to be post-relational.
|
7,092
|
The resource space model is a non-relational data model based on multi-dimensional classification.
|
7,093
|
Graph databases allow even more general structure than a network database; any node may be connected to any other node.
|
7,094
|
Multivalue databases are "lumpy" data, in that they can store exactly the same way as relational databases, but they also permit a level of depth which the relational model can only approximate using sub-tables. This is nearly identical to the way XML expresses data, where a given field/attribute can have multiple right answers at the same time. Multivalue can be thought of as a compressed form of XML.
|
7,095
|
An example is an invoice, which in either multivalue or relational data could be seen as Invoice Header Table - one entry per invoice, and Invoice Detail Table - one entry per line item. In the multivalue model, we have the option of storing the data as on table, with an embedded table to represent the detail: Invoice Table - one entry per invoice, no other tables needed.
|
7,096
|
The advantage is that the atomicity of the Invoice and the Invoice are one-to-one. This also results in fewer reads, less referential integrity issues, and a dramatic decrease in the hardware needed to support a given transaction volume.
|
7,097
|
In the 1990s, the object-oriented programming paradigm was applied to database technology, creating a new database model known as object databases. This aims to avoid the object–relational impedance mismatch – the overhead of converting information between its representation in the database and its representation in the application program . Even further, the type system used in a particular application can be defined directly in the database, allowing the database to enforce the same data integrity invariants. Object databases also introduce the key ideas of object programming, such as encapsulation and polymorphism, into the world of databases.
|
7,098
|
A variety of these ways have been tried for storing objects in a database. Some products have approached the problem from the application programming end, by making the objects manipulated by the program persistent. This typically requires the addition of some kind of query language, since conventional programming languages do not have the ability to find objects based on their information content. Others have attacked the problem from the database end, by defining an object-oriented data model for the database, and defining a database programming language that allows full programming capabilities as well as traditional query facilities.
|
7,099
|
Object databases suffered because of a lack of standardization: although standards were defined by ODMG, they were never implemented well enough to ensure interoperability between products. Nevertheless, object databases have been used successfully in many applications: usually specialized applications such as engineering databases or molecular biology databases rather than mainstream commercial data processing. However, object database ideas were picked up by the relational vendors and influenced extensions made to these products and indeed to the SQL language.
|
7,100
|
An alternative to translating between objects and relational databases is to use an object–relational mapping library.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.