ACL-OCL / Base_JSON /prefixT /json /T75 /T75-2018.json
Benjamin Aw
Add updated pkl file v3
6fa4bc9
{
"paper_id": "T75-2018",
"header": {
"generated_with": "S2ORC 1.0.0",
"date_generated": "2023-01-19T07:43:07.713858Z"
},
"title": "META-COMPILING TEXT GRAMMARS AS A MODEL FOR HUMAN BEHAVIOR",
"authors": [
{
"first": "Sheldon",
"middle": [],
"last": "Klein",
"suffix": "",
"affiliation": {
"laboratory": "",
"institution": "University of Wisconsin",
"location": {}
},
"email": ""
}
],
"year": "",
"venue": null,
"identifiers": {},
"abstract": "",
"pdf_parse": {
"paper_id": "T75-2018",
"_pdf_hash": "",
"abstract": [],
"body_text": [
{
"text": "In our efforts to model the totality of synchronic and diachronic language behavior in complex social groups, we developed a meta-symbolic simulation system that includes a powerful behavioral simulation programming language that models, generates and manipulates events in the notation of a semantic network that changes through time, and a generalized, semantics-to-surface structure generation mechanism that can describe changes in the semantic universe in the syntax of any natural language for which a grammar is supplied.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I. BACKGROUND",
"sec_num": null
},
{
"text": "Because the system is a meta-theoretical device, it can handle generative semantic grammars formulated within a variety of theoretical frameworks.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I. BACKGROUND",
"sec_num": null
},
{
"text": "A key feature of the system is that the semantic deep structure of the non-verbal, behavioral rules may be represented in the same network notation as the semantics for natural language grammars, and, as a consequence, provide non-verbal context for linguistic rules.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I. BACKGROUND",
"sec_num": null
},
{
"text": "We are also experimenting with a natural language meta-compiling capability, that is, the use of the semantic network to generate productions in the simulation language itself --productions in the form of \"texts\" that may themselves be compiled as new behavioral rules during the flow of the simulation --rules that may themselves control the process of deriving new rules. This feature permits non-verbal behavioral rules to be derived from natural language conversational inputs, and through inference techniques identical with those for inferring natural language generative semantic grammars.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I. BACKGROUND",
"sec_num": null
},
{
"text": "The total system has the power of at least the 2nd order predicate calculus, and will facilitate the formulation of highly abstract meta-models of discourse, including the logical quantification of such models. Perhaps the ultimate test is the modelling of the heuristic processes of Levi-Strauss.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I. BACKGROUND",
"sec_num": null
},
{
"text": "We hope to be able to build a model that learns text grammars with arbitrarily abstract semantics such as that manifested in Levi-Strauss (1969) .",
"cite_spans": [
{
"start": 125,
"end": 144,
"text": "Levi-Strauss (1969)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "I. BACKGROUND",
"sec_num": null
},
{
"text": "At the moment, we are working on modelling the text grammar he himself has derived (Klein et al 1975) .",
"cite_spans": [
{
"start": 83,
"end": 101,
"text": "(Klein et al 1975)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "I. BACKGROUND",
"sec_num": null
},
{
"text": "The potential of our work is to handle a degree and kind of abstraction in semantics heretofore untouched by linguistics, including the modelling of the automatic creation of text grammars for dreams and myths as a function of cultural rules.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I. BACKGROUND",
"sec_num": null
},
{
"text": "OF THE META-SYMBOLIC SIMULATION SYSTEM AS A THEORY TESTING DEVICE Our methodology and programming style have yielded a system wherein all the rules, and even the form of the theories in which they are cast, are input as data.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "GENERALITY",
"sec_num": null
},
{
"text": "As far as we can determine, this permits us to encode in our system virtually all the theoretical models currently prevalent in linguistics, plus heretofore unformulated models of vastly greater power.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "GENERALITY",
"sec_num": null
},
{
"text": "( ",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "GENERALITY",
"sec_num": null
}
],
"back_matter": [],
"bib_entries": {
"BIBREF0": {
"ref_id": "b0",
"title": "A new \"revolution\" in linguistics? --\"Text-grammars\" vs. \"sentence-grammars",
"authors": [
{
"first": "M",
"middle": [],
"last": "Dascal",
"suffix": ""
},
{
"first": "A",
"middle": [],
"last": "Margalit",
"suffix": ""
}
],
"year": 1974,
"venue": "Theoretical Linguistics",
"volume": "1",
"issue": "",
"pages": "195--213",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Dascal, M. and A. Margalit 1974. A new \"revolution\" in linguistics? --\"Text-grammars\" vs. \"sentence-grammars.\" Theoretical Linguistics, 1:195-213",
"links": null
},
"BIBREF1": {
"ref_id": "b1",
"title": "Text Grammar and Narrative Structures",
"authors": [
{
"first": "T",
"middle": [
"A"
],
"last": "Van Dijk",
"suffix": ""
}
],
"year": 1973,
"venue": "Poetics",
"volume": "3",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "van Dijk, T.A. (ed), 1973, Text Grammar and Narrative Structures. Poetics 3.",
"links": null
}
},
"ref_entries": {
"TABREF0": {
"content": "<table><tr><td/><td/><td/><td/><td/><td/><td/><td>!</td></tr><tr><td colspan=\"7\">generated model at provided apparent surface diversity. both stories from a level of abstraction a structural that a semantic unification of the Tale I THE BORISIEVICHES LIVE IN A DISTANCE PROVINCE. THE FATHER IS EMELYA. THE ONLY SON IS BORIS. MARTHA IS THE ONLY DAUGHTER. EMELYA HAS THE SHEEP. BORIS, MARTHA AND THE SHEEP ARE IN THE WOODS. BORIS SAYS MARTHA, DO NOT LEAVE THE WOODS. BORIS LEAVES TO GO BERRY GATHERING. MARTHA LEAVES THE WOODS. same abstract semantic Unit. A major type of behavior rule modification and extension is the ability to requantify the rules as a heuristic function of experience. The process does not involve recompilation --rather modification of the domain of applicability of an existing rule. One of the types of semantic parsing possible in the system is the determination of the presuppositions of the semantic content of input text. The scenario rules that could have generated the text have preconditions, and these preconditions also have their own preconditions as specified by other rules. In cases where the semantic content of an input text is not potentially derivable from existing behavioral rules, the system can posit requantification A WOLF APPEARS IN THE DISTANT PROVINCE. EMELYA ASKS THE WOLF WHERE IS YOUR WISDOM. THE WOLF SAYS THAT MY WISDOM IS IN A MAGIC EGG. THE WOLF PLUNDERS THE SHEEP. (assignments and reassignments to semantic classes) to make the input text derivable. Or, if necessary, the same end can be achieved by compiling new rules that would make the text plausible. THE WITCH PROPOSES THAT MARTHA LISTEN TO THE GUSLA WITHOUT FALLING ASLEEP. EMELYA SENDS MARTHA TO SEARCH FOR THE WOLF. MARTHA DECIDES TO SEARCH FOR THE WOLF. MARTHA LEAVES ON A SEARCH. MARTHA MEETS A WITCH ALONG THE WAY. Generalization of the method makes it possible to build complex learning models for highly abstract, semantically driven text grammars.</td><td>Folktale (Propp Russian fairytales, 1968) of his text grammar, at an average speed which generated according to the rules 50 of 128 words a second, again including plot computations and specification of deep structure as well as surface syntax (Klein et al 1974, Klein et all 1975). Our earliest automatic text generation work used syntactic dependency network/graphs with 2-valued labelling of edges as an approximation to semantic network/graphs with multi-valued labelling of edges (Klein &amp; Simmons 1963, Klein 1965a, 1965b). Our work on automatic inference of grammars includes the world's first program for learning context free, phrase structure grammars, for both natural and artificial languages, and the first program for learning transformational grammars (Klein 1967, Klein et al 1968, Klein &amp; Kuppin 1970). More recent inference work includes the formulation of techniques for automatic inference of generative semantic grammars (Klein 1973) and for the ontogeny of Pidgin and Creole languages (Klein &amp; Rozencvejg 1974). THE MAGIC BOW UNPROTECTED. THE MAGIC BOW, A MAGIC CARPET AND A MAGIC BOX ARE SEIZED BY NICHOLAS. NICHOLAS TRAVELS TO THE LOCATION OF THE MAGIC STEED III. THE KEY QUESTION We perceive the locus of theoretical interest to be the process of verbal and non-verbal behavior transmission across generations. Our work on modelling speech communities includes designs for simulations In formulating components for automatic (Klein 1974a). in which many modelled individuals, each with his own semantic network, his own grammar(s), his own behavior rules, interact with each other according to the modelled rules of the social structure of the society</td><td>\u2022 i i I \u2022 | i I I I I</td></tr><tr><td colspan=\"5\">MARTHA LISTENING TO THE GUSLA. RESPONDS BY STAYING A MAGIC WAFER IS CONSUMED BY MARTHA. AWAKE MARTHA OBTAINS SUPER-HUMAN STRENGTH. MARTHA TRAVELS TO THE LOCATION OF THE IN ANOTHER KINGDOM.</td><td colspan=\"2\">WHILE WOLF</td><td>inference simulation system, we find that of rules in the It is our hope to be able to model meta-symbolic the the common notation for the semantics of the non-verbal behavioral simulation rules and natural transmission process of all the rules in the system. This means that newly born modelled individuals will infer rules for natural language means that the same learning heuristics may be used to infer language and also for non-verbal behavioral behavioral simulation rules, as a function of inputs of</td><td>I</td></tr><tr><td colspan=\"4\">MARTHA IS DIRECTED BY A HEDGEHOG. MARTHA FINDS THE WOLF THEY FIGHT IN AN OPEN FIELD.</td><td/><td/><td/><td>rules implication is that the as well as linguistic totality texts supplied by other individuals. The texts may verbal and non-verbal behavior, in complex rules. The of human modelled be verbal discourse, or non-verbal sequences of</td><td>\u2022 |</td></tr><tr><td>MARTHA IS WOUNDED.</td><td/><td/><td/><td/><td/><td/><td>social behavior. groups, The</td><td>both learning synchronically individual</td><td>and will</td></tr><tr><td>MARTHA DEFEATS THE</td><td>WOLF</td><td>WITH</td><td>THE</td><td colspan=\"2\">AID</td><td>OF</td><td>diachronically, actually compile and recompile new may now be modelled within versions</td></tr><tr><td colspan=\"3\">SUPER-HUMAN STRENGTH. THE WOLF IS CAUGHT BY MARTHA. MARTHA STARTS BACK HOME.</td><td/><td/><td/><td/><td>the same notational framework. started as a generalized device for testing What for us of his own behavioral rules as the simulation process proceeds. His own test varying theoretical models as part of an productions of behavior scenarios as well as</td><td>I</td></tr><tr><td>MARTHA RETURNS HOME.</td><td/><td/><td/><td/><td/><td/><td>effort natural language discourse will to model language</td><td>change be subject and</td></tr><tr><td/><td/><td/><td/><td/><td/><td/><td>variation (Klein 1974a, Klein to evaluation and possible</td><td>&amp; Rozencvejg correction by</td><td>I</td></tr><tr><td>Tale 2</td><td/><td/><td/><td/><td/><td/><td>1974) other members of the modelled community, and now appears as the basis for a higher level theory of the linguistic basis of their reactions as well as the consequences human behavior (Klein 1974b). of the productions, will serve as a control</td><td>m</td></tr><tr><td/><td/><td/><td/><td/><td/><td/><td>on</td><td>the</td><td>entire</td><td>learning</td><td>process.</td><td>And, as</td><td>i</td></tr><tr><td/><td/><td/><td/><td/><td/><td/><td>II. indicated earlier, the rules to be inferred, WHAT IS A TEXT GRAMMAR? compiled and recompiled will include rules</td><td>I</td></tr><tr><td/><td/><td/><td/><td/><td/><td/><td>that govern the</td><td>process</td><td>of</td><td>inference</td><td>and</td></tr><tr><td/><td/><td/><td/><td/><td/><td/><td>The text grammarian movement, compilation itself.</td><td>centered</td></tr><tr><td colspan=\"7\">Achievements portion of the system include a text grammar with the generative model that generates 2100 word murder mystery stories in less than 19 seconds each, complete with calculation of the plot and specification of the deep structure as well as the surface syntax (Klein et al 1973). The speed of this generation is 100 to 1000 times faster than other existing programs using transformational grammars. (The algorithm for the semantics-to-surface structure NICHOLAS DECIDES TO SEARCH FOR THE MAGIC generative component is such that STEED. processing time increases only linearly as a function of sentence NICHOLAS LEAVES ON A SEARCH. length and syntactic NICHOLAS MEETS A JUG ALONG THE WAY. complexity.) THE JUG IS FIGHTING WITH ELENA OVER A MAGIC</td><td>in Germany as that of van Dijk, Ihwe, Pet~fi and Rieser and Holland, includes work such (1972), Pet6fi and Rieser (1973), Pet~fi (1973), van Dijk (1973), and van Dijk and IV. LOGICAL QUANTIFICATION, SEMANTIC Pet6fi (1974). The underlying motivation of PARSING, PRESUPPOSITIONAL ANALYSIS this group is the belief that Chomskian derived linguistic theories are inadequate We have mentioned the 2nd order or to handle the complexities of complex higher predicate calculus. For our narrative and discourse --that more purposes, the essential feature is that the powerful logical devices are needed. An attempted refutation of the text grammarian logical quantification of the rules may be quantified by the contents of the rules position appeared in Dascal &amp; Margalit themselves. Meta-compiling of rules (1974). Our own work on Propp and governing meta-compiling is an example of Levi-Strauss models refutes the refutation this process. by demonstration (Klein et al 1974). There are other techniques available.</td><td>I l J I \u2022 Q J I</td></tr><tr><td colspan=\"7\">More recent achievements include models of portions of Levi-Strauss\" mythology work in The Raw &amp; the Cooked (Levi-Strauss 1969) and a model for Propp's Morphology of the BOW. THE JUG ASKS NICHOLAS TO DIVIDE THE MAGIC BOW. NICHOLAS TRICKS THE DISPUTANTS INTO LEAVING</td><td>To provide the reader with an intuitive of the nature of a text grammar, we offer the following two view Russian fairytales The behavioral rules operate with high-level classes that make it possible to formulate rules that can treat objects, characters and generated by our automated model of Propp (Klein et al 1974). The same text grammar complex actions as manifestations of the</td><td>!</td></tr><tr><td/><td/><td/><td/><td/><td/><td>84</td><td>I</td></tr></table>",
"type_str": "table",
"text": "THE MOREVNAS LIVE IN A DISTANT PROVINCE. THE FATHER IS EREMA. THE MOTHER IS VASILISA. THE OLDEST SON IS BALDAK. THE YOUNGER SON IS MARCO. THE YOUNGEST SON IS BORIS. THE OLDEST DAUGHTER IS MARIA. THE YOUNGER DAUGHTER IS KATRINA. THE YOUNGEST DAUGHTER IS MARTHA. NICHOLAS ALSO LIVES IN THE SAME LAND. NICHOLAS IS OF MIRACULOUS BIRTH. BALDAK HAS A MAGIC STEED. A BEAR APPEARS IN THE DISTANT PROVINCE. THE BEAR SEIZES THE MAGIC STEED. BALDAK CALLS FOR HELP FROM NICHOLAS. IN ANOTHER KINGDOM. NICHOLAS BY THE MAGIC CARPET. NICHOLAS FINDS THE BEAR. NICHOLAS SURPRISES THE BEAR. NICHOLAS KILLS THE BEAR WITH THE AID OF THE MAGIC BOW. THE MAGIC STEED APPEARS FROM THE MAGIC BOX. NICHOLAS STARTS BACK HOME. THE BEAR'S FATHER CHASES AFTER NICHOLAS. NICHOLAS ESCAPES BY FLYING ON A FALCON. NICHOLAS RETURNS HOME.",
"num": null,
"html": null
}
}
}
}