ACL-OCL / Base_JSON /prefixJ /json /J82 /J82-1001.json
Benjamin Aw
Add updated pkl file v3
6fa4bc9
{
"paper_id": "J82-1001",
"header": {
"generated_with": "S2ORC 1.0.0",
"date_generated": "2023-01-19T03:07:17.394900Z"
},
"title": "Phrase Structure Trees Bear More Fruit than You Would Have Thought 1",
"authors": [
{
"first": "Aravind",
"middle": [
"K"
],
"last": "Joshi",
"suffix": "",
"affiliation": {},
"email": ""
},
{
"first": "J",
"middle": [],
"last": "Bresnan",
"suffix": "",
"affiliation": {},
"email": ""
},
{
"first": "R",
"middle": [
"M"
],
"last": "Kaplan",
"suffix": "",
"affiliation": {},
"email": ""
},
{
"first": "S",
"middle": [],
"last": "Peters",
"suffix": "",
"affiliation": {},
"email": ""
},
{
"first": "A",
"middle": [],
"last": "Zaenan",
"suffix": "",
"affiliation": {},
"email": ""
}
],
"year": "",
"venue": null,
"identifiers": {},
"abstract": "In this paper we will present several results concerning phrase structure trees. These results show that phrase structure trees, when viewed in certain ways, have much more descriptive power than one would have thought. We have given a brief account of local constraints on structural descriptions and an intuitive proof of a theorem about local constraints. We have compared the local constraints approach to some aspects of Gazdar's framework and that of Peters and Ritchie and of Karttunen. We have also presented some results on skeletons (phrase structure trees without labels) which show that phrase structure trees, even when deprived of the labels, retain in a certain sense all the structural information. This result has implications for grammatical inference procedures.",
"pdf_parse": {
"paper_id": "J82-1001",
"_pdf_hash": "",
"abstract": [
{
"text": "In this paper we will present several results concerning phrase structure trees. These results show that phrase structure trees, when viewed in certain ways, have much more descriptive power than one would have thought. We have given a brief account of local constraints on structural descriptions and an intuitive proof of a theorem about local constraints. We have compared the local constraints approach to some aspects of Gazdar's framework and that of Peters and Ritchie and of Karttunen. We have also presented some results on skeletons (phrase structure trees without labels) which show that phrase structure trees, even when deprived of the labels, retain in a certain sense all the structural information. This result has implications for grammatical inference procedures.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Abstract",
"sec_num": null
}
],
"body_text": [
{
"text": "There is renewed interest in examining the descriptive as well as generative power of phrase structure grammars. The primary motivation for this interest has come from the recent investigations in alternatives to transformational grammars (e.g., Bresnan 1978; Kaplan and Bresnan 1979; Gazdar 1978 Gazdar , 1979a Gazdar , 1979b Peters 1980; Karttunen 1980) . 2 Some of these approaches require amendments to phrase structure grammars (especially Gazdar 1978 Gazdar , 1979a Gazdar , 1979b Peters 1980; Karttunen 1980 ) that increase their descriptive power without increasing their generative power. Gazdar wants to restrict the power to that of context-free languages. Others are not com-so-called indexed grammars. The power of the phrase linking grammars of Peters and Ritchie is not completely known at this time.",
"cite_spans": [
{
"start": 246,
"end": 259,
"text": "Bresnan 1978;",
"ref_id": "BIBREF0"
},
{
"start": 260,
"end": 284,
"text": "Kaplan and Bresnan 1979;",
"ref_id": "BIBREF8"
},
{
"start": 285,
"end": 296,
"text": "Gazdar 1978",
"ref_id": "BIBREF2"
},
{
"start": 297,
"end": 311,
"text": "Gazdar , 1979a",
"ref_id": null
},
{
"start": 312,
"end": 326,
"text": "Gazdar , 1979b",
"ref_id": null
},
{
"start": 327,
"end": 339,
"text": "Peters 1980;",
"ref_id": "BIBREF12"
},
{
"start": 340,
"end": 355,
"text": "Karttunen 1980)",
"ref_id": "BIBREF9"
},
{
"start": 445,
"end": 456,
"text": "Gazdar 1978",
"ref_id": "BIBREF2"
},
{
"start": 457,
"end": 471,
"text": "Gazdar , 1979a",
"ref_id": null
},
{
"start": 472,
"end": 486,
"text": "Gazdar , 1979b",
"ref_id": null
},
{
"start": 487,
"end": 499,
"text": "Peters 1980;",
"ref_id": "BIBREF12"
},
{
"start": 500,
"end": 514,
"text": "Karttunen 1980",
"ref_id": "BIBREF9"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "The notion of node admissibility plays an important role in these formulations. The earliest reference to node admissibility appears in Chomsky 1965 (p. 215) ; he suggests the possibility of constructing a rewriting system where the rewriting of a symbol is determined not only by the symbol being rewritten but also by the dominating category symbol. In his analysis of the base component of a transformation grammar, McCawley 1968 suggested that the appropriate role of context-sensitive rules in the base component of a transformational grammar can be viewed as node admissibility conditions on the base trees.",
"cite_spans": [
{
"start": 136,
"end": 157,
"text": "Chomsky 1965 (p. 215)",
"ref_id": null
},
{
"start": 410,
"end": 432,
"text": "grammar, McCawley 1968",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "The base component is thus a set of labeled trees satisfying certain conditions. Peters and Ritchie 1969 made this notion precise and proved an important result, which roughly states that the weak generative power of a context-sensitive grammar is that of a context-free grammar, if the rules are used as node admissibility conditions. Later Joshi and Levy 1977 made a substantial extension of this result and showed that, if the node admissibility conditions include Boolean combinations of proper analysis predicates and domination predicates, the weak generative capacity is still that of a context-free grammar.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "Besides the notion of node admissibility, Gazdar introduces two other notions in his framework (Generalized Phrase Structure Grammars, GPSG). These are (1) categories with holes and an associated set of derived rules and linking rules, and (2) metarules for deriving rules from one another. The categories with holes and the associated rules do not increase the weak generative power beyond that of context-free grammars. The metarules, unless constrained in some fashion, will increase the generative power, because, for example, a metarule can generate an infinite set of context-free rules that can generate a strictly context-sensitive language.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "(The language {anbncn/n>l} can be generated in this way.) The metarules in the actual grammars written in the GPSG framework so far are constrained enough so that they do not increase the generative power.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "Besides node admissibility conditions, Peters 1980 introduces a device for \"linking\" nodes (see also Karttunen 1980) . A lower node can be \"linked\" to a node higher in the tree and becomes \"visible\" while the semantic interpretation is carried out at the lower node. The idea here is to let the context-free grammar overgenerate and the semantic interpretation weed out ill-formed structures. Karttunen 1980 has developed a parser using this idea. Kaplan and Bresnan 1979 have proposed an intermediate level of representation called functional structure. This level serves to filter structures generated by a phrase structure grammar. Categories with holes are not used in their framework. In this paper we will not be concerned with the Kaplan-Bresnan system.",
"cite_spans": [
{
"start": 101,
"end": 116,
"text": "Karttunen 1980)",
"ref_id": "BIBREF9"
},
{
"start": 393,
"end": 407,
"text": "Karttunen 1980",
"ref_id": "BIBREF9"
},
{
"start": 448,
"end": 471,
"text": "Kaplan and Bresnan 1979",
"ref_id": "BIBREF8"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "In Section 2 we briefly review Gazdar's proposal, especially his notion of categories with holes. We give a short historical review of this notion.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "In Section 3 we briefly describe our work on local constraints on structural descriptions (Joshi and Levy 1977; Joshi, Levy, and Yueh 1980) . We give an intuitive explanation of these results.",
"cite_spans": [
{
"start": 90,
"end": 111,
"text": "(Joshi and Levy 1977;",
"ref_id": "BIBREF6"
},
{
"start": 112,
"end": 139,
"text": "Joshi, Levy, and Yueh 1980)",
"ref_id": "BIBREF7"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "In Section 4 we propose some extensions of our results and discuss them in the context of some long distance rules. We also describe Peters's 1980 approach and present some suggestions for \"contextsensitive\" compositional semantics.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "In Section 5 we briefly present the framework of Peters and Karttunen and compare it with that of Gazdar and of ourselves.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "In Section 6 we briefly discuss our results concerning a characterization of structural descriptions entirely in terms of trees without labels.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": "1."
},
{
"text": "Gazdar 1979 has introduced categories with holes and some associated rules in order to allow for the base generation of \"unbounded\" dependencies. Let For example, if S and NP are the only two nonterminal symbols, then D(V N) would consist of S/S, S/NP, NP/NP, and NP/S. The intended interpretation of a derived category (slashed category or a category with a hole) is as follows: A node labeled a/18 will dominate subtrees identical to those that can be dominated by a, except that somewhere in every subtree of the a/18 type there will occur a node of the form ~/fl dominating a resumptive pronoun, a trace, or the empty string, and every node linking a/fl and /3//3 will be of the form a/t9. Thus a/18 labels a node of type a that dominates material containing a hole of the type/3 (i.e., an extraction site in a movement analysis). For example, S/N P is a sentence that has an N P missing somewhere. The derived rules allow the propagation of a hole and the linking rules allow the introduction of a category with a hole. For example, given the rule (1)",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "EQUATION",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [
{
"start": 0,
"end": 8,
"text": "EQUATION",
"ref_id": "EQREF",
"raw_str": "[s NPVP] 3",
"eq_num": "(1)"
}
],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "3 This is the same as the rule S-,,. NPVP but written as a node admissibility condition.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "we will get two derived rules (2) and (3)",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "[S/NP NP/NP VP] (2) [S/NP NP VP/NP] (3)",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "An example of a linking rule is a rule (rule schema) that introduces a category with a hole as needed for topicalization.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "[s ~ S/a] ",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "This rule will induce a structure like (6). The technique of categories with holes and the associated derived and linking rules allows unbounded dependencies to be accounted for in the phrase structure representation; however, this is accomplished at the expense of proliferation of categories of the type a/[3 (see also Karttunen 1980) . Later, in Section 3, we will present an alternate way of representing (6) by means of local constraints and some of their generalizations. ",
"cite_spans": [
{
"start": 321,
"end": 336,
"text": "Karttunen 1980)",
"ref_id": "BIBREF9"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "The notion of categories with holes is not completely new.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "In his 'String Analysis of Language Structure', Harris 1956 Harris , 1962 introduces categories such as S -NP or S Np (like S/NP of Gazdar) to account for moved constituents. He does not however seem to provide, at least not explicitly, machinery for carrying the \"hole\" downwards. He also has rules in his framework for introducing categories with holes. Thus, in his framework, something like (6) would be accomplished by allowing for a sentence form (a center string) of the form (7) (not entirely his notation).",
"cite_spans": [
{
"start": 48,
"end": 59,
"text": "Harris 1956",
"ref_id": "BIBREF5"
},
{
"start": 60,
"end": 73,
"text": "Harris , 1962",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "NP V \u00a3-NP (7) \u00a3 = Object or Complement of V",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "Sager, who has constructed a very substantial parser starting from some of these ideas and extending them significantly, has allowed for the propagation of the 'hole' resulting in structures very similar to those of Gazdar. She has also used the notion of categories with holes in order to carry out some coordinate structure computation. For example, Sager allows for the coordination of S/a and S/a but not S and S/a. (See Sager 1967 for an early reference to her work.)",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "Gazdar is the first, however, to incorporate the notion of categories with holes and the associated rules in a formal framework for his syntactical theory and also to exploit it in a systematic manner for explaining coordinate structure phenomena.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Gazdar's Formulation",
"sec_num": "2."
},
{
"text": "In this section we briefly review our work on local constraints. Although this work has already appeared Levy 1977, Joshi, Levy, and Yueh 1980) and attracted some attention recently, the demonstration of our results has remained somewhat inaccessible to many due to the technicalities of the tree automata theory. In this paper we present an intuitive account of these results in terms of interacting finite state machines.",
"cite_spans": [
{
"start": 105,
"end": 132,
"text": "Levy 1977, Joshi, Levy, and",
"ref_id": "BIBREF6"
},
{
"start": 133,
"end": 143,
"text": "Yueh 1980)",
"ref_id": "BIBREF7"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Local Constraints",
"sec_num": "3."
},
{
"text": "The method of local constraints is an attempt to describe context-free languages in an apparently context-sensitive form that helps to retain the intuitive insights about the grammatical structure. This form of description, while apparently context-sensitive, is in fact context-free.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Local Constraints",
"sec_num": "3."
},
{
"text": "Context-sensitive grammars, in general, are more powerful (with respect to weak generative capacity) than context-free grammars. A fascinating result of Peters and Ritchie 1969 is that if a context-sensitive grammar G is used for \"analysis\" then the language \"analyzed\" by G is context-free. First, we describe what we mean by the use of a context-sensitive grammar G for \"analysis\". Given a tree t, we define the set of proper analyses of t. Roughly speaking, a proper analysis of a tree is a slice across the tree. More precisely, the following recursive definition applies:",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Definition of Local Constraints",
"sec_num": "3.1"
},
{
"text": "Definition 3.1. The set of proper analyses of a tree t, denoted Pt, is defined as follows. (i) If t = q~ (the empty tree), then Pt = Cp.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Definition of Local Constraints",
"sec_num": "3.1"
},
{
"text": "A then Pt = {A} u P(to).P(t 1) ..... P(t n) where to, t 1 .... t n are trees, and '.' denotes concatenation (of sets). In transformational linguistics the context-sensitive and domination predicates are used to describe conditions on transformations; hence we have referred to these local constraints elsewhere as local transformations.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "(ii) If t =",
"sec_num": null
},
{
"text": "A B '= /\\ I C d E",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "S /\\",
"sec_num": null
},
{
"text": "C e pt = {S, AB, AE, Ae, CdB, CdE, Cde, cdB, cdE, cde}.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "S /\\",
"sec_num": null
},
{
"text": "Let G be a context-sensitive grammar; i.e., its rules are of the form A -\" ~/~r__~ where A cV -E (V is the alphabet and E is the set of terminal symbols), w e V + (set of non-null strings on V) and ~r, 4~ E V* (set of all strings onV). If ~r and are both null, then the rule is a context-free rule. A tree t is said to be \"analyzable\" with respect to G if for each node of t some rule of G \"holds\". It is obvious how to check whether a context-free rule holds of a node or not. A context-sensitive rule A --~/~r ~ holds of a node labeled A if the string corresponding to the immediate descendants of that node is t~ and there is a proper analysis of t of the form p17rAdpp 2 that \"passes through\" the node, (Pl,P2 E V*). We call the contextual condition qr ff a proper analysis predicate.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "S /\\",
"sec_num": null
},
{
"text": "Similar to these context-sensitive rules, which allow us to specify context on the \"right\" and \"left\", we often need rules to specify context on the \"top\" or \"bottom\". Given a node labeled A in a tree t, we say that DOM(~r q0, qr, ~ E V*, holds of a node labeled A if there is a path from the root of the tree to the frontier, which passes through the node labeled A, and is of the form",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "S /\\",
"sec_num": null
},
{
"text": "Theorem 3.1 (Joshi and Levy 1977) Let G be a finite set of local constraint rules and z(G) the set of trees analyzable by G. (It is assumed here that the trees in +(G) are sentential trees; i.e., the root node of a tree in ~-(G) is labeled by the start symbol, S, and the terminal nodes are labeled by terminal symbols.) Then the string language L(z(G)) = {xlx is the yield of t and t E ~-(G)} is context-free. In rules 1, 2, and 3, the context is null, and these rules are context-free. In rule 4 (and in rule 5), the constraint requires an 'a' on the left, and the node dominated (immediately) by a T (and by an S in rule 5).",
"cite_spans": [
{
"start": 12,
"end": 33,
"text": "(Joshi and Levy 1977)",
"ref_id": "BIBREF6"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "The language generated by G can be derived by G l:",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "S -*e S --aT 1 S --, aT T ---aS 1 T --,-aS T 1 --,. bSc S 1 --,. bTc",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "In G 1 there are additional nonterminals S 1 and T 1 that enable the context checking of the local constraints grammar, G, in the generation process.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "It is easy to see that, under the homomorphism that removes subscripts on the nonterminals T 1 and S 1, each tree generable in G 1 is analyzable in G. Also, each tree analyzable in G has a homomorphic preimage in G 1.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "The methods used in the proof of the theorem use tree automata to check the local constraint predicates, since tree automata used as recognizers accept only tree sets whose yield languages are context-free.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "We now give an informal introduction to the ideas of (bottom-up) tree automata. Tree automata process labeled trees, where there is a left-to-right ordering on the successors of a node in the tree. When all the successors of a node v have been assigned states, then a state is assigned to v by a rule that depends on the label of v and (he states of the successors of v considering their left-to-right ordering. Note that the automaton may immediately assign states to the nodes on the frontier of the tree since these nodes .have no successors. If the set of states is partitioned into final and non-final states, then a tree is accepted by the automaton if the state assigned to the root is a final state. A set of trees accepted by a tree automaton is called a recognizable set. Note that the automaton may operate non-deterministically, in which case, as usual, a tree is accepted if there is some set of state assignments leading to its acceptance.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "The importance of tree automata is that they are related to the sets of derivation trees of context-free grammars. Specifically, if T is the set of derivation trees of a context-free grammar, G, then there is a tree automaton that recognizes T. Conversely, if T is the set of trees recognized by a tree automaton, A, then T may be systematically relabeled as the set of derivation trees of a context-free grammar.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "The basic idea presented in detail in Joshi and Levy 1977 is that, because tree automata have nice closure properties (closure under union, intersection, and concatenation), they can do the computations required to check the local constraints.",
"cite_spans": [
{
"start": 38,
"end": 57,
"text": "Joshi and Levy 1977",
"ref_id": "BIBREF6"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "Another way of looking at the checking of a labeled tree by a tree automaton is as follows. We imagine a finite state machine sitting at each node of a tree. The role of the finite state machine is to check that a correct rule application is made at the node it is checking.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "Initially, the nodes on the frontier are turned on and signal their parent nodes. At any other node in the tree, the machine at that node is turned on as soon as all its direct descendants are active. Assuming that at each node the machine for that node has checked that the rule applied there was one of the rules of the context-free grammar we are looking for, then when the root node of the tree signals that it has correctly checked the root we know that the tree is a proper tree for the given context-free grammar.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "When checking for local constraints, a machine at a given node not only passes information to its parent, as described above, but also passes information about those parts of the local constraints, corresponding to the given node as well as all its descendants, that have not yet been satisfied. The point is that this informa-tion is always bounded and hence a finite number of states are adequate to code this information.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "The fact that the closure properties hold can be seen as follows. Consider a slightly more general situation. We consider an A machine and a B machine at each node. Depending on the connections between these A and B machines, we obtain additional results. For example, as each A machine passes information to its parent, it may also pass information to the B machine, but the [3 machine will not pass information back to the A machine. The tree is accepted if the B machine .at the root node of the tree ends up in a final state. Although this seems to be a more complicated model, it can in fact be subsumed in our first model and is the basis of an informal proof that the recognition rules are closed under intersection, since the A machine and the [3 machine can check different rules.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "An important point is that the local constraint on a rule applied at a given node may only be verified by the checking automata at some distant ancestor of that node. In particular, in the case of a proper analysis constraint, it can only be verified at a node sufficiently high in the tree to dominate the entire string specified in the constraint.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "The perceptive reader may now be wondering what replaces all these hypothetical finite state machines when the set of trees corresponds to a context-free grammar. Well, if we were to convert our local constraints grammar into a standard form context-free grammar, we would require a larger nonterminal set. In effect this larger nonterminal set is an encoding of the finite state information stored at the nodes.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "The intuitive explanation presented in this section is, in fact, a complete characterization of recognizability. Given a context-free grammar, one can specify the finite state machine to be posted at each node of a tree to check the tree. And conversely, given the finite state machine description, one can derive the equivalent context-free grammar.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "The essence of the local constraints formulation is to paraphrase the finite state checking at the nodes of the tree in terms of patterns or predicates.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Results on Local Constraints",
"sec_num": "3.2"
},
{
"text": "The result of Theorem 3.1 can be generalized in various ways. Generalizations in (i) and (ii) below are immediate. ",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Some Generalizations",
"sec_num": "4."
},
{
"text": "Another useful generalization has the following essential character. (iii) Predicates that relate nodes mentioned in the proper analysis predicates and domination predicates (associated with a rule), as well as nodes in finite tree fragments dominated by these nodes, can be included in the constraint. Unfortunately, at this time we are unable to give a precise characterization of this generalization. The following two predicates are special cases of this generalization, and Theorem 3.1 holds for these two cases. Let us consider (6) in Section 2 (topicalization).",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I to",
"sec_num": null
},
{
"text": "Consider the following fragment of a local constraints grammar, G. (Only some of the relevant rules are shown.)",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I to",
"sec_num": null
},
{
"text": "S .-,,. NPVP VP .-,. V NP PP S' ---PP S V -.,. give I NP PP PP--,,. c I PP1 X V 2 NP 3 __ A DOM (S' 4 S 5 Y VP 6 ) A COM (PP1 S'4 VP6)-- /k LMS (PP1 $5)",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I to",
"sec_num": null
},
{
"text": "The last rule has a proper analysis predicate, a domination predicate, and COMMAND and LEFTMOST-SISTER predicates whose arguments satisfy the requirements mentioned in (iii) above (i.e., they relate nodes mentioned in proper analysis predicates and domination predicates). The indexing makes this clear. Structure (7) will be well-formed.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I to",
"sec_num": null
},
{
"text": "Compare (7) with (6) in Section 1 (Gazdar's framework) and with (8) in Section 5 (Peters's and Karttunen's framework). ",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "I to",
"sec_num": null
},
{
"text": "Since a local constraint rule has a context-free part and contextual constraint part, it is possible to define context-sensitive compositional semantics in the following manner. where A -* BC is the context-free part and P is the contextual constraint, we can have o(A) as a composition of o(B) and o(C), which depends on P. This idea has been pursued in the context of programming languages (Joshi, Levy, and Yueh 1980) . Whether such an approach would be useful for natural language is an open question.",
"cite_spans": [
{
"start": 392,
"end": 420,
"text": "(Joshi, Levy, and Yueh 1980)",
"ref_id": "BIBREF7"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Local Constraints in Semantics",
"sec_num": "4.1"
},
{
"text": "(An additional comment appears in Section 5.)",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Local Constraints in Semantics",
"sec_num": "4.1"
},
{
"text": "Peters 1980 and Kartunnen 1980 have proposed a device for linking nodes to handle unbounded dependencies. Thus, for example, instead of (6) or (7), we have (8). out bottom-up semantic translation, the moved constituent is \"visible\" at the VP node. In our approach, this \"visibility\" can be obtained if the translation is made to depend on the contextual constraint which, of course, has already been checked prior to the translation. This is the essence of our suggestion in Section 4.1.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Linked Nodes 4 (Peters's and Kartunnen's framework)",
"sec_num": "5."
},
{
"text": "Karttunen 1980 has constructed a parser incorporating the device of linked nodes. Karttunen also discusses the problem of complex patterns of moved constituents and their associated gaps or resumptive pronouns. This is not easy to handle in Gazdar's framework without multiplying the categories even further, e.g., by providing categories such as S/NP NP, etc. 5 Karttunen handles this problem by essentially incorporating the checking of the patterns of gaps and fillers in the parser, i.e., in the control structure of the parser.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "S",
"sec_num": null
},
{
"text": "Our approach can be regarded as somewhat intermediate between Gazdar's and that of Peters and Karttunen in the following sense. We avoid multiplication of categories as do Peters and Karttunen. On the other hand, the relationship between the moved constituent and the gap is expressed in the grammar itself (more in the spirit of Gazdar) instead of in the parser (more precisely, in the data structure created by the parser) as in the Peters and Karttunen approach.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "S",
"sec_num": null
},
{
"text": "We have not pursued the topic of multiple gaps and fillers in our framework but, obviously, in it we would opt for Karttunen's suggestion of checking the constraints on the patterns of gaps and fillers in the parser itself. It could not be done by local constraints alone because local constraints essentially do the work of the links in the Peters and Karttunen framework.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "S",
"sec_num": null
},
{
"text": "In Section 4, we showed how local constraints allowed us to prevent proliferation of categories. We can dispense with the local constraints and construct an equivalent context-free grammar that would have potentially a very large number of categories. While pursuing the relation between 'structure' and the size of the nonterminal vocabulary (i.e., the syntactic categories), we were led to the following surprising result: the actual labels, in a sense, carry no information. (This result was also used by us in developing some heuristics for converting a context-free grammar into a more compact but equivalent local constraints grammar. We will not describe this use of our result in the present paper.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Skeletal Structural Descriptions (Skeletons)",
"sec_num": "6."
},
{
"text": "(For further information, see Joshi, Levy, and Yueh 1980.) First we need some definitions. A phrase structure tree without labels will be called a skeletal structural 5 S/NP NP means an S tree with two NP type holes. description or a skeleton. A skeleton exhibits all of the grouping structure without naming the syntactic categories. For example, (9) is a skeleton. The structural description is characterized only by the shape of the tree and not the associated labels. The only symbols appearing in the structure are the terminal symbols (more precisely, the preterminal symbols and the terminal symbols, in the linguistic context, as in (10); however, for the rest of the discussion, we will take skeletons to mean trees with terminal symbols only).",
"cite_spans": [
{
"start": 30,
"end": 58,
"text": "Joshi, Levy, and Yueh 1980.)",
"ref_id": "BIBREF7"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Skeletal Structural Descriptions (Skeletons)",
"sec_num": "6."
},
{
"text": "Let G be a context-free grammar and let T G be the set of sentential derivation trees (structural descriptions) of G. Let S G be the skeletons of G, i.e., all trees in T G with the labels removed.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Skeletal Structural Descriptions (Skeletons)",
"sec_num": "6."
},
{
"text": "It is possible to show that for every context-free grammar G we can construct a skeletal generating system (consisting of skeletons and skeletal rewriting rules) that generates exactly SG; i.e., all category labels can be eliminated while retaining the structural information (Levy and Joshi 1979) . Since skeletons pay attention to grouping only, this result may be psycholinguistically important because our first intuition about the structure of a sentence is more likely to be in terms of the grouping structure and not in terms of the corresponding syntactic categories, especially those beyond the preterminal categories.",
"cite_spans": [
{
"start": 276,
"end": 297,
"text": "(Levy and Joshi 1979)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Skeletal Structural Descriptions (Skeletons)",
"sec_num": "6."
},
{
"text": "The theory of skeletons may also provide some insight into the problem of grammatical inference. For a finite state string automaton, it is well know that if the number of states is 2k then, if we are presented with all acceptable stings of length <2k, the finite state automaton is completely determined. We have a similar situation with the skeletons. First, it can be shown that for each skeletal set S G (i.e., the set of skeletons of a context-free grammar) we can construct a bottom-up tree automaton that recognizes precisely S G (Levy and Joshi 1978) . Further, if the number of states of this automaton is k, then the set of all acceptable sets of skeletons of depth _<2k completely determines 5 G (Levy and Joshi 1979) . Using skeletons (i.e., string with their grouping structure) rather than just strings as input to a grammatical inference machine is an idea worth pursuing further.",
"cite_spans": [
{
"start": 537,
"end": 558,
"text": "(Levy and Joshi 1978)",
"ref_id": "BIBREF10"
},
{
"start": 707,
"end": 728,
"text": "(Levy and Joshi 1979)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Bitt",
"sec_num": null
},
{
"text": "We have presented several results concerning phrase structure trees that show that phrase structure trees when viewed in certain ways have much more descriptive power than one would have thought. We have given a brief account of our work on local constraints and presented an intuitive proof. We have also compared it to some aspects of the framework of Gazdar and that of Peters and Karttunen. We have also shown that phrase structure trees, even when deprived of the labels, retain in a certain sense all the structural information. This result has implications for grammatical inference procedures.",
"cite_spans": [
{
"start": 373,
"end": 394,
"text": "Peters and Karttunen.",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Conclusion:",
"sec_num": "6."
},
{
"text": "American Journal of Computational Linguistics, Volume 8, Number 1, January-March 1982",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
}
],
"back_matter": [
{
"text": "(1)A skeletal generating system can be constructed as follows. We have a finite set of initial skeletons and a finite set of skeletal rewriting rules whose left-hand and right-hand sides are skeletons. /\\ a bIn this system, generation proceeds from an initial skeleton through a sequence of intermediate skeletons to the desired skeleton. Clearly, because of the definition of a skeleton and the nature of the skeletal rewriting rules, the rules must always apply to one of the lowermost configurations in a skeleton that matches with the left-hand side of a rule. Thus the derivation of the skeleton (3) in S G would be as in (11). The configurations encircled by a dotted line are the ones to which the skeletal rule is applied.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "annex",
"sec_num": null
},
{
"text": "(3) = = \u2022In the above example, there was only one nonterminal; hence the result is obvious.Following is a somewhat more complicated example.G is a local constraints grammar. Clearly there is a context-free grammar G' that is equivalent to G. Rather than taking a complicated context-free grammar and then exhibiting the equivalent skeletal grammar, we will take the local constraints grammar G and exhibit a skeletal grammar equivalent to G. This will allow us to present a complicated example without making the resulting skeletal grammar too unwieldy. Also, this example will give some idea about the relationship between local constraints grammars and skeletal grammars; in particular, the skeletal rewriting rules indirectly encode the local constraints in the rules in Example 6.2.We have eliminated all labels by introducing structural rewriting rules and defining the derivation as proceeding from skeleton to skeleton rather than from string to string. This result clearly brings out the relationship between the grouping structure and the syntactic categories labeling the nodes. Skeletal grammar equivalent to G: ",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "/IX",
"sec_num": null
}
],
"bib_entries": {
"BIBREF0": {
"ref_id": "b0",
"title": "Linguistic Theory and Psychological Reality",
"authors": [
{
"first": "J",
"middle": [
"W"
],
"last": "Bresnan",
"suffix": ""
},
{
"first": "M",
"middle": [],
"last": "Halle",
"suffix": ""
},
{
"first": "J",
"middle": [],
"last": "Bresnan",
"suffix": ""
},
{
"first": "G",
"middle": [
"A"
],
"last": "Miller",
"suffix": ""
}
],
"year": 1978,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Bresnan, J.W. 1978 A Realistic transformational grammar. In Halle, M., Bresnan, J. and Miller, G.A., Ed., Linguistic Theory and Psychological Reality. The MIT Press, Cambridge, Mass.",
"links": null
},
"BIBREF1": {
"ref_id": "b1",
"title": "Aspects of the Theory of Syntax",
"authors": [
{
"first": "N",
"middle": [],
"last": "Chomsky",
"suffix": ""
}
],
"year": 1965,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Chomsky, N. 1965 Aspects of the Theory of Syntax, The MIT Press, Cambridge, Mass.",
"links": null
},
"BIBREF2": {
"ref_id": "b2",
"title": "English as a context-free language, unpublished manuscript",
"authors": [
{
"first": "G",
"middle": [
"J M"
],
"last": "Gazdar",
"suffix": ""
}
],
"year": 1978,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Gazdar, G.J.M. 1978 English as a context-free language, unpubl- ished manuscript.",
"links": null
},
"BIBREF3": {
"ref_id": "b3",
"title": "Unbounded dependencies and coordinate structure",
"authors": [
{
"first": "G",
"middle": [
"J M"
],
"last": "Gazdar",
"suffix": ""
}
],
"year": 1981,
"venue": "Linguistic Inquiry",
"volume": "12",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Gazdar, G.J.M. 1981 Unbounded dependencies and coordinate structure. Linguistic Inquiry 12, 2.",
"links": null
},
"BIBREF4": {
"ref_id": "b4",
"title": "The Nature of Syntactic Representation",
"authors": [
{
"first": "P",
"middle": [
"G J M"
],
"last": "Gazdar",
"suffix": ""
},
{
"first": "G",
"middle": [
"K"
],
"last": "Pullam",
"suffix": ""
},
{
"first": "",
"middle": [],
"last": "Ed",
"suffix": ""
}
],
"year": 1982,
"venue": "Jaeobson",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Gazdar. G.J.M. 1982 Phrase Structure Grammar. To appear in Jaeobson, P. and Pullam, G.K., Ed., The Nature of Syntactic Representation. Reidel, Boston, Mass.",
"links": null
},
"BIBREF5": {
"ref_id": "b5",
"title": "String analysing of language structure, manuscript. Also in manuscripts around 1958 and published in 1962 by Mouton and Co",
"authors": [
{
"first": "Z",
"middle": [
"S"
],
"last": "Harris",
"suffix": ""
}
],
"year": 1956,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Harris, Z.S. 1956 String analysing of language structure, manu- script. Also in manuscripts around 1958 and published in 1962 by Mouton and Co., The Hague.",
"links": null
},
"BIBREF6": {
"ref_id": "b6",
"title": "Constraints on structural descriptions",
"authors": [
{
"first": "A",
"middle": [
"K"
],
"last": "Joshi",
"suffix": ""
},
{
"first": "L",
"middle": [
"S"
],
"last": "Levy",
"suffix": ""
}
],
"year": 1977,
"venue": "SlAM Journal of Computing",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Joshi, A.K. and Levy, L.S. 1977 Constraints on structural descrip- tions. SlAM Journal of Computing.",
"links": null
},
"BIBREF7": {
"ref_id": "b7",
"title": "Local constraints in the syntax and semantics of programming language",
"authors": [
{
"first": "A",
"middle": [
"K"
],
"last": "Joshi",
"suffix": ""
},
{
"first": "L",
"middle": [
"S"
],
"last": "Levy",
"suffix": ""
},
{
"first": "Yueh",
"middle": [],
"last": "",
"suffix": ""
},
{
"first": "K",
"middle": [],
"last": "",
"suffix": ""
}
],
"year": 1980,
"venue": "Journal of Theoretical Computer Science",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Joshi, A.K., Levy, L.S., and Yueh, K. 1980 Local constraints in the syntax and semantics of programming language. Journal of Theoretical Computer Science.",
"links": null
},
"BIBREF8": {
"ref_id": "b8",
"title": "The Mental Representation of Grammatical Relations",
"authors": [
{
"first": "R",
"middle": [],
"last": "Kaplan",
"suffix": ""
},
{
"first": "J",
"middle": [
"W"
],
"last": "Bresnan",
"suffix": ""
}
],
"year": 1979,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Kaplan, R. and Bresnan, J.W. 1979 A formal system for grammat- ical representation. To appear in Bresnan, J.W., Ed., The Men- tal Representation of Grammatical Relations. The MIT Press, Mass.",
"links": null
},
"BIBREF9": {
"ref_id": "b9",
"title": "Unbounded dependencies in phrase structure grammar: slash categories versus dotted lines. Paper presented at the Third Amsterdam Colloquium: Formal Methods in the Study of Language",
"authors": [
{
"first": "L",
"middle": [],
"last": "Karttunen",
"suffix": ""
}
],
"year": 1980,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Karttunen, L. 1980 Unbounded dependencies in phrase structure grammar: slash categories versus dotted lines. Paper presented at the Third Amsterdam Colloquium: Formal Methods in the Study of Language, Amsterdam.",
"links": null
},
"BIBREF10": {
"ref_id": "b10",
"title": "Skeletal structural descriptions. Information and Control",
"authors": [
{
"first": "L",
"middle": [
"S"
],
"last": "Levy",
"suffix": ""
},
{
"first": "A",
"middle": [
"K"
],
"last": "Joshi",
"suffix": ""
}
],
"year": 1978,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Levy, L.S. and Joshi, A.K. 1978 Skeletal structural descriptions. Information and Control",
"links": null
},
"BIBREF11": {
"ref_id": "b11",
"title": "Concerning the base component of a transformational grammar",
"authors": [
{
"first": "J",
"middle": [
"D"
],
"last": "Mccawley",
"suffix": ""
}
],
"year": 1967,
"venue": "Foundations of Language",
"volume": "4",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "McCawley, J.D. 1967 Concerning the base component of a trans- formational grammar. Foundations of Language, Vol. 4. Reprint- ed in McCawley, J.D., Grammar and Meaning, Academic Press, New York, NY, 1968.",
"links": null
},
"BIBREF12": {
"ref_id": "b12",
"title": "Talk presented at the Workshop on Alternatives to Transformation Grammars",
"authors": [
{
"first": "S",
"middle": [],
"last": "Peters",
"suffix": ""
}
],
"year": 1980,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Peters, S. 1980 (Talk presented at the Workshop on Alternatives to Transformation Grammars, Stanford University, January.)",
"links": null
},
"BIBREF13": {
"ref_id": "b13",
"title": "Context-sensitive immediate constituent analysis",
"authors": [
{
"first": "S",
"middle": [],
"last": "Peters",
"suffix": ""
},
{
"first": "R",
"middle": [
"W"
],
"last": "Ritchie",
"suffix": ""
}
],
"year": 1969,
"venue": "Proc. ACM Symposium on Theory of Computing",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Peters, S. and Ritchie, R.W. 1969 Context-sensitive immediate constituent analysis. Proc. ACM Symposium on Theory of Computing.",
"links": null
},
"BIBREF14": {
"ref_id": "b14",
"title": "Syntactic analysis of natural languages",
"authors": [
{
"first": "N",
"middle": [],
"last": "Sager",
"suffix": ""
}
],
"year": 1967,
"venue": "Advances in Computers",
"volume": "8",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Sager, N. 1967 Syntactic analysis of natural languages. Advances in Computers, Vol. 8 ed. M. Alt and M. Rubinoff, Academic Press, New York, NY.",
"links": null
}
},
"ref_entries": {
"FIGREF0": {
"text": "N be the set of basic nonterminal symbols. Then we define a set D(V N) of derived nonterminal symbols as follows. D(VN) = {a/~ I ct, fl e V N }",
"type_str": "figure",
"num": null,
"uris": null
},
"FIGREF3": {
"text": "pl,P2e V*).The contextual condition associated with such a \"vertical\" proper analysi~ is called a domination predicate.The general form of a local constraint combines the proper analysis and domination predicates as follows: Definition 3.2. A local constraint rule is a rule of the form A -* u/C A where C A is a Boolean combination of proper analysis and domination predicates.",
"type_str": "figure",
"num": null,
"uris": null
},
"FIGREF4": {
"text": "Example 3.2 Let V = {S,T,a,b,c,e} and Y. = {a,b,c,e},and G be a finite set of local constraint rules:",
"type_str": "figure",
"num": null,
"uris": null
},
"FIGREF5": {
"text": "Variables can be included in the constraint.Thus, for example, a local constraint rule can be of the formA --,,. w I BCDXE FYGwhere A,B,C,D.E,F,G are nonterminals, w is a string of terminals and/or nonterminals, and X and Y are variables that range over arbitrary strings of terminals and nonterminals.(ii) Finite tree fragments can be included in the constraint. Thus, for example, a local constraint rule can be of the form",
"type_str": "figure",
"num": null,
"uris": null
},
"FIGREF8": {
"text": "For a context-free rule of the form A--,-BC if o(A), o(B), o(C) are the 'semantic' translations associated with A, B, and C, respectively, then o(A) is a composition of o(B) and o(C). For a local constraint rule of the form A-,-BC I P",
"type_str": "figure",
"num": null,
"uris": null
},
"FIGREF9": {
"text": "that loops from the VP node back to the moved constituent is a way of indicating the location of the gap in the object position under the VP. The link also indicates that there is certain dependency between the gap and the dislocated element. Both in our approach and that of Peters and Karttunen, proliferation of categories as in Gazdar's approach is avoided. Further, for Peters and Karttunen, while carrying4 We give a very informal description of a linked tree.A precise definition can be found in S.Peters and R.W. Ritchie, Phrase Linking Grammars, Technical Report, Department of Linguistics, University of Texas at Austin, 1982.",
"type_str": "figure",
"num": null,
"uris": null
}
}
}
}