ACL-OCL / Base_JSON /prefixJ /json /J92 /J92-3013.json
Benjamin Aw
Add updated pkl file v3
6fa4bc9
{
"paper_id": "J92-3013",
"header": {
"generated_with": "S2ORC 1.0.0",
"date_generated": "2023-01-19T02:45:00.246481Z"
},
"title": "Vision, Instruction, and Action",
"authors": [
{
"first": "David",
"middle": [],
"last": "Chapman",
"suffix": "",
"affiliation": {},
"email": ""
}
],
"year": "",
"venue": null,
"identifiers": {},
"abstract": "",
"pdf_parse": {
"paper_id": "J92-3013",
"_pdf_hash": "",
"abstract": [],
"body_text": [
{
"text": "This book is about a comprehensive theory of activity and a computer program (Sonja) that uses the theory to play a video game (Amazon) from the same perspective as a human player. Chapman's main point is an argument about the kinds of architectures and internal representations that are plausible for modeling human activity. Starting with the essential connectionism of the human brain, he argues against \"mentalese\" and in favor of deictic representations such as THE-CAR-WHICH-IS-ABOUT-TO-HIT-ME. He argues that representations that are to be useful for the routines that make up most of human activity must be grounded in perception and be centered on the representing agent's perspective of the world. Most of the book is about how one can implement Chapman's theory of activity by reference to a biologically based model of visual perception.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
},
{
"text": "Chapman incorporates instruction by having Sonja accept very simple, canned suggestions from a human \"kibbitzer.\" Since Sonja interprets them in terms of what it sees in the world and what routines it thinks it should perform next, it automatically uses them in the right way, only, for instance, turning left when it reaches the next left turning rather than at the exact time of the instruction. However, Sonja relies on its kibbitzers having a very similar model of the game to its own in order to decode references correctly; for instance, a/any/the/that amulet is always the one that Sonja reckons is most relevant to the current situation. This entails that kibbitzers can only serve to reorient Sonja's attention rather than suggesting novel things to try. Chapman considers this a benefit because most human activity is routine, but I think that most instruction is exactly about learning new ways of looking at a domain. I imagine that even in video games, kibbitzers are most useful for novice players who want to learn the right tricks. It is not at all clear that Chapman's ideas could be straightforwardly adapted for more complex instruction-giving. Beyond the central message about representations, there is little new work that researchers in instruc-tion will wish to take away from the book.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
},
{
"text": "As for the book's cover, it would be more at home on the top shelf of a newsstand than among scholarly works. Chapman may only be bothered \"on political grounds\" by violence (p. 62) and not by sexism (but do we really believe his claim that it would have been very difficult to find a suitable nonviolent domain?), but someone should tell the publishers that dressing one's theory of activity in a chain-mail bikini won't necessarily help it selL--Jean Carletta, University of Edinburgh William Rounds's paper, \"Computational complexity and natural language processing\" (pp. 9-29), introduces complexity theory to linguists. It predates Barton, Berwick, and Ristad 1987 and covers much of the same ground. On Rounds's view, two of the most important issues are learnability of the language and conciseness of the grammatical description.",
"cite_spans": [
{
"start": 637,
"end": 669,
"text": "Barton, Berwick, and Ristad 1987",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
},
{
"text": "\"The convergence of mildly context-sensitive grammar formalisms\" (pp. 31-81), by Aravind K. Joshi, K. Vijay-Shanker, and David Weir, compares several newer theories of grammar, all of which use phrase structure rules slightly augmented by other mechanisms (making them context-sensitive, but only \"mildly\" so). The authors focus on Tree Adjoining Grammar (TAG) and compare it to Generalized Phrase Structure Grammar (GPSG), Head Grammar, and other formalisms.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
},
{
"text": "Janet Dean Fodor's \"Sentence processing and the mental grammar\" (pp. 83--113) in-troduces psycholinguistics from a computational viewpoint, with emphasis on method-olog~ As examples she discusses, at length, the experimental evidence on two issues: whether there is a syntactic level other than surface structure (but distinct from semantics), and whether trace binding is separate from phrase-structure parsing (as implicitly claimed by GB theory but not GPSG). She points out the notorious \"problem of null resuits\":",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
},
{
"text": "If the sentence-matching studies fail to reveal mental representations of S-structure, that might be because there are none, or it might be because the sentence-matching task is not sensitive to them. (p. 91)",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
},
{
"text": "In a footnote Fodor apologizes for the delayed publication and cites important newer literature.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
},
{
"text": "Finally, Robert C. advocates an approach to parsing in which the grammar rules are not represented directly, but inferred from more abstract principles. In practice this means GB theory, and its fortunate effect has been to create versions of GB in which handwaving is impossible. Berwick's paper is lucidly written, presupposes no knowledge of GB, and summarizes much of the work of his colleagues, including Kashket's free-wordorder parser for Warlpiri and Berwick and Fong's principle-ordering heuristics.-Michael A. Covington, University of Georgia.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
}
],
"back_matter": [],
"bib_entries": {
"BIBREF1": {
"ref_id": "b1",
"title": "Computational Complexity and Natural Language",
"authors": [
{
"first": "Robert",
"middle": [
"C"
],
"last": "Berwick",
"suffix": ""
},
{
"first": "Eric",
"middle": [],
"last": "Ristad",
"suffix": ""
},
{
"first": "",
"middle": [],
"last": "Sven",
"suffix": ""
}
],
"year": 1987,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Berwick, Robert C.; and Ristad, Eric Sven (1987). Computational Com- plexity and Natural Language. The MIT Press.",
"links": null
}
},
"ref_entries": {
"FIGREF0": {
"num": null,
"type_str": "figure",
"uris": null,
"text": "available in convenient form four papers that have already been widely circulated among specialists. All four were presented at Santa Cruz in January 1987 and distributed, in draft, at the 1987 Linguistic Institute."
}
}
}
}