Semantic Analysis. Computational Semantics Chapter 18. Compositional Analysis. Example. Compositional Analysis. Compositional Semantics

Similar documents
Compositional Semantics

Part III: Semantics. Notes on Natural Language Processing. Chia-Ping Chen

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

Basic Parsing with Context-Free Grammars. Some slides adapted from Julia Hirschberg and Dan Jurafsky 1

Proof Theory for Syntacticians

CS 598 Natural Language Processing

Parsing of part-of-speech tagged Assamese Texts

Some Principles of Automated Natural Language Information Extraction

Case government vs Case agreement: modelling Modern Greek case attraction phenomena in LFG

Natural Language Processing. George Konidaris

Constraining X-Bar: Theta Theory

Chapter 4: Valence & Agreement CSLI Publications

Construction Grammar. University of Jena.

Grammars & Parsing, Part 1:

Derivational: Inflectional: In a fit of rage the soldiers attacked them both that week, but lost the fight.

Inleiding Taalkunde. Docent: Paola Monachesi. Blok 4, 2001/ Syntax 2. 2 Phrases and constituent structure 2. 3 A minigrammar of Italian 3

A Minimalist Approach to Code-Switching. In the field of linguistics, the topic of bilingualism is a broad one. There are many

Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions.

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation

ENGBG1 ENGBL1 Campus Linguistics. Meeting 2. Chapter 7 (Morphology) and chapter 9 (Syntax) Pia Sundqvist

Informatics 2A: Language Complexity and the. Inf2A: Chomsky Hierarchy

Developing a TT-MCTAG for German with an RCG-based Parser

Control and Boundedness

The presence of interpretable but ungrammatical sentences corresponds to mismatches between interpretive and productive parsing.

Argument structure and theta roles

Approaches to control phenomena handout Obligatory control and morphological case: Icelandic and Basque

Aspectual Classes of Verb Phrases

1/20 idea. We ll spend an extra hour on 1/21. based on assigned readings. so you ll be ready to discuss them in class

The Strong Minimalist Thesis and Bounded Optimality

Using dialogue context to improve parsing performance in dialogue systems

Theoretical Syntax Winter Answers to practice problems

"f TOPIC =T COMP COMP... OBJ

a) analyse sentences, so you know what s going on and how to use that information to help you find the answer.

The Interface between Phrasal and Functional Constraints

Basic Syntax. Doug Arnold We review some basic grammatical ideas and terminology, and look at some common constructions in English.

Hans-Ulrich Block, Hans Haugeneder Siemens AG, MOnchen ZT ZTI INF W. Germany. (2) [S' [NP who][s does he try to find [NP e]]s IS' $=~

Developing Grammar in Context

Underlying and Surface Grammatical Relations in Greek consider

An Interactive Intelligent Language Tutor Over The Internet

Minimalism is the name of the predominant approach in generative linguistics today. It was first

Context Free Grammars. Many slides from Michael Collins

An Introduction to the Minimalist Program

Pseudo-Passives as Adjectival Passives

Enhancing Unlexicalized Parsing Performance using a Wide Coverage Lexicon, Fuzzy Tag-set Mapping, and EM-HMM-based Lexical Probabilities

Adapting Stochastic Output for Rule-Based Semantics

THE INTERNATIONAL JOURNAL OF HUMANITIES & SOCIAL STUDIES

CAS LX 522 Syntax I. Long-distance wh-movement. Long distance wh-movement. Islands. Islands. Locality. NP Sea. NP Sea

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

Foundations of Knowledge Representation in Cyc

Universal Grammar 2. Universal Grammar 1. Forms and functions 1. Universal Grammar 3. Conceptual and surface structure of complex clauses

Focusing bound pronouns

Guidelines for Writing an Internship Report

Word Stress and Intonation: Introduction

Intra-talker Variation: Audience Design Factors Affecting Lexical Selections

Modeling Attachment Decisions with a Probabilistic Parser: The Case of Head Final Structures

Update on Soar-based language processing

A Version Space Approach to Learning Context-free Grammars

LING 329 : MORPHOLOGY

Analysis of Probabilistic Parsing in NLP

Derivational and Inflectional Morphemes in Pak-Pak Language

BANGLA TO ENGLISH TEXT CONVERSION USING OPENNLP TOOLS

Type Theory and Universal Grammar

Secondary English-Language Arts

Semantic Inference at the Lexical-Syntactic Level for Textual Entailment Recognition

Copyright and moral rights for this thesis are retained by the author

Chinese Language Parsing with Maximum-Entropy-Inspired Parser

Part I. Figuring out how English works

A Framework for Customizable Generation of Hypertext Presentations

LFG Semantics via Constraints

AN EXPERIMENTAL APPROACH TO NEW AND OLD INFORMATION IN TURKISH LOCATIVES AND EXISTENTIALS

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

A Computational Evaluation of Case-Assignment Algorithms

November 2012 MUET (800)

Leader s Guide: Dream Big and Plan for Success

Accurate Unlexicalized Parsing for Modern Hebrew

Hindi Aspectual Verb Complexes

Procedia - Social and Behavioral Sciences 154 ( 2014 )

The College Board Redesigned SAT Grade 12

Specifying Logic Programs in Controlled Natural Language

COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR

Type-driven semantic interpretation and feature dependencies in R-LFG

AQUA: An Ontology-Driven Question Answering System

Let's Learn English Lesson Plan

Phenomena of gender attraction in Polish *

Interpretive (seeing) Interpersonal (speaking and short phrases)

Project in the framework of the AIM-WEST project Annotation of MWEs for translation

Highlighting and Annotation Tips Foundation Lesson

Words come in categories

Controlled vocabulary

Prediction of Maximal Projection for Semantic Role Labeling

Towards a MWE-driven A* parsing with LTAGs [WG2,WG3]

Should a business have the right to ban teenagers?

Ch VI- SENTENCE PATTERNS.

A Pumpkin Grows. Written by Linda D. Bullock and illustrated by Debby Fisher

Chapter 4 - Fractions

LTAG-spinal and the Treebank

Statewide Framework Document for:

Content Language Objectives (CLOs) August 2012, H. Butts & G. De Anda

THE SHORT ANSWER: IMPLICATIONS FOR DIRECT COMPOSITIONALITY (AND VICE VERSA) Pauline Jacobson. Brown University

Opportunities for Writing Title Key Stage 1 Key Stage 2 Narrative

Transcription:

Semantic Analysis Computational Semantics Chapter 18 Lecture #12 November 2012 We will not do all of this Semantic analysis is the process of taking in some linguistic input and producing a meaning representation for it. There are many ways of doing this, ranging from completely l ad hoc domain specific methods to more theoretically founded by not quite useful methods. Different methods make more or less (or no) use of syntax We re going to start with the idea that syntax does matter The compositional rule-to-rule approach 1 2 Compositional Analysis Principle of Compositionality The meaning of a whole is derived from the meanings of the parts What parts? The constituents of the syntactic parse of the input. AyCaramba serves meat. e Serving( e)^ Server( AyCaramba)^ Served( Meat) 3 4 Compositional Analysis Compositional Semantics Note in the previous example: Part of the meaning derives from the people and activities it s about (predicates and arguments, or, nouns and verbs) and part from the way they are ordered and related grammatically: syntax Question: can we link up syntactic structures to a corresponding semantic representation to produce the meaning of a sentence in the course of parsing it? 5 6 1

Specific vs. General-Purpose Rules We don t want to have to specify for every possible parse tree what semantic representation it maps to We want to identify general mappings from parse trees to semantic representations: ti Again (as with feature structures) we will augment the lexicon and the grammar Rule-to-rule hypothesis: a mapping exists between rules of the grammar and rules of semantic representation Semantic Attachments Extend each grammar rule with instructions on how to map the components of the rule to a semantic representation (grammars are getting complex) S NP VP {VP.sem(NP.sem)} Each semantic function is defined in terms of the semantic representation of choice Problem: how to define these functions and how to specify their composition so we always get the meaning representation we want from our grammar? 7 8 Augmented Rules Augmented Rules Let s look at this a little more abstractly. Consider the general case of a grammar rule: A 1... n { f ( 1. sem,... n. sem)} This should be read as the semantics we attach to A can be computed from some function applied to the semantics of A s parts. As we ll see the class of actions performed by f in the following rule can be quite restricted. A 1... n { f ( 1. sem,... n. sem)} 9 10 Compositional Analysis A Simple AyCaramba serves meat. Associating constants with constituents ProperNoun AyCaramba {AyCaramba} MassNoun meat {Meat} Defining functions to produce these from input NP ProperNoun {ProperNoun.sem} NP MassNoun {MassNoun.sem} Assumption: meaning reps of children are passed up to parents for non-branching constituents 11 12 2

Verbs here are where the action is V serves { x,y) Isa(Serving) ^ Server(x) ^ Served(y)} Will every verb have its own distinct representation? Predicate(Agent,Patient) How do we combine these pieces? VP V NP {????} Goal: (x) Isa(Serving) ^ Server(x) ^ Served(Meat) S NP VP {????} Goal: (e) Isa(Serving) ^ Server( AyCaramba) ^ Served(Meat) Lambda Notation Extension to FOPC x P(x) + variable(s) + FOPC expression in those variables Lambda binding Apply lambda-expression to logical terms to bind lambda- expression s parameters to terms (lambda reduction) Simple process: substitute terms for variables in lambda expression xp(x)(car) P(car) VP and S semantics must tell us Which variables are to be replaced by which arguments? How is this replacement done? 13 14 Lambda notation provides requisite verb semantics Formal parameter list makes variables within the body of the logical expression available for binding to external arguments provided by e.g. NPs Lambda reduction implements the replacement Semantic attachment for grammar rules: S NP VP {VP.sem(NP.sem)} VP V NP {V.sem(NP.sem)} V serves {???} { (x,y) Isa(Serving) ^ Server(y) ^ Served(x)} becomes { y x (e) Isa(Serving) ^ Server(x) ^ Served(y)} Now x is available to be bound when V.sem is applied to NP.sem, and y is available to be bound when the S rule is applied. 15 16 17 18 3

Key Points Each node in a tree corresponds to a rule in the grammar Each grammar rule has a semantic rule associated with it that specifies how the semantics of the LHS of that rule can be computed from the semantics of its daughters. 19 20 Strong Compositionality The semantics of the whole is derived solely from the semantics of the parts. (i.e. we ignore what s going on in other parts of the tree). Predicate-Argument Semantics The functions/operations permitted in the semantic rules fall into two classes Pass the semantics of a daughter up unchanged to the mother Apply (as a function) the semantics of one of the daughters of a node to the semantics of the other daughters 21 22 Mismatches There are unfortunately some annoying mismatches between the syntax of FOPC and the syntax provided by our grammars So we ll accept that we can t always directly create valid logical forms in a strictly compositional way We ll get as close as we can and patch things up after the fact. Quantified Phrases Consider A restaurant serves meat. Assume that A restaurant looks like x Isa ( x, Restaurant) If we do the normal lambda thing we get eserving( e) Server( xisa( x, Restaurant)) Served(Meat) 23 24 4

Complex Terms Allow the compositional system to pass around representations like the following as objects with parts: Complex-Term <Quantifier var body> x Isa(x, Restaurant) Our restaurant example winds up looking like eserving( e) Server( xisa( x, Restaurant) ) Served(Meat) Big improvement 25 26 Conversion So complex terms wind up being embedded inside predicates. So pull them out and redistribute the parts in the right way P(<quantifier, var, body>) turns into Quantifier var body connective P(var) Server( x Isa( x, Restaurant) ) x Isa( x, Restaurant) Server( x) 27 28 Quantifiers and Connectives If the quantifier is an existential, then the connective is an ^ (and) If the quantifier is a universal, then the connective is an -> (implies) Multiple Complex Terms Note that the conversion technique pulls the quantifiers out to the front of the logical form That leads to ambiguity if there s more than one complex term in a sentence. 29 30 5

Quantifier Ambiguity Consider Every restaurant has a menu Every restaurant has a beer. I took a picture of everyone in the room. Quantifier Scope Ambiguity xrestaurant( x) yhaving( e) Haver( x) Had( y) Isa( y, Menu) That could mean that every restaurant has a menu Or that There s some super-menu out there and all restaurants have that menu yisa( y, Menu) xisa( x, Restaurant) ehaving( e) Haver( x) Had( y) 31 32 Ambiguity This turns out to be a lot like the prepositional phrase attachment problem The number of possible interpretations goes up exponentially with the number of complex terms in the sentence The best we can do is to come up with weak methods to prefer one interpretation over another Doing Compositional Semantics To incorporate semantics into grammar we must Figure out right representation for a single constituent based on the parts of that constituent (e.g. Adj) Figure out the right representation for a category of constituents based on other grammar rules making use of that constituent (e.g Nom Adj Nom) This gives us a set of function-like semantic attachments incorporated into our CFG E.g. Nom Adj Nom { x Nom.sem(x) ^ Isa(x,Adj.sem)} 33 34 What do we do with them? As we did with feature structures: Alter an Early-style parser so when constituents (dot at the end of the rule) are completed, the attached semantic function is applied and a meaning representation created and stored with the state Or, let parser run to completion and then walk through resulting tree running semantic attachments from bottom-up Option 1 (Integrated Semantic Analysis) S NP VP {VP.sem(NP.sem)} VP.sem has been stored in state representing VP NP.sem has been stored with the state for NP When rule completed, go get value of VP.sem, go get NP.sem, and apply VP.sem to NP.sem Store result in S.sem. As fragments of input parsed, semantic fragments created Can be used to block ambiguous representations 35 36 6

Drawback You also perform semantic analysis on orphaned constituents that play no role in final parse Henc case for pipelined approach: Do semantics after syntactic parse But. What makes this hard? What role does Harry play in all this? Harder Let s look at some other examples. 37 38 f,x Isa( Telling) ٨ Isa(f, Going) ٨ Teller( Speaker) ٨ Tellee( Harry) ٨ ToldThing( f) ٨ Goer(f, Harry) ٨ Destination(f, x) Harder Harder The VP for told is VP -> V NP VPto So you do what? Apply the semantic function attached to VPTO the semantics of the NP; this binds Harry as the goer of the going. Then apply the semantics of the V to the semantics of the NP; this binds Harry as the Tellee of the Telling And to the result of the first application to get the right value of the told thing. V.Sem(NP.Sem, VPto.Sem(NP.Sem)) 39 40 Harder That s a little messy and violates the notion that the grammar ought not to know much about what is going on in the semantics Better might be V.sem(NP.Sem, VPto.Sem)???? VPto.sem(V.sem, NP.sem)??? i.e Apply the semantics of the head verb to the semantics of its arguments. Complicate the semantics of the verb inside VPto to figure out what s going on. Two Philosophies 1. Let the syntax do what syntax does well and don t expect it to know much about meaning In this approach, the lexical entry s semantic attachments do the work 2. Assume the syntax does know about meaning Here the grammar gets complicated and the lexicon simpler 41 42 7

Consider the attachments for the VPs VP -> Verb NP NP (gave Mary a book) VP -> Verb NP PP (gave a book to Mary) Assume the meaning representations should be the same for both. Under the lexicon-heavy scheme the attachments are: VP.Sem(NP.Sem, NP.Sem) VP.Sem(NP.Sem, PP.Sem) Under the syntax-heavy scheme we might want to do something like VP -> V NP NP V.sem ^ Recip(NP1.sem) ^ Object(NP2.sem) VP -> VNPPP V.Sem ^ Recip(PP.Sem) ^ Object(NP1.sem) I.e. the verb only contributes the predicat the grammar knows the roles. 43 44 Integration Two basic approaches Integrate semantic analysis into the parser (assign meaning representations as constituents are completed) Pipeline assign meaning representations to complete trees only after they re completed From BERP I want to eat someplace near campus Somebody tell me the two meanings 45 46 Pros and Cons If you integrate semantic analysis into the parser as its running You can use semantic constraints to cut off parses that make no sense You assign meaning representations to constituents that don t take part in the correct (most probable) parse Non-Compositionality Unfortunately, there are lots of examples where the meaning (loosely defined) can t be derived from the meanings of the parts Idioms, jokes, irony, sarcasm, metaphor, metonymy, indirect requests, etc 47 48 8

English Idioms Kick the bucket, buy the farm, bite the bullet, run the show, bury the hatchet, etc Lots of these constructions where the meaning of the whole is either Totally unrelated to the meanings of the parts (kick k the bucket) Related in some opaque way (run the show) Enron is the tip of the iceberg. NP -> the tip of the iceberg Not so good attested examples the tip of Mrs. Ford s iceberg the tip of a 1000-page iceberg the merest tip of the iceberg How about That s just the iceberg s tip. 49 50 What we seem to need is something like NP -> An initial NP with tip as its head followed by a subsequent PP with of as its head and that has iceberg as the head of its NP And that allows modifiers like merest, Mrs. Ford, and 1000- page to modify the relevant semantic forms The Tip of the Iceberg Describing this particular construction 1. A fixed phrase with a particular meaning 2. A syntactically and lexically flexible phrase with a particular meaning 3. A syntactically and lexically flexible phrase with a partially compositional meaning 51 52 Constructional Approach Syntax and semantics aren t separable in the way that we ve been assuming Grammars contain form-meaning pairings that vary in the degree to which the meaning of a constituent (and what constitutes a constituent) can be computed from the meanings of the parts. Constructional Approach So we ll allow both VP V NP {V.sem(NP.sem)} and VP Kick-Verb the bucket {λ x Die(x)} 53 54 9

Computational Realizations Semantic grammars Simple idea, dumb name Cascaded finite-state transducers Just like Chapter 3 Semantic Grammars One problem with traditional grammars is that they don t necessarily reflect the semantics in a straightforward way You can deal with this by Fighting with the grammar Complex lambdas and complex terms, etc Rewriting the grammar to reflect the semantics And in the process give up on some syntactic niceties 55 56 BERP BERP How about a rule like the following Request I want to go to eat FoodType Time { some attachment } 57 58 Semantic Grammar The term semantic grammar refers to the motivation for the grammar rules The technology (plain CFG rules with a set of terminals) is the same as we ve been using The good thing about them is that you get exactly the semantic rules you need The bad thing is that you need to develop a new grammar for each new domain Semantic Grammars Typically used in conversational agents in constrained domains Limited vocabulary Limited grammatical complexity Chart parsing (Earley) can often produce all that s needed for semantic interpretation even in the face of ungrammatical input. 59 60 10