Supplement to Logic Unit: Logical Structure in Natural Language

Similar documents
Compositional Semantics

Proof Theory for Syntacticians

Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions.

Lecture 9. The Semantic Typology of Indefinites

AN EXPERIMENTAL APPROACH TO NEW AND OLD INFORMATION IN TURKISH LOCATIVES AND EXISTENTIALS

CS 598 Natural Language Processing

A Minimalist Approach to Code-Switching. In the field of linguistics, the topic of bilingualism is a broad one. There are many

Focusing bound pronouns

Informatics 2A: Language Complexity and the. Inf2A: Chomsky Hierarchy

The Intertwining Influences of Logic, Philosophy, and Linguistics in the Development of Formal Semantics and Pragmatics.

LFG Semantics via Constraints

a) analyse sentences, so you know what s going on and how to use that information to help you find the answer.

Some Principles of Automated Natural Language Information Extraction

Ch VI- SENTENCE PATTERNS.

Constraining X-Bar: Theta Theory

Developing Grammar in Context

Type-driven semantic interpretation and feature dependencies in R-LFG

Intra-talker Variation: Audience Design Factors Affecting Lexical Selections

Chapter 4: Valence & Agreement CSLI Publications

Fenstad, Jens Erik: Grammar, Geometry, & Brain. CSLI Lecture Notes. CSLI Pages.

COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR

Minimalism is the name of the predominant approach in generative linguistics today. It was first

Language acquisition: acquiring some aspects of syntax.

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

Multiple case assignment and the English pseudo-passive *

THE ANTINOMY OF THE VARIABLE: A TARSKIAN RESOLUTION Bryan Pickel and Brian Rabern University of Edinburgh

Pseudo-Passives as Adjectival Passives

Grammars & Parsing, Part 1:

Types and Lexical Semantics

ELD CELDT 5 EDGE Level C Curriculum Guide LANGUAGE DEVELOPMENT VOCABULARY COMMON WRITING PROJECT. ToolKit

BULATS A2 WORDLIST 2

Control and Boundedness

Basic Syntax. Doug Arnold We review some basic grammatical ideas and terminology, and look at some common constructions in English.

"f TOPIC =T COMP COMP... OBJ

Concept Acquisition Without Representation William Dylan Sabo

Writing a composition

Frequency and pragmatically unmarked word order *

Underlying and Surface Grammatical Relations in Greek consider

1 st Quarter (September, October, November) August/September Strand Topic Standard Notes Reading for Literature

Spring 2016 Stony Brook University Instructor: Dr. Paul Fodor

The Strong Minimalist Thesis and Bounded Optimality

Segmented Discourse Representation Theory. Dynamic Semantics with Discourse Structure

Aspectual Classes of Verb Phrases

Natural Language Processing. George Konidaris

Chapter 3: Semi-lexical categories. nor truly functional. As Corver and van Riemsdijk rightly point out, There is more

Lecture 2: Quantifiers and Approximation

ENGBG1 ENGBL1 Campus Linguistics. Meeting 2. Chapter 7 (Morphology) and chapter 9 (Syntax) Pia Sundqvist

Version Space. Term 2012/2013 LSI - FIB. Javier Béjar cbea (LSI - FIB) Version Space Term 2012/ / 18

Parsing of part-of-speech tagged Assamese Texts

Procedia - Social and Behavioral Sciences 154 ( 2014 )

Word Stress and Intonation: Introduction

Loughton School s curriculum evening. 28 th February 2017

An Introduction to the Minimalist Program

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

An Approach to Polarity Sensitivity and Negative Concord by Lexical Underspecification

Highlighting and Annotation Tips Foundation Lesson

Hindi Aspectual Verb Complexes

Advanced Grammar in Use

First Grade Curriculum Highlights: In alignment with the Common Core Standards

DEGREE MODIFICATION IN NATURAL LANGUAGE JESSICA RETT. A Dissertation submitted to the. Graduate School-New Brunswick

Virtually Anywhere Episodes 1 and 2. Teacher s Notes

Inleiding Taalkunde. Docent: Paola Monachesi. Blok 4, 2001/ Syntax 2. 2 Phrases and constituent structure 2. 3 A minigrammar of Italian 3

Universal Grammar 2. Universal Grammar 1. Forms and functions 1. Universal Grammar 3. Conceptual and surface structure of complex clauses

Shared Mental Models

Dear Teacher: Welcome to Reading Rods! Reading Rods offer many outstanding features! Read on to discover how to put Reading Rods to work today!

Derivational: Inflectional: In a fit of rage the soldiers attacked them both that week, but lost the fight.

Part I. Figuring out how English works

Argument structure and theta roles

What the National Curriculum requires in reading at Y5 and Y6

A Version Space Approach to Learning Context-free Grammars

SOME MINIMAL NOTES ON MINIMALISM *

Dual Content Semantics, privative adjectives, and dynamic compositionality

Negative indefinites and negative concord

A R "! I,,, !~ii ii! A ow ' r.-ii ' i ' JA' V5, 9. MiN, ;

Possessive have and (have) got in New Zealand English Heidi Quinn, University of Canterbury, New Zealand

THEORETICAL CONSIDERATIONS

Intension, Attitude, and Tense Annotation in a High-Fidelity Semantic Representation

CAS LX 522 Syntax I. Long-distance wh-movement. Long distance wh-movement. Islands. Islands. Locality. NP Sea. NP Sea

IN THIS UNIT YOU LEARN HOW TO: SPEAKING 1 Work in pairs. Discuss the questions. 2 Work with a new partner. Discuss the questions.

UCLA UCLA Electronic Theses and Dissertations

Evolution of Collective Commitment during Teamwork

Chunk Parsing for Base Noun Phrases using Regular Expressions. Let s first let the variable s0 be the sentence tree of the first sentence.

Case government vs Case agreement: modelling Modern Greek case attraction phenomena in LFG

Collocations of Nouns: How to Present Verb-noun Collocations in a Monolingual Dictionary

1/20 idea. We ll spend an extra hour on 1/21. based on assigned readings. so you ll be ready to discuss them in class

Toward Probabilistic Natural Logic for Syllogistic Reasoning

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

Emmaus Lutheran School English Language Arts Curriculum

The Interface between Phrasal and Functional Constraints

Specifying Logic Programs in Controlled Natural Language

Approaches to control phenomena handout Obligatory control and morphological case: Icelandic and Basque

Are You Ready? Simplify Fractions

Taught Throughout the Year Foundational Skills Reading Writing Language RF.1.2 Demonstrate understanding of spoken words,

CONSTRUCTION OF AN ACHIEVEMENT TEST Introduction One of the important duties of a teacher is to observe the student in the classroom, laboratory and

Welcome to SAT Brain Boot Camp (AJH, HJH, FJH)

THE SHORT ANSWER: IMPLICATIONS FOR DIRECT COMPOSITIONALITY (AND VICE VERSA) Pauline Jacobson. Brown University

Programma di Inglese

Context Free Grammars. Many slides from Michael Collins

Participate in expanded conversations and respond appropriately to a variety of conversational prompts

Som and Optimality Theory

Transcription:

1 Supplement to Logic Unit: Logical Structure in Natural Language 1. Quantifiers: Introduction...1 2. A solution (Montague 197): NPs as Generalized Quantifiers...4. Semantic explanations of linguistic phenomena: a case study: Negative polarity items. ( NPI s )...6 Appendix 1. Recursive definition of " " on semantic types that "end with a t"....7 References....9 1. Quantifiers: Introduction We ve studied quantifiers in first-order quantificational logic (= predicate logic), and you ve had some experience translating sentences from English into predicate logic. Now let s look at English sentences containing quantifiers and see whether logic helps us analyze them semantically. The answer: yes, but first-order quantificational logic isn t enough if we want to do justice to the structure of natural language. For some historical perspective behind what we ll talk about today and next time, see (Partee 1996, Partee 2004) and the longer version of the latter on my website: http://people.umass.edu/partee/docs/bhp_essay_feb05.pdf. A good introduction to much of what we ll do in these two classes can be found in (Larson 1995). And a somewhat more advanced introduction, including an introduction to the lambda calculus, which we won t go into here, can be found in Part D, English as a Formal Language, of the Partee, ter Meulen and Wall textbook: Chapter 1 Basic Concepts and Chapter 14 Generalized Quantifiers. Consider the following sentence containing a universal quantifier-word every and an indefinite article a. The sentence is semantically ambiguous: we can think of the indefinite article as introducing an existential quantifier and every as introducing a universal quantifier, and the two quantifiers can be interpreted in either scope order. (1) a. Every student read a book. (Quantifier scope ambiguity) Just one (surface) syntactic structure: b. S NP VP DET CNP V NP every student read DET CNP a book Predicate logic representations of the two readings: (2) (i) x ( Student (x) y ( Book (y) & Read (x,y)) (ii) y ( Book (y) & x ( Student (x) Read (x,y)) 2 Compositional interpretation of the English sentence: How do we derive the meaning of the whole from the meaning of the parts? -- First question: what are the parts? The difficulty for compositionality if we try to use predicate calculus to represent logical form : What is the interpretation of every student? There is no appropriate syntactic category or semantic type in predicate logic. Inadequacy of 1st-order predicate logic for representing the semantic structure of natural language. We can solve this problem when we have (the lambdacalculus and) 1 a richer type theory. Categories of PC: Categories of NL: Formula - Sentence Predicate - Verb, Common Noun, Adjective Term Constant - Proper Noun Variable - Pronoun (he, she, it) ========== (no more) - Verb Phrase, Noun Phrase, Common Noun Phrase, Adjective Phrase, Determiner, Preposition, Prepositional Phrase, Adverb. Where in the formula below is the meaning of every student? () a. x ( Student (x) Walk (x)) Answer: it s all the underlined parts, that is, everything except the predicate Walk! b. x ( Student (x) Walk (x)) It s not a constituent in the logical formula, although it is a constituent in the English sentence. Predicate logic helps us express the truth-conditional content of these English sentences, but it does not capture the structure of such English sentences. Consider the following sentences and their structures: English on the left, PC on the right: (4) a. S S (Formula) NP VP Pred Term 5 John walks Walk John John walks. Walk (John) 1 Using the lambda-calculus to help express higher-type interpretations is often helpful, but it s not strictly essential, and we will manage to do without it here.

b. S S 5 x 9 Det N walks S S 2 2 every student Pred-1 x Pred-1 x Every s. ( x)(student (x) walk(x)) c. S S 5 x 9 Det N walks S & S 2 2 some student Pred-1 x Pred-1 x Some s. ( x)(student(x) & walk(x)) S d. S S 5 x 9 Det N walks S & S 2 2 no student Pred-1 x Pred-1 x No s. ( x)(student(x) & walk(x)) This last sentence, No s, has two equivalent translations into predicate logic: the one in the tree above and ( x)(student(x) walk(x)). That would have a different tree. What similarities do you see in the four English trees? And what differences do you see between the English syntactic structures and the structures of the formulas that are their logical translations? 2. A solution (Montague 197): NPs as Generalized Quantifiers Determiner meanings: Relations between sets, or functions which apply to one set (the interpretation of the CNP) to give a function from sets to truth values, or equivalently, a set of sets (the interpretation of the NP). Typical case: S NP VP DET CNP Semantic types: Basic types: e, the type of entities, and t, the type of truth values. Functional types: a b: the type of functions from a-type things to b-type things Example: 1-place predicates denote sets of entities; the type of the characteristic function for a set of entities is e t. So this is the type for simple nouns like student, intransitive verbs like walks, and simple adjectives like red. We ll also assume it s the type for all VPs. The type for S will be t. Proper names in English, like terms in logic, can be assumed to be of type e. CNP: type e t VP: type e t DET: interpreted as a function which applies to CNP meaning to give a generalized quantifier, which is a function which applies to VP meaning to give Sentence meaning (extension: truth value). type: (e t) ((e t) t) NP: type (e t) t (Let s work through these functional types on the blackboard) Sometimes it is simpler to think about DET meanings in relational terms, as a relation between a CNP-type meaning and a VP-type meaning, using the equivalence between a function that takes a pair of arguments and a function that takes two arguments one at a time. Every: as a relation between sets A and B ( Every A B ): A B Some, a: A B. No: A B =. Most (not first-order expressible): A B > A - B. [[Every]] (A,B) = 1 iff A B. [[Some]] (A,B) = 1 iff A B. [[No]](A,B) = 1 iff A B = [[Most]](A,B) =1 iff A B > A - B. 4

Determiners as one-place functions whose value is also a function: But to mimic the structure of the English NP, we want every to combine with one predicate, not with two predicates at the same time. Here s the trick: we can define every as an expression that combines with a predicate to yield a predicate that combines with another predicate: (5) a. Every (A) (B) = 1 iff A B b. Some (A)(B) = 1 iff A B c. No (A)(B) = 1 iff A B = Every(A) denotes a predicate of predicates: a set of predicates. We will call such a creature a generalized quantifier. A predicate B should be in the extension of EVERY(A) iff A is a subset of B. Similarly for Every: takes as argument a set A and gives as result {B A B}: the set of all sets that contain A as a subset. Equivalently: Every (A) = {B x ( x A x B)} Some, a: takes as argument a set A and gives as result {B A B }. -- How would you express the meaning of Not every this way? Most? Applying a function to its argument (blackboard): Every (Student) Combining NP and VP (blackboard). Linguistic universal: Natural language determiners are conservative functions. (Barwise and Cooper 1981) Definition: A determiner meaning D is conservative iff for all A,B, D(A)(B) = D(A)(A B). Examples: No solution is perfect = No solution is a perfect solution. Exactly three circles are blue = Exactly three circles are blue circles. Every boy is singing = every boy is a boy who is singing. Non-example : Only is not conservative; but it can be argued that only is not a determiner. Only males are astronauts (false) only males are male astronauts (true). Theorem: (Keenan and Stavi 1986, van Benthem 1986) Starting from every and a as basic determiners, and building other determiner meanings by the Boolean operations of negation, conjunction, and disjunction, the resulting set of determiners consists of exactly the conservative determiners. Suggested consequence: The conservativity universal is probably linked to the Boolean structure that is found throughout natural language semantics. It may be conjectured (Chierchia and McConnell-Ginet 1990) that we are mentally endowed with cross-categorial Boolean functions as the basic combinatory tool of our capacity for concept formation.. Semantic explanations of linguistic phenomena: a case study: Negative polarity items. ( NPI s ). In English there are negative polarity items (NPI s) which are restricted to occurring in certain contexts, of which negative contexts are typical licensing contexts, but not the only contexts. The linguistic problem is to characterize the nature of the contexts in which NPI s can and cannot occur (and to characterize the NPI s themselves, but we will not try to do that here.) Examples: any, anyone, anything, anywhere, ever, at all; give a damn, lift a finger, move a muscle, pay the slightest attention. Examples: (1) I did not see any lions. (2) *I saw any lions. () If you have any questions, you can call me. (4) Noone has ever found a unicorn. (5) *Someone has ever found a unicorn. (6) No student who knows anything about phonology would ever say that. (7) *Some student who knows anything about phonology would ever say that. (8) Every student who knows anything about phonology will know the answer. (9) *Every student who knows phonology would ever say that. The semantic generalization discovered by (Ladusaw 1979) is that NPI s occur inside the argument of monotone decreasing functions. This notion is much more general than the notion of negation, and covers all of the above examples and many others; and it is an intrinsically model-theoretic concept a real semantic property of the interpretation of the expressions, not a formal property of representations in some sort of logical form. Definition (general): A function f is monotone increasing if whenever a b, f(a) f(b). A function f is monotone decreasing if whenever a b, f(b) f(a). Application to determiner meanings: (Note: on the domains in our model, the basic ordering relation begins from the ordering on type t: 0 < 1; and for all types whose interpretations are sets, the corresponding notion of less than then becomes subset of. See Appendix 1 below for more details.) Definitions: A determiner D is right monotone increasing (sometimes called right upward entailing or monotone ) iff whenever B C, D(A)(B) entails D(A)(C). A determiner D is right monotone decreasing (right downward entailing or monotone ) iff whenever B C, D(A)(C) entails D(A)(B). A determiner D is left monotone increasing (left upward entailing or monotone) iff whenever A C, D(A)(B) entails D(C)(B). A determiner D is left monotone decreasing (left downward entailing or monotone) iff whenever A C, D(C)(B) entails D(A)(B). 5 6

Illustrations: In a structure Det CNP VP, the left position is the CNP argument, and the right position is the VP argument. 1. To show, for instance, that no is right monotone decreasing, we use a test like the following: (i) B C: knows Turkish and Chinese knows Turkish (ii) test entailment: No student knows Turkish no student knows Turkish and Chinese. Valid. So no is right monotone decreasing. 2. To show that no is left monotone decreasing, we use a test like the following: (ii) No student knows Urdu no Italian student knows Urdu. Valid.. Similarly we can show that some is right monotone increasing. (i) knows Turkish and Chinese knows Turkish (ii) Some student knows Turkish and Chinese some student knows Turkish. 4. Some is also left monotone increasing. (ii) Some Italian student knows Urdu some student knows Urdu. Valid. 5. Interesting fact about every. While most determiners are like some and no in being either left and right increasing or left and right decreasing (so that it makes sense to call some positive and no negative ), there are some determiners, of which the universal quantifier every is the most basic example, which have different properties for their left and right arguments. 5a. Every is left monotone decreasing: (ii) Every student knows Urdu Every Italian student knows Urdu. Valid. 5b: Every is right monotone increasing: (i) knows Turkish and Chinese knows Turkish (ii) Every student knows Turkish and Chinese Every student knows Turkish. The distribution of polarity items in the CNP part and the VP part of the sentences (4-9) above, and others like them, is accounted for by the monotonicity properties of the determiners in them. This account reinforces the analysis of determiners as functions which take a CNP as first argument, and the resulting NP interpretation (a generalized quantifier) as a function which takes the VP as its argument. Appendix 1. Recursive definition of " " on semantic types that "end with a t". Relevant background (Partee and Rooth 198), (Ladusaw 1980). (The first of these doesn't mention monotonicity, but it defines conjunction recursively on the same family of semantic types. There are considerable formal parallels in the two recursive definitions.) Both can be found in the useful collection (Portner and Partee 2002). Semantic types: from the extensional part of Montague's intensional logic Basic types: e, t (entities, truth values) Recursively defined functional types: If a, b are types, then <a,b> is a type. <a,b> is the type of expressions which denote functions from a-type things to b-type things. Negation as a sentence operator is of type <t,t>. One-place predicates (nouns, some adjectives, maybe intransitive verbs) are of type <e,t>. Generalized quantifiers are of type <<e,t,>,t>. Semantic domains. Start with a set D of entities and the set of two truth values {0,1}. Then recursively define the domain of possible denotations D a for expressions of any type a. D t, the domain of possible denotations for expressions of type t, is {0,1}. D e, the domain of possible denotations for expressions of type e, is D. D <a,b>, the domain of possible denotations for expressions of type <a,b>, is the set of all functions from D a to D b. Now we're ready to start the definition of "less than" that will be used in the definition of "monotone" across semantic types. First we need to define the set of types that end with a t. Recursive definition of types that end with a t : 1. Type t "ends with a t." 2. If x is any type at all, and y is a type that "ends with a t", then <x,y> is a type that "ends with a t." (The result is in fact the set of all types that have t as their last letter' symbol, with any number (0 or more) of right brackets (>>>...> ) following that last t. Every type ends with either an e or a t.) Recursive definition of : Now we can define on the set of all types that end in a t. 1. 0 0, 0 1, 1 1. [In fact, we can divide into < and = in the natural way here.] 2. For any functional type <a,b>, and any f, g in D <a,b>, f g iff: For all x in D a, f(x) g(x). That's it. Let's see how it applies to types t, <e,t> and <<e,t>,t>. First, type t. This is the type of truth values, the extensions of sentences. Using a to represent the semantic value of a, we can observe that for the case of sentences, a b is equivalent to "a implies b" (material implication, defined by the usual truth table), because it's true when a and b are both true (1 1), when they are both false (0 0), or when a is false and b is true (0 1). The only case where it's false is where a is true and b is false (NOT: 1 0). (Note: is not defined on type e, nor on any type that "ends with an e.) Then type <e,t>, the type of characteristic functions of sets of entities, the semantic values of one-place predicate expressions (nouns, simple adjectives, maybe simple intransitive verbs, some prepositional phrases, etc.) Let A, B be two expressions of type <e,t>, e.g. two common noun 7 8

phrases. A B is true iff for all d in D, A (d) B (d). But since on sentences means "implies", this is another way of saying that A is a subset of B. So we've derived the fact that less-than on predicates means "subset of" from the fact that less-than on propositions means "implies" Now type <<e,t>,t>: Let me go straight to model-theoretic terms, bypassing the expressions. For all P, P' in D <<e,t>,t>, P P' iff for all Q in D <e,t>, P(Q) P'(Q). Since the possible denotations of type <<e,t>,t> are sets of sets, this again turns out to the be subset relation, though this time it's the subset relation among sets of sets rather than among sets of entities. Monotonicity: Having defined across all the types that end with a t, we can define monotonicity for all functional types that end with a t which have arguments that also end with a t: For any type <a,b> such that both a and <a,b> "end with a t", and any function f of type <a,b>, f is monotone increasing iff for all x,y in D a, if x y, then f(x) f(y). References. Barwise, Jon, and Cooper, Robin. 1981. Generalized quantifiers and natural language. Linguistics and Philosophy 4:159-219. Reprinted in Portner and Partee, eds., 2002, 75-126. Chierchia, Gennaro, and McConnell-Ginet, Sally. 1990. Meaning and Grammar. An Introduction to Semantics. Cambridge: MIT Press. Keenan, Edward, and Stavi, Jonathan. 1986. A Semantic Characterization of Natural Language Determiners. Linguistics and Philosophy 9:25-26. Ladusaw, William. 1979. Polarity Sensitivity as Inherent Scope Relations, University of Texas at Austin: Ph.D. dissertation. Ladusaw, William. 1980. On the notion "affective" in the analysis of negative polarity items. Journal of Linguistic Research 1:1-16. Reprinted in Portner and Partee (2002), pp. 457-470. Larson, Richard. 1995. Semantics. In An Invitation to Cognitive Science. Vol 1: Language, eds. Lila Gleitman and Mark Liberman, 61-80. Cambridge, MA: The MIT Press. Partee, Barbara H., and Rooth, Mats. 198. Generalized conjunction and type ambiguity. In Meaning, Use, and Interpretation of Language, eds. Rainer Bäuerle, Christoph Schwarze and Arnim von Stechow, 61-8. Berlin: de Gruyter. Reprinted in Portner and Partee, eds., 2002, 4-56. Partee, Barbara H. 1996. The development of formal semantics in linguistic theory. In The Handbook of Contemporary Semantic Theory, ed. Shalom Lappin, 11-8. Oxford: Blackwell. Partee, Barbara H. 2004. Reflections of a formal semanticist. In Compositionality in Formal Semantics: Selected Papers by Barbara H. Partee, 1-25. Oxford: Blackwell Publishing. Portner, Paul, and Partee, Barbara H. eds. 2002. Formal Semantics: The Essential Readings. Oxford: Blackwell Publishers. van Benthem, Johan. 1986. Essays in Logical Semantics. Dordrecht: Reidel. 9 10