Ling 566 Nov 22, 2016

Similar documents
Chapter 4: Valence & Agreement CSLI Publications

Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions.

Basic Syntax. Doug Arnold We review some basic grammatical ideas and terminology, and look at some common constructions in English.

A Minimalist Approach to Code-Switching. In the field of linguistics, the topic of bilingualism is a broad one. There are many

a) analyse sentences, so you know what s going on and how to use that information to help you find the answer.

Hindi Aspectual Verb Complexes

Part I. Figuring out how English works

1/20 idea. We ll spend an extra hour on 1/21. based on assigned readings. so you ll be ready to discuss them in class

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

CS 598 Natural Language Processing

Approaches to control phenomena handout Obligatory control and morphological case: Icelandic and Basque

Constraining X-Bar: Theta Theory

ENGBG1 ENGBL1 Campus Linguistics. Meeting 2. Chapter 7 (Morphology) and chapter 9 (Syntax) Pia Sundqvist

Inleiding Taalkunde. Docent: Paola Monachesi. Blok 4, 2001/ Syntax 2. 2 Phrases and constituent structure 2. 3 A minigrammar of Italian 3

Words come in categories

Universal Grammar 2. Universal Grammar 1. Forms and functions 1. Universal Grammar 3. Conceptual and surface structure of complex clauses

THE INTERNATIONAL JOURNAL OF HUMANITIES & SOCIAL STUDIES

Advanced Grammar in Use

Pseudo-Passives as Adjectival Passives

Enhancing Unlexicalized Parsing Performance using a Wide Coverage Lexicon, Fuzzy Tag-set Mapping, and EM-HMM-based Lexical Probabilities

Construction Grammar. Laura A. Michaelis.

The building blocks of HPSG grammars. Head-Driven Phrase Structure Grammar (HPSG) HPSG grammars from a linguistic perspective

Developing Grammar in Context

Construction Grammar. University of Jena.

Some Principles of Automated Natural Language Information Extraction

Developing a TT-MCTAG for German with an RCG-based Parser

Language acquisition: acquiring some aspects of syntax.

The presence of interpretable but ungrammatical sentences corresponds to mismatches between interpretive and productive parsing.

Citation for published version (APA): Veenstra, M. J. A. (1998). Formalizing the minimalist program Groningen: s.n.

Case government vs Case agreement: modelling Modern Greek case attraction phenomena in LFG

Minimalism is the name of the predominant approach in generative linguistics today. It was first

Derivational and Inflectional Morphemes in Pak-Pak Language

Intra-talker Variation: Audience Design Factors Affecting Lexical Selections

Parsing of part-of-speech tagged Assamese Texts

Today we examine the distribution of infinitival clauses, which can be

Som and Optimality Theory

LNGT0101 Introduction to Linguistics

Proof Theory for Syntacticians

Adjectives tell you more about a noun (for example: the red dress ).

Informatics 2A: Language Complexity and the. Inf2A: Chomsky Hierarchy

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

Grammars & Parsing, Part 1:

Language Acquisition by Identical vs. Fraternal SLI Twins * Karin Stromswold & Jay I. Rifkin

DIRECT AND INDIRECT SPEECH

Ch VI- SENTENCE PATTERNS.

ELD CELDT 5 EDGE Level C Curriculum Guide LANGUAGE DEVELOPMENT VOCABULARY COMMON WRITING PROJECT. ToolKit

Psychology and Language

Chapter 3: Semi-lexical categories. nor truly functional. As Corver and van Riemsdijk rightly point out, There is more

LING 329 : MORPHOLOGY

An Introduction to the Minimalist Program

CHILDREN S POSSESSIVE STRUCTURES: A CASE STUDY 1. Andrew Radford and Joseph Galasso, University of Essex

Theoretical Syntax Winter Answers to practice problems

CAS LX 522 Syntax I. Long-distance wh-movement. Long distance wh-movement. Islands. Islands. Locality. NP Sea. NP Sea

SOME MINIMAL NOTES ON MINIMALISM *

More Morphology. Problem Set #1 is up: it s due next Thursday (1/19) fieldwork component: Figure out how negation is expressed in your language.

Derivational: Inflectional: In a fit of rage the soldiers attacked them both that week, but lost the fight.

Campus Academic Resource Program An Object of a Preposition: A Prepositional Phrase: noun adjective

Tagged for Deletion: A Typological Approach to VP Ellipsis in Tag Questions

Using dialogue context to improve parsing performance in dialogue systems

Software Maintenance

Basic Parsing with Context-Free Grammars. Some slides adapted from Julia Hirschberg and Dan Jurafsky 1

Intension, Attitude, and Tense Annotation in a High-Fidelity Semantic Representation

Loughton School s curriculum evening. 28 th February 2017

Argument structure and theta roles

Derivations (MP) and Evaluations (OT) *

A Usage-Based Approach to Recursion in Sentence Processing

On the Notion Determiner

Procedia - Social and Behavioral Sciences 154 ( 2014 )

Underlying and Surface Grammatical Relations in Greek consider

Dependency, licensing and the nature of grammatical relations *

Procedia - Social and Behavioral Sciences 141 ( 2014 ) WCLTA Using Corpus Linguistics in the Development of Writing

(3) Vocabulary insertion targets subtrees (4) The Superset Principle A vocabulary item A associated with the feature set F can replace a subtree X

Multiple case assignment and the English pseudo-passive *

L1 and L2 acquisition. Holger Diessel

A First-Pass Approach for Evaluating Machine Translation Systems

Parallel Evaluation in Stratal OT * Adam Baker University of Arizona

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

Context Free Grammars. Many slides from Michael Collins

UKLO Round Advanced solutions and marking schemes. 6 The long and short of English verbs [15 marks]

Word Stress and Intonation: Introduction

In search of ambiguity

Aspectual Classes of Verb Phrases

Structure-Preserving Extraction without Traces

ON THE SYNTAX AND SEMANTICS

A is an inde nite nominal pro-form that takes antecedents. ere have

Minding the Absent: Arguments for the Full Competence Hypothesis 1. Abstract

arxiv:cmp-lg/ v1 16 Aug 1996

babysign 7 Answers to 7 frequently asked questions about how babysign can help you.

Using a Native Language Reference Grammar as a Language Learning Tool

Replace difficult words for Is the language appropriate for the. younger audience. For audience?

Negation through reduplication and tone: implications for the LFG/PFM interface 1

Intensive English Program Southwest College

The Strong Minimalist Thesis and Bounded Optimality

Switched Control and other 'uncontrolled' cases of obligatory control

Authors note Chapter One Why Simpler Syntax? 1.1. Different notions of simplicity

The stages of event extraction

BANGLA TO ENGLISH TEXT CONVERSION USING OPENNLP TOOLS

Organizing Comprehensive Literacy Assessment: How to Get Started

UCLA UCLA Electronic Theses and Dissertations

TECHNICAL REPORT FORMAT

Transcription:

Ling 566 Nov 22, 2016 Auxiliaries cont: NICE

Overview NICE properties of auxiliaries The auxiliary do NICE properties (lexical rules) Reading questions 2

Descriptive Summary of the NICE Properties Negation Inversion Contraction Ellipsis Sentences are negated by putting not after the first auxiliary verb; they can be reaffirmed by putting too or so in the same position Questions are formed by putting an auxiliary verb before the subject NP Auxiliary verbs take negated forms, with n t affixed Verb phrases immediately following an auxiliary verb can be omitted 3

Negation (and Reaffirmation) Polar adverbs (sentential not, so, and too) appear immediately following an auxiliary Pat will not leave Pat will SO leave Pat will TOO leave What about examples like Not many people left? What happens when you want to deny or reaffirm a sentence with no auxiliary? Pat left Pat did not leave Pat did TOO leave 4

Like modals, auxiliary do only occurs in finite contexts: *Pat continued to do not leave Unlike modals, do cannot be followed by other auxiliaries: *Pat did not have left do, auxv-lxm ARG-ST SEM The Auxiliary do HEAD FORM HEAD X, SEM INDEX INDEX s RESTR fin verb FORM AUX s base 5

The ADV pol -Addition Lexical Rule pi-rule INPUT X, HEAD verb FORM fin POL AUX + ARG-ST 1 A SEM INDEX s 1 OUTPUT Y, HEAD POL + VAL SPR Z ARG-ST 1 ADV pol INDEX s 2 RESTR ARG s 1 A SEM INDEX s 2 6

What does the type pi-rule mean? It maps words to words (hence, post-inflectional ) It preserves MOD values, HEAD values as a default, and (like other lexical rule types) SEM values as a default INPUT / 0, word HEAD / 1 VAL MOD A SEM / 2 OUTPUT / 0, word HEAD / 1 VAL MOD A SEM / 2 7

Why doesn t ADV pol -Addition LR mention VAL? pi-rule INPUT X, HEAD verb FORM fin POL AUX + ARG-ST 1 A SEM INDEX s 1 OUTPUT Y, HEAD POL + VAL SPR Z ARG-ST 1 ADV pol INDEX s 2 RESTR ARG s 1 A SEM INDEX s 2 8

What is the role of these indices? pi-rule INPUT X, HEAD verb FORM fin POL AUX + ARG-ST 1 A SEM INDEX s 1 OUTPUT Y, HEAD POL + VAL SPR Z ARG-ST 1 ADV pol INDEX s 2 RESTR ARG s 1 A SEM INDEX s 2 9

Which nots does the rule license? pi-rule INPUT X, HEAD verb FORM fin POL AUX + ARG-ST 1 A SEM INDEX s 1 OUTPUT Y, HEAD POL + VAL SPR Z ARG-ST 1 ADV pol INDEX s 2 RESTR ARG s 1 A SEM INDEX s 2 Andy must not have been sleeping? Andy must have not been sleeping? Andy must have been not sleeping? Kleptomaniacs cannot not steal. Kleptomaniacs cannot not steal. 10

Negation and Reaffirmation: A Sample Tree S NP VP Leslie V ADV pol VP did so eat the whole pizza 11

Inversion Yes-no questions begin with an auxiliary: Will Robin win? The NP after the auxiliary has all the properties of a subject Agreement: Have they left? vs. *Has they left? Case: *Have them left? Raising: Will there continue to be food at the meetings? What happens if you make a question out of a sentence without an auxiliary? Robin won Did Robin win? 12

The Inversion Lexical Rule pi-rule INPUT W, HEAD verb FORM fin AUX + VAL SPR X ARG-ST A SEM MODE prop OUTPUT Z, HEAD INV + VAL SPR ARG-ST A SEM MODE ques 13

How the Rule Yields Inverted Order pi-rule INPUT W, HEAD verb FORM fin AUX + VAL SPR X ARG-ST A SEM MODE prop OUTPUT Z, HEAD INV + VAL SPR ARG-ST A SEM MODE ques...plus the ARP 14

The Feature INV What is the INV value of inputs to the Inversion LR? Perhaps surprisingly, the input is INV + Word-to-word rules (pi-rules) have default identity of HEAD features, and no INV value is given on the input Then what work is the feature doing? It s used to mark auxiliaries that can t or must be inverted You better watch out vs. *Better you watch out I shall go (shall ~ will ) vs. Shall I go? (shall ~ should ) 15

Other Cases of Inversion Inversion is not limited to questions Preposed negatives: Never have I been so upset! Conditionals: Had we known, we would have left. Exclamations: May your teeth fall out! Does our rule account for these? No. Our rule s output says MODE ques. And each construction has slightly different idiosyncrasies. How might we extend our analysis to cover them? Define a type of inversion lexical rules, sharing certain properties, but with some differences. 16

Inversion: A Sample Tree S V NP VP Did Leslie eat the entire pizza? 17

Contraction There are several types of contraction in English, but we re only talking about words ending in n t It may seem like just not said fast, but there s more to it Only finite verbs can take n t: *Terry must haven t seen us There are morphological irregularities: won t, not *willn t %shan t, not *shalln t mustn t pronounced mussn t don t pronounced doen t, not dewn t *amn t 18

The Contraction Lexical Rule pi-rule INPUT 2, HEAD verb FORM fin AUX + POL ARG-ST B SEM INDEX s 1 RESTR A OUTPUT F NEG ( 2 ), HEAD POL + VAL SPR X ARG-ST B SEM INDEX s 2 RESTR RELN not SIT s 2 ARG s 1 A 19

Most of the work is in the semantics pi-rule INPUT 2, HEAD verb FORM fin AUX + POL ARG-ST B SEM INDEX s 1 RESTR A OUTPUT F NEG ( 2 ), HEAD POL + VAL SPR X ARG-ST B SEM INDEX s 2 RESTR RELN not SIT s 2 ARG s 1 A Why? 20

What does POL do? pi-rule INPUT 2, HEAD verb FORM fin AUX + POL ARG-ST B SEM INDEX s 1 RESTR A OUTPUT F NEG ( 2 ), HEAD POL + VAL SPR X ARG-ST B SEM INDEX s 2 RESTR RELN not SIT s 2 ARG s 1 A *We can tn t stop *They won t TOO mind 21

Contraction: Sample Tree S NP VP Leslie V VP wouldn t eat the entire pizza 22

Ellipsis Ellipsis allows VPs to be omitted, so long as they would have been preceded by an auxiliary * Pat couldn t have been watching us, but Chris could have been watching us. Unlike the other NICE properties, this holds of all auxiliaries, not just finite ones. What is the elliptical counterpart to a sentence with no auxiliary? Whenever Pat watches TV, Chris watches TV Whenever Pat watches TV, Chris does 23

The Ellipsis Lexical Rule d-rule INPUT OUTPUT 1, 1, auxv-lxm ARG-ST 2 dervv-lxm ARG-ST 2 A Note that this is a derivational LR (d-rule) -- that is, lexeme-to-lexeme This means that and SEM are unchanged, by default 24

Ellipsis: A Sample Output could, auxv-lxm HEAD FORM fin AUX + POL AGR 1 VAL SPR AGR 1 ARG-ST NP SEM MODE prop INDEX s 1 RESTR RELN could SIT s 1 ARG s 2 25

Ellipsis: A Sample Tree S NP VP Kim V VP could V VP have V VP been attending the conference 26

Semantics of Ellipsis S NP VP Kim could What is the SEM value of the S node of this tree? INDEX s 1 MODE prop RELN name RELN could RESTR NAME Kim, SIT s 1 NAMED i ARG s 2 Note: s 2 has to be filled in by context. 27

Infinitival to Revisited VP Ellipsis can occur after to: We didn t find the solution, but we tried to. This is covered by our Ellipsis LR if we say to is AUX +. Since AUX is declared on type verb, it follows that to is a verb. 28

do Revisited Chomsky s old analysis: in sentences w/o auxiliaries... Tense can get separated from the verb in various ways Negation/Reaffirmation inserts something between Tense and the following verb Inversion moves Tense to the left of the subject NP Ellipsis deletes what follows Tense When this happens, do is inserted to support Tense Our counterpart: NICE properties hold only of auxiliaries do is a semantically empty auxiliary, so negated, reaffirmed, inverted, and elliptical sentences that are the semantic counterparts to sentences w/o auxiliaries are ones with do. 29

Summary Our analysis employs straightforward mechanisms Lexical entries for auxiliaries 3 new features (AUX, POL, INV) 4 lexical rules We handle a complex array of facts co-occurrence restrictions (ordering & iteration) the NICE properties auxiliary do combinations of NICE constructions 30

Overview NICE properties of auxiliaries The auxiliary do NICE properties (lexical rules) Reading questions 31

But first Midterms returned Be sure to make use of answer keys Thanksgiving game: Bagels, Kim likes. 32

Reading Questions Why do classify aux verbs that do not have any inflection (such as could and will) as verb-lxm rather than const-lxm? Why is better an auxiliary? To account for semantic difference after inversion for aux verbs like Shall on p414, what would the difference in semantics for those two lexical entries with different INV value? Do they also show semantic difference when the sentence is in the normal order(proposition)? 33

Reading Questions This raised questions for me about how semimodals would be formalized: ought, for example, doesn't inflect but takes an infinitive as a complement instead of a bare base form. I'm not really sure inverting it sounds right. Its negation can appear in contracted form. It can undergo ellipsis, but keeps the to (--Who should drive? --Well, I think Steve ought to.) 34

Reading Questions Do we want to classify rules as d-rules or i- rules whenever possible? I'm curious why all these rules are pi-rules except for the Ellipsis Lexical Rule, since it seems like the pi-rule type could have applied here too. Is there a preference for using the more constrained type because it saves you rewriting information in the statement of the rule? Does calling a rule a d-rule or a pi-rule actually imply any theoretical claims? 35

Reading Questions The collection of constraints on pi-rule seems sort of random to me, particularly identifying the MOD list between input and output. Why are these particular constraints put on the pi-rule type? 36

Reading Questions When introducing contraction, the books says that there are exceptions, such as *amn't and *mayn't. This rule seems to me to overproduce, unless F_NEG("am") produces am not. What in the Contraction Lexical Rule blocks forms like amn't and mayn t? Is it just that the morphological function not having an entry for those words makes them fail to pass through the rule? How does that work when the morphological function is on the OUTPUT side, rather than restricting the INPUT side? 37

The Contraction Lexical Rule pi-rule INPUT 2, HEAD verb FORM fin AUX + POL ARG-ST B SEM INDEX s 1 RESTR A OUTPUT F NEG ( 2 ), HEAD POL + VAL SPR X ARG-ST B SEM INDEX s 2 RESTR RELN not SIT s 2 ARG s 1 A 38

Reading Questions In figure (51) on page 406, in the OUTPUT, Why is the SPR value Z? Why does the rule even have to mention the specifier? The specifier doesn't change, and all of the moving around of arguments is done in ARG-ST. 39

Reading Questions When not follows an auxiliary, can we assume that is is always an instance of sentence negation? Since the polarity markers we're concerned with here are ones with a certain syntactic role, would we consider reaffirming adverbs like indeed and absolutely, or even of course, to ever act as truly polar adverbs (e.g., Pat would indeed have left), even though they can also behave more like sentential adverbs (e.g., Indeed, Pat would have left) -- and would we then just have multiple entries for these different roles? 40

Reading Questions If negation can be indicated by both the feature POL+/- (in the form of not), as well as through contractions (n't) what (in the grammar) governs which one appears where? How do we handle cases as given in (43)? -*Sandy did NOT SO write that. -*Sandy did NOT TOO write that. 41

Reading Questions Also, I don't really get how sentences like Leslie did SO not go to the party are constituent negation rather than sentential negation. Is it just the addition of SO that makes it become constituent negation? 42

Reading Questions Intuitively, I feel that the Inversion Rule's input should be a sentence, not a word. Can you explain the reasoning behind choosing a pi-rule instead of a sentence rule (that doesn't exist in the grammar fragment)? p413 addresses how what would be the V SPR is now the first item on COMPS. How is it that we make the now COMPS still function as the subject of the V? 43

The Inversion Lexical Rule pi-rule INPUT W, HEAD verb FORM fin AUX + VAL SPR X ARG-ST A SEM MODE prop OUTPUT Z, HEAD INV + VAL SPR ARG-ST A SEM MODE ques 44

Reading Questions Why is the ellipsis rule a d-rule? Does the ellipsis lexical rule work any differently for elliptical counterparts to sentences without an auxiliary? For example: (i) Their son eats dinner at five, and the dog eats dinner at five. (ii) Their son eats dinner at five, and the dog does too. What ensures do has the appropriate tense? How are we handling the semantics of ellipsis? 45

The Ellipsis Lexical Rule d-rule INPUT OUTPUT 1, 1, auxv-lxm ARG-ST 2 dervv-lxm ARG-ST 2 A Note that this is a derivational LR (d-rule) -- that is, lexeme-to-lexeme This means that and SEM are unchanged, by default 46

Reading Questions The ellipsis rule shows the deletion of the boxed A after the ARG-ST of the input. Does this mean that ellipsis can only ever remove the last argument on the ARG-ST? Why do we need dervv-lxm? 47

Reading Questions It seems like the context of a sentence also contributes to whether or not it's grammatical. Is that correct? If it is, do we have a way to model that? You couldn't have lifted the heavy weight. Could too. / *Could. 48

Reading Questions You couldn't have lifted the heavy weight. Could too. Could not. Could. Also: What would you like with that? Gravy. 49

Reading Questions Is is in Kim isn't a doctor really semantically empty? When grammar engineers look for examples of grammatical vs ungrammatical sentences to help them design rules (e.g., the SO/TOO examples in this chapter) do they usually just rely on their own knowledge of the language? Are there empirical methods that make the process easier, like mining a corpus for usage examples? 50