Chapter 4: Valence & Agreement CSLI Publications

Similar documents
Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions.

Basic Syntax. Doug Arnold We review some basic grammatical ideas and terminology, and look at some common constructions in English.

The building blocks of HPSG grammars. Head-Driven Phrase Structure Grammar (HPSG) HPSG grammars from a linguistic perspective

ENGBG1 ENGBL1 Campus Linguistics. Meeting 2. Chapter 7 (Morphology) and chapter 9 (Syntax) Pia Sundqvist

BULATS A2 WORDLIST 2

Case government vs Case agreement: modelling Modern Greek case attraction phenomena in LFG

Universal Grammar 2. Universal Grammar 1. Forms and functions 1. Universal Grammar 3. Conceptual and surface structure of complex clauses

Basic Parsing with Context-Free Grammars. Some slides adapted from Julia Hirschberg and Dan Jurafsky 1

Words come in categories

1/20 idea. We ll spend an extra hour on 1/21. based on assigned readings. so you ll be ready to discuss them in class

Construction Grammar. Laura A. Michaelis.

Enhancing Unlexicalized Parsing Performance using a Wide Coverage Lexicon, Fuzzy Tag-set Mapping, and EM-HMM-based Lexical Probabilities

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

Compositional Semantics

CS 598 Natural Language Processing

Construction Grammar. University of Jena.

A Minimalist Approach to Code-Switching. In the field of linguistics, the topic of bilingualism is a broad one. There are many

Proof Theory for Syntacticians

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

Type Theory and Universal Grammar

Feature-Based Grammar

Parsing of part-of-speech tagged Assamese Texts

Aspectual Classes of Verb Phrases

Implementing the Syntax of Japanese Numeral Classifiers

Natural Language Processing. George Konidaris

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation

Pseudo-Passives as Adjectival Passives

Derivational: Inflectional: In a fit of rage the soldiers attacked them both that week, but lost the fight.

Hindi Aspectual Verb Complexes

Underlying and Surface Grammatical Relations in Greek consider

An Interactive Intelligent Language Tutor Over The Internet

Argument structure and theta roles

Constraining X-Bar: Theta Theory

On the Notion Determiner

Inleiding Taalkunde. Docent: Paola Monachesi. Blok 4, 2001/ Syntax 2. 2 Phrases and constituent structure 2. 3 A minigrammar of Italian 3

Segmented Discourse Representation Theory. Dynamic Semantics with Discourse Structure

Approaches to control phenomena handout Obligatory control and morphological case: Icelandic and Basque

Using dialogue context to improve parsing performance in dialogue systems

Context Free Grammars. Many slides from Michael Collins

SAMPLE. Chapter 1: Background. A. Basic Introduction. B. Why It s Important to Teach/Learn Grammar in the First Place

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

Language acquisition: acquiring some aspects of syntax.

6.863J Natural Language Processing Lecture 12: Featured attraction. Instructor: Robert C. Berwick

A relational approach to translation

cambridge occasional papers in linguistics Volume 8, Article 3: 41 55, 2015 ISSN

Ch VI- SENTENCE PATTERNS.

Grammars & Parsing, Part 1:

The presence of interpretable but ungrammatical sentences corresponds to mismatches between interpretive and productive parsing.

Dear Teacher: Welcome to Reading Rods! Reading Rods offer many outstanding features! Read on to discover how to put Reading Rods to work today!

Developing a TT-MCTAG for German with an RCG-based Parser

About this unit. Lesson one

Constructions License Verb Frames

cmp-lg/ Jul 1995

Minimalism is the name of the predominant approach in generative linguistics today. It was first

MODELING DEPENDENCY GRAMMAR WITH RESTRICTED CONSTRAINTS. Ingo Schröder Wolfgang Menzel Kilian Foth Michael Schulz * Résumé - Abstract

ON THE SYNTAX AND SEMANTICS

An Introduction to the Minimalist Program

Pre-Processing MRSes

Control and Boundedness

Informatics 2A: Language Complexity and the. Inf2A: Chomsky Hierarchy

been each get other TASK #1 Fry Words TASK #2 Fry Words Write the following words in ABC order: Write the following words in ABC order:

Chapter 3: Semi-lexical categories. nor truly functional. As Corver and van Riemsdijk rightly point out, There is more

Switched Control and other 'uncontrolled' cases of obligatory control

Tibor Kiss Reconstituting Grammar: Hagit Borer's Exoskeletal Syntax 1

Sample Goals and Benchmarks

The semantics of case *

Intension, Attitude, and Tense Annotation in a High-Fidelity Semantic Representation

BASIC ENGLISH. Book GRAMMAR

SINGLE DOCUMENT AUTOMATIC TEXT SUMMARIZATION USING TERM FREQUENCY-INVERSE DOCUMENT FREQUENCY (TF-IDF)

Some Principles of Automated Natural Language Information Extraction

An Approach to Polarity Sensitivity and Negative Concord by Lexical Underspecification

Adjectives tell you more about a noun (for example: the red dress ).

Types and Lexical Semantics

AN EXPERIMENTAL APPROACH TO NEW AND OLD INFORMATION IN TURKISH LOCATIVES AND EXISTENTIALS

Developing Grammar in Context

Content Language Objectives (CLOs) August 2012, H. Butts & G. De Anda

Derivational and Inflectional Morphemes in Pak-Pak Language

Heads and history NIGEL VINCENT & KERSTI BÖRJARS The University of Manchester

Citation for published version (APA): Veenstra, M. J. A. (1998). Formalizing the minimalist program Groningen: s.n.

LING 329 : MORPHOLOGY

First Grade Curriculum Highlights: In alignment with the Common Core Standards

Using a Native Language Reference Grammar as a Language Learning Tool

Chunk Parsing for Base Noun Phrases using Regular Expressions. Let s first let the variable s0 be the sentence tree of the first sentence.

Procedia - Social and Behavioral Sciences 154 ( 2014 )

Phenomena of gender attraction in Polish *

CAS LX 522 Syntax I. Long-distance wh-movement. Long distance wh-movement. Islands. Islands. Locality. NP Sea. NP Sea

Writing a composition

Formulaic Language and Fluency: ESL Teaching Applications

A Computational Evaluation of Case-Assignment Algorithms

Type-driven semantic interpretation and feature dependencies in R-LFG

Build on students informal understanding of sharing and proportionality to develop initial fraction concepts.

L1 and L2 acquisition. Holger Diessel

Tutorial on Paradigms

Multiple case assignment and the English pseudo-passive *

EAGLE: an Error-Annotated Corpus of Beginning Learner German

The Verbmobil Semantic Database. Humboldt{Univ. zu Berlin. Computerlinguistik. Abstract

Indeterminacy by Underspecification Mary Dalrymple (Oxford), Tracy Holloway King (PARC) and Louisa Sadler (Essex) (9) was: ( case) = nom ( case) = acc

Theoretical Syntax Winter Answers to practice problems

Focusing bound pronouns

Adapting Stochastic Output for Rule-Based Semantics

Transcription:

Chapter 4: Valence & Agreement

Reminder: Where We Are Simple CFG doesn t allow us to cross-classify categories, e.g., verbs can be grouped by transitivity (deny vs. disappear) or by number (deny vs. denies). So we broke categories down into feature structures and began constructing a hierarchy of types of feature structures. This allows us to schematize rules and state crosscategorial generalizations, while still making fine distinctions.

Generalizing Transitivity: Valence (1) a. Pat relies on Kim. b. *Pat relies. c. The child put the toy on the table. d. *The child put the toy. e. The teacher became angry with the students. f. *The teacher became. g.the jury believed the witness lied. (2) The guests ate (the cheese).

Valence is not entirely semantic (3) a. The guests devoured the meal. b. *The guests devoured. c. *The guests dined the meal. d. The guests dined. e. The guests ate the meal. f. The guests ate.

Complements Head-Complement Rule: phrase COMPS word H COMPS 1,..., n 1,..., n This allows for arbitrary numbers of complements, but only applies when there is at least one. Heads in English probably never have more than 3 or 4 complements

phrase verb word verb COMPS 1, 2 1 NP 2 PP put the flowers in a vase

Adjective Valence (10) a. The children are happy. b. The children are happy with the ice cream. c. The children are happy that they have ice cream. d. *The children are happy of ice cream. e. *The children are fond. f. *The children are fond with the ice cream. g. *The children are fond that they have ice cream. h. The children are fond of ice cream.

Noun Valence (11) a. A magazine appeared on the newsstands. b. A magazine about crime appeared on the newsstands. c. Newsweek appeared on the newsstands. d. *Newsweek about crime appeared on the newsstands. e. The report surprised many people. f. The report that crime was declining surprised many people. e. The book surprised many people. f. *The book that crime was declining surprised many people.

Preposition Valence (12) a. The storm arrived after the picnic. b. The storm arrived after we ate lunch. c. The storm arrived during the picnic. d. *The storm arrived during we ate lunch. e. *The storm arrived while the picnic. f. The storm arrived while we ate lunch.

phrase adj word adj COMPS 1 1 phrase prep fond of ice cream

The Parallelism between S and NP Motivation: pairs like Chris lectured about syntax and Chris s lecture about syntax. both S and NP exhibit agreement The bird sings/*sing vs. The birds sing/*sings this/*these bird vs. these/*this birds So we treat NP as the saturated category of type noun and S as the saturated category of type verb.

Specifiers Head-Specifier Rule: phrase 1 H SPR SPR 1 Combines the rules expanding S and NP. In principle also generalizes to other categories.

Different Types SPR Values For Verbs: For Nouns: SPR NP SPR det

Some Abbreviations NOM = noun VP = SPR X verb SPR X NP = noun S = SPR verb SPR

Getting SPR Information Up the Tree How does the head s SPR value get to NOM and VP? We could revise the Head Complement Rule: phrase word SPR A H SPR A 1... n COMPS 1,..., n

A More General Solution The Valence Principle Unless the rule says otherwise, the mother s values for the features (SPR and COMPS) are identical to those of the head daughter.

More on the Valence Principle Intuitively, the features list the contextual requirements that haven t yet been found. This way of thinking about it (like talk of cancellation ) is bottom-up and procedural. But formally, the Valence Principle (like most of the rest of our grammar) is just a well-formedness constraint on trees, without inherent directionality.

phrase verb SPR 1 word noun SPR phrase verb SPR 1 Alex word verb SPR 1 COMPS 2 2 phrase noun SPR likes 3 word det SPR word noun SPR 3 the opera

Specifiers of Other Categories (31) a. They want/preferred them arrested. b. We want/preferred them on our team. c. With them on our team, we'll be sure to win. d. With my parents as supportive as they are, I'll be in fine shape.

Modifiers Not all elements following the head of a phrase are complements. Modifiers are not listed in The Head Modifier Rule (early version): phrase H PP

Coordination Rule (Chapter 4 version) 1 1 + word 1 conj

Agreement Two kinds so far: subject-verb and determiner-noun: diamonds sparkle(*s) these/*this diamonds This could be handled via stipulation in the Head-Specifier Rule But if we want to use this rule for categories that don t have the AGR feature (such as PPs and APs, in English), we can t build it into the rule.

The Specifier-Head Agreement Constraint (SHAC) Verbs and nouns must be specified as: AGR 1 SPR AGR 1

Subject-Verb Agreement walks: verb PER 3rd AGR 1 NUM sg NP SPR AGR 1 Kim: noun AGR SPR PER NUM 3rd sg we: noun AGR SPR PER NUM 1st pl (36) a. Kim walks b. *We walks.

3sing : PER NUM 3rd sg 1sing non 3sing non 1sing 1sing : 2sing : PER NUM PER NUM 1st sg 2nd sg 2sing plural plural : NUM pl

Possible AGR Values 1sing PER 1st NUM sg 2sing PER 2nd NUM sg plural PER 1st NUM pl plural PER 2nd NUM pl plural PER 3rd NUM pl 3sing PER 3rd NUM sg GEND fem 3sing PER 3rd NUM sg GEND masc 3sing PER 3rd NUM sg GEND neut

Determiner-Noun Agreement (45) a. This dog barked. b. *This dogs barked. (46) a. *These dog barked. b. These dogs barked. c. *Few dog barked. d. Few dogs barked. (47) a. The dog barked. b. The dogs barked.

SPR det person, boat, a, this: AGR 3sing people, boats, few, these: AGR PER NUM 3rd pl the: AGR PER 3rd

The Count/Mass Distinction Partially semantically motivated mass terms tend to refer to undifferentiated substances (air, butter, courtesy, information) count nouns tend to refer to individuatable entities (bird, cookie, insult, fact) But there are exceptions: succotash (mass) denotes a mix of corn & lima beans, so it s not undifferentiated. furniture, footwear, cutlery, etc. refer to individuatable artifacts with mass terms cabbage can be either count or mass, but many speakers get lettuce only as mass. borderline case: data

Our Formalization of the Count/Mass Distinction Determiners are: COUNT (much and, in some dialects, less), COUNT + (a, six, many, etc.), or lexically underspecified (the, all, some, no, etc.) Nouns select appropriate determiners count nouns say SPR <COUNT +> mass nouns say SPR <COUNT > Nouns themselves aren t marked for the feature COUNT So the SHAC plays no role in count/mass marking.

dog, word noun AGR 1 SPR det AGR 1 3sing COUNT + SPR

walks, word verb AGR 1 SPR noun AGR 1 3sing SPR

word the, det SPR

phrase 0 SPR word phrase 4 verb 1 0 AGR 3 SPR SPR 1 word word det noun AGR 3 3sing 4 AGR 3 COUNT + PER 3rd walks NUM sg SPR SPR 2 2 T he dog

The Type Hierarchy (so far) feat struc expression, agr-cat PER,NUM val-cat pos SPR,COMPS word phrase 3sing GEND non 3sing adj agr-pos AGR prep conj non 1sing 1sing verb noun det AUX CASE COUNT 2sing plural

TYPE FEATURES/CONSTRAINTS IST feat-struc expression feat-struc pos val-cat word expression phrase expression val-cat feat-struc SPR list(expression) 1 COMPS list(expression) pos agr-pos AGR verb AUX feat-struc pos agr-cat { } agr-pos +, noun CASE { } agr-pos nom, acc det COUNT { } agr-pos +, adj,prep,conj agr-cat PER NUM { } 1st, 2nd, 3rd { } sg, pl pos feat-struc 3sing PER NUM GEND non-3sing 1sing PER NUM non-1sing 2sing PER NUM plural NUM 3rd sg { } fem, masc, neut agr-cat agr-cat non-3sing 1st sg non-3sing non-1sing 2nd sg non-1sing pl

Abbreviations S = verb SPR NP = noun SPR VP = verb NOM = SPR X noun SPR X V = D = word word verb det SPR N = word noun

The Grammar Rules Head-Specifier Rule phrase 1 H SPR SPR 1 Head-Complement Rule phrase H word 1... n COMPS 1,..., n Head-Modifier Rule phrase H PP Coordination Rule 1 1 + word conj 1

The Principles Head Feature Principle (HFP): In any headed phrase, the value of the mother and the value of the head daughter must be identical. Valence Principle: Unless the rule says otherwise, the mother s values for the features (SPR and COMPS) are identical to those of the head daughter. Specifier-Head Agreement Constraint (SHAC): Verbs and common nouns must be specified as: AGR 1 SPR AGR 1

I, word noun AGR 1sing SPR word dog, Sample Lexicon noun AGR 3sing SPR D COUNT + word furniture, noun AGR SPR 3sing D COUNT

a, word det AGR 3sing COUNT + SPR word much, More Lexicon det AGR 3sing COUNT SPR barks, word verb AGR 3sing SPR NP like, word verb AGR non-3sing SPR NP COMPS NP

Chapter 5: Semantics

Building a precise model Some of our statements are statements about how the model works: prep and AGR 3sing can t be combined because AGR is not a feature of the type prep. Some of our statements are statements about how (we think) English or language in general works. The determiners a and many only occur with count nouns, the determiner much only occurs with mass nouns, and the determiner the occurs with either. Some are statements about how we code a particular linguistic fact within the model. All count nouns are SPR < COUNT +>.

Semantics: Where s the Beef? So far, our grammar has no semantic representations. We have, however, been relying on semantic intuitions in our argumentation, and discussing semantic contrasts where they line up (or don't) with syntactic ones. Examples? structural ambiguity S/NP parallelism count/mass distinction complements vs. modifiers

Our Slice of a World of Meanings Aspects of meaning we won t account for Pragmatics Fine-grained lexical semantics: The meaning of life is life, or, in our case, RELN INST life i

Our Slice of a World of Meanings MODE INDEX RESTR prop s RELN SIT SAVER SAVED save s i, j RELN NAME NAMED name Chris, i RELN NAME NAMED name Pat j... the linguistic meaning of Chris saved Pat is a proposition that will be true just in case there is an actual situation that involves the saving of someone named Pat by someone named Chris. (p. 140)

Our Slice of a World of Meanings What we are accounting for is the compositionality of sentence meaning. How the pieces fit together Semantic arguments and indices How the meanings of the parts add up to the meaning of the whole. Appending RESTR lists up the tree

Semantics in Constraint-Based Grammar Constraints as (generalized) truth conditions proposition: what must be the case for a proposition to be true directive: what must happen for a directive to be fulfilled question: the kind of situation the asker is asking about reference: the kind of entity the speaker is referring to Syntax/semantics interface: Constraints on how syntactic arguments are related to semantic ones, and on how semantic information is compiled from different parts of the sentence.

Feature Geometry SYN SEM MODE INDEX RESTR pos SPR COMPS list(expression) list(expression) { prop, ques, dir, ref, none} { i, j, k,... s 1, s 2,... } list(pred)

How the Pieces Fit Together Dana, word SYN noun AGR 3sing SPR SEM INDEX i MODE ref RESTR RELN name NAME Dana NAMED i

How the Pieces Fit Together slept, word SYN SEM verb SPR NPj INDEX s 1 MODE prop RELN sleep RESTR SIT s 1,... SLEEPER j

The Pieces Together S 1 NP SEM INDEX i VP SYN SPR 1 SEM RESTR RELN SIT s 1 SLEEPER i sleep,... Dana slept

A More Detailed View of the Same Tree SEM S INDEX MODE RESTR SEM INDEX RESTR 1 NP i name RELN NAME Dana NAMED i VP SYN SPR 1 SEM RESTR RELN SIT s 1 SLEEPER i sleep,...

To Fill in Semantics for the S-node We need the Semantics Principles The Semantic Inheritance Principle: In any headed phrase, the mother's MODE and INDEX are identical to those of the head daughter. The Semantic Compositionality Principle:

Semantic Inheritance Illustrated SEM S INDEX s 1 MODE RESTR prop SEM INDEX RESTR 1 NP i name RELN NAME Dana NAMED i VP SYN SPR 1 SEM RESTR RELN SIT s 1 SLEEPER i sleep,...

To Fill in Semantics for the S-node We need the Semantics Principles The Semantic Inheritance Principle: In any headed phrase, the mother's MODE and INDEX are identical to those of the head daughter. The Semantic Compositionality Principle: In any well-formed phrase structure, the mother's RESTR value is the sum of the RESTR values of the daughter.

Semantic Compositionality Illustrated SEM INDEX s 1 MODE RESTR prop RELN NAME NAMED i S name Dana, RELN SIT s 1 SLEEPER i sleep,... SEM INDEX RESTR 1 NP i name RELN NAME Dana NAMED i VP SYN SPR 1 SEM RESTR RELN SIT s 1 SLEEPER i sleep,...

What Identifies Indices? S 1 NP i VPSPR 1 D NOM i SPR 1 RESTR VP RELN SIT s 3 SLEEPER i sleep PP the cat slept on the mat

Other Aspects of Semantics Tense, Quantification (only touched on here) Modification Coordination Structural Ambiguity

Quantifiers (44) A dog saved every family (45) a. There was some particular dog who saved every family. b. Every family was saved by some dog or other (not necessarily the same dog). (46) a. (Exist i: dog(i))(all j: family(j))save(i,j) b. (All j: family(j))(exist i: dog(i))save(i,j) (47) save(i,j) (48) (All j: family(j))save(i,j)

predication RELN BV QRESTR QSCOPE exist i predication predication

RELN exist RESTR BV i QRESTR 1, RELN dog 1 INST i QSCOPE 2 RELN save RELN family 3, 4SAVER i INST j SAVED j, 2 RELN all BV j QRESTR 3, QSCOPE 4

RELN exist RESTR BV i 2 QRESTR 1, RELN dog 1, INST i QSCOPE 4 RELN save RELN family 3, 4SAVER i INST j SAVED j RELN all BV j QRESTR 3, QSCOPE 2

Evolution of a Phrase Structure Rule Ch. 2: NOM NOM PP VP VP PP Ch. 3: Ch. 4: Ch. 5: phrase COMPS itr H phrase SPR phrase H phrase H 1 SYN SPR PP PP SYN MOD 1 Ch. 5 (abbreviated): phrase H 1 MOD 1

Evolution of Another Phrase Structure Rule Ch. 2: X X + CONJ X Ch. 3: Ch. 4: Ch. 5: 1 1 + word 1 SYN 0 SEM IND s 0 conj 1 + SYN 0 SYN 0... SEM IND s 1 SEM IND s n 1 Ch. 5 (abbreviated): 0 IND s 0 1 word SYN SEM conj 1 conj IND s 0 RESTR ARGS s 1...s n conj 0 0... IND s 0 IND s 1 IND s n 1 RESTR ARGS s 1...s n SYN 0 SEM IND s n 0 IND s n

Combining Constraints and Coordination Coordination Rule 0 IND s 0 conj 0 0... IND s 0 IND s 1 IND s n 1 RESTR ARGS s 1...s n 0 IND s n Lexical Entry for a Conjunction and, SYN SEM INDEX MODE RESTR conj s none RELN SIT and s

Combining S IND s 0 Constraints and Coordination Lexical Entry for and SYN conj INDEX s and, MODE none SEM RELN and RESTR SIT s S IND s 1 conj IND s 0 RESTR RELN and SIT s 0 ARGS s 1, s 2 S IND s 2 Coordination Rule 0 IND s 0 0... IND s 1 Pat sings conj 0 IND s 0 IND s n 1 RESTR ARGS s 1...s n and 0 IND s n Lee dances

S IND s 0 Structural Ambiguity, 1 S IND s 0 ADV MOD 1 Tree I S IND s 1 CONJ S IND s 2 frequently NP V P and NP V P P at sings Lee dances IND s 0 MODE prop RELN name RELN sing RELN and NAME Pat, SIT s 1, SIT s 0, NAMED k SINGER k ARGS s 1, s 2 RESTR RELN name RELN win RELN frequently NAME Lee, SIT s 2, ARG s 0 NAMED j WINNER j

S IND s 0 Structural Ambiguity, S IND s 1 CONJ S IND s 2 Tree II NP V P and 1 S IND s 2 ADV MOD 1 P at sings NP V P frequently IND s 0 MODE prop RELN NAME NAMED RESTR RELN NAME NAMED Lee dances name RELN sing RELN and Pat, SIT s 1, SIT s 0, k SINGER k ARGS s 1, s 2 name RELN win RELN frequently Lee, SIT s 2, ARG s 2 j WINNER j

Question About Structural Ambiguity Why isn t this a possible semantic representation for the string Pat sings and Lee dances frequently? IND s 0 MODE prop RELN NAME NAMED RESTR RELN NAME NAMED name RELN sing RELN and Pat, SIT s 1, SIT s 0, k SINGER k ARGS s 1, s 2 name RELN win RELN frequently Lee, SIT s 2, ARG s 1 j WINNER j

Semantic Compositionality IND s 0 MODE prop RELN NAME NAMED RESTR RELN NAME NAMED IND s 0 MODE prop RELN NAME NAMED RESTR RELN NAME NAMED name RELN sing RELN and Pat, SIT s 1, SIT s 0, k SINGER k ARGS s 1, s 2 name RELN win RELN frequently Lee, SIT s 2, ARG s 0 j WINNER j name RELN sing RELN and Pat, SIT s 1, SIT s 0, k SINGER k ARGS s 1, s 2 name RELN win RELN frequently Lee, SIT s 2, ARG s 2 j WINNER j

Semantic Compositionality The meaning of a phrase is determined by the meaning of its parts and how they are put together.

The Type Hierarchy feat struc predication agr-cat PER NUM expression SYN SEM syn-cat sem-cat MODE INDEX RESTR val-cat SPR COMPS MOD 3sing GEND non 3sing word phrase pos 1sing non 1sing agr-pos AGR adj prep adv conj 2sing plural verb noun det AUX CASE COUNT

Feature Declarations & Type Constraints TYPE FEATURES/CONSTRAINTS IST feat-struc expression feat-struc SYN syn-cat SEM sem-cat syn-cat sem-cat predication word, phrase val-cat MODE INDEX RESTR feat-struc pos val-cat { } prop, ques, dir, ref, none { } 1 i, j, k,..., s 1, s 2,... list(predication) { } RELN love, walk,...... SPR COMPS MOD pos agr-pos AGR verb AUX list(expression) list(expression) list(expression) feat-struc feat-struc expression feat-struc feat-struc pos agr-cat { } agr-pos +, TYPE FEATURES/CONSTRAINTS IST noun CASE { nom, acc } agr-pos det COUNT adj, prep, adv, conj agr-cat 3sing PER NUM PER NUM GEND non-3sing 1sing PER NUM non-1sing 2sing PER NUM plural NUM { } agr-pos +, { } 1st, 2nd, 3rd { } sg, pl 3rd sg { } fem, masc, neut pos feat-struc agr-cat agr-cat non-3sing 1st sg non-3sing non-1sing 2nd sg non-1sing pl

Abbreviations S = VP = V = SYN SYN word SYN PP = SYN P = DP = word SYN SYN verb SYN SPR NPi = SEM verb NOM = SYN SPR X N = verb word SYN prep AP = SYN A = prep det SPR word SYN noun SPR INDEX i noun SPR X noun adj adj

The Grammar Rules Head-Specifier Rule phrase 1 HSYN SYN SPR SPR 1 Head-Complement Rule phrase word H SYN SYN COMPS 1,..., n 1... n Head-Modifier Rule phrase H 1 SYN SYN MOD 1 Coordination Rule SYN 0 SEM IND s 0 SYN 0 SEM IND s 1... SYN 0 SEM IND s n 1

Old Principles Head Feature Principle (HFP) In any headed phrase, the value of the mother and the value of the head daughter must be identical. Valence Principle Unless the rule says otherwise, the mother's values for the features (SPR, COMPS, and MOD) are identical to those of the head daughter. Specifier-Head Agreement Constraint (SHAC) Verbs and common nouns must be specified as: SYN AGR 1 SPR AGR 1

New Principles Semantic Inheritance Principle In any headed phrase, the mother's MODE and INDEX values are identical to those of the head daughter. Semantic Compositionality Principle In any well-formed phrase structure, the mother's RESTR value is the sum of the RESTR values of the daughters.

Sample Lexicon dog, SYN noun AGR 3sing SPR DP i MOD SEM MODE ref INDEX i RESTR RELN dog INST i Kim, SYN noun AGR 3sing SPR MOD SEM MODE ref INDEX i RESTR RELN name NAME Kim NAMED i love, SYN verb SPR NP i COMPS NPacc j MOD SEM MODE prop INDEX s RESTR RELN love SIT s LOVER i LOVED j

Sample Lexicon, Continued today, SYN adv SPR MOD VP INDEX s SEM MODE none RESTR RELN today ARG s and, SYN conj SEM INDEX s MODE none RESTR RELN and SIT s a, word SYN det AGR 3sing COUNT + SPR MOD SEM MODE none INDEX i RESTR RELN exist BV i