Psychology of Language

Similar documents
Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

CS 598 Natural Language Processing

An Interactive Intelligent Language Tutor Over The Internet

The presence of interpretable but ungrammatical sentences corresponds to mismatches between interpretive and productive parsing.

AQUA: An Ontology-Driven Question Answering System

A Minimalist Approach to Code-Switching. In the field of linguistics, the topic of bilingualism is a broad one. There are many

Translational Display of. in Communication Sciences and Disorders

Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions.

What is a Mental Model?

An Introduction to the Minimalist Program

Ambiguities and anomalies: What can eye-movements and event-related potentials reveal about second language sentence processing?

Constraining X-Bar: Theta Theory

How to analyze visual narratives: A tutorial in Visual Narrative Grammar

The Strong Minimalist Thesis and Bounded Optimality

Parsing of part-of-speech tagged Assamese Texts

Chapter 4: Valence & Agreement CSLI Publications

The acquisition of certain basic cognitive functions seems to

Neurocognitive Mechanisms of Human Comprehension. Tufts University, Medford, MA, USA

English Language and Applied Linguistics. Module Descriptions 2017/18

The Smart/Empire TIPSTER IR System

Proof Theory for Syntacticians

Formulaic Language and Fluency: ESL Teaching Applications

Mathematics subject curriculum

Implementing the English Language Arts Common Core State Standards

Multiple case assignment and the English pseudo-passive *

Improved Effects of Word-Retrieval Treatments Subsequent to Addition of the Orthographic Form

Psychology and Language

Inleiding Taalkunde. Docent: Paola Monachesi. Blok 4, 2001/ Syntax 2. 2 Phrases and constituent structure 2. 3 A minigrammar of Italian 3

Parallel Evaluation in Stratal OT * Adam Baker University of Arizona

Grammars & Parsing, Part 1:

Content Language Objectives (CLOs) August 2012, H. Butts & G. De Anda

The lab is designed to remind you how to work with scientific data (including dealing with uncertainty) and to review experimental design.

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation

Theoretical Syntax Winter Answers to practice problems

Language acquisition: acquiring some aspects of syntax.

Approaches to control phenomena handout Obligatory control and morphological case: Icelandic and Basque

Prediction of Maximal Projection for Semantic Role Labeling

Procedia - Social and Behavioral Sciences 154 ( 2014 )

Derivational: Inflectional: In a fit of rage the soldiers attacked them both that week, but lost the fight.

An Empirical and Computational Test of Linguistic Relativity

Abstractions and the Brain

Information Structure Influences Depth of Syntactic Processing: Event-Related Potential Evidence for the Chomsky Illusion

EdIt: A Broad-Coverage Grammar Checker Using Pattern Grammar

Modeling Attachment Decisions with a Probabilistic Parser: The Case of Head Final Structures

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

Construction Grammar. University of Jena.

What Can Neural Networks Teach us about Language? Graham Neubig a2-dlearn 11/18/2017

South Carolina English Language Arts

Sitnikova T, Holcomb PJ, Kuperberg GR Neurocognitive mechanisms of human comprehension.

Correspondence between the DRDP (2015) and the California Preschool Learning Foundations. Foundations (PLF) in Language and Literacy

5/26/12. Adult L3 learners who are re- learning their L1: heritage speakers A growing trend in American colleges

Neural & Predictive Effects of Verb Argument Structure

Age Effects on Syntactic Control in. Second Language Learning

Basic Parsing with Context-Free Grammars. Some slides adapted from Julia Hirschberg and Dan Jurafsky 1

Control and Boundedness

RANKING AND UNRANKING LEFT SZILARD LANGUAGES. Erkki Mäkinen DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF TAMPERE REPORT A ER E P S I M S

V Congress of Russian Psychological Society. Alexander I. Statnikov*, Tatiana V. Akhutina

MASTER OF SCIENCE (M.S.) MAJOR IN COMPUTER SCIENCE

Longman English Interactive

NAME: East Carolina University PSYC Developmental Psychology Dr. Eppler & Dr. Ironsmith

Science with Kids, Science by Kids By Sally Bowers, Dane County 4-H Youth Development Educator and Tom Zinnen, Biotechnology Specialist

Case government vs Case agreement: modelling Modern Greek case attraction phenomena in LFG

Argument structure and theta roles

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

Rover Races Grades: 3-5 Prep Time: ~45 Minutes Lesson Time: ~105 minutes

From Access to Inclusion: Approaches to Building Institutional Capacities for Inclusive Pedagogy

Refining the Design of a Contracting Finite-State Dependency Parser

Underlying and Surface Grammatical Relations in Greek consider

The Acquisition of English Grammatical Morphemes: A Case of Iranian EFL Learners

Project in the framework of the AIM-WEST project Annotation of MWEs for translation

Derivational and Inflectional Morphemes in Pak-Pak Language

Loughton School s curriculum evening. 28 th February 2017

Speech Recognition at ICSI: Broadcast News and beyond

Developing a TT-MCTAG for German with an RCG-based Parser

The MEANING Multilingual Central Repository

Language properties and Grammar of Parallel and Series Parallel Languages

Artificial Neural Networks written examination

Natural Language Processing. George Konidaris

LING 329 : MORPHOLOGY

Sample Problems for MATH 5001, University of Georgia

ERP measures of auditory word repetition and translation priming in bilinguals

COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR

Using Semantic Relations to Refine Coreference Decisions

Senior Stenographer / Senior Typist Series (including equivalent Secretary titles)

Specifying Logic Programs in Controlled Natural Language

Physics 270: Experimental Physics

Specification and Evaluation of Machine Translation Toy Systems - Criteria for laboratory assignments

Reading Grammar Section and Lesson Writing Chapter and Lesson Identify a purpose for reading W1-LO; W2- LO; W3- LO; W4- LO; W5-

Stages of Literacy Ros Lugg

THE ROLE OF DECISION TREES IN NATURAL LANGUAGE PROCESSING

Adjectives tell you more about a noun (for example: the red dress ).

1/20 idea. We ll spend an extra hour on 1/21. based on assigned readings. so you ll be ready to discuss them in class

Informatics 2A: Language Complexity and the. Inf2A: Chomsky Hierarchy

PowerTeacher Gradebook User Guide PowerSchool Student Information System

Good Enough Language Processing: A Satisficing Approach

Millersville University Degree Works Training User Guide

Guidelines for Writing an Internship Report

Linguistics. Undergraduate. Departmental Honors. Graduate. Faculty. Linguistics 1

A General Class of Noncontext Free Grammars Generating Context Free Languages

The Interface between Phrasal and Functional Constraints

Transcription:

PSYCH 155/LIG 155 UCI COGITIVE SCIECES syn lab Psychology of Language Prof. Jon Sprouse Lecture 18: Parsing 1

Grammar PP P A grammar is a set of equations/rules that generates all (and only) the sentences of a given language. V S V V PP V PP S V P Sentences have hierarchical structure: words combine to form phrases, which combine to form larger phrases, and so on. the boy ate the cookies after the party 2

The problem of sequential input The problem: Words are delivered in a specific temporal order, with no information about the hierarchical structure S This means that your brain needs to build the structure from the words alone. PP P V This string of words forms a sentence 3

Grammar can t do it alone A grammar is just a static list of rules. It doesn t specify the order that the rules should be a applied. PP P V V V PP V This is similar to the rules of arithmetic: Addition: x + y Subtraction: x - y Multiplication: x * y S ivision: Exponentiation: x / y x y Knowing the rules of arithmetic isn t enough to solve a complex problem, you need to know the order of operations : Please Excuse My ear Aunt Sally 4

What we need is a Parser PP P A grammar is a set of equations/rules that generates all (and only) the sentences of a given language. V V V PP V S Sentences have hierarchical structure: words combine to form phrases, which combine to form larger phrases, and so on. A parser is a cognitive system for building hierarchical structure from the sequential input of words based on the rules of the grammar. Just like there can be any number of orders of operations for arithmetic rules, there can be any number of parsers (that is, any number of ways to build hierarchical structure from sequential input). It is an empirical question which method of parsing the brain actually uses. 5

Extreme hypothesis 1: Hypothesis-riven Parsing Also known as top-down parsing, this is when the parser predicts a structure before the word is encountered, and then checks to see if it matches S PP P V This string of words forms a sentence 6

The problem with H-driven parsing We can already see that H-driven parsing may require several revisions if the hypothesis is incorrect. S PP P V This string of words forms a sentence 7

Extreme hypothesis 2: ata-riven Parsing Also known as bottom-up parsing, this is when the parser uses information about the word to generate the structure S PP P V This string of words forms a sentence 8

The problem with -driven parsing We can already see that -driven parsing will stall out if there is more than one option available (that is, given two choices, it must have a hypothesis about which is correct, but by definition, -driven parsing has no hypotheses!) S PP P V This string of words forms a sentence 9

Left-Corner Parsing What we need is a middle ground: a parser that uses some information from the words to choose a hypothesis (prediction) about what the structure is. Question: What information does the parser use to make its predictions? A left-corner parser combines both H-driven and -driven parsing by using syntactic categories and phrase structure rules to parse the sentence. Phrase Structure Rules S S -> -> V -> The first item after the arrow is the left corner of the rule. V -> hug -> boy -> girl V -> the The girl_ hugs the boy 10

Predictions of a left-corner parser If the human parser is a left-corner parser, then we would predict that the earliest stage of sentence processing would be driven by the syntactic category of the word, and the phrase structure rules that the word can appear in. We can use electroencephalography to test this prediction because it provides very good temporal resolution. egative Voltage egative Component Positive Voltage Positive Component 11

Predictions of a left-corner parser If the human parser is a left-corner parser, then we would predict that the earliest stage of sentence processing would be driven by the syntactic category of the word, and the phrase structure rules that the word can appear in. We can test this prediction by violating the phrase structure rules of a sentence, and looking to see how early the response to the violation appears: P The scientist criticized Max s proof of the theorem PP This sentence is grammatical, and serves as a control to show us what the normal response to of looks like P * The scientist criticized Max s of proof the theorem * P This sentence is ungrammatical because there is no phrase structure rule that allows a preposition to follow a determiner. 12

Predictions of a left-corner parser If the human parser is a left-corner parser, then we would predict that the earliest stage of sentence processing would be driven by the syntactic category of the word, and the phrase structure rules that the word can appear in. The scientist criticized Max s proof of the theorem * The scientist criticized Max s of proof the theorem The ungrammatical sentence causes a negative peak in amplitude 150ms - 250ms after the onset of the preposition. We call this the Early Left Anterior egativity because it is early, it appears over left anterior scalp locations, and it is negative. 13

How early is the ELA? The ELA occurs in response to phrase structure violations 150ms - 250ms after the onset of the critical word. But is this early enough satisfy the prediction of a left-corner parser? Is it the earliest stage of sentence processing? Well, to figure out if this is the earliest stage, we need to look at other stages of sentence processing to see if any are (or could be) earlier. Recall that shadowing experiments suggest that lexical access occurs around 150ms - 200ms after word onset. Speaker 1 Speaker 2: the shadower word1: 375ms 200ms lag word1: 375ms Because sentence processing requires words, the time course of lexical access provides a lower limit the speed of sentence processing. The ELA (150-250ms) occurs in approximately the same window as lexical access (150-200ms), so this is very early in sentence processing! 14

How early is the ELA? The ELA occurs in response to phrase structure violations 150ms - 250ms after the onset of the critical word. We can also look for other processes to see if any occur before the ELA. For example, we know that the brain must construct a meaning for the sentence. We can break that meaning to look for a time course of when meaning is constructed. He spread the warm bread with butter. * He spread the warm bread with socks. The incongruous sentence causes a negative peak in amplitude 3000ms - 500ms after the onset of the incongruous word. We call this the 400 because it is negative, and peaks around 400ms after the word. 15

How early is the ELA? Lexical Access occurs around 150ms - 200ms after the onset of the word. The ELA occurs in response to phrase structure violations 150ms - 250ms after the onset of the critical word. The 400 occurs in response to semantic violations 300ms - 500ms after the onset of the critical word. These facts suggest that the ELA is indexing an extremely early stage of sentence processing -- as predicted by a left-corner parsing architecture! 16

Syntax precedes Semantics A final prediction of a left-corner parser is that syntactic processing (building hierarchical structure according to phrase structure rules) will precede semantic processing (deriving a meaning from the sentence). The relative ordering of the ELA and the 400 suggest that this might be true: The ELA occurs in response to phrase structure violations 150ms - 250ms after the onset of the critical word. The 400 occurs in response to semantic violations 300ms - 500ms after the onset of the critical word. However, we can be more sophisticated than this. If syntactic structure building precedes semantic processing, then that means that semantic processing is dependent upon syntactic structure building. In other words, if syntactic structure building fails, then semantic processing is impossible. 17

Syntax precedes Semantics We can test this prediction by violating both syntax and semantic simultaneously. Syntax violation: er Strauch wurde trotz verpflantz von einem Gärtner, den wenige empfahlen The bush was despite replanted by a gardner, whom few recommended Semantics violation: as Buch wurde verpflantz von einem Verleger, den wenige empfahlen The book was replanted by a publisher, whom few recommended ouble violation: as Buch wurde trotz verpflantz von einem Verleger, den wenige empfahlen The book was despite replanted by a publisher, whom few recommended. Friederici et al. 2004 18

Syntax precedes Semantics If semantics proceeds independently, then we will see both an ELA and 400. If semantics is dependent on successful syntactic processing, then we should only see an ELA. The 400 will not show up because the system will not see the semantic violation (because semantic processing will not have taken place). Syntax violation: Semantics violation: ouble violation: Friederici et al. 2004 19