Word Meaning. Informatics 1 CG: Lecture 12. Mirella Lapata. February 5, School of Informatics University of Edinburgh

Similar documents
The Representation of Concrete and Abstract Concepts: Categorical vs. Associative Relationships. Jingyi Geng and Tatiana T. Schnur

End-of-Module Assessment Task

Answer Key For The California Mathematics Standards Grade 1

Morphosyntactic and Referential Cues to the Identification of Generic Statements

Learning Fields Unit and Lesson Plans

Francesca degli Espinosa. Ph.D., BCBA-D, CPsychol. National Autism Conference Penn State, 5 th & 6 th August 2015

Session 2B From understanding perspectives to informing public policy the potential and challenges for Q findings to inform survey design

Conteúdos de inglês para o primeiro bimestre. Turma 21. Turma 31. Turma 41

Standards Alignment... 5 Safe Science... 9 Scientific Inquiry Assembling Rubber Band Books... 15

Dear Teacher: Welcome to Reading Rods! Reading Rods offer many outstanding features! Read on to discover how to put Reading Rods to work today!

Probabilistic Latent Semantic Analysis

Using dialogue context to improve parsing performance in dialogue systems

Using Web Searches on Important Words to Create Background Sets for LSI Classification

Sight Word Assessment

Grade 8: Module 4: Unit 1: Lesson 11 Evaluating an Argument: The Joy of Hunting

Construction Grammar. University of Jena.

The following shows how place value and money are related. ones tenths hundredths thousandths

Physical Features of Humans

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Lesson 12. Lesson 12. Suggested Lesson Structure. Round to Different Place Values (6 minutes) Fluency Practice (12 minutes)

Adjectives tell you more about a noun (for example: the red dress ).

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

Concepts and Properties in Word Spaces

Evolution in Paradise

Poll. How do you feel when someone says assessment? How do your students feel?

All Systems Go! Using a Systems Approach in Elementary Science

Context Free Grammars. Many slides from Michael Collins

Seminar - Organic Computing

Special Educational Needs Assessment for Learning. Phil Dexter, British Council, Teacher Development Adviser

STT 231 Test 1. Fill in the Letter of Your Choice to Each Question in the Scantron. Each question is worth 2 point.

RI.2.4 Determine the meaning of words and phrases in a text relevant to a grade 2 topic or subject area.

Writing that Tantalizes Taste Buds. Presented by Tracy Wassmer Roanoke County Schools

SAMPLE PAPER SYLLABUS

5 Day Schedule Paragraph Lesson 2: How-to-Paragraphs

Word learning as Bayesian inference

Chapter 4: Valence & Agreement CSLI Publications

Elementary Supplemental (purchase only) Instructional Materials -- Draft

Using Proportions to Solve Percentage Problems I

Progress Monitoring Assessment Tools

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

Language acquisition: acquiring some aspects of syntax.

Genevieve L. Hartman, Ph.D.

Theme 10. THEME 10: We Can Do It!

An Empirical and Computational Test of Linguistic Relativity

Outline. Web as Corpus. Using Web Data for Linguistic Purposes. Ines Rehbein. NCLT, Dublin City University. nclt

Concept Acquisition Without Representation William Dylan Sabo

Corpus Linguistics (L615)

2 months: Social and Emotional Begins to smile at people Can briefly calm self (may bring hands to mouth and suck on hand) Tries to look at parent

We are going to talk about the meaning of the word weary. Then we will learn how it can be used in different sentences.

Lecture 2: Quantifiers and Approximation

The Evolution of Random Phenomena

Fourth Grade. Spelling Dictation Sentences ~ Theme 1. Spelling Lesson 1- Long and Short a

Section 7, Unit 4: Sample Student Book Activities for Teaching Listening

Students will be able to describe how it feels to be part of a group of similar peers.

Build on students informal understanding of sharing and proportionality to develop initial fraction concepts.

2. Related Work. KEY WORDS concept, spreading activation, neural network, semantic memory, semantic network, memory model. 1.

Fluency YES. an important idea! F.009 Phrases. Objective The student will gain speed and accuracy in reading phrases.

MODULE FRAMEWORK AND ASSESSMENT SHEET

Name: Class: Date: ID: A

P a g e 1. Grade 5. Grant funded by:

Ocean Exploration: Diving Deep into Ocean Science. Developed by: Sierra Tobiason, Lynn Fujii and Noe Taum

Proof Theory for Syntacticians

ARTS IMPACT INSTITUTE LESSON PLAN Core Program Year 1 Arts Foundations VISUAL ARTS LESSON Unity and Variety in a Textural Collage

1 st Quarter (September, October, November) August/September Strand Topic Standard Notes Reading for Literature

Primary Language Lessons by Emma Serl

A Decision Tree Analysis of the Transfer Student Emma Gunu, MS Research Analyst Robert M Roe, PhD Executive Director of Institutional Research and

First Grade Standards

About this unit. Lesson one

Attention and inhibition in bilingual children: evidence from the dimensional change card sort task

Controlled vocabulary

The Internet as a Normative Corpus: Grammar Checking with a Search Engine

First Grade Curriculum Highlights: In alignment with the Common Core Standards

Rule-based Expert Systems

This publication is also available for download at

Grade 8: Module 4: Unit 1: Lesson 8 Reading for Gist and Answering Text-Dependent Questions: Local Sustainable Food Chain

Learning Structural Correspondences Across Different Linguistic Domains with Synchronous Neural Language Models

BASIC ENGLISH. Book GRAMMAR

Arizona s College and Career Ready Standards Mathematics

Developing True/False Test Sheet Generating System with Diagnosing Basic Cognitive Ability

NAME: East Carolina University PSYC Developmental Psychology Dr. Eppler & Dr. Ironsmith

Copyright by Sung Ju Hwang 2013

San José State University Department of Psychology PSYC , Human Learning, Spring 2017

An Evaluation of the Interactive-Activation Model Using Masked Partial-Word Priming. Jason R. Perry. University of Western Ontario. Stephen J.

rat tail Overview: Suggestions for using the Macmillan Dictionary BuzzWord article on rat tail and the associated worksheet.

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation

Literacy THE KEYS TO SUCCESS. Tips for Elementary School Parents (grades K-2)

Aging and the Use of Context in Ambiguity Resolution: Complex Changes From Simple Slowing

ENGBG1 ENGBL1 Campus Linguistics. Meeting 2. Chapter 7 (Morphology) and chapter 9 (Syntax) Pia Sundqvist

THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

Broward County Public Schools G rade 6 FSA Warm-Ups

Which verb classes and why? Research questions: Semantic Basis Hypothesis (SBH) What verb classes? Why the truth of the SBH matters

Linking the Ohio State Assessments to NWEA MAP Growth Tests *

Language skills to be used and worked upon : Listening / Speaking PPC-PPI / Reading / Writing

Ontologies vs. classification systems

ACTIVITY: Comparing Combination Locks

Learning Methods for Fuzzy Systems

Prewriting: Drafting: Revising: Editing: Publishing:

CEFR Overall Illustrative English Proficiency Scales

Using a topic-based approach for Cambridge English: Young Learners classroom activities

Animal Farm. Student Journal. Reading Schedule. by George Orwell. Does power always corrupt? Name: Group members:

Transcription:

Word Meaning Informatics 1 CG: Lecture 12 Mirella Lapata School of Informatics University of Edinburgh mlap@inf.ed.ac.uk February 5, 2016 Informatics 1 CG: Lecture 11 Word Meaning 1

Reading: Trevor Harley s The Psychology of Language, Chapter 10 Informatics 1 CG: Lecture 11 Word Meaning 2

Recap: Categorization Categorization is one of the classical problems in the field of cognitive science, one with a history dating back to Aristotle. Ability to generalize from experience underlies a variety of common mental tasks Perception, learning, and the use of language. Definitional, prototype, exemplar, and theories theory. Basic-level categories, prototype, family resemblance. Informatics 1 CG: Lecture 11 Word Meaning 3

How do we Represent the Meaning of Words? Semantic knowledge can be thought of as knowledge about relations among several types of elements, including words, concepts, and percepts. Informatics 1 CG: Lecture 11 Word Meaning 4

Word-concept relations Knowledge that the word dog refers to the concept dog the word animal refers to the concept animal or the word toaster refers to the concept toaster. Informatics 1 CG: Lecture 11 Word Meaning 5

Word-concept relations Knowledge that the word dog refers to the concept dog the word animal refers to the concept animal or the word toaster refers to the concept toaster. dog Informatics 1 CG: Lecture 11 Word Meaning 5

Word-concept relations Knowledge that the word dog refers to the concept dog the word animal refers to the concept animal or the word toaster refers to the concept toaster. animal Informatics 1 CG: Lecture 11 Word Meaning 5

Word-concept relations Knowledge that the word dog refers to the concept dog the word animal refers to the concept animal or the word toaster refers to the concept toaster. toaster Informatics 1 CG: Lecture 11 Word Meaning 5

Concept-concept relations Knowledge that dogs are a kind of animal, that dogs have tails and can bark, or that animals have bodies and can move. animal vertebrate mammal canine dog Informatics 1 CG: Lecture 11 Word Meaning 6

Concept-percept, Concept-action Relations Knowledge about what dogs look like, how a dog can be distinguished from a cat, or how to pet a dog or operate a toaster. Informatics 1 CG: Lecture 11 Word Meaning 7

Word-word relations Knowledge that the word dog tends to be associated with or co-occur with words such as tail, bone, and cat or that the word toaster tends to be associated with kitchen, oven, or bread. Informatics 1 CG: Lecture 11 Word Meaning 8

Word-word relations Knowledge that the word dog tends to be associated with or co-occur with words such as tail, bone, and cat or that the word toaster tends to be associated with kitchen, oven, or bread. What are the associates of apple? Informatics 1 CG: Lecture 11 Word Meaning 8

Word-word relations Knowledge that the word dog tends to be associated with or co-occur with words such as tail, bone, and cat or that the word toaster tends to be associated with kitchen, oven, or bread. What are the associates of apple? red orange pie worm Informatics 1 CG: Lecture 11 Word Meaning 8

Semantic Networks Emphasizes abstract conceptual structure, focusing on relations among concepts and relations between concepts and percepts or actions. This knowledge is represented in terms of systems of abstract propositions, such as canary bird, canary has bird wings. Concepts represented in network of interconnecting nodes Distance between nodes represents similarity between them. Concept defined in terms of the connections with other concepts Informatics 1 CG: Lecture 11 Word Meaning 9

Collins and Quillian (1969) animal bird mammal robin penguin pig Useful for representing natural kind terms. Economical method for storing information. Most common links are links Attributes stored at lowest possible node at which they are true of all lower nodes in network Informatics 1 CG: Lecture 11 Word Meaning 10

Collins and Quillian (1969) animal breathes has heartbeat bird has wings lays eggs, flies mammal bears live young robin has red breast penguin swims, cannot fly pig farm animal pink skin Informatics 1 CG: Lecture 11 Word Meaning 11

Sentence Verification Task Participants are presented with simple facts and have to press one button if the sentence is true, another if it is false. The reaction time is an index of how difficult the decision was. (1) A robin is a robin. (2) A robin is a bird. (3) A robin is an animal. (4) A robin is a fish. Informatics 1 CG: Lecture 11 Word Meaning 12

Sentence Verification Task Participants are presented with simple facts and have to press one button if the sentence is true, another if it is false. The reaction time is an index of how difficult the decision was. (1) A robin is a robin. (baseline measure) Yes! (2) A robin is a bird. (3) A robin is an animal. (4) A robin is a fish. Response time to (1) < (2) < (3) < (4). Participants start off from robin and travel through the network until they find the necessary infromation. The farther away the information the slower the response time. Informatics 1 CG: Lecture 11 Word Meaning 12

Sentence Verification Task Participants are presented with simple facts and have to press one button if the sentence is true, another if it is false. The reaction time is an index of how difficult the decision was. (1) A robin is a robin. (baseline measure) Yes! (2) A robin is a bird. Yes! (3) A robin is an animal. (4) A robin is a fish. Response time to (1) < (2) < (3) < (4). Participants start off from robin and travel through the network until they find the necessary infromation. The farther away the information the slower the response time. Informatics 1 CG: Lecture 11 Word Meaning 12

Sentence Verification Task Participants are presented with simple facts and have to press one button if the sentence is true, another if it is false. The reaction time is an index of how difficult the decision was. (1) A robin is a robin. (baseline measure) Yes! (2) A robin is a bird. Yes! (3) A robin is an animal. Yes! (4) A robin is a fish. Response time to (1) < (2) < (3) < (4). Participants start off from robin and travel through the network until they find the necessary infromation. The farther away the information the slower the response time. Informatics 1 CG: Lecture 11 Word Meaning 12

Sentence Verification Task Participants are presented with simple facts and have to press one button if the sentence is true, another if it is false. The reaction time is an index of how difficult the decision was. (1) A robin is a robin. (baseline measure) Yes! (2) A robin is a bird. Yes! (3) A robin is an animal. Yes! (4) A robin is a fish. No! Response time to (1) < (2) < (3) < (4). Participants start off from robin and travel through the network until they find the necessary infromation. The farther away the information the slower the response time. Informatics 1 CG: Lecture 11 Word Meaning 12

Sentence Verification Task animal robin bird penguin mammal pig Informatics 1 CG: Lecture 11 Word Meaning 13

Sentence Verification Task animal robin bird penguin mammal pig A robin is a robin. Informatics 1 CG: Lecture 11 Word Meaning 13

Sentence Verification Task animal robin bird penguin mammal pig A robin is a bird. Informatics 1 CG: Lecture 11 Word Meaning 13

Sentence Verification Task animal robin bird penguin mammal pig A robin is an animal. Informatics 1 CG: Lecture 11 Word Meaning 13

Sentence Verification Task animal robin bird penguin mammal pig A robin is a fish. Informatics 1 CG: Lecture 11 Word Meaning 13

Problems with the Collins and Quillian Model Not all information is easily represented in hierarchical form (what is the relation between truth, justice, and law?) Experiments confound distance in network with conjoint frequency (robin and bird often co-occur). The model makes some incorrect predictions: (5) < (6), (7) < (8), (9) < (10). (5) A cow is an animal. (6) A cow is a mammal. (7) A pine is a church. (8) A pine is a flower. (9) A robin is a bird. (10) A penguin is a bird. Informatics 1 CG: Lecture 11 Word Meaning 14

Problems with the Collins and Quillian Model Not all information is easily represented in hierarchical form (what is the relation between truth, justice, and law?) Experiments confound distance in network with conjoint frequency (robin and bird often co-occur). The model makes some incorrect predictions: (5) < (6), (7) < (8), (9) < (10). (5) A cow is an animal. (6) A cow is a mammal. (7) A pine is a church. (8) A pine is a flower. (9) A robin is a bird. (10) A penguin is a bird. Informatics 1 CG: Lecture 11 Word Meaning 14

Problems with the Collins and Quillian Model Not all information is easily represented in hierarchical form (what is the relation between truth, justice, and law?) Experiments confound distance in network with conjoint frequency (robin and bird often co-occur). The model makes some incorrect predictions: (5) < (6), (7) < (8), (9) < (10). (5) A cow is an animal. (6) A cow is a mammal. (7) A pine is a church. (8) A pine is a flower. (9) A robin is a bird. (10) A penguin is a bird. Informatics 1 CG: Lecture 11 Word Meaning 14

Problems with the Collins and Quillian Model Not all information is easily represented in hierarchical form (what is the relation between truth, justice, and law?) Experiments confound distance in network with conjoint frequency (robin and bird often co-occur). The model makes some incorrect predictions: (5) < (6), (7) < (8), (9) < (10). (5) A cow is an animal. (6) A cow is a mammal. (7) A pine is a church. (8) A pine is a flower. (9) A robin is a bird. (10) A penguin is a bird. Informatics 1 CG: Lecture 11 Word Meaning 14

Collins and Loftus (1975) Model is based on idea of spreading activation. More complex network structure with links varying in strength or distance; structure is no longer hierarchical. Connections represent: categorical relations, degree of association, typicality. When you think about a concept, that concept will become activated, and that activation will spread to other concepts that are linked to it. Verification times depend on closeness concepts in network. Hard to see what sort of experiments could falsify this model. Informatics 1 CG: Lecture 11 Word Meaning 15

Collins and Loftus (1975) street vehicle car bus truck house orange fire engine fire red blue apple roses tulips flowers fruit pear Informatics 1 CG: Lecture 11 Word Meaning 16

Feature Comparison Model animate feathered has a beak flies sings bird + + + + + robin + + + + + penguin + + + pig + Concepts are represented as a set of features. Features are ordered in terms of definingness. 1 Defining features: essential to meaning of word and relate to preperties that things must have to be members of category. Characteristic features: are usually but not necessarily true (most birds can fly but penguins and ostriches cannot). 1 Distinction between defining and characteristic features is arbitrary. Informatics 1 CG: Lecture 11 Word Meaning 17

Feature Comparison Model defining characteristic animate feathered has a beak flies sings bird + + + + + robin + + + + + penguin + + + pig + Concepts are represented as a set of features. Features are ordered in terms of definingness. 1 Defining features: essential to meaning of word and relate to preperties that things must have to be members of category. Characteristic features: are usually but not necessarily true (most birds can fly but penguins and ostriches cannot). 1 Distinction between defining and characteristic features is arbitrary. Informatics 1 CG: Lecture 11 Word Meaning 17

Sentence Verification (Again) A robin is a bird. Informatics 1 CG: Lecture 11 Word Meaning 18

Sentence Verification (Again) A robin is a bird. Stage 1 Compare all features of robin and bird to dermine featural similarity. Informatics 1 CG: Lecture 11 Word Meaning 18

Sentence Verification (Again) A robin is a bird. Stage 1 Compare all features of robin and bird to dermine featural similarity. High Overlap Fast yes Informatics 1 CG: Lecture 11 Word Meaning 18

Sentence Verification (Again) A robin is a pig. Stage 1 Compare all features of robin and pig to dermine featural similarity. Low Overlap fast no Informatics 1 CG: Lecture 11 Word Meaning 18

Sentence Verification (Again) A penguin is a bird. Stage 1 Compare all features of penguin and bird to dermine featural similarity. Informatics 1 CG: Lecture 11 Word Meaning 18

Sentence Verification (Again) A penguin is a bird. Stage 1 Compare all features of penguin and bird to dermine featural similarity. Stage 2 Compare defining features to dermine featural similarity. Slow yes Informatics 1 CG: Lecture 11 Word Meaning 18

Problems with Feature Comparison Model Many words do not have obvious defining features! Model is tied to sentence verification paradigm. Probabilistic feature model (Smith and Medin, 1981). Distinguishes between essential defining features of concepts and aspects of meaning for identifying instances of concept. Features are weighted based on salience and probability of being true for category (has four limbs vs bears live young) Candidate instance is accepted if exceeds some critical weighted sum of features. Categories now have fuzzy boundaries. How is this model different from prototype model? Informatics 1 CG: Lecture 11 Word Meaning 19

Where Do the Features Come from? Participants are presented with set of concept names Asked to write down up to n features they think are important for each concept McRae et al (2005) collected feature norms for 541 living and nonliving concepts Largest set in existence (2,526 features), collected over several years Reveal psychologically salient dimensions of meaning Informatics 1 CG: Lecture 11 Word Meaning 20

Where Do the Features Come from? Informatics 1 CG: Lecture 11 Word Meaning 21

Where Do the Features Come from? moose/elk Informatics 1 CG: Lecture 11 Word Meaning 21

Where Do the Features Come from? moose/elk Feature Freq Classification is large 27 visual has antlers 23 visual has legs 14 visual is brown 10 visual has fur 7 visual has hooves 5 visual eaten as meat 5 function lives in woods 14 encyclopedic an animal 17 taxonomic a mammal 9 taxonomic Feature norms from McRae et al. (2005). Informatics 1 CG: Lecture 11 Word Meaning 21

Representing Word Meaning eats seads has beak has claws has handlebar trolley.00.00.00.30.32.00.00.06.25 robin.05.24.15.00.00.19.34.00.00 has wheels has wings has feathers made of metal made of wood McRae spent 10 years collecting his feature norms! (541 basic-level nouns). What about verbs or abstract concepts (e.g., move, peace)? But, humans naturally express word meaning using features. Informatics 1 CG: Lecture 11 Word Meaning 22

Representing Word Meaning eats seads has beak has claws has handlebar trolley.00.00.00.30.32.00.00.06.25 robin.05.24.15.00.00.19.34.00.00 has wheels has wings has feathers made of metal made of wood McRae spent 10 years collecting his feature norms! (541 basic-level nouns). What about verbs or abstract concepts (e.g., move, peace)? But, humans naturally express word meaning using features. Informatics 1 CG: Lecture 11 Word Meaning 22

Representing Word Meaning eats seads has beak has claws has handlebar trolley.00.00.00.30.32.00.00.06.25 robin.05.24.15.00.00.19.34.00.00 has wheels has wings has feathers made of metal made of wood McRae spent 10 years collecting his feature norms! (541 basic-level nouns). What about verbs or abstract concepts (e.g., move, peace)? But, humans naturally express word meaning using features. Informatics 1 CG: Lecture 11 Word Meaning 22

Summary How do we represent the meaning of words? How is semantic knowledge organized? Semantic information in encoded in networks of linked nodes. Collins and Quillian network emphasizes hierarchical relations and cognitive economy; sentence verification times. Does not explain similarity and relatedness effects. Spreading activation model does but is difficult to falsify. Word meaning can be decomposed into semantic features. Feature-list theories account for sentence verification times by postulating that we compare lists of defining and characteristic features. Next lecture: associationist view of meaning, vector space model. Informatics 1 CG: Lecture 11 Word Meaning 23