Listen up! Speech is for thinking during infancy. Athena Vouloumanos and Sandra R. Waxman. to appear in Trends in Cognitive Sciences (in press) 1

Similar documents
Communicative signals promote abstract rule learning by 7-month-old infants

Linking object names and object categories: Words (but not tones) facilitate object categorization in 6- and 12-month-olds

Language Development: The Components of Language. How Children Develop. Chapter 6

Visual processing speed: effects of auditory input on

Philosophy of Literacy Education. Becoming literate is a complex step by step process that begins at birth. The National

Perceptual foundations of bilingual acquisition in infancy

Perceptual Auditory Aftereffects on Voice Identity Using Brief Vowel Stimuli

9.85 Cognition in Infancy and Early Childhood. Lecture 7: Number

Infants learn phonotactic regularities from brief auditory experience

Without it no music: beat induction as a fundamental musical trait

Abstract Rule Learning for Visual Sequences in 8- and 11-Month-Olds

Stages of Literacy Ros Lugg

Evolution of Symbolisation in Chimpanzees and Neural Nets

Accelerated Learning Course Outline

Speech Recognition at ICSI: Broadcast News and beyond

A Case-Based Approach To Imitation Learning in Robotic Agents

Mandarin Lexical Tone Recognition: The Gating Paradigm

AGENDA LEARNING THEORIES LEARNING THEORIES. Advanced Learning Theories 2/22/2016

REVIEW OF NEURAL MECHANISMS FOR LEXICAL PROCESSING IN DOGS BY ANDICS ET AL. (2016)

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

Course Law Enforcement II. Unit I Careers in Law Enforcement

Introduction to Psychology

1. REFLEXES: Ask questions about coughing, swallowing, of water as fast as possible (note! Not suitable for all

INTRODUCTION TO PSYCHOLOGY

Accelerated Learning Online. Course Outline

On the Links Among Face Processing, Language Processing, and Narrowing During Development

Revisiting the role of prosody in early language acquisition. Megha Sundara UCLA Phonetics Lab

GOLD Objectives for Development & Learning: Birth Through Third Grade

SLINGERLAND: A Multisensory Structured Language Instructional Approach

Eliciting Language in the Classroom. Presented by: Dionne Ramey, SBCUSD SLP Amanda Drake, SBCUSD Special Ed. Program Specialist

A Comparison of the Effects of Two Practice Session Distribution Types on Acquisition and Retention of Discrete and Continuous Skills

Shared Challenges in Object Perception for Robots and Infants

The Effect of Discourse Markers on the Speaking Production of EFL Students. Iman Moradimanesh

Intra-talker Variation: Audience Design Factors Affecting Lexical Selections

SOFTWARE EVALUATION TOOL

The Perception of Nasalized Vowels in American English: An Investigation of On-line Use of Vowel Nasalization in Lexical Access

Tracy Dudek & Jenifer Russell Trinity Services, Inc. *Copyright 2008, Mark L. Sundberg

Bayley scales of Infant and Toddler Development Third edition

Training Staff with Varying Abilities and Special Needs

Concept Acquisition Without Representation William Dylan Sabo

PROGRAM REQUIREMENTS FOR RESIDENCY EDUCATION IN DEVELOPMENTAL-BEHAVIORAL PEDIATRICS

Presented by The Solutions Group

Linguistics. Undergraduate. Departmental Honors. Graduate. Faculty. Linguistics 1

Cambridgeshire Community Services NHS Trust: delivering excellence in children and young people s health services

Increasing Student Engagement

Colorado Academic. Drama & Theatre Arts. Drama & Theatre Arts

A Bayesian Model of Imitation in Infants and Robots

Rajesh P. N. Rao, Aaron P. Shon and Andrew N. Meltzoff

Learners Use Word-Level Statistics in Phonetic Category Acquisition

CELTA. Syllabus and Assessment Guidelines. Third Edition. University of Cambridge ESOL Examinations 1 Hills Road Cambridge CB1 2EU United Kingdom

The Complete Brain Exercise Book: Train Your Brain - Improve Memory, Language, Motor Skills And More By Fraser Smith

THE USE OF TINTED LENSES AND COLORED OVERLAYS FOR THE TREATMENT OF DYSLEXIA AND OTHER RELATED READING AND LEARNING DISORDERS

A joint model of word segmentation and meaning acquisition through crosssituational

Dyslexia/dyslexic, 3, 9, 24, 97, 187, 189, 206, 217, , , 367, , , 397,

NAME: East Carolina University PSYC Developmental Psychology Dr. Eppler & Dr. Ironsmith

VB-MAPP Guided Notes

Chapter 1 Notes Wadsworth, Cengage Learning

Why Pay Attention to Race?

Correspondence between the DRDP (2015) and the California Preschool Learning Foundations. Foundations (PLF) in Language and Literacy

Special Education Program Continuum


Rhythm-typology revisited.

Understanding the Relationship between Comprehension and Production

Speech Emotion Recognition Using Support Vector Machine

Eli Yamamoto, Satoshi Nakamura, Kiyohiro Shikano. Graduate School of Information Science, Nara Institute of Science & Technology

Table of Contents. Introduction Choral Reading How to Use This Book...5. Cloze Activities Correlation to TESOL Standards...

Elizabeth R. Crais, Ph.D., CCC-SLP

Procedia - Social and Behavioral Sciences 146 ( 2014 )

PREDICTING GLOBAL MEASURES OF DEVELOPMENT AT 18-MONTHS OF AGE FROM SPECIFIC MEASURES OF COGNITIVE ABILITY AT 10-MONTHS OF AGE. Tasha D.

LEXICAL CATEGORY ACQUISITION VIA NONADJACENT DEPENDENCIES IN CONTEXT: EVIDENCE OF DEVELOPMENTAL CHANGE AND INDIVIDUAL DIFFERENCES.

Assessment and Evaluation

Language Acquisition Chart

Recommended Guidelines for the Diagnosis of Children with Learning Disabilities

1 st Quarter (September, October, November) August/September Strand Topic Standard Notes Reading for Literature

The role of word-word co-occurrence in word learning

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Integrating culture in teaching English as a second language

Ohio s New Learning Standards: K-12 World Languages

Understanding and Supporting Dyslexia Godstone Village School. January 2017

English Language and Applied Linguistics. Module Descriptions 2017/18

Contribution of facial and vocal cues in the still-face response of 4-month-old infants

Beeson, P. M. (1999). Treating acquired writing impairment. Aphasiology, 13,

A Cross-language Corpus for Studying the Phonetics and Phonology of Prominence

Non-Secure Information Only

Proceedings of Meetings on Acoustics

Neural pattern formation via a competitive Hebbian mechanism

Age Effects on Syntactic Control in. Second Language Learning

Cognition 112 (2009) Contents lists available at ScienceDirect. Cognition. journal homepage:

Stimulating Techniques in Micro Teaching. Puan Ng Swee Teng Ketua Program Kursus Lanjutan U48 Kolej Sains Kesihatan Bersekutu, SAS, Ulu Kinta

Longitudinal family-risk studies of dyslexia: why. develop dyslexia and others don t.

San José State University Department of Psychology PSYC , Human Learning, Spring 2017

Individual Component Checklist L I S T E N I N G. for use with ONE task ENGLISH VERSION

USING THE VERBAL BEHAVIOR MILESTONES ASSESSMENT AND PLACEMENT PROGRAM (VB-MAPP) TO ASSESS LANGUAGE AND GUIDE PROGRAMMING

2 months: Social and Emotional Begins to smile at people Can briefly calm self (may bring hands to mouth and suck on hand) Tries to look at parent

raıs Factors affecting word learning in adults: A comparison of L2 versus L1 acquisition /r/ /aı/ /s/ /r/ /aı/ /s/ = individual sound

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

ELA/ELD Standards Correlation Matrix for ELD Materials Grade 1 Reading

Fluency Disorders. Kenneth J. Logan, PhD, CCC-SLP

Abstractions and the Brain

First Grade Curriculum Highlights: In alignment with the Common Core Standards

Transcription:

to appear in Trends in Cognitive Sciences (in press) 1 1 2 3 4 5 6 7 8 9 10 Listen up! Speech is for thinking during infancy Athena Vouloumanos and Sandra R. Waxman Department of Psychology, New York University, 6 Washington Place, New York, NY, 10003-6603, USA Department of Psychology, Northwestern University, 2029 Sheridan Rd, Evanston, IL 60208-2710, USA Corresponding author: Vouloumanos, A. (athena.vouloumanos@nyu.edu)

to appear in Trends in Cognitive Sciences (in press) 2 11 12 13 14 15 16 17 18 19 20 21 22 Abstract Infants exposure to human speech within the first year of life promotes more than speech processing and language acquisition: new developmental evidence suggests that listening to speech shapes infants fundamental cognitive and social capacities. Speech streamlines infants learning, promotes the formation of object categories, signals communicative partners, highlights information in social interactions, and offers insight into the minds of others. These results, which challenge the claim that for infants, speech offers no special cognitive advantages, suggests a new synthesis: Far earlier than researchers had imagined, an intimate and powerful connection between human speech and cognition guides infant development, advancing infants acquisition of fundamental psychological processes.

to appear in Trends in Cognitive Sciences (in press) 3 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 Speech is not just for language (even for infants) Infants rapid progress in speech perception stands as a clarion case of our species natural proclivity to learn language. Until recently, infant speech perception was considered primarily a foundation upon which to build language. Research focused on the rapidity with which infants tune to the sounds of their native language [1,2] and use these as building blocks for the acquisition of phonology, syntax, and meaning. But infants natural affinity for processing the speech signal has implications that reach far beyond the acquisition of language. New evidence now shows that from the first months of life, listening to speech is a powerful engine: it promotes the acquisition of fundamental psychological processes including pattern learning, the formation of object categories, the identification of communicative partners, knowledge acquisition within social interactions, and the development of social cognition. Human speech is a privileged signal from birth From birth, speech is a privileged signal for humans. Newborns prefer the vocalizations of humans and non-human primates (Rhesus macaques: Macaca mulatta) to other sounds [3,4]. By 3 months, they tune in specifically to human speech even favoring human speech over other human vocalizations, including emotional (e.g., laughing) and physiological (e.g., sneezing) vocalizations [3,5] (see Box 1). Interestingly, 3-month-olds preference for speech is broad enough to include native as well as non-native speech sounds. This suggests that infants privilege the speech signal itself and not simply the familiar sounds of their own native language. These behavioral preferences converge well with neural evidence: At one month of age, human speech and rhesus calls activate similar neural areas, but by 3 months speech and rhesus calls elicit distinctly different neural responses [6,7]. The developmental change in patterns of activation likely reflects neural specialization. Specifically, 1-month-olds response to human speech is already localized to the left hemisphere; over the next few months, the left hemisphere maintains its activation to speech, but becomes less responsive to non-speech sounds [6]. This developmental pattern suggests that from birth, listening to speech sounds preferentially activates specific areas of the temporal cortex, and that a pruning process underlies further neural specialization for speech in the left hemisphere [8]. Infants rapid behavioral and neural tuning to the signal of human speech, remarkable in its own right, has powerful developmental consequences that extend beyond their listening preferences alone. Infants preference for listening to human speech shapes how infants learn. Listening to speech facilitates learning and pattern extraction Speech is a privileged unit for even the most basic forms of learning, including low-level conditioned responses. From birth, when infants listen to speech, they successfully recognize individual units and their relative positions in the speech sequence [9]. And at 1 month, infants who are conditioned to speech show a stronger response and a steeper learning curve than infants conditioned to either tones or backward speech [10].

to appear in Trends in Cognitive Sciences (in press) 4 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 By 7 months, speech promotes more sophisticated forms of learning, including the detection of rules and patterns. After hearing only 2 minutes of patterned speech syllable sequences (ABB: la-ga-ga, da-li-li), 7-month-olds extract and generalize rules such as identity and sequential positioning and distinguish ABB (la-ga-ga) from ABA (laga-la) [11]. But after 2 minutes of patterned exposure to non-speech sounds (musical tones, animal sounds, timbres), infants do not extract the equivalent ABB or ABA rules. Within the auditory domain, infants can generalize rules to non-speech sounds only if they first hear those rules instantiated in speech [12]. This asymmetry, favoring infants ability to extract patterns in speech over non-speech sounds, suggests that infants learn better with speech. Listening to speech promotes categorization Infants early preference for speech is powerful. But infants preferences cannot tell us whether (or when) infants begin to link speech to the objects and events around them. A series of experiments designed to tackle this question focused on object categorization a building block of cognition [13,14]. In these experiments, infants ranging in age from 3 to 12 months viewed several images from one object-category (e.g., dinosaurs), each accompanied by either a segment of speech or a sequence of sine-wave tones. Next, infants viewed two test images, one from the now-familiar category (a new dinosaur) and one from a novel category (e.g., a fish). If infants formed the object category (here, dinosaurs), they should distinguish between the test images [15]. By 3 months of age, infants listening to speech successfully formed categories; those listening to tones failed to form object categories at any age [14]. Thus, infants are tuned not only to speech but also to a principled and surprisingly early link between speech and the fundamental cognitive process of categorization. Moreover, this link, evident at three months, derives from a broader template that initially encompasses human speech as well as the calls of non-human primates (Madagascar blue-eyed lemurs: Eulemur macaco flavifrons). Three- and 4- month-old infants categorization in the context of hearing lemur calls mirrors precisely their categorization in response to human speech; by 6 months, the link to categorization has become tuned specifically to human vocalizations [13]. This documents a surprisingly early link between human language and core cognitive processes, including object categorization that cannot be attributed to familiarity. Although 3- and 4-month-olds have considerable exposure to speech and none to lemur vocalizations, both signals confer the same cognitive advantage for categorization. Speech helps identify potential communicative partners To convey meaning, human communicative partners must integrate, encode, and decode linguistic symbols instantiated in speech, paralinguistic cues (like vocal pitch or intonation), and gestures (see Box 2). The speech signal itself can help identify a potential communicative partner. From the first months of life infants treat people and objects as different kinds of entities: they respond differently to people (with more smiling and emotional sounds) and objects (with more grasping) [16-19] and at 6 months, they also expect others to also treat people and objects differently [20]. By 5 months, infants use human speech to identify potential conversational partners. When presented with human and monkey faces, 5-month-olds match speech (native or non-

to appear in Trends in Cognitive Sciences (in press) 5 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 native) to human faces and monkey calls to monkey faces, but they do not match other human emotional vocalizations (e.g., laughter) specifically to humans [21]. Infants may thus already expect that humans, but not other animals, are the source of speech (see Box 3). This expectation for human speech (but not emotional vocalizations) suggests that infants are guided by more than their familiarity with the sounds alone. By 6 months, infants are especially attentive to communicative cues including eye gaze and speech produced by their pedagogical partners and use these cues to guide learning [22,23]. Infants appear to use speech to identify the natural class of individuals with whom they can communicate and from whom they can learn. Speech indexes the transfer of information When listening to a conversation in a foreign language, even if we cannot understand the meaning of a single word, we nonetheless infer that information is being conveyed. Thus, for adults, understanding the communicative function of speech does not require understanding the contents of the speech. Infants show a similar understanding. By 6 months, although infants understand only very few words [24], they are already sensitive to the communicative function of speech and appreciate that speech is a powerful conduit through which people share information. When an actor can no longer reach a target object, infants at 6 and 12 months infer that she can still obtain that target object from a second actor by using speech but not coughing and other non-speech vocalization [25,26]. Inferring that speech allows people to transfer information may allow infants to more easily deduce the focus of a person s attention, and to make inferences about what information they intend to share. This early understanding of the communicative function of speech may provide a mechanism for acquiring language and knowledge about the world. Speech is a conduit for moving information between people and a cue that information is being shared. Speech gives insight into others minds Understanding the goals and intentions of others is one of the most complex problems facing infants. How do infants come to gain insight into the minds of others? The foundations of social cognition begin to take shape in the first year of life [27]. By the end of their first year, infants appreciate that people (and other agents) have intentions [28] and they distinguish between agents who can behave intentionally and non-agents, who can t [29-34]. By 12 months, infants use speech to learn about aspects of the world that are beyond their direct perception, including the minds of others [35]. Twelve-month-olds watched as an actor attempted (but failed) to stack a ring on a funnel. If the actor then spoke to a new actor (who had not observed the failed attempts), infants expected the second actor to stack the ring. But if the actor produced non-speech sounds (e.g., coughs), infants had no such expectation. Infants appreciate that speech (but not nonspeech) permits us to share our internal mental states, desires and beliefs. They expect that speech is a powerful vehicle for communicating our intentions and understanding the intentions of others. At this age, infants also begin to forge more precise expectations about the functions of human language: They discover that different kinds of words refer to objects, events, and categories [36]. This more precise set of expectation permits

to appear in Trends in Cognitive Sciences (in press) 6 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 infants to make more precise inferences about speakers intentions. The advantage that speech has on categorization in 3-6 month-olds becomes far more precise: by 12 months, infants expect words that are presented in naming phrases ( Look at the blick ) to refer to objects and object categories, but have no such expectation for words presented alone ( Wow ) or for speech that does not involve naming ( Oh, look! ) [37,38]. Moreover, they expect novel nouns to refer to objects and object categories but not to surface properties (e.g., color or pattern) [39]. And by 14 months, infants expect that novel words also refer to actions and events. Although infants at this age tend not to imitate an adult experimenter s unconventional action (e.g., using her forehead rather than her hand to turn on a light; [40], if the unconventional action is named ("I'm going to blick the light!"), infants imitate it spontaneously [41]. As infants expectations about the different functions of language become more precise, so too do the ways in which listening to speech comes to shape cognition. Intriguing new evidence suggests that individual differences in infants preferences for speech may even be linked to differences in their acquisition of fundamental social cognitive capacities. Infants who exhibit reduced preferences for human speech at 12 months display more autistic-like behaviors at 18 months [42]. Inasmuch as autistic traits include social communicative deficits beyond simple language difficulties (DSM-5), this suggests a potent link between simple speech biases and complex social communicative behaviors. Conclusions Before infants begin talking, they are listening. We have proposed that even before infants can understand the meaning of the speech that surrounds them, listening to speech transforms infants acquisition of core cognitive capacities. This transformation is unlikely to be explained by appealing to low-level perceptual effects or issues of stimulus familiarity. Instead, what begins as a natural preference for listening to speech actually provides infants with a powerful natural mechanism for learning rapidly about the objects, events, and people that populate their world.

to appear in Trends in Cognitive Sciences (in press) 7 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 References 1 Kuhl, P.K. et al. (1992). Linguistic experience alters phonetic perception in infants by 6 months of age. Science 255, 606-608 2 Werker, J.F., and Tees, R.C. (1984). Cross-language speech perception: Evidence for perceptual reorganization during the first year of life. Infant Behav Dev 7, 49-63 3 Vouloumanos, A. et al. (2010). The tuning of human neonates' preference for speech. Child Dev 81, 517-527 4 Vouloumanos, A., and Werker, J.F. (2007). Listening to language at birth: evidence for a bias for speech in neonates. Dev Sci 10, 159-164 5 Shultz, S., and Vouloumanos, A. (2010). Three-month olds prefer speech to other naturally occurring signals. Lang Learn Dev 6, 241-257 6 Shultz, S. et al. (2014). Neural specialization for speech in the first months of life. Dev Sci 7 Minagawa-Kawai, Y. et al. (2011). Optical brain imaging reveals general auditory and language-specific processing in early infant development. Cereb Cortex 21, 254-261 8 Huttenlocher, P.R. (1999). Dendritic and synaptic development in human cerebral cortex: time course and critical periods. Dev Neuropsychol 16, 347-349 9 Gervain, J. et al. (2012). Binding at birth: the newborn brain detects identity relations and sequential position in speech. J Cogn Neurosci 24, 564-574 10 Reeb-Sutherland, B.C. et al. (2011). One-month-old human infants learn about the social world while they sleep. Dev Sci 14, 1134-1141 11 Marcus, G.F. et al. (1999). Rule learning by seven-month-old infants. Science 283, 77-80 12 Marcus, G.F. et al. (2007). Infant rule learning facilitated by speech. Psychol Sci 18, 387-391 13 Ferry, A.L. et al. (2013). Nonhuman primate vocalizations support categorization in very young human infants. Proc Natl Acad Sci USA 110, 15231-15235 14 Ferry, A.L. et al. (2010). Categorization in 3-and 4-month-old infants: an advantage of words over tones. Child Dev 81, 472-479 15 Aslin, R.N. (2007). What's in a look? Dev Sci 10, 48-53 16 Rönnqvist, L., and von Hofsten, C. (1994). Neonatal finger and arm movements as determined by a social and an object context. Early Dev Parent 3, 81-94 17 Legerstee, M. (1994). Patterns of 4-month-old infant responses to hidden silent and sounding people and objects. Early Dev Parent 3 18 Legerstee, M. (1991). Changes in the quality of infant sounds as a function of social and nonsocial stimulation. First Lang 11, 327-343 19 Legerstee, M. (1991). The role of person and object in eliciting early imitation. J Exp Child Psychol 51, 423-433 20 Legerstee, M. et al. (2000). Precursors to the development of intention at 6 months: understanding people and their actions. Dev Psychol 36, 627-634 21 Vouloumanos, A. et al. (2009). Five-month-old infants' identification of the sources of vocalizations. Proc Natl Acad Sci USA 106, 18867-18872 22 Csibra, G., and Gergely, G. (2011). Natural pedagogy as evolutionary adaptation. Philos Trans R Soc Lond B Biol Sci 366, 1149-1157

to appear in Trends in Cognitive Sciences (in press) 8 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 23 Csibra, G., and Gergely, G. (2009). Natural pedagogy. Trends Cogn Sci 13, 148-153 24 Bergelson, E., and Swingley, D. (2012). At 6-9 months, human infants know the meanings of many common nouns. Proc Natl Acad Sci USA 109, 3253-3258 25 Vouloumanos, A. et al. (2014). Do 6-month-olds understand that speech can communicate? Dev Sci 26 Martin, A. et al. (2012). Understanding the abstract role of speech in communication at 12 months. Cognition 123, 50-60 27 Baillargeon, R. et al. Psychological and Sociomoral Reasoning in Infancy. In APA Handbook of Personality and Social Psychology: Vol.1. Attitudes and Social Cognition, E. Borgida, and J. Bargh, eds. (Washington DC: APA). 28 Woodward, A.L. (1998). Infants selectively encode the goal object of an actor's reach. Cognition 69, 1-34 29 Király, I. et al. (2003). The early origins of goal attribution in infancy. Conscious Cogn 12, 752-769 30 Luo, Y., and Baillargeon, R. (2005). Can a self-propelled box have a goal? Psychological reasoning in 5-month-old infants. Psychol Sci 16, 601-608 31 Johnson, S. et al. (1998). Whose gaze will infants follow? The elicitation of gazefollowing in 12-month-olds. Dev Sci 1, 233-238 32 Gergely, G. et al. (1995). Taking the intentional stance at 12 months of age. Cognition 56, 165-193 33 Csibra, G. (2008). Goal attribution to inanimate agents by 6.5-month-old infants. Cognition 107, 705-717 34 Bíró, S., and Leslie, A.M. (2007). Infants' perception of goal-directed actions: development through cue-based bootstrapping. Dev Sci 10, 379-398 35 Vouloumanos, A. et al. (2012). Twelve-month-old infants recognize that speech can communicate unobservable intentions. Proc Natl Acad Sci USA 109, 12933-12937 36 Waxman, S.R., and Gelman, S.A. (2009). Early word-learning entails reference, not merely associations. Trends Cogn Sci 13, 258-263 37 Fennell, C.T., and Waxman, S.R. (2010). What paradox? Referential cues allow for infant use of phonetic detail in word learning. Child Dev 81, 1376-1383 38 Waxman, S.R., and Markow, D.B. (1995). Words as invitations to form categories: Evidence from 12- to 13-month-old infants. Cogn Psychol 29, 257-302 39 Waxman, S.R. (1999). Specifying the scope of 13-month-olds' expectations for novel words. Cognition 70, B35-B50 40 Gergely, G. et al. (2002). Rational imitation in preverbal infants. Nature 415, 755 41 Chen, M.L., and Waxman, S.R. (2013). "Shall we blick?": novel words highlight actors' underlying intentions for 14-month-old infants. Dev Psychol 49, 426-431 42 Curtin, S., and Vouloumanos, A. (2013). Speech Preference is Associated with Autistic-Like Behavior in 18-Months-Olds at Risk for Autism Spectrum Disorder. J Autism Dev Disord 43, 2114-2120 43 Pascalis, O. et al. (2002). Is face processing species-specific during the first year of life? Science 296, 1321-1323 44 Lewkowicz, D.J., and Ghazanfar, A.A. (2006). The decline of cross-species intersensory perception in human infants. Proc Natl Acad Sci USA 103, 6771-6774 45 Pons, F. et al. (2009). Narrowing of intersensory speech perception in infancy. Proc Natl Acad Sci USA 106, 10598-10602

to appear in Trends in Cognitive Sciences (in press) 9 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 46 Senghas, A. et al. (2004). Children creating core properties of language: evidence from an emerging sign language in Nicaragua. Science 305, 1779-1782 47 Goldin-Meadow, S. (2006). Talking and thinking with our hands. Curr Dir Psychol Sci 15, 34-39 48 Pyers, J., and Senghas, A. (2009). Language Promotes False-Belief Understanding: Evidence From Learners of a New Sign Language. Psychol Sci 49 Krentz, U.C., and Corina, D.P. (2008). Preference for language in early infancy: the human language bias is not speech specific. Dev Sci 11, 1-9 50 Rabagliati, H. et al. (2012). Infant rule learning: advantage language, or advantage speech? PLoS One 7, e40517 51 Krehm, M. et al. (2014). I See Your Point: Infants Under 12 Months Understand That Pointing Is Communicative. J Cogn Dev 15, 527-538 52 Goldin-Meadow, S. (2007). Pointing sets the stage for learning language--and creating language. Child Dev 78, 741-745 53 Vouloumanos, A., and Gelfand, H.M. (2012). Infant perception of atypical speech signals. Dev Psychol 49, 815-824 54 Ferguson, B., and Waxman, S. (2014). Communication and categorization: New insights into the relation between speech, labels, and concepts for infants. In Proceedings of the 35th Annual Conference of the Cognitive Science Society, M. Knauff, M. Pauen, N. Sebanz, and I. Wachsmuth, eds. ( 55 Fulkerson, A.L., and Waxman, S.R. (2007). Words (but not tones) facilitate object categorization: evidence from 6- and 12-month-olds. Cognition 105, 218-228 56 Balaban, M.T., and Waxman, S.R. (1997). Do words facilitate object categorization in 9-month-old infants? J Exp Child Psychol 64, 3-26 299 300 301 302

to appear in Trends in Cognitive Sciences (in press) 10 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 BOX 1. Tuning mechanisms as pervasive developmental processes The tuning of infants speech bias between birth and 3 months, from preferring vocalizations including those of other primates to being speech-specific [3], mirrors similar tuning processes at work in face perception, cross-modal speech perception, and phoneme perception [1,2,43-45]. Infants are initially able to recognize faces of individuals from different species but by 9 months and into adulthood show better recognition of human faces compared to the faces of other species [43]. Similarly, infants ability to discriminate between many different phonemes may initially rely on language-general discrimination abilities, which become language-specific by 6-12 months [1,2]. Tuning mechanisms sharpen initially broad biases into more specific ones across many perceptual domains in infants first year of life. BOX 2. Are the facilitative effects of speech specific to spoken language? Might language produced in other modalities, including vision, also confer cognitive advantages in infancy? From birth, infants are prepared to acquired language in either the auditory or visual modality [46]. Both signed language and gesture confer cognitive and social advantages [47,48]. Although there is less work documenting the effects of signed than spoken language in infancy, infants privilege sign language over gestures. At 6 months, naïve hearing infants prefer to look at a person producing sign language, as compared to a person producing gesture [49], and by 7 months infants begin to extract some rules from sequences of sign language [50]. Still, hearing infants ability to extract rules is less robust when they are presented with sign language than spoken language, which may reflect their experience. Although 9-month-olds already understand gestures such as pointing as being communicative [[NO STYLE for: Krehm]] and a possible precursor to language [52], the communicative function of signed languages might be understood even earlier. Do hearing infants initially link visually-produced language to object categories like they do for vocalizations? Although even hearing infants prefer sign to gesture, this preference does not tell us which, if either, they will link to core cognitive capacities. Do they link sign language (but not gesture) to fundamental cognitive and social capacities? How do these links fare over the first year in hearing infants who are not exposed to language in the visual-motor domain? BOX 3. Can infants use non-linguistic stimuli like they use speech? A hallmark of speech perception in adults is our ability to perceive distorted or atypical speech as speech. Like adults, infants can also perceive atypical signals as speech, but only under certain circumstances. Nine-month-olds who heard speech-like vocalizations produced by a parrot (which maintain some but not all of the acoustic features of speech) successfully treated the parrot vocalizations similarly to human speech, but only if they viewed a (static) human face while listening. If they viewed a (static) checkerboard pattern, 9-month-olds treated the parrot vocalizations like nonspeech [53]. One question currently under investigation is whether infants would link a parrot s speech-like vocalizations to object categorization or to any other cognitive and social capacities.

to appear in Trends in Cognitive Sciences (in press) 11 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 Another hallmark of being human is our capacity to infuse communicative status into a host of non-linguistic signals (e.g., Morse code). Infants, too, have this flexibility [54]. Like adults, under certain circumstances, infants will interpret an otherwise inert signal as communicative. Six-month-old infants participated in a categorization task involving sine-wave tone sequences, a signal that fails to promote infant object categorization [14,55,56]. But first, before the categorization task, infants watched a 2- minute videotaped conversation in which one person spoke and the other responded with beeps in sine-wave tones. Embedding the tones within a rich communicative episode convinced infants that the tones had communicative status; tones now supported infants object categorization. Although infants privilege speech, they can flexibly extend some of its most important communicative and cognitive functions to other initially non-privileged signals. BOX 4. Outstanding questions What is the range of fundamental cognitive and social processes that are facilitated by speech? Are there processes that are not facilitated by speech? What is the range of signals that promote infant cognitive and social development? o Does sign language, like spoken language, facilitate infant cognitive and social development? o Can atypical speech signals facilitate infant cognitive development? What are the mechanisms underlying the cognitive and social advantages conferred by speech? How might this new evidence from typically developing infants help design interventions for infants and young children experiencing delays and disorders in language, cognitive, and social development? Acknowledgements This work was supported by the Eunice Kennedy Shriver National Institute of Child Health and Human Development of the National Institutes of Health under Award Numbers R01HD072018 (AV) and R01HD30410 (SRW), and National Science Foundation BCS 0950376 (SRW).