q 3 Menu cs3102: Theory of Computation Class 7: Context-Free Languages DPDA Recap: DFA/ε + Stack Adding Nondeterminism q 0 a, ε +

Similar documents
Grammars & Parsing, Part 1:

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

Informatics 2A: Language Complexity and the. Inf2A: Chomsky Hierarchy

COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR

Parsing of part-of-speech tagged Assamese Texts

Basic Parsing with Context-Free Grammars. Some slides adapted from Julia Hirschberg and Dan Jurafsky 1

A R "! I,,, !~ii ii! A ow ' r.-ii ' i ' JA' V5, 9. MiN, ;

CS 598 Natural Language Processing

Natural Language Processing. George Konidaris

Proof Theory for Syntacticians

RANKING AND UNRANKING LEFT SZILARD LANGUAGES. Erkki Mäkinen DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF TAMPERE REPORT A ER E P S I M S

A Version Space Approach to Learning Context-free Grammars

A General Class of Noncontext Free Grammars Generating Context Free Languages

Lecture 10: Reinforcement Learning

Basic Syntax. Doug Arnold We review some basic grammatical ideas and terminology, and look at some common constructions in English.

"f TOPIC =T COMP COMP... OBJ

1/20 idea. We ll spend an extra hour on 1/21. based on assigned readings. so you ll be ready to discuss them in class

What Can Neural Networks Teach us about Language? Graham Neubig a2-dlearn 11/18/2017

GRAMMAR IN CONTEXT 2 PDF

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation

The Strong Minimalist Thesis and Bounded Optimality

Parsing with Treebank Grammars: Empirical Bounds, Theoretical Models, and the Structure of the Penn Treebank

Efficient Normal-Form Parsing for Combinatory Categorial Grammar

ENGBG1 ENGBL1 Campus Linguistics. Meeting 2. Chapter 7 (Morphology) and chapter 9 (Syntax) Pia Sundqvist

Language properties and Grammar of Parallel and Series Parallel Languages

Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions.

Erkki Mäkinen State change languages as homomorphic images of Szilard languages

Multiple case assignment and the English pseudo-passive *

Parsing natural language

The CYK -Approach to Serial and Parallel Parsing

Developing a TT-MCTAG for German with an RCG-based Parser

Machine Learning from Garden Path Sentences: The Application of Computational Linguistics

Derivational: Inflectional: In a fit of rage the soldiers attacked them both that week, but lost the fight.

PRODUCT PLATFORM DESIGN: A GRAPH GRAMMAR APPROACH

Enhancing Unlexicalized Parsing Performance using a Wide Coverage Lexicon, Fuzzy Tag-set Mapping, and EM-HMM-based Lexical Probabilities

Android App Development for Beginners

Specification of the Verity Learning Companion and Self-Assessment Tool

Prediction of Maximal Projection for Semantic Role Labeling

CHANCERY SMS 5.0 STUDENT SCHEDULING

Evolution of Symbolisation in Chimpanzees and Neural Nets

Accurate Unlexicalized Parsing for Modern Hebrew

A Grammar for Battle Management Language

We are strong in research and particularly noted in software engineering, information security and privacy, and humane gaming.

NAME: East Carolina University PSYC Developmental Psychology Dr. Eppler & Dr. Ironsmith

The Interface between Phrasal and Functional Constraints

An Introduction to the Minimalist Program

Stacks Teacher notes. Activity description. Suitability. Time. AMP resources. Equipment. Key mathematical language. Key processes

Language Evolution, Metasyntactically. First International Workshop on Bidirectional Transformations (BX 2012)

Enumeration of Context-Free Languages and Related Structures

Compositional Semantics

Cognitive Modeling. Tower of Hanoi: Description. Tower of Hanoi: The Task. Lecture 5: Models of Problem Solving. Frank Keller.

Authors note Chapter One Why Simpler Syntax? 1.1. Different notions of simplicity

BANGLA TO ENGLISH TEXT CONVERSION USING OPENNLP TOOLS

Approaches to control phenomena handout Obligatory control and morphological case: Icelandic and Basque

WSU Five-Year Program Review Self-Study Cover Page

CS 101 Computer Science I Fall Instructor Muller. Syllabus

Execution Plan for Software Engineering Education in Taiwan

Context Free Grammars. Many slides from Michael Collins

Specifying Logic Programs in Controlled Natural Language

Psychology and Language

Hyperedge Replacement and Nonprojective Dependency Structures

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Empiricism as Unifying Theme in the Standards for Mathematical Practice. Glenn Stevens Department of Mathematics Boston University

UNIVERSITY OF OSLO Department of Informatics. Dialog Act Recognition using Dependency Features. Master s thesis. Sindre Wetjen

SELF-STUDY QUESTIONNAIRE FOR REVIEW of the COMPUTER SCIENCE PROGRAM

The Evolution of the Language Faculty: Clarifications and Implications

Analysis of Probabilistic Parsing in NLP

Character Stream Parsing of Mixed-lingual Text

PH.D. IN COMPUTER SCIENCE PROGRAM (POST M.S.)

Improved Reordering for Shallow-n Grammar based Hierarchical Phrase-based Translation

Gr. 9 Geography. Canada: Creating a Sustainable Future DAY 1

PESIT SOUTH CAMPUS 10CS71-OBJECT-ORIENTED MODELING AND DESIGN. Faculty: Mrs.Sumana Sinha No. Of Hours: 52. Outcomes

Some Principles of Automated Natural Language Information Extraction

Towards a MWE-driven A* parsing with LTAGs [WG2,WG3]

A method to teach or reinforce concepts of restriction enzymes, RFLPs, and gel electrophoresis. By: Heidi Hisrich of The Dork Side

IBAN LANGUAGE PARSER USING RULE BASED APPROACH

Construction Grammar. University of Jena.

SOME MINIMAL NOTES ON MINIMALISM *

systems have been developed that are well-suited to phenomena in but is properly contained in the indexed languages. We give a

Language and Computers. Writers Aids. Introduction. Non-word error detection. Dictionaries. N-gram analysis. Isolated-word error correction

A Usage-Based Approach to Recursion in Sentence Processing

CSC200: Lecture 4. Allan Borodin

Chapter 4: Valence & Agreement CSLI Publications

ARNE - A tool for Namend Entity Recognition from Arabic Text

Working Papers in Linguistics

Teacher Action Research Multiple Intelligence Theory in the Foreign Language Classroom. By Melissa S. Ferro George Mason University

The Entrepreneurial Mindset Syllabus

Abstractions and the Brain

Software Security: Integrating Secure Software Engineering in Graduate Computer Science Curriculum

Detecting English-French Cognates Using Orthographic Edit Distance

Pre-Processing MRSes

Human-like Natural Language Generation Using Monte Carlo Tree Search

Underlying and Surface Grammatical Relations in Greek consider

Refining the Design of a Contracting Finite-State Dependency Parser

CEE 2050: Introduction to Green Engineering

On the Notion Determiner

Introduction to Causal Inference. Problem Set 1. Required Problems

Surface Structure, Intonation, and Meaning in Spoken Language

Navigating the PhD Options in CMS

The presence of interpretable but ungrammatical sentences corresponds to mismatches between interpretive and productive parsing.

Transcription:

cs3102: Theory of Computation Class 7: Context-Free Menu Nondeterministic PDAs Reviewing Machine Models of Computing Linguistic Models of Computing Spring 2010 University of Virginia David Evans DPDA Recap: DFA/ε + Stack q 0 a, ε + b, + ε Adding Nondeterminism a a ε, ε $ b, + ε q 1 q 2 a Processing: aabb Input State aabb aabb q 0 q 1 aabb q 1 aabb aabb q 1 q 2 aabb q 2 aabb Regular Configuration: one state DFA Regular Configuration: set of states NFA Stack ε $ +$ ++$ +$ $ ε What does it mean to add nondeterminism to a DPDA? Adding Nondeterminism to DPDA Adding Nondeterminism to DPDA recognized:? recognized:? recognized:? +? Configuration: one state + one stack Configuration: one state + one stack Configuration: set of <state, stack> pairs DPDA NPDA DPDA NPDA

q 0 ε, ε $ Example a, ε A a, A ε q 1 q 2 b, ε B b, B ε Now the ε-transition is optional: can be multiple possible edges for a state on a (a, h)input. Acceptance: NPDA accepts w when: δ (q 0,w,ǫ) (q f,s) q f F Accepting State Model δ (q 0,w,ǫ) (q,ǫ) Empty Stack Model Is the set of languages accepted by NPDAs with each model the same? L(NPDA/Empty Stack) L(NPDA/Accepting) L(NPDA/Empty Stack) L(NPDA/Accepting) q 8 qx q stack Cleaner ǫ,h Γ ǫ L(NPDA/Accepting) L(NPDA/Empty Stack) L(NPDA/Accepting) L(NPDA/Empty Stack) q 0 q j q k q z ε, ε $ q + q A

Open (for us) Questions Machine Models L(DPDA/Accepting) =? L(DPDA/Empty Stack) Why don t the proofs for NPDAs work for DPDAs? Are NPDAs more powerful than DPDAs? (will answer next week) What languages cannot be recognized by an NDPDA? (will answer next week) Instead of answering these now, we ll introduce a different model and show it is equivalent to NPDA

I don t know anybody who s ever read a Chomsky book, He does not write page turners, he writes page stoppers. There are a lot of bent pages in Noam Chomsky s books, and they are usually at about Page 16. Alan Dershowitz I must admit to taking a copy of Noam Chomsky s Syntactic Structuresalong with me on my honeymoon in 1961. During odd moments, while crossing the Atlantic in an ocean liner and while camping in Europe, I read that book rather thoroughly and tried to answer some basic theoretical questions. Here was a marvelous thing: a mathematical theory of language in which I could use a computer programmer s intuition!the mathematical, linguistic, and algorithmic parts of my life had previously been totally separate. During the ensuing years those three aspects became steadily more intertwined; and by the end of the 1960s I found myself a Professor of Computer Science at Stanford University, primarily because of work that I had done with respect to languages for computer programming. Donald Knuth Modeling Language Modeling Language Generative Grammar match replacement S NP VP NP N N AdjPN AdjP Adj Adj Adj N N VP V V V AdvP V V AdvP Adv Adv Adv Generative Grammars S NP VP NP AdjN VP V Adv How many sentences can S produce? S NP VP NP AdjNP VP V Adv How many sentences can S produce? S NP VP NP AdjNP NP N VP V Adv How many sentences can S produce?

Recursion Human? We hypothesize that faculty of language in the narrow sense (FLN) only includes recursion and is the only uniquely human component of the faculty of language. We further argue that FLN may have evolved for reasons other than language, hence comparative studies might look for evidence of such computations outside of the domain of communication (for example, number, navigation, and social relations). Marc Hauser, Noam Chomsky, Tecumseh Fitch, The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?, Science, Nov 2002 Steven Pinker and Ray Jackendoff(2004): its not just recursion... Kanzi and Sue Savage-Rumbaugh Kanzi s Language Modeling Computing Machines:String is in the language if machine accepts that input string. Power of a machine typeis defined by the set of languages it can recognize. Generative grammars:string is in the language if grammar can produce that input string. Power of a grammar typeis defined by the set of languages it can recognize. NPDA DPDA Regular Can be recognized by some DFA Machine: simple lookup table DFA with no cycles Grammar: grammar with no cycles A terminals Can we define types of grammars that correspond to each class? All

Regular L(DFA) L(Regular Grammar) Regular Machine: DFA Grammar: Regular Grammar A ab A a Hint: PS2, Problem 9 Context-Free Grammars { w w contains moreas thanbs} A BCD Match: one nonterminal Replacement:any sequence of terminals and nonterminals Can a CFG generate the language a n b n? Context-Free L(NDPDA) L(CFG) NPDA Regular Machine: NPDA Grammar: Context-Free Grammar A BCD 1. L(NDPDA) L(CFG) Left side: one nonterminal Replacement:any sequence of terminals and nonterminals Detailed Proof: Sipser, Section 2.2

L(NDPDA) L(CFG) 2. L(CFG) L(NDPDA) More Powerful Grammars Context-Free Grammar A BCD A a Context-Sensitive Grammar XAY XBCDY Detailed Proof: Sipser, Section 2.2 Unrestricted Grammar XAY BCD Recap Questions Charge grammar is regular? language is regular? languageis not regular? grammar is context-free? language is context-free? language is not contextfree? PS2 is due Tuesday Human Are they finite, regular, context-free, contextsensitive? (Is the human brain a DFA, PDA, etc. or something entirely different?) Next week: Non-Context-Free Parsing, Applications of Grammars