Course 2 Introduction to Automata Theory (cont d)

Similar documents
Erkki Mäkinen State change languages as homomorphic images of Szilard languages

Language properties and Grammar of Parallel and Series Parallel Languages

COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR

A General Class of Noncontext Free Grammars Generating Context Free Languages

Informatics 2A: Language Complexity and the. Inf2A: Chomsky Hierarchy

Enumeration of Context-Free Languages and Related Structures

Grammars & Parsing, Part 1:

A Version Space Approach to Learning Context-free Grammars

Proof Theory for Syntacticians

IT Students Workshop within Strategic Partnership of Leibniz University and Peter the Great St. Petersburg Polytechnic University

A R "! I,,, !~ii ii! A ow ' r.-ii ' i ' JA' V5, 9. MiN, ;

RANKING AND UNRANKING LEFT SZILARD LANGUAGES. Erkki Mäkinen DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF TAMPERE REPORT A ER E P S I M S

Natural Language Processing. George Konidaris

CS 598 Natural Language Processing

"f TOPIC =T COMP COMP... OBJ

Lecture 10: Reinforcement Learning

On the Polynomial Degree of Minterm-Cyclic Functions

Parsing natural language

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

Parsing of part-of-speech tagged Assamese Texts

PRODUCT PLATFORM DESIGN: A GRAPH GRAMMAR APPROACH

Improving Fairness in Memory Scheduling

GRAMMAR IN CONTEXT 2 PDF

The Strong Minimalist Thesis and Bounded Optimality

Morphotactics as Tier-Based Strictly Local Dependencies

Speech Segmentation Using Probabilistic Phonetic Feature Hierarchy and Support Vector Machines

Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions.

Reinforcement Learning by Comparing Immediate Reward

Refining the Design of a Contracting Finite-State Dependency Parser

Basic Parsing with Context-Free Grammars. Some slides adapted from Julia Hirschberg and Dan Jurafsky 1

Self Study Report Computer Science

Language Evolution, Metasyntactically. First International Workshop on Bidirectional Transformations (BX 2012)

Artificial Neural Networks written examination

arxiv: v1 [math.at] 10 Jan 2016

Evolution of Collective Commitment during Teamwork

Data Integration through Clustering and Finding Statistical Relations - Validation of Approach

Developing a TT-MCTAG for German with an RCG-based Parser

BAUM-WELCH TRAINING FOR SEGMENT-BASED SPEECH RECOGNITION. Han Shu, I. Lee Hetherington, and James Glass

systems have been developed that are well-suited to phenomena in but is properly contained in the indexed languages. We give a

MTH 141 Calculus 1 Syllabus Spring 2017

An Introduction to the Minimalist Program

(Sub)Gradient Descent

Probability and Game Theory Course Syllabus

A Grammar for Battle Management Language

LING 329 : MORPHOLOGY

Improved Reordering for Shallow-n Grammar based Hierarchical Phrase-based Translation

Syntactic systematicity in sentence processing with a recurrent self-organizing network

Hyperedge Replacement and Nonprojective Dependency Structures

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

Ch VI- SENTENCE PATTERNS.

Axiom 2013 Team Description Paper

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

ISFA2008U_120 A SCHEDULING REINFORCEMENT LEARNING ALGORITHM

Lecture 1: Machine Learning Basics

Unraveling symbolic number processing and the implications for its association with mathematics. Delphine Sasanguie

Math 96: Intermediate Algebra in Context

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation

ABSTRACT. A major goal of human genetics is the discovery and validation of genetic polymorphisms

Reading Horizons. Organizing Reading Material into Thought Units to Enhance Comprehension. Kathleen C. Stevens APRIL 1983

A. True B. False INVENTORY OF PROCESSES IN COLLEGE COMPOSITION

Analysis of Enzyme Kinetic Data

Efficient Normal-Form Parsing for Combinatory Categorial Grammar

Discriminative Learning of Beam-Search Heuristics for Planning

AP Calculus AB. Nevada Academic Standards that are assessable at the local level only.

Listening and Speaking Skills of English Language of Adolescents of Government and Private Schools

WSU Five-Year Program Review Self-Study Cover Page

Testing A Moving Target: How Do We Test Machine Learning Systems? Peter Varhol Technology Strategy Research, USA

1/20 idea. We ll spend an extra hour on 1/21. based on assigned readings. so you ll be ready to discuss them in class

Developing True/False Test Sheet Generating System with Diagnosing Basic Cognitive Ability

Detecting English-French Cognates Using Orthographic Edit Distance

Evolutive Neural Net Fuzzy Filtering: Basic Description

PRODUCT COMPLEXITY: A NEW MODELLING COURSE IN THE INDUSTRIAL DESIGN PROGRAM AT THE UNIVERSITY OF TWENTE

A Minimalist Approach to Code-Switching. In the field of linguistics, the topic of bilingualism is a broad one. There are many

CS Machine Learning

Classify: by elimination Road signs

7. Stepping Back. 7.1 Related Work Systems that Generate Folding Nets. The problem of unfolding three-dimensional models is not a new one (c.f.

Interactive Whiteboard

OCR for Arabic using SIFT Descriptors With Online Failure Prediction

Action Models and their Induction

Reducing Abstraction When Learning Graph Theory

Two-Valued Logic is Not Sufficient to Model Human Reasoning, but Three-Valued Logic is: A Formal Analysis

Reading Horizons. A Look At Linguistic Readers. Nicholas P. Criscuolo APRIL Volume 10, Issue Article 5

Role of Pausing in Text-to-Speech Synthesis for Simultaneous Interpretation

Outline for Session III

ARNE - A tool for Namend Entity Recognition from Arabic Text

The Use of Statistical, Computational and Modelling Tools in Higher Learning Institutions: A Case Study of the University of Dodoma

UPPER SECONDARY CURRICULUM OPTIONS AND LABOR MARKET PERFORMANCE: EVIDENCE FROM A GRADUATES SURVEY IN GREECE

Liquid Narrative Group Technical Report Number

Preparing a Research Proposal

Multi-label Classification via Multi-target Regression on Data Streams

1 NETWORKS VERSUS SYMBOL SYSTEMS: TWO APPROACHES TO MODELING COGNITION

PREDICTING SPEECH RECOGNITION CONFIDENCE USING DEEP LEARNING WITH WORD IDENTITY AND SCORE FEATURES

Context Free Grammars. Many slides from Michael Collins

College Pricing. Ben Johnson. April 30, Abstract. Colleges in the United States price discriminate based on student characteristics

Is there a Causal Effect of High School Math on Labor Market Outcomes?

Abstractions and the Brain

QuickStroke: An Incremental On-line Chinese Handwriting Recognition System

Lecture 2: Quantifiers and Approximation

ENGBG1 ENGBL1 Campus Linguistics. Meeting 2. Chapter 7 (Morphology) and chapter 9 (Syntax) Pia Sundqvist

Logical Aspects of Digital Mathematics Libraries (extended abstract)

Transcription:

Course 2 Introduction to Automata Theory (cont d) The structure and the content of the lecture is based on http://www.eecs.wsu.edu/~ananth/cpts317/lectures/index.htm 1

Excursion: Previous lecture 2

Languages & Grammars Or words Languages: A language is a collection of sentences of finite length all constructed from a finite alphabet of symbols Grammars: A grammar can be regarded as a device that enumerates the sentences of a language - nothing more, nothing less G = (V N, V T, S, P) V N list of non-terminal symb. V T list of terminal symb. Image source: Nowak et al. Nature, vol 417, 2002 S start symb. P list of production rules V N V T = 3

The Chomsky Hierachy A containment hierarchy of classes of formal languages Regular (DFA) Contextfree (PDA) Contextsensitive (LBA) Recursivelyenumerable (TM) 4

The Chomsky Hierarchy Regular Contextsensitive Contextfree Recursivelyenumerable Grammar Languages Automaton Production Rules Type-0 Recursively enumerable L 0 Turing machine Type-1 Context sensitive Linear-bounded non-deterministic L 1 Turing machine Type-2 Context-free Nondeterministic push L 2 down automaton Type-3 Regular L Finite state 3 automaton α β αaβ αγβ A γ A a and A ab 5

The Chomsky Hierarchy (cont d) 6

The Chomsky Hierarchy (cont d) Classification using the structure of their rules: Type-0 grammars: there are no restriction on the rules; Type-1 grammars/context sensitive grammars: the rules for this type have the next form: uav upv, u, p, v V G, p λ, A V N or A λ and in this case A does not belongs to any right side of a rule. Remark. The rules of the second form have sense only if A is the start symbol. 7

The Chomsky Hierarchy (cont d) Remarks 1. A grammar is Type 1 monotonic if it contains no rules in which the left-hand side consists of more symbols than the right-hand side. This forbids, for instance, the rule,. NE and N, where N, E are non-term. symb.; and is a terminal symb (3 =. NE and N = 2). 8

The Chomsky Hierarchy (cont d) Remarks A grammar is Type 1 context-sensitive if all of its rules are context-sensitive. A rule is context-sensitive if actually only one (non-terminal) symbol in its left-hand side gets replaced by other symbols, while we find the others back undamaged and in the same order in the right-hand side. Example: Name Comma Name End Name and Name End meaning that the rule Comma and may be applied if the left context is Name and the right context is Name End. The contexts themselves are not affected. The replacement must be at least one symbol long; this means that context-sensitive grammars are always monotonic. Examples: see whiteboard 9

The Chomsky Hierarchy (cont d) Classification using the structure of their rules: Type-2 grammars/context free grammars: the rules for this type are of the form: A p, p V G, A V N Type-3 grammars/regular grammars: the rules for this type have one of the next two forms: Cat. I rules Cat. II rules A Bp C q Rule A λ is allowed if A does not belongs to any right side of a rule. or A, B, C V N, p, q V T A pb C q Examples: see whiteboard 10

The Chomsky Hierarchy (cont d) Localization lemma for context-free languages (CFL) (or uvwxy theorem or pumping lemma for CFL) Motivation for the lemma: almost anything could be expressed in a CF grammar. Let G be a context free grammar and the derivation x 1 x m p, where x i V G, p V G. Then there exists p 1 p m V G such that p = p 1 p m and x j p j. Example: see whiteboard 11

The Chomsky Hierarchy (cont d) What do you observe on the right-hand side (RHS) of the production rules of a context-free grammar? 12

The Chomsky Hierarchy (cont d) It is convenient to have on the RHS of a derivation only terminal or nonterminal symbols! This can be achieved without changing the type of grammar. Lemma A i Let system G = (V N, V T, S, P) be a context-free grammar. There exists an equivalent context free grammar G with the property: if one rule contains terminals then the rule is of the form A i, A V N, i V T. 13

The Chomsky Hierarchy (cont d) Lemma A i Let system G = (V N, V T, S, P) be a context free grammar. There exists an equivalent context free grammar G with the property: if one rule contains terminals then the rule is of the form A i, A V N, i V T. Proof. Let G = V N, V T, S, P, where V N V N and P contains all convenient rules from P. Let the following incoveninent rule: u v 1 i 1 v 2 i 2 i n v n+1, i k V T, v k V N We add to P the following rules: u v 1 X i1 v 2 X i2 X in v n+1 X ik i k k = 1.. n, X ik V N Key ideas in the transformation! 14

The Chomsky Hierarchy (cont d) What is the relationship between L 0, L 1, L 2, L 3? 15

Closure properties of Chomsky families Definition. Let be a binary operation on a family of languages L. We say that the family L is closed on the operation if L 1, L 2 L then L 1, L 2 L. Let G 1 = N 1, T 1, S 1, P 1, G 2 = N 2, T 2, S 2, P 2. Closure of Chomsky families under union The families L 0, L 1, L 2, L 3 are closed under union. Key idea in the proof G = V N1 N 2 S, V T1 T 2, P 1 P 2 S S 1 S 2 Examples: see whiteboard 16

For L 3 G p = V N1 N2, V T1 T2, S 1, P 1 P 2 Closure properties of Chomsky families Closure of Chomsky families under product The families L 0, L 1, L 2, L 3 are closed under product. Key ideas in the proof For L 0, L 1, L 2 Gp = V N1 N2 S, V T1 T2, P 1 P 2 S S 1 S 2 where P 1 is obtained from P 1 by replacing the rules A p with A ps 2 Examples: see whiteboard 17

Closure properties of Chomsky families Closure of Chomsky families under Kleene closure The families L 0, L 1, L 2, L 3 are closed under Kleene closure operation. Key ideas in the proof For L 0, L 1 G = V N S, X, V T, S, P S λ S XS, Xi Si XSi, i V T The new introduced rules are of type 1, so G does not modify the type of G. For L 2 G = V N S, V T, S, P S S S λ For L 3 G = V N S, V T, S, P P S S λ where P is obtained with category II rules, from P, namely if A p P then A ps P. Examples: see whiteboard 18

Closure properties of Chomsky families Observation. Union, product and Kleene closure are called regular operations. Hence, the language families from the Chomsky classification are closed under regular operations. 19

Summary Chomsky hierarchy Closure properties of Chomsky families 20