THEORY OF COMPUTATION IT T55 III YEAR

Similar documents
A R "! I,,, !~ii ii! A ow ' r.-ii ' i ' JA' V5, 9. MiN, ;

RANKING AND UNRANKING LEFT SZILARD LANGUAGES. Erkki Mäkinen DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF TAMPERE REPORT A ER E P S I M S

Grammars & Parsing, Part 1:

COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR

Language properties and Grammar of Parallel and Series Parallel Languages

A General Class of Noncontext Free Grammars Generating Context Free Languages

Basic Parsing with Context-Free Grammars. Some slides adapted from Julia Hirschberg and Dan Jurafsky 1

Enumeration of Context-Free Languages and Related Structures

A Version Space Approach to Learning Context-free Grammars

Informatics 2A: Language Complexity and the. Inf2A: Chomsky Hierarchy

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

Natural Language Processing. George Konidaris

CS 598 Natural Language Processing

Erkki Mäkinen State change languages as homomorphic images of Szilard languages

Parsing of part-of-speech tagged Assamese Texts

Proof Theory for Syntacticians

Developing a TT-MCTAG for German with an RCG-based Parser

Context Free Grammars. Many slides from Michael Collins

Chinese Language Parsing with Maximum-Entropy-Inspired Parser

CS 101 Computer Science I Fall Instructor Muller. Syllabus

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation

ENGBG1 ENGBL1 Campus Linguistics. Meeting 2. Chapter 7 (Morphology) and chapter 9 (Syntax) Pia Sundqvist

On the Polynomial Degree of Minterm-Cyclic Functions

systems have been developed that are well-suited to phenomena in but is properly contained in the indexed languages. We give a

Self Study Report Computer Science

AP Calculus AB. Nevada Academic Standards that are assessable at the local level only.

BANGLA TO ENGLISH TEXT CONVERSION USING OPENNLP TOOLS

PRODUCT PLATFORM DESIGN: A GRAPH GRAMMAR APPROACH

"f TOPIC =T COMP COMP... OBJ

Lecture 10: Reinforcement Learning

Enhancing Unlexicalized Parsing Performance using a Wide Coverage Lexicon, Fuzzy Tag-set Mapping, and EM-HMM-based Lexical Probabilities

Grade 5 + DIGITAL. EL Strategies. DOK 1-4 RTI Tiers 1-3. Flexible Supplemental K-8 ELA & Math Online & Print

ARNE - A tool for Namend Entity Recognition from Arabic Text

Refining the Design of a Contracting Finite-State Dependency Parser

(Sub)Gradient Descent

Math-U-See Correlation with the Common Core State Standards for Mathematical Content for Third Grade

arxiv: v1 [math.at] 10 Jan 2016

Discriminative Learning of Beam-Search Heuristics for Planning

The Interface between Phrasal and Functional Constraints

Linking Task: Identifying authors and book titles in verbose queries

WSU Five-Year Program Review Self-Study Cover Page

The Strong Minimalist Thesis and Bounded Optimality

Lecture 1: Machine Learning Basics

Towards a MWE-driven A* parsing with LTAGs [WG2,WG3]

CS 1103 Computer Science I Honors. Fall Instructor Muller. Syllabus

ReinForest: Multi-Domain Dialogue Management Using Hierarchical Policies and Knowledge Ontology

2/15/13. POS Tagging Problem. Part-of-Speech Tagging. Example English Part-of-Speech Tagsets. More Details of the Problem. Typical Problem Cases

Parallel Evaluation in Stratal OT * Adam Baker University of Arizona

NATIONAL CENTER FOR EDUCATION STATISTICS RESPONSE TO RECOMMENDATIONS OF THE NATIONAL ASSESSMENT GOVERNING BOARD AD HOC COMMITTEE ON.

Evolution of Collective Commitment during Teamwork

Backwards Numbers: A Study of Place Value. Catherine Perez

Parsing with Treebank Grammars: Empirical Bounds, Theoretical Models, and the Structure of the Penn Treebank

Standard 1: Number and Computation

Approaches to control phenomena handout Obligatory control and morphological case: Icelandic and Basque

Machine Learning from Garden Path Sentences: The Application of Computational Linguistics

GACE Computer Science Assessment Test at a Glance

Unit 3 Ratios and Rates Math 6

Three New Probabilistic Models. Jason M. Eisner. CIS Department, University of Pennsylvania. 200 S. 33rd St., Philadelphia, PA , USA

A Neural Network GUI Tested on Text-To-Phoneme Mapping

Parsing natural language

Designing a Computer to Play Nim: A Mini-Capstone Project in Digital Design I

Detecting English-French Cognates Using Orthographic Edit Distance

The New York City Department of Education. Grade 5 Mathematics Benchmark Assessment. Teacher Guide Spring 2013

Pre-Processing MRSes

IT Students Workshop within Strategic Partnership of Leibniz University and Peter the Great St. Petersburg Polytechnic University

AGS THE GREAT REVIEW GAME FOR PRE-ALGEBRA (CD) CORRELATED TO CALIFORNIA CONTENT STANDARDS

Mathematics subject curriculum

Morphotactics as Tier-Based Strictly Local Dependencies

GRAMMAR IN CONTEXT 2 PDF

have to be modeled) or isolated words. Output of the system is a grapheme-tophoneme conversion system which takes as its input the spelling of words,

South Carolina College- and Career-Ready Standards for Mathematics. Standards Unpacking Documents Grade 5

South Carolina English Language Arts

TOPICS LEARNING OUTCOMES ACTIVITES ASSESSMENT Numbers and the number system

Changing Majors. You can change or add majors, minors, concentration, or teaching fields from the Student Course Registration (SFAREGS) form.

The Odd-Parity Parsing Problem 1 Brett Hyde Washington University May 2008

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

arxiv: v1 [cs.cv] 10 May 2017

Extending Place Value with Whole Numbers to 1,000,000

Story Problems with. Missing Parts. s e s s i o n 1. 8 A. Story Problems with. More Story Problems with. Missing Parts

Specifying Logic Programs in Controlled Natural Language

IBM Software Group. Mastering Requirements Management with Use Cases Module 6: Define the System

Learning goal-oriented strategies in problem solving

Genevieve L. Hartman, Ph.D.

What the National Curriculum requires in reading at Y5 and Y6

Hans-Ulrich Block, Hans Haugeneder Siemens AG, MOnchen ZT ZTI INF W. Germany. (2) [S' [NP who][s does he try to find [NP e]]s IS' $=~

The presence of interpretable but ungrammatical sentences corresponds to mismatches between interpretive and productive parsing.

Machine Learning and Data Mining. Ensembles of Learners. Prof. Alexander Ihler

Cognitive Modeling. Tower of Hanoi: Description. Tower of Hanoi: The Task. Lecture 5: Models of Problem Solving. Frank Keller.

Character Stream Parsing of Mixed-lingual Text

Abstractions and the Brain

Sample Problems for MATH 5001, University of Georgia

Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions.

Lecture 1: Basic Concepts of Machine Learning

Math 181, Calculus I

An Efficient Implementation of a New POP Model

FractionWorks Correlation to Georgia Performance Standards

Improving Fairness in Memory Scheduling

Hyperedge Replacement and Nonprojective Dependency Structures

Analysis of Probabilistic Parsing in NLP

The Indices Investigations Teacher s Notes

Transcription:

PART A UNIT - I 1. What is meant by finite automata? 2. What is a formal language? 3. What are the two ways of representing an automaton? 4. What is a formal language? 5. What are the two main types of finite automata? 6. Define a language. 7. What is meant by transition? 8. What is regular expression and regular language? 9. What are two- way finite automata? (Nov 14) 10. What is meant by ε -closure? (Nov 16) 11. State the equivalence theorem of NFA and DFA. 12. Define the language accepted by NFA with epsilon moves. 13. Define regular expression. (Nov 14) 14. Write down the rules for defining regular expression. 15. What are the difference between NFA and DFA? (Nov 13) 16. Write down the operations of the regular expression. 17. State some applications of regular expression. 18. What are the two types of finite automata with output?(nov 16) 19. State the difference between Mealy and Moore machine. (Nov 16) 20. State the characteristics of automata. (Nov 16) 21. Construct a DFA with for all the set of strings with {0, 1} that has even number of 0 s and 1 s. 22. Draw the block diagram of finite automata? 23. Explain the six tuples in a mealy and a Moore machines? 24. Explain finite automata with -moves? 25. Define the rules for transition diagram? 26. What are the applications of automata theory?(nov 13) 27. Define Finite Automaton(FA) (Apr 15) 28. What is meant by token? (Apr 15) 29. What is meant by minimization of DFA? 30. Show the (r*)*=r* for a regular expression (Nov 13) PART B 1. Explain the equivalence of NFA and DFA with examples. (Nov 16) 2. Discuss the importance of Epsilon transitions with examples. (Nov 16) 3. Design a DFA to accept the language. L = {w w has both an even number of 0 s and an even number of 1 s} 4. Prove that a language L is accepted by some DFA if and only if L is accepted by some DFA. 5. Distinguish between DFA and NFA. Give a formal definition of finite automata. (Nov 16) J. Veerendeswari, AP/IT. Page 1

6. Construct a DFA that accepts those strings over {a, b} which begin with an a followed by b n (n >= 0). 7. For the following NFA, find the equivalent DFA. (Nov 16) 0 1 q o {q0, q1} {q0} q1 {q2} {q2} q2 {q3} {q3} q3 Ø Ø 8. Write short notes on Moore and Melay machine. (Nov 16)(Nov 14)(Apr 15) 9. Discuss finite automata with output. (Apr 15) UNIT II 1. Define context free grammar. (Nov 13) 2. When is a grammar said to be ambiguous? (Nov 13) 3. Write the principal closure properties for regular language?(nov 13) 4. What is regular set?(nov 14) 5. Define the term Production. Give an example. (Nov 14) 6. Describe the application of regular expressions. (Nov 16) 7. What is meant by leftmost derivation and rightmost derivation? 8. Write about sentential form? 9. What are the applications of context free language? 10. What is meant by unambiguous? 11. What is meant by ambiguous grammar? (Nov 16) (Nov 13)(Apr 15) 12. Describe the applications of regular expressions. (Nov 16) 13. What is BNF?(Apr 15) 14. State Chomsky normal form? 15. State Greibach normal form? 16. What are the properties of the CFL generated by a CFG? 17. What are the three ways to simplify a context free grammar? 18. Find the grammar for the language L={ a 2n bc, where n>1 } 19. Find the language generated by :S 0S1 0A 0 1B 1 ; A 0A 0; B 1B 1 20. Differentiate sentences Vs sentential forms 21. What is derivation tree? 22. Draw derivation tree for a+a*a. 23. Define pumping lemma for regular language. (Nov 16) 24. What is handle pruning? 25. State pumping lemma for context free language. (Nov 16) J. Veerendeswari, AP/IT. Page 2

PART - B 1. What is a derivation tree? Construct derivation trees for the word ababbbba using the grammars G consists of the products {S AbS, A as, S ba and A b} 2. Prove that every language defined regular expression is also accepted by a finite automaton. 3. Construct a DFA with reduced state equivalent to the Regular Expression RE = 10 + (0 + 11) 0 * 1. 4. Briefly explain with Pumping Lemma for Regular sets. (Nov 16) 5. Find the languages generated by the following grammar: G = {(S, A, B), (a, b), S, P} where P is the set of production {S AB, S AA, A ab, A ab, B > b} 6. What is Chomsky Normal form? How grammar can be put in CNF? Illustrate. (Nov 14) 7. Write a context free grammar for the language L = {a n ba n n>=1} (Nov 13) 8. Find the Chomsky normal form for the following grammar G = ({S,A,B},{a,b}P,S) P: S ba ab, A baa+ as a, B abb bs\b. (Nov 13) 9. Convert the following grammar to Greibach normal form G = ({A1,A2,A3}, {a,b}, P1A1), where p consists of the following: A1 A2,A3, A2 A3A1 b, A3 A1A2 (Apr 15) 10. Describe the top down parsing. (Nov 13) 11. Show that {0 i 1 j gcd(i,j)=1}is not regular.(apr 15) 12. Convert the following CFG to Chomsky Normal Form S AA, A BB, B abb/b/bb. 13. Discuss the normal forms for context-free grammars. (Nov 16) 14. State and prove pumping lemma for CFL. (Nov 16) UNIT III 1. What are the special features of Turing Machine?(Nov 13) 2. When is checking off symbols used in Turing Machine?(Nov 13) 3. State the notations for Turing machine. 4. What is Nondeterministic Turing Machine?(Nov 14) 5. What are (a) recursively enumerable languages (b) recursive sets? (Nov 16) 6. Write the cncepts of Universal Turing Machine.(Nov 15) 7. When do you say that a Turing Machine accepts a string?(nov 15) 8. What is the difference between the Turing machine and the Finite automata? 9. What are the components of Turing machine?(apr 16) 10. What are the elements of a Turing Machine? (Apr 16) 11. Who invented Turing Machine? (Nov 16) 12. How many tuples are in the Turing machine? What are they? 13. What are the different types of Turing machine? 14. What is multiple Turing machine?(apr 15)(Apr 16) J. Veerendeswari, AP/IT. Page 3

15. What is Turing machine with multiple tapes? 16. What is Turing machine with infinite tape? 17. What is a recursive algorithm? 18. What is recursive enumerable language? (Nov 16) 19. What is the difference between Semi-infinite and two way infinite tapes? 20. Write about the storage in the Finite control? 21. Write about shifting over in the Turing machine? 22. What is a Turing machine? 23. Define Instantaneous description of TM. 24. What are the applications of TM? 25. What is the basic difference between 2-way FA and TM? 26. What are the techniques for Turing machine construction? 27. Differentiate PDA and TM. 28. What is Church s Hypothesis?(Apr 16) 29. What is a multidimensional TM? PART - B 1. With a proper diagram, briefly explain the working of a Turing Machine.(Nov 14)(Nov 15) 2. Explain the method of constructing Turing Machine. (Nov 16) 3. Explain briefly the Church Thesis.(Apr 16)(Nov 15) 4. Design a Turing machine to accept language {a n b n n>=1}(nov 15) 5. Prove that every language accepted by a multiple Turing Machine is recursively Enumerable. (Nov 15) 6. Briefly explain with programming techniques of turing machines. 7. What is Turing Machine? Explain version (Types) in Turing Machine. (Nov 16)(Nov 13)(Apr 16) 8. Design a turing Machine M to perform proper subtraction. (Nov 13) 9. Consider the TM described by the transition table. Describe the processing of (a) 011, (b)0011, (c)001. Which of the above strings are accepted by M? Present State Tape Symbols 0 1 x y B q1 xrq2 BRq5 q2 0Rq2 ylq3 yrq2 q3 0Lq4 xrq5 ylq3 q4 0Lq4 xrq1 q5 yrq5 BRq6 q6 10. Design a Turing machine M to recognize the language. L = {1 n 2 n 3 n : n>=1} J. Veerendeswari, AP/IT. Page 4

11. Construct a Turing Machine to accept the set L of all strings over {0, 1} ending with 010. (Nov 16) 12. Explain the Turing Machine model with a suitable example.(nov 14) 13. Discuss modification of turing machines.(apr 15) 14. Write a note on variations of TM. (Apr 15) UNIT IV 1. What is Push down automata?(apr 16) 2. Define Push down automata.( Apr 16) 3. Define NPDA. (Apr 16) 3. Explain the transition mapping of PDA. 4. Why there is a need for stack in PDA? 5. What is the Equivalence of PDA s and CFG s. (Nov 16) 6. Compare NFA and PDA. 7. Specify the two types of moves in PDA? 8. What are the two ways by which a language is accepted by PDA? (Nov 15) 9. What is the relationship between Deterministic Push Down Automata and Context Free Languages? (Nov 15) 10. What are the different types of language acceptances by a PDA. 11. Define acceptance by final state. 12. Define acceptance by empty stack. 13. Define Instantaneous description in PDA. 14. What are the components of Pushdown Automata? 15. Compare NPDA and DPDA. PART B 1. Define Push Down Automata (PDA) and give moves of the PDA that accepts {wcw R w (0+!)* by empty stack. (Nov 15) 2. Design a PDA to accept the language L = {a i b j c k : i+j = k; i>=0, j>=0} (Nov 15) 3. Explain the closure properties of CFL. (Nov 15)(Apr 16)(Nov 13) 4. Show that, if L is a Context Free language, then there exists a PDA M such that L = N(M). (Nov 13) 5. Discuss the equivalence of PDA s and CFL s. (Nov 16)(Apr 16) 6. Describe the importance of deterministic push down automata. (Nov 16) J. Veerendeswari, AP/IT. Page 5

7. Write a detailed note on deterministic PDA. (Nov 14) 8. Define PDA with seven tuples. Explain in detail and input string is L = {ababbcbbaba}. 9. Explain detail about Decision Algorithms for CFL s. UNIT V 1. Define Top-down parsing? (Nov 16) 2. Define Bottom-up parsing? (Nov 16)(Nov 15) 3. Define handle and handle pruning? (Apr 16) 4. What is a reduction? (Apr 16) 5. Define parser table. (Nov 15) 6. What is the concept of stack? (Nov 14) 7. Use Top down parsing to parse aaab using the following grammar. S AB, A aa/?, B b/bb (Apr 15) 8. Explain the actions used in Bottom up parsing. 9. Distinguish between top-down and bottom-up parsing. (Nov 16) 10. State decision algorithm 11. State the relationship between derivation and derivation tree. 12. How to eliminate the left recursive patterns? 13. List the properties of LR parser. 14. Mention the types of LR parser. 15. What are the problems with top down parsing? 16. Write the algorithm for FIRST and FOLLOW. 17. Write short notes on YACC. 18. Define LR(0) items. 19. What is meant by viable prefixes? PART - B 1. Describe function of SLR parser and its limitations. (Nov 15) 2. How will you construct the parsing table for LALR parser? Explain the procedure with example. (Nov 15) 3. Write in detail about LALR parsing algorithm. (Nov 13) 4. Explain the working of CFG and PDA with suitable example. (Apr 16) 5. How will you construct the parsing table for LALR parser? Explain the procedure with example. 6. Consider the grammar E TE, E +TE /ε, T FT, T *FT / ε, (E) / id. To construct the predictive parser method. Input string is id + id * id. 7. What is shift reduce parsing? Give example. And stack implementation of Shift reduce parsing. J. Veerendeswari, AP/IT. Page 6

8. Explain the construction of top-down parser with an example. (Nov 16) 9. Elaborate on the properties of LR(K) grammars. (Nov 16) 10. Consider the grammar E E+T/T, T T*F/F, F (E)/id. To construct the SLR Parsing table. 11. Consider the grammar S CC, C cc/d. To construct the sets of LR(1) items. 12. Explain the working of the LALR parser with suitable examples. (Apr 16) 13. Explain the construction of top-down parser with an example. (Nov 16) 14. Construct CLR parsing table form S AA, A Aa b. (Nov 13) J. Veerendeswari, AP/IT. Page 7