Generating a Sentence from a Thought

Size: px
Start display at page:

Download "Generating a Sentence from a Thought"

Transcription

1 Generating a Sentence from a Thought W. Faris 1 and K.H. Cheng Computer Science Department, University of Houston, Houston, Texas, USA Abstract It is desirable for an intelligent program to communicate with humans using a natural language. We have recently developed a program to parse a sentence into a thought using a learned English grammar. Without learning a separate grammar, this paper discusses how we convert the learned grammatical structures into role structures. These role structures can then be used by our algorithm to generate a sentence that reflects the contents of a given thought. Roles define the purpose of grammar terms and link grammatical knowledge to semantic knowledge. Since this linkage separates semantics from grammatical structures completely, the thoughts used in generating a sentence only need to be logical thoughts. Consequently, the creator of a thought may define the thought s content based simply on its semantics and not be concerned with the grammatical details of the natural language. Keywords: Grammar, Sentence Generation, Semantic Representation, Natural Language Processing 1 Introduction One important objective for most artificial intelligence programs is to possess the ability to communicate with humans using a natural language. The communication problem has two major objectives: comprehending the intention of a given sentence, and generating a sentence for a thought that the program wants to express. Recently, we have developed a communication sub-system for the A Learning Program System (ALPS) project [1]. The goal of ALPS is to learn all types of knowledge, including those involved in communication. Our system does not use any pre-coded grammatical knowledge, but instead acquires them during program execution. Our system first learns the grammar terms (parts-of-speech) of the English language along with their details, and then uses this grammatical knowledge to parse a given sentence [2]. Subsequently, we have developed solutions to understand declarative sentences [3, 4] including identifying the correct knowledge referenced by various forms of pronouns [5]. In this paper, we discuss how we transform the grammatical knowledge acquired for parsing a sentence into a bidirectional grammar [6] and present how to use that to generate a sentence when 1 Research is supported in part by GAANN: Doctoral Training in Computer and Computational Sciences, US Department of Education under grant #P200A given a thought. Reiter [7] identifies four stages in generating a sentence: content determination, sentence planning, surface generation, and morphology. Content determination involves the formation of an internal representation for the information to be expressed such as feature structures [8]. Sentence planning is the task of mapping the semantic representation to a linguistic one. For instance, it is responsible for identifying the determiner, adjective, and noun when given a semantic representation such as the red ball. Kasper [9] uses sentence planning to develop an interface between a semantic domain and the Penman s sentence generator [10]. However, it requires specialized modifications based on the domain s knowledge base. Surface generation is the process of properly correlating grammatical terms to one another, such as recognizing that an article must precede the noun. The semantic head-driven generation algorithm [11] is an example of a surface generation algorithm. However, it requires the use of a grammar that does not provide a clear separation between the grammatical and the semantic knowledge. In addition, the grammar used is only for sentence generation and requires a separate grammar for parsing. TG/2 [12] is another surface generation tool, but it is limited to a predefined grammar. Finally, morphology requires the use of linguistic rules to produce the correct form of a word, examples of which can be found in [13]. One common approach to use a bidirectional grammar is using the same grammatical structure in both parsing and generating a sentence. In other words, when a sentence is being interpreted, parsers identify grammar terms using a set of acceptable sequences and alternatives provided by the grammar; these same grammar terms are used to generate a sentence. The problem is grammars are traditionally designed for parsing, taking text as input. Whereas, in generating a sentence, the input is semantic knowledge. Trying to use the same component in these grammars to generate a sentence has its limitations [14]. Instead, our definition of a bidirectional grammar simply requires that one grammar be used for both purposes, but that it may use different components of the same grammar to accomplish both tasks. The original definition of our grammar uses the grammatical structures to parse a sentence. During the learning of this grammar, we construct knowledge for generating sentences using components known as roles. The role of a grammar term defines the purpose of the term and acts as a bridge between grammatical knowledge and its semantics. Specifically the knowledge created includes role

2 sets, and role sequences. A role set represents the alternative semantic representations of a grammar term, while a role sequence defines a precise way to express a role. Our program compiles this knowledge and sets up the bridge between a grammar term and its associated role. By using these roles, role set, role sequences and bridges, it removes the dependency on the grammatical terms. The process for generating a sentence begins by identifying a sequence of roles that best matches the information stored in the given thought. Each role is then called sequentially to produce a textual representation of that role. Finally, each word is transformed into the right form based on the properties of the knowledge that it represents. Because of the existence of roles, the grammar acquired by our solution does not require special knowledge of the domain. In addition, since roles may be associated to multiple grammar terms, it allows for a simple implementation for a complex grammar. The principles behind creating role sequences ensure the correct grammatical relationships between each term in the sequence, our form of surface generation. As a result, the generated sentence automatically follows the structure taught in the grammar when using the chosen sequence to express the given thought. Our approach has the advantage that no surface generation step is needed at the time of generating a sentence, and thoughts may be created based on its logical meaning instead of the grammatical requirement of the natural language. Note that we are not interested on how or why the program produces a thought, but on how to generate a sentence according to the contents of a given thought. Currently, our program may use properly constructed thoughts created in three situations. The first situation occurs when a sentence presented to the program by a human user is parsed, creating a thought. We have tested our solution on thoughts such as declarations and questions with various kinds of verbs and pronouns. The second situation occurs when the program is responding to a question. It uses the question thought to create a declaration thought that includes the found answer. The third situation happens during the understanding of a sentence that involves an action. Currently, when certain actions [3] are learned, their effects on various logical objects involved in the action are presented to our program as a sequence of English sentences. The stored effects are individual thoughts created from parsing each sentence. When the program attempts to understand a given sentence that uses that action verb, the thought for each actual effect may be created easily from the prototype effect thought. The actual effect sentence is then generated using our proposed solution. For example, one effect for the action buy is The buyer gives the price to the seller. Given the sentence John buys Jack 2 apples from Mary for 4 dollars., our solution produces the sentence John gives Mary 4 dollars. as one actual effect. The rest of this paper is organized as follows. In the next section, we discuss key components of the learned grammar and describe how roles are used to express the intention of a grammar term. In addition, we demonstrate how multiple roles may be combined to form the structure of thoughts expected by our solution. Section 3 discusses the processing needed, at the time of learning the English grammar, to form a bidirectional grammar. This includes building role sets, role sequences, and sequence collections. Section 4 presents the algorithm to generate an English sentence from a given thought. It then describes the algorithm s three major steps in detail: selecting the best-fit role sequence to express the given role, identifying the words representing each role in the selected sequence, and transforming each word into the right form based on the properties of the knowledge that the word represents. Finally, Section 5 concludes the paper and presents some challenges and future objectives of the project. 2 Grammar The learning of the English grammar is done incrementally in ALPS, i.e., our program first learns a subset of the English grammar and details may then be added to increase the kind and complexity of the sentences that the program can handle. Our program originally uses the learned grammar to parse and understand English sentences. It first learns grammar terms such as sentence, complete subject, verb, noun phrase, and preposition. Each grammar term has several components that define the term and how to use it. Some major components introduced in [2] are structure, role, and rule. The structure of a grammar term defines the exact grammatical format of the term. Two major possibilities for the structure of a grammar term are a sequence and an alternative of grammar terms. A rule specifies a condition that must be satisfied by either the grammar term or its structure. The role of a grammar term defines the purpose of the term. Another grammar term s component introduced in [3, 4] is the control, which is the grammar term in a sequence that carries out an important duty in understanding the semantics of the term. We will show in this paper how to use the same grammar to generate an English sentence when given a thought to express. A role associated to a grammar term has three important aspects: the grammatical role, the semantic role, and its structure. The grammatical role, identified by italics, is the label given to the role depending on the grammatical purpose of the term. For example, the role of the first noun phrase in a sentence is labeled as subject. The semantic role, called a logical object, is the label given to the role in relation to its semantic purpose within a higher-level role. For example, the intent of a declarative sentence using a be verb is to define some object of interest. As a result, its role, a declaration, requires two logical objects: definition and object-of-interest. The grammatical subject of the sentence serves the semantic purpose of object-of-interest. On the other hand, in a declaration using an action verb, the

3 same subject serves the purpose of actor if the declaration is in active voice. The structure of a role reflects the content of the role, and consists of several related logical objects that represent what the role is expressing. For example, in a thought that reflects an action, the logical objects are actor, act-on object, and act-for object. Consider another example; the aspect-of-an-object structure is one possible structure to realize the role associated to a noun phrase. The knowledge-of-interest referred to by an aspect-of-an-object structure represents an aspect or characteristic of a specific object, the object-of-interest. Consequently, aspect and object-of-interest are the two logical objects in this structure. Given specific values of these logical objects, the knowledge-of-interest may be inferred easily. For instance, if the logical objects aspect and object-of-interest refer to the height concept and Mt. Everest, respectively, then the intent of this role is to express the height of Mt. Everest. One way to express an aspect-of-an-object in English is to use a noun phrase that uses the preposition of. As a result, the grammar term for the preposition of is associated with an aspect-of-an-object role structure. Table 1 shows some role structures and their usage in ALPS. The above example shows a role structure whose logical objects are already provided. However, identifying the logical objects within a structure depends on the associated grammar term. For example, when attached to the preposition of and used in a noun phrase, aspect should be the simple subject grammatical role, while object-of-interest should be the object-of-the-preposition. On the other hand, a role with this structure could be attached to the possessive noun to express phrases such as John s weight. In this case, the object-of-interest is the grammatical role possessor, while the aspect should point to the role of the noun being modified, labeled term. In order to create the correct associations, a dynamically generated bridge is created for each instance of the structure linking logical objects to grammatical roles. The structure associated to of will have a bridge that links aspect to simple subject and objectof-interest to object-of-the-preposition. On the other hand, the bridge for the structure in the second example links aspect to term and object-of-interest to possessor. By using different bridges, we may apply the same role structure to different grammar terms to express the same knowledge in various ways. The usages of this role structure with its bridges are shown in Figure 1. Note that these bridges link in both directions, one for generating a sentence and the other in creating the thought while parsing a sentence. It is important to note that the required thought given to our program to generate a sentence is a logical thought, i.e., its content is identified by logical objects instead of grammatical terms. As a result, the creator of the thought only needs to define the logical meaning of the thought, and does not need to be concerned with the grammatical details of the natural language. We will use the thought for the sentence The next prime number after 7 is 11. as an example. Since this is a declarative sentence, the thought is a declaration. Every declaration depends on its verb role, which is a define role in this example. A define role uses a definition to define an object-of-interest. Therefore, in the declaration, the logical object object-of-interest is the next prime number after 7, and the definition is the number 11. The definition object may be represented easily by a wholeobject role that contains the number 11. The logical object object-of-interest for the phrase the next prime number after 7 may be constructed in a way similar to a declaration. We use a relationship role structure for this expression since it depicts a number related to another number. A relationship role structure is defined by three logical objects: an object-of-interest, a reference-object, and a relation. In this example, 7 is the reference-object, after is the relation, and the next prime number is the object-of-interest. Finally, the next prime number is represented by a role Table 1. Example role structures in ALPS Category Role Structure Purpose Logical Objects Example Declaration Define Defines a state, category or property of an object object-of-interest, definition Apples are fruit. Act Expresses an action Actor, act-on, act-for John gave Jill a gift. Possess Defines the possession of one object by another possessor, possession Mary has 2 homes. Usage Aspect-of-an-Object Represent an aspect or property of an object aspect, object-of-interest His weight Relationship Represent an object in relation to another object reference-object, object-of-interest The ball on the table Simple Subject Obj of the Preposition Aspect Obj-of-Interest Pre Aspect-of-an-Object posi tion Aspect Weight Aspect Obj-of-Interest Term Possessor Object-of-interest weight of John John John s weight Figure 1. Two bridges of the aspect-of-an-object structure

4 which contains the knowledge prime number, the modifier the, which represents a property of uniqueness, and the adjective next. 3 Sequence collection At the time our program learns the grammar, the collection of role sequences for a grammar term is generated according to its grammatical structure. For a grammar term that is defined by a list of alternative grammar terms, its role set is composed of the role structures for the roles associated with its descendants. For example, two alternatives of preposition are of and relational prepositions such as above and before. The structures for their associated roles, aspect-of-an-object and relationship, are added to the role set for preposition. Note that the same grammar may be learned incrementally in many different orders. It is possible that a role is attached to a grammar term that is known to be a descendant of another term. Alternatively, a grammar term already having an associated role is later taught as a descendant of another term. In both cases, the role structure for that role is added to the role set stored at the root grammar term. For instance, suppose define role has already been associated with the be grammar term and verb is initially taught to have two alternatives: action verb and linking verb. When act role is taught to associate with the action verb, it is added to the role set for verb as an alternative. Similarly, define role is also added to the role set for verb when be is taught as the child alternative term of linking verb. As a result, the role set for verb contains an act and define role structures as its alternatives. For a grammar term whose grammatical structure is a sequence, a role sequence is created based on its order and occurrence. Each item in the role sequence is a grammatical role, with a pointer to the role set of the corresponding grammar term. When a new role structure is added to the role set, it will be available dynamically to any role sequence that contains the role set. For instance, a noun phrase may be defined by the sequence [nominal, prepositional phrase], where nominal is compulsory and has the simple subject role, and prepositional phrase is optional and defined in turn by the sequence [preposition, nominal] with this nominal having the object-of-the-preposition role. The corresponding role sequence created for noun phrase is [simple subject, prepositional, object-of-the-preposition] with the first item compulsory while the last two items optional. Unlike roles and role sets that are attached with grammar terms, a role sequence is attached to each role structure in the role set of the control. Once a new role sequence is attached to a role structure, that structure may now be expressed by the originating grammatical sequence. For example, the role sequence for noun phrase would be attached to each alternative role structure in the role set of preposition such as aspect-of-an-objec and relationship. Recall that a role structure such as aspect-of-an-object may be used by multiple terms each having a different role sequence, the collection of these role sequences is called the sequence collection for that role structure. Any role sequence found in its sequence collection may be used to express the content of the role structure. For example, since the aspect-of-an-object structure may also be used by possessive noun, the role sequence for possessive noun, [possessor, term], is also added to the sequence collection for the aspect-of-an-object role structure. Figure 2 shows the sequence structures that may be taught for a noun phrase and the corresponding role sequences created for the aspectof-an-object role structure. These dual structures allow one grammar to both parse and generate a sentence using the grammatical structures and the role sequences, respectively. The next section will show how to choose from a sequence collection a specific role sequence to produce a statement that properly reflects the contents of the role. noun phrase aspect of an object nominal + prepositional phrase possessive noun + noun simple subject + prepositional + object of the preposition possessor + term Figure 2. An example dual structure of grammatical structures and role sequences. Finally, when a sequence is taught as a grammatical structure for a specific kind of a grammar term, the role structures in the role set of the control are not the appropriate place to store the generated role sequence. For example, the control of a sequence applicable to many different kinds of sentences is the verb. Consequently, the generated role sequence [subject, verb, predicate] is associated to each role structure in the role set for verb, in particular, define. On the other hand, decision question is a kind of sentence that has a special sequence structure, which only applies to it, yet it has the same control, the verb. If the generated role sequence [verb, subject, predicate] is also stored in each role structure of the role set of the control, such as define, then two errors may occur. Either the special structure is used erroneously to generate sentences of other kinds or decision questions may be generated erroneously by the generic structure. To prevent these errors, when a sequence is taught for a kind of a grammar term, the generated role sequence is added to the sequence collection of that kind s unique role.

5 4 Sentence generation Given that a thought is a collection of roles, each representing a unique semantic element to be expressed, the task of producing an English sentence to reflect that thought involves expressing the intent of each internal role. The intent of each role is represented by the structure of the role. Our algorithm to generate a sentence consists of two major steps: generate and make. The generate step, called by every role structure needed to be expressed, is further subdivided into two steps: select and generatelist. The select step is our version of sentence planning, as it selects the role sequence from the sequence collection that best fits the given thought or role. In general, the sequence collection is retrieved from the role to be expressed. However, for thoughts, the location of the sequence collection depends on its kind. The sequence that best fits is the one that expresses all the information contained within the role. The generatelist step assembles a list of wordunits according to the chosen sequence of roles. The make step performs the duty of morphology. It selects the correct form of the word based on the properties stored in each word unit, and applies any syntactic rules needed to generate a complete sentence. Recall that the sequence collection of a role contains multiple role sequences that may be used to express what the role represents. However, not all sequences apply to the given role, and some may be more desirable than others. The select step selects the sequence that best fits the given role. For example, suppose declaration has the following sequences in its collection: [subject, verb, predicate], [subject, verb, direct object], and [subject, verb, indirect object, direct object]. Now given a thought that expresses the act of a boy selling lemonade to his neighbors, although the first sequence may be used to express a declaration to define a subject, it is not suitable to express a declaration about an action. The second sequence is for an action, but does not express all the information stored in the thought; specifically, the indirect object representing the neighbors cannot be expressed. The last sequence is the best fit for this thought and should be the one selected to generate the sentence. We make use of an algorithm, checkavailability, to eliminate all sequences that do not match the given role by determining the availability of each element in the role sequence. A role sequence is suitable to express a given role if the given role contains all the logical objects required by the role sequence. If a required element is missing, the role sequence cannot be used and is eliminated. Recall that an element in a sequence can be a single role, a role set, or another role sequence. If it is a role sequence, the checkavailability algorithm is called recursively on this inner sequence. If the element is a role set, the checkavailability function is called for each role structure in the set. As long as one alternative is available, the entire element is considered available. If the element is a single role, it needs a bridge to determine if the element exists in the given role. The reason is grammatical roles are used in role sequences, but roles to be expressed contain logical objects instead. The correct bridge may be determined by either the given role or the role of the control term. Continuing with our example thought about an action, since the control of a sentence is the verb and the role for an action verb is the act role, the bridge for an act role is used. Using this bridge, the program knows that the subject maps to the actor, the direct object to the act-on object and the indirect object to the act-for object. As shown in Figure 3, these logical objects are all in the given thought and can be found, indicating that the role sequence is valid. [subject, verb, indirect object, direct object] Subject Direct Obj Indirect Obj Action Sell Act On Lemonade Act Role Bridge Declaration Actor Act-On Act - For Actor Boy Act For Neighbors Figure 3. Matching a role sequence with logical objects Once a logical object is found, its corresponding object in the role is marked as used to prevent another element from using it. For instance, a sentence may have multiple prepositional phrases. If not marked, the same role may be used by each prepositional phrase, resulting in duplicate phrases. If the logical object exists in the given role, the role element is available. However, if an element in a role sequence is not available and its occurrence is compulsory, then the role sequence is eliminated. On the other hand, if its occurrence is optional, it is still possible to use the role sequence and so it is not eliminated. For example, the first sequence in the sequence collection for sentence is not acceptable because the compulsory predicate cannot be mapped to any object in the act role. After removing all invalid sequences, it is still possible to have a set of valid alternatives. In our example, two valid sequences may still be used: [subject, verb, direct object], and [subject, verb, indirect object, direct object]. The latter sequence will be chosen since it expresses the most information from the given thought. After selecting the best sequence of roles to express the thought, the generatelist step will work on each role according to its order in the sequence. There are two categories of roles: composite and non-composite. A composite role such as aspect-of-an-object contains multiple sub-roles representing the logical objects of the role, while a

6 non-composite role is atomic. If the role is composite, the generate function is called on that role and the process of selecting a sequence and composing a word unit list is handled at the level of the composite role. If the role is noncomposite, a word-unit is created by extracting the content of the role including the name of the knowledge and any properties associated with that knowledge. For example, a role representing the person John would produce a wordunit containing John and the properties singular, thirdperson, male, and unique. Note that some of these properties may be stored within the thought role such as the tense of the sentence where the tense property of a verb may be obtained. The object-oriented nature of ALPS allows for each type of non-composite role to have its own unique function to determine what properties are needed and where to locate them. The make step selects the correct form of the word based on the properties stored in each word unit, and applies any syntactic rules needed to generate a grammatically correct sentence. Currently, we have focused on properties such as case for nouns, and both case and tense for verbs. Two categories of morphology rules: regular and irregular, are used in English to distinguish the various values of a property for a word. For regular words, they follow a set of rules that determine how affixes are added to root words in order to express a property. This allows the same affix to be applied to a large number of words. For example, the suffix s can be applied to nouns to form the plural of the word. Our program is taught a set of rules that is used to understand a word based on its root form and any affixes found, and this set of rules is used to produce the correct word when given the property. Each morphology rule is taught with a condition to determine if that rule is applicable to the current word, the transformation to apply to the word, and the new property that the word has. For instance, one rule for forming the plural of a word could have condition that if the word ends in y, then transform the word by replacing the -y with an ies. For irregular words, the difference between the root word and its varying forms does not adhere to any consistent rules. For instance, even though the plural of house is houses, the plural of mouse is mice but not mouses, and dice is the plural form of die. It is estimated that there are around 250 irregular verbs alone in the English language [15], and each special case needs to be learned individually. In our program, specialized lexicons identified by their implied properties are learned for use to map a word between its base form and its irregular form. For instance, a plural lexicon that contains a set of words and their irregular plural forms would contain pairings such as goose/geese, ox/oxen, and radius/radii. A word unit that does not contain any properties indicates the word is already in the correct form and no conversion is needed. If it has properties, the make step first tries to convert the base form of the stored word into its irregular form. For each property in the word unit, the corresponding lexicon is looked up. If the word in question is found within that lexicon, this indicates that the property can be applied and the irregular form found in the lexicon is used in the final sentence. If all properties have been tried and an irregular form is not found, the make step will try morphology rules for regular words. Finally, whether or not a word has been transformed, all word-units are tested for syntactic rules on capitalization. For example, the first word in a sentence and proper nouns are capitalized. One final complication of our solution arises from the fact that not all grammar terms have a role associated with them, yet that grammar term still needs to be represented in a role sequence in order to produce a proper expression. For example in our grammar, the grammar term punctuation is a part of a sentence, but does not have an associated role. In this instance, the role sequences for a sentence will have any empty role set reflecting the punctuation. In order to produce a valid sentence, compulsory terms without roles use rules to indicate the proper way of generating the correct output. For example, two rules taught for punctuation are that a declarative sentence must end in a period and an interrogative sentence must end with a question mark. As a result, whenever a role set is empty, the algorithm will use the applicable rule to determine the correct word unit to complete the sequence. 5 Conclusion In future versions of our sentence generating algorithms, we aim to tackle subtle problems in sentence selection, such as generating passive voice sentences, and in morphology, such as property prioritizing. When deciding to express a thought in the passive voice, a variety of bridges may be used depending on the structure of the passive voice. For instance, take an original thought expressing that John gave a gift to Mary. In the thought, John is identified by the logical object actor, the gift is the act-on object, and Mary is the act-for or beneficiary. This thought could be expressed in several passive voice sentences such as A gift was given to Mary. or Mary was given a gift. To express the first case, the bridge would need to link subject to the act-on object and in the second; the bridge would need to link subject to the act-for object. As a result, a method needs to be developed to determine which object is the focus of the sentence and thus choose the correct bridge to properly express the sentence in that manner. The issue of property prioritizing occurs when certain modalities, or helping verbs, are presented. In certain cases, a property of a word may not need to be applied. For instance, verbs generally take the same case as the subject in a sentence. However, most verbs do not differentiate between singular and plural forms when preceded by a helping verb: He may run a marathon., rather than He may runs a marathon. Similarly, the be verb takes the root form, ignoring all properties, when it is preceded by a helping verb: John could be a civil engineer. instead of John could is a civil

7 engineer. The distinction of when properties can and should be applied is one that needs to be taught through rules that are more sophisticated. Traditional bidirectional grammars rely on the same structure that was used in parsing a sentence to produce a sentence. This limits the flexibility and usefulness of the learned grammar. Instead, our program converts the learned grammar structure into a secondary type of knowledge that may be used to generate a sentence. This is accomplished by combining roles into sets and sequences to create a parallel structure based solely on the roles instead of grammar term. The purpose of a role set is to collect the alternative semantic representations of a grammar term, while role sequences define the various ways the intent of a role can be expressed. A role can then select an applicable role sequence from its collection to create a list of word units that expresses its contents. These word unit lists then propagate up to the thought to produce a list that expresses the entire thought. Finally, by applying properties gathered from the roles and the thought, the proper form of each word may be produced to create the final output sentence. Learning grammar incrementally allows more complex structures to be added as desired, in turn expanding the type of sentences that can be parsed and generated. By using roles, with a clear separation between grammatical and semantic purposes, along with the use of the proper bridge, our approach has the advantage that the same logical structure may be expressed in multiple ways in the natural language. The principles behind creating role sequences ensure the correct grammatical relationships between each term in a sequence. It has the advantage that no surface generation step is needed at the time of generating a sentence, and thoughts may be created based on its logical meaning instead of the grammatical requirement of the natural language. 6 References [1] K. Cheng. An Object-Oriented Approach to Machine Learning ; International Conference on Artificial Intelligence, , [2] W. Faris & K. Cheng. An Object-Oriented Approach in Representing the English Grammar and Parsing ; International Conference on Artificial Intelligence, , [3] E. Ahn, W. Faris, & K. Cheng. Recognizing the Effects caused by an Action in a Declarative Sentence ; International Conference on Artificial Intelligence, , [5] W. Faris & K. Cheng. Understanding Pronouns ; International Conference on Artificial Intelligence, , [6] D. Appelt. Bidirectional grammars and the design of natural language generation systems ; In Proceedings of the 1987 workshop on Theoretical issues in natural language processing, Association for Computational Linguistics, Stroudsburg, PA, USA, , [7] E. Reiter. Has a consensus NL generation architecture appeared, and is it psycholinguistically plausible? ; In Proceedings of the Seventh International Workshop on Natural Language Generation, Association for Computational Linguistics, Stroudsburg, PA, USA, , [8] S. Shieber. "An introduction to unification-based approaches to grammar", CSLI Lecture Notes, 4, Stanford Univ [9] R. Kasper. A flexible interface for linking applications to Penman's sentence generator ; In Proceedings of the workshop on Speech and Natural Language, Association for Computational Linguistics, Stroudsburg, PA, USA, , [10] E. Hovy. The current status of the Penman language generation system ; Proceedings of the workshop on Speech and Natural Language, Association for Computational Linguistics, Stroudsburg, PA, [11] S. Shieber, G. van Noord, F. Pereira, and R. Moore. Semantic-head-driven generation, Computational Linguistustics, 16, 1, 30-42, 1990 [12] S. Busemann. Best-First surface realization ; In Proceedings of the Eighth International Workshop on Natural Language Generation, , [13] S. Russell and P. Norvig, Artificial Intelligence, A Modern Approach, 2 nd Ed., Prentice Hall, [14] G. Russell, S. Warwick, & J. Carroll. Asymmetry in Parsing and Generating with Unification Grammars: Case Studies from ELU ; 28 th Annual meeting on Association for Computational Linguistics, , [15] R. Quirk, S. Greenbaum, G. Leech, & J. Svartvik. A Comprehensive Grammar of the English Language. Longman, [4] W. Faris & K. Cheng. Understanding and Executing a Declarative Sentence involving a forms-of-be Verb ; IEEE International Conference on Systems, Man, and Cybernetics, , 2009.

AQUA: An Ontology-Driven Question Answering System

AQUA: An Ontology-Driven Question Answering System AQUA: An Ontology-Driven Question Answering System Maria Vargas-Vera, Enrico Motta and John Domingue Knowledge Media Institute (KMI) The Open University, Walton Hall, Milton Keynes, MK7 6AA, United Kingdom.

More information

Emmaus Lutheran School English Language Arts Curriculum

Emmaus Lutheran School English Language Arts Curriculum Emmaus Lutheran School English Language Arts Curriculum Rationale based on Scripture God is the Creator of all things, including English Language Arts. Our school is committed to providing students with

More information

Parsing of part-of-speech tagged Assamese Texts

Parsing of part-of-speech tagged Assamese Texts IJCSI International Journal of Computer Science Issues, Vol. 6, No. 1, 2009 ISSN (Online): 1694-0784 ISSN (Print): 1694-0814 28 Parsing of part-of-speech tagged Assamese Texts Mirzanur Rahman 1, Sufal

More information

Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions.

Introduction to HPSG. Introduction. Historical Overview. The HPSG architecture. Signature. Linguistic Objects. Descriptions. to as a linguistic theory to to a member of the family of linguistic frameworks that are called generative grammars a grammar which is formalized to a high degree and thus makes exact predictions about

More information

An Interactive Intelligent Language Tutor Over The Internet

An Interactive Intelligent Language Tutor Over The Internet An Interactive Intelligent Language Tutor Over The Internet Trude Heift Linguistics Department and Language Learning Centre Simon Fraser University, B.C. Canada V5A1S6 E-mail: heift@sfu.ca Abstract: This

More information

What the National Curriculum requires in reading at Y5 and Y6

What the National Curriculum requires in reading at Y5 and Y6 What the National Curriculum requires in reading at Y5 and Y6 Word reading apply their growing knowledge of root words, prefixes and suffixes (morphology and etymology), as listed in Appendix 1 of the

More information

Linking Task: Identifying authors and book titles in verbose queries

Linking Task: Identifying authors and book titles in verbose queries Linking Task: Identifying authors and book titles in verbose queries Anaïs Ollagnier, Sébastien Fournier, and Patrice Bellot Aix-Marseille University, CNRS, ENSAM, University of Toulon, LSIS UMR 7296,

More information

CS 598 Natural Language Processing

CS 598 Natural Language Processing CS 598 Natural Language Processing Natural language is everywhere Natural language is everywhere Natural language is everywhere Natural language is everywhere!"#$%&'&()*+,-./012 34*5665756638/9:;< =>?@ABCDEFGHIJ5KL@

More information

Loughton School s curriculum evening. 28 th February 2017

Loughton School s curriculum evening. 28 th February 2017 Loughton School s curriculum evening 28 th February 2017 Aims of this session Share our approach to teaching writing, reading, SPaG and maths. Share resources, ideas and strategies to support children's

More information

SINGLE DOCUMENT AUTOMATIC TEXT SUMMARIZATION USING TERM FREQUENCY-INVERSE DOCUMENT FREQUENCY (TF-IDF)

SINGLE DOCUMENT AUTOMATIC TEXT SUMMARIZATION USING TERM FREQUENCY-INVERSE DOCUMENT FREQUENCY (TF-IDF) SINGLE DOCUMENT AUTOMATIC TEXT SUMMARIZATION USING TERM FREQUENCY-INVERSE DOCUMENT FREQUENCY (TF-IDF) Hans Christian 1 ; Mikhael Pramodana Agus 2 ; Derwin Suhartono 3 1,2,3 Computer Science Department,

More information

Writing a composition

Writing a composition A good composition has three elements: Writing a composition an introduction: A topic sentence which contains the main idea of the paragraph. a body : Supporting sentences that develop the main idea. a

More information

Taught Throughout the Year Foundational Skills Reading Writing Language RF.1.2 Demonstrate understanding of spoken words,

Taught Throughout the Year Foundational Skills Reading Writing Language RF.1.2 Demonstrate understanding of spoken words, First Grade Standards These are the standards for what is taught in first grade. It is the expectation that these skills will be reinforced after they have been taught. Taught Throughout the Year Foundational

More information

Opportunities for Writing Title Key Stage 1 Key Stage 2 Narrative

Opportunities for Writing Title Key Stage 1 Key Stage 2 Narrative English Teaching Cycle The English curriculum at Wardley CE Primary is based upon the National Curriculum. Our English is taught through a text based curriculum as we believe this is the best way to develop

More information

Written by: YULI AMRIA (RRA1B210085) ABSTRACT. Key words: ability, possessive pronouns, and possessive adjectives INTRODUCTION

Written by: YULI AMRIA (RRA1B210085) ABSTRACT. Key words: ability, possessive pronouns, and possessive adjectives INTRODUCTION STUDYING GRAMMAR OF ENGLISH AS A FOREIGN LANGUAGE: STUDENTS ABILITY IN USING POSSESSIVE PRONOUNS AND POSSESSIVE ADJECTIVES IN ONE JUNIOR HIGH SCHOOL IN JAMBI CITY Written by: YULI AMRIA (RRA1B210085) ABSTRACT

More information

Linguistic Variation across Sports Category of Press Reportage from British Newspapers: a Diachronic Multidimensional Analysis

Linguistic Variation across Sports Category of Press Reportage from British Newspapers: a Diachronic Multidimensional Analysis International Journal of Arts Humanities and Social Sciences (IJAHSS) Volume 1 Issue 1 ǁ August 216. www.ijahss.com Linguistic Variation across Sports Category of Press Reportage from British Newspapers:

More information

Derivational and Inflectional Morphemes in Pak-Pak Language

Derivational and Inflectional Morphemes in Pak-Pak Language Derivational and Inflectional Morphemes in Pak-Pak Language Agustina Situmorang and Tima Mariany Arifin ABSTRACT The objectives of this study are to find out the derivational and inflectional morphemes

More information

Senior Stenographer / Senior Typist Series (including equivalent Secretary titles)

Senior Stenographer / Senior Typist Series (including equivalent Secretary titles) New York State Department of Civil Service Committed to Innovation, Quality, and Excellence A Guide to the Written Test for the Senior Stenographer / Senior Typist Series (including equivalent Secretary

More information

Using dialogue context to improve parsing performance in dialogue systems

Using dialogue context to improve parsing performance in dialogue systems Using dialogue context to improve parsing performance in dialogue systems Ivan Meza-Ruiz and Oliver Lemon School of Informatics, Edinburgh University 2 Buccleuch Place, Edinburgh I.V.Meza-Ruiz@sms.ed.ac.uk,

More information

ELA/ELD Standards Correlation Matrix for ELD Materials Grade 1 Reading

ELA/ELD Standards Correlation Matrix for ELD Materials Grade 1 Reading ELA/ELD Correlation Matrix for ELD Materials Grade 1 Reading The English Language Arts (ELA) required for the one hour of English-Language Development (ELD) Materials are listed in Appendix 9-A, Matrix

More information

Iraqi EFL Students' Achievement In The Present Tense And Present Passive Constructions

Iraqi EFL Students' Achievement In The Present Tense And Present Passive Constructions Iraqi EFL Students' Achievement In The Present Tense And Present Passive Constructions Shurooq Abudi Ali University Of Baghdad College Of Arts English Department Abstract The present tense and present

More information

1 st Quarter (September, October, November) August/September Strand Topic Standard Notes Reading for Literature

1 st Quarter (September, October, November) August/September Strand Topic Standard Notes Reading for Literature 1 st Grade Curriculum Map Common Core Standards Language Arts 2013 2014 1 st Quarter (September, October, November) August/September Strand Topic Standard Notes Reading for Literature Key Ideas and Details

More information

The College Board Redesigned SAT Grade 12

The College Board Redesigned SAT Grade 12 A Correlation of, 2017 To the Redesigned SAT Introduction This document demonstrates how myperspectives English Language Arts meets the Reading, Writing and Language and Essay Domains of Redesigned SAT.

More information

First Grade Curriculum Highlights: In alignment with the Common Core Standards

First Grade Curriculum Highlights: In alignment with the Common Core Standards First Grade Curriculum Highlights: In alignment with the Common Core Standards ENGLISH LANGUAGE ARTS Foundational Skills Print Concepts Demonstrate understanding of the organization and basic features

More information

Intra-talker Variation: Audience Design Factors Affecting Lexical Selections

Intra-talker Variation: Audience Design Factors Affecting Lexical Selections Tyler Perrachione LING 451-0 Proseminar in Sound Structure Prof. A. Bradlow 17 March 2006 Intra-talker Variation: Audience Design Factors Affecting Lexical Selections Abstract Although the acoustic and

More information

Basic Syntax. Doug Arnold We review some basic grammatical ideas and terminology, and look at some common constructions in English.

Basic Syntax. Doug Arnold We review some basic grammatical ideas and terminology, and look at some common constructions in English. Basic Syntax Doug Arnold doug@essex.ac.uk We review some basic grammatical ideas and terminology, and look at some common constructions in English. 1 Categories 1.1 Word level (lexical and functional)

More information

BASIC ENGLISH. Book GRAMMAR

BASIC ENGLISH. Book GRAMMAR BASIC ENGLISH Book 1 GRAMMAR Anne Seaton Y. H. Mew Book 1 Three Watson Irvine, CA 92618-2767 Web site: www.sdlback.com First published in the United States by Saddleback Educational Publishing, 3 Watson,

More information

Approaches to control phenomena handout Obligatory control and morphological case: Icelandic and Basque

Approaches to control phenomena handout Obligatory control and morphological case: Icelandic and Basque Approaches to control phenomena handout 6 5.4 Obligatory control and morphological case: Icelandic and Basque Icelandinc quirky case (displaying properties of both structural and inherent case: lexically

More information

California Department of Education English Language Development Standards for Grade 8

California Department of Education English Language Development Standards for Grade 8 Section 1: Goal, Critical Principles, and Overview Goal: English learners read, analyze, interpret, and create a variety of literary and informational text types. They develop an understanding of how language

More information

Subject: Opening the American West. What are you teaching? Explorations of Lewis and Clark

Subject: Opening the American West. What are you teaching? Explorations of Lewis and Clark Theme 2: My World & Others (Geography) Grade 5: Lewis and Clark: Opening the American West by Ellen Rodger (U.S. Geography) This 4MAT lesson incorporates activities in the Daily Lesson Guide (DLG) that

More information

Proof Theory for Syntacticians

Proof Theory for Syntacticians Department of Linguistics Ohio State University Syntax 2 (Linguistics 602.02) January 5, 2012 Logics for Linguistics Many different kinds of logic are directly applicable to formalizing theories in syntax

More information

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm syntax: from the Greek syntaxis, meaning setting out together

More information

Target Language Preposition Selection an Experiment with Transformation-Based Learning and Aligned Bilingual Data

Target Language Preposition Selection an Experiment with Transformation-Based Learning and Aligned Bilingual Data Target Language Preposition Selection an Experiment with Transformation-Based Learning and Aligned Bilingual Data Ebba Gustavii Department of Linguistics and Philology, Uppsala University, Sweden ebbag@stp.ling.uu.se

More information

Derivational: Inflectional: In a fit of rage the soldiers attacked them both that week, but lost the fight.

Derivational: Inflectional: In a fit of rage the soldiers attacked them both that week, but lost the fight. Final Exam (120 points) Click on the yellow balloons below to see the answers I. Short Answer (32pts) 1. (6) The sentence The kinder teachers made sure that the students comprehended the testable material

More information

Coast Academies Writing Framework Step 4. 1 of 7

Coast Academies Writing Framework Step 4. 1 of 7 1 KPI Spell further homophones. 2 3 Objective Spell words that are often misspelt (English Appendix 1) KPI Place the possessive apostrophe accurately in words with regular plurals: e.g. girls, boys and

More information

BULATS A2 WORDLIST 2

BULATS A2 WORDLIST 2 BULATS A2 WORDLIST 2 INTRODUCTION TO THE BULATS A2 WORDLIST 2 The BULATS A2 WORDLIST 21 is a list of approximately 750 words to help candidates aiming at an A2 pass in the Cambridge BULATS exam. It is

More information

Comprehension Recognize plot features of fairy tales, folk tales, fables, and myths.

Comprehension Recognize plot features of fairy tales, folk tales, fables, and myths. 4 th Grade Language Arts Scope and Sequence 1 st Nine Weeks Instructional Units Reading Unit 1 & 2 Language Arts Unit 1& 2 Assessments Placement Test Running Records DIBELS Reading Unit 1 Language Arts

More information

Some Principles of Automated Natural Language Information Extraction

Some Principles of Automated Natural Language Information Extraction Some Principles of Automated Natural Language Information Extraction Gregers Koch Department of Computer Science, Copenhagen University DIKU, Universitetsparken 1, DK-2100 Copenhagen, Denmark Abstract

More information

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus Language Acquisition Fall 2010/Winter 2011 Lexical Categories Afra Alishahi, Heiner Drenhaus Computational Linguistics and Phonetics Saarland University Children s Sensitivity to Lexical Categories Look,

More information

The presence of interpretable but ungrammatical sentences corresponds to mismatches between interpretive and productive parsing.

The presence of interpretable but ungrammatical sentences corresponds to mismatches between interpretive and productive parsing. Lecture 4: OT Syntax Sources: Kager 1999, Section 8; Legendre et al. 1998; Grimshaw 1997; Barbosa et al. 1998, Introduction; Bresnan 1998; Fanselow et al. 1999; Gibson & Broihier 1998. OT is not a theory

More information

Houghton Mifflin Reading Correlation to the Common Core Standards for English Language Arts (Grade1)

Houghton Mifflin Reading Correlation to the Common Core Standards for English Language Arts (Grade1) Houghton Mifflin Reading Correlation to the Standards for English Language Arts (Grade1) 8.3 JOHNNY APPLESEED Biography TARGET SKILLS: 8.3 Johnny Appleseed Phonemic Awareness Phonics Comprehension Vocabulary

More information

TABE 9&10. Revised 8/2013- with reference to College and Career Readiness Standards

TABE 9&10. Revised 8/2013- with reference to College and Career Readiness Standards TABE 9&10 Revised 8/2013- with reference to College and Career Readiness Standards LEVEL E Test 1: Reading Name Class E01- INTERPRET GRAPHIC INFORMATION Signs Maps Graphs Consumer Materials Forms Dictionary

More information

Developing Grammar in Context

Developing Grammar in Context Developing Grammar in Context intermediate with answers Mark Nettle and Diana Hopkins PUBLISHED BY THE PRESS SYNDICATE OF THE UNIVERSITY OF CAMBRIDGE The Pitt Building, Trumpington Street, Cambridge, United

More information

A Case Study: News Classification Based on Term Frequency

A Case Study: News Classification Based on Term Frequency A Case Study: News Classification Based on Term Frequency Petr Kroha Faculty of Computer Science University of Technology 09107 Chemnitz Germany kroha@informatik.tu-chemnitz.de Ricardo Baeza-Yates Center

More information

Reading Grammar Section and Lesson Writing Chapter and Lesson Identify a purpose for reading W1-LO; W2- LO; W3- LO; W4- LO; W5-

Reading Grammar Section and Lesson Writing Chapter and Lesson Identify a purpose for reading W1-LO; W2- LO; W3- LO; W4- LO; W5- New York Grade 7 Core Performance Indicators Grades 7 8: common to all four ELA standards Throughout grades 7 and 8, students demonstrate the following core performance indicators in the key ideas of reading,

More information

Correspondence between the DRDP (2015) and the California Preschool Learning Foundations. Foundations (PLF) in Language and Literacy

Correspondence between the DRDP (2015) and the California Preschool Learning Foundations. Foundations (PLF) in Language and Literacy 1 Desired Results Developmental Profile (2015) [DRDP (2015)] Correspondence to California Foundations: Language and Development (LLD) and the Foundations (PLF) The Language and Development (LLD) domain

More information

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition Chapter 2: The Representation of Knowledge Expert Systems: Principles and Programming, Fourth Edition Objectives Introduce the study of logic Learn the difference between formal logic and informal logic

More information

ELD CELDT 5 EDGE Level C Curriculum Guide LANGUAGE DEVELOPMENT VOCABULARY COMMON WRITING PROJECT. ToolKit

ELD CELDT 5 EDGE Level C Curriculum Guide LANGUAGE DEVELOPMENT VOCABULARY COMMON WRITING PROJECT. ToolKit Unit 1 Language Development Express Ideas and Opinions Ask for and Give Information Engage in Discussion ELD CELDT 5 EDGE Level C Curriculum Guide 20132014 Sentences Reflective Essay August 12 th September

More information

Myths, Legends, Fairytales and Novels (Writing a Letter)

Myths, Legends, Fairytales and Novels (Writing a Letter) Assessment Focus This task focuses on Communication through the mode of Writing at Levels 3, 4 and 5. Two linked tasks (Hot Seating and Character Study) that use the same context are available to assess

More information

Using a Native Language Reference Grammar as a Language Learning Tool

Using a Native Language Reference Grammar as a Language Learning Tool Using a Native Language Reference Grammar as a Language Learning Tool Stacey I. Oberly University of Arizona & American Indian Language Development Institute Introduction This article is a case study in

More information

Learning Disability Functional Capacity Evaluation. Dear Doctor,

Learning Disability Functional Capacity Evaluation. Dear Doctor, Dear Doctor, I have been asked to formulate a vocational opinion regarding NAME s employability in light of his/her learning disability. To assist me with this evaluation I would appreciate if you can

More information

Dickinson ISD ELAR Year at a Glance 3rd Grade- 1st Nine Weeks

Dickinson ISD ELAR Year at a Glance 3rd Grade- 1st Nine Weeks 3rd Grade- 1st Nine Weeks R3.8 understand, make inferences and draw conclusions about the structure and elements of fiction and provide evidence from text to support their understand R3.8A sequence and

More information

Machine Learning from Garden Path Sentences: The Application of Computational Linguistics

Machine Learning from Garden Path Sentences: The Application of Computational Linguistics Machine Learning from Garden Path Sentences: The Application of Computational Linguistics http://dx.doi.org/10.3991/ijet.v9i6.4109 J.L. Du 1, P.F. Yu 1 and M.L. Li 2 1 Guangdong University of Foreign Studies,

More information

Words come in categories

Words come in categories Nouns Words come in categories D: A grammatical category is a class of expressions which share a common set of grammatical properties (a.k.a. word class or part of speech). Words come in categories Open

More information

5 th Grade Language Arts Curriculum Map

5 th Grade Language Arts Curriculum Map 5 th Grade Language Arts Curriculum Map Quarter 1 Unit of Study: Launching Writer s Workshop 5.L.1 - Demonstrate command of the conventions of Standard English grammar and usage when writing or speaking.

More information

The Interface between Phrasal and Functional Constraints

The Interface between Phrasal and Functional Constraints The Interface between Phrasal and Functional Constraints John T. Maxwell III* Xerox Palo Alto Research Center Ronald M. Kaplan t Xerox Palo Alto Research Center Many modern grammatical formalisms divide

More information

Specification and Evaluation of Machine Translation Toy Systems - Criteria for laboratory assignments

Specification and Evaluation of Machine Translation Toy Systems - Criteria for laboratory assignments Specification and Evaluation of Machine Translation Toy Systems - Criteria for laboratory assignments Cristina Vertan, Walther v. Hahn University of Hamburg, Natural Language Systems Division Hamburg,

More information

Learning Methods for Fuzzy Systems

Learning Methods for Fuzzy Systems Learning Methods for Fuzzy Systems Rudolf Kruse and Andreas Nürnberger Department of Computer Science, University of Magdeburg Universitätsplatz, D-396 Magdeburg, Germany Phone : +49.39.67.876, Fax : +49.39.67.8

More information

Probabilistic Latent Semantic Analysis

Probabilistic Latent Semantic Analysis Probabilistic Latent Semantic Analysis Thomas Hofmann Presentation by Ioannis Pavlopoulos & Andreas Damianou for the course of Data Mining & Exploration 1 Outline Latent Semantic Analysis o Need o Overview

More information

On the Notion Determiner

On the Notion Determiner On the Notion Determiner Frank Van Eynde University of Leuven Proceedings of the 10th International Conference on Head-Driven Phrase Structure Grammar Michigan State University Stefan Müller (Editor) 2003

More information

English for Life. B e g i n n e r. Lessons 1 4 Checklist Getting Started. Student s Book 3 Date. Workbook. MultiROM. Test 1 4

English for Life. B e g i n n e r. Lessons 1 4 Checklist Getting Started. Student s Book 3 Date. Workbook. MultiROM. Test 1 4 Lessons 1 4 Checklist Getting Started Lesson 1 Lesson 2 Lesson 3 Lesson 4 Introducing yourself Numbers 0 10 Names Indefinite articles: a / an this / that Useful expressions Classroom language Imperatives

More information

Case government vs Case agreement: modelling Modern Greek case attraction phenomena in LFG

Case government vs Case agreement: modelling Modern Greek case attraction phenomena in LFG Case government vs Case agreement: modelling Modern Greek case attraction phenomena in LFG Dr. Kakia Chatsiou, University of Essex achats at essex.ac.uk Explorations in Syntactic Government and Subcategorisation,

More information

AN ANALYSIS OF GRAMMTICAL ERRORS MADE BY THE SECOND YEAR STUDENTS OF SMAN 5 PADANG IN WRITING PAST EXPERIENCES

AN ANALYSIS OF GRAMMTICAL ERRORS MADE BY THE SECOND YEAR STUDENTS OF SMAN 5 PADANG IN WRITING PAST EXPERIENCES AN ANALYSIS OF GRAMMTICAL ERRORS MADE BY THE SECOND YEAR STUDENTS OF SMAN 5 PADANG IN WRITING PAST EXPERIENCES Yelna Oktavia 1, Lely Refnita 1,Ernati 1 1 English Department, the Faculty of Teacher Training

More information

Intensive English Program Southwest College

Intensive English Program Southwest College Intensive English Program Southwest College ESOL 0352 Advanced Intermediate Grammar for Foreign Speakers CRN 55661-- Summer 2015 Gulfton Center Room 114 11:00 2:45 Mon. Fri. 3 hours lecture / 2 hours lab

More information

Compositional Semantics

Compositional Semantics Compositional Semantics CMSC 723 / LING 723 / INST 725 MARINE CARPUAT marine@cs.umd.edu Words, bag of words Sequences Trees Meaning Representing Meaning An important goal of NLP/AI: convert natural language

More information

Notes on The Sciences of the Artificial Adapted from a shorter document written for course (Deciding What to Design) 1

Notes on The Sciences of the Artificial Adapted from a shorter document written for course (Deciding What to Design) 1 Notes on The Sciences of the Artificial Adapted from a shorter document written for course 17-652 (Deciding What to Design) 1 Ali Almossawi December 29, 2005 1 Introduction The Sciences of the Artificial

More information

Mercer County Schools

Mercer County Schools Mercer County Schools PRIORITIZED CURRICULUM Reading/English Language Arts Content Maps Fourth Grade Mercer County Schools PRIORITIZED CURRICULUM The Mercer County Schools Prioritized Curriculum is composed

More information

Context Free Grammars. Many slides from Michael Collins

Context Free Grammars. Many slides from Michael Collins Context Free Grammars Many slides from Michael Collins Overview I An introduction to the parsing problem I Context free grammars I A brief(!) sketch of the syntax of English I Examples of ambiguous structures

More information

BANGLA TO ENGLISH TEXT CONVERSION USING OPENNLP TOOLS

BANGLA TO ENGLISH TEXT CONVERSION USING OPENNLP TOOLS Daffodil International University Institutional Repository DIU Journal of Science and Technology Volume 8, Issue 1, January 2013 2013-01 BANGLA TO ENGLISH TEXT CONVERSION USING OPENNLP TOOLS Uddin, Sk.

More information

Underlying and Surface Grammatical Relations in Greek consider

Underlying and Surface Grammatical Relations in Greek consider 0 Underlying and Surface Grammatical Relations in Greek consider Sentences Brian D. Joseph The Ohio State University Abbreviated Title Grammatical Relations in Greek consider Sentences Brian D. Joseph

More information

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation tatistical Parsing (Following slides are modified from Prof. Raymond Mooney s slides.) tatistical Parsing tatistical parsing uses a probabilistic model of syntax in order to assign probabilities to each

More information

Ch VI- SENTENCE PATTERNS.

Ch VI- SENTENCE PATTERNS. Ch VI- SENTENCE PATTERNS faizrisd@gmail.com www.pakfaizal.com It is a common fact that in the making of well-formed sentences we badly need several syntactic devices used to link together words by means

More information

EdIt: A Broad-Coverage Grammar Checker Using Pattern Grammar

EdIt: A Broad-Coverage Grammar Checker Using Pattern Grammar EdIt: A Broad-Coverage Grammar Checker Using Pattern Grammar Chung-Chi Huang Mei-Hua Chen Shih-Ting Huang Jason S. Chang Institute of Information Systems and Applications, National Tsing Hua University,

More information

PAGE(S) WHERE TAUGHT If sub mission ins not a book, cite appropriate location(s))

PAGE(S) WHERE TAUGHT If sub mission ins not a book, cite appropriate location(s)) Ohio Academic Content Standards Grade Level Indicators (Grade 11) A. ACQUISITION OF VOCABULARY Students acquire vocabulary through exposure to language-rich situations, such as reading books and other

More information

Modeling full form lexica for Arabic

Modeling full form lexica for Arabic Modeling full form lexica for Arabic Susanne Alt Amine Akrout Atilf-CNRS Laurent Romary Loria-CNRS Objectives Presentation of the current standardization activity in the domain of lexical data modeling

More information

Minimalism is the name of the predominant approach in generative linguistics today. It was first

Minimalism is the name of the predominant approach in generative linguistics today. It was first Minimalism Minimalism is the name of the predominant approach in generative linguistics today. It was first introduced by Chomsky in his work The Minimalist Program (1995) and has seen several developments

More information

Oakland Unified School District English/ Language Arts Course Syllabus

Oakland Unified School District English/ Language Arts Course Syllabus Oakland Unified School District English/ Language Arts Course Syllabus For Secondary Schools The attached course syllabus is a developmental and integrated approach to skill acquisition throughout the

More information

A First-Pass Approach for Evaluating Machine Translation Systems

A First-Pass Approach for Evaluating Machine Translation Systems [Proceedings of the Evaluators Forum, April 21st 24th, 1991, Les Rasses, Vaud, Switzerland; ed. Kirsten Falkedal (Geneva: ISSCO).] A First-Pass Approach for Evaluating Machine Translation Systems Pamela

More information

Copyright 2002 by the McGraw-Hill Companies, Inc.

Copyright 2002 by the McGraw-Hill Companies, Inc. A group of words must pass three tests in order to be called a sentence: It must contain a subject, which tells you who or what the sentence is about Gabriella lives in Manhattan. It must contain a predicate,

More information

A Minimalist Approach to Code-Switching. In the field of linguistics, the topic of bilingualism is a broad one. There are many

A Minimalist Approach to Code-Switching. In the field of linguistics, the topic of bilingualism is a broad one. There are many Schmidt 1 Eric Schmidt Prof. Suzanne Flynn Linguistic Study of Bilingualism December 13, 2013 A Minimalist Approach to Code-Switching In the field of linguistics, the topic of bilingualism is a broad one.

More information

1. Introduction. 2. The OMBI database editor

1. Introduction. 2. The OMBI database editor OMBI bilingual lexical resources: Arabic-Dutch / Dutch-Arabic Carole Tiberius, Anna Aalstein, Instituut voor Nederlandse Lexicologie Jan Hoogland, Nederlands Instituut in Marokko (NIMAR) In this paper

More information

Evolution of Symbolisation in Chimpanzees and Neural Nets

Evolution of Symbolisation in Chimpanzees and Neural Nets Evolution of Symbolisation in Chimpanzees and Neural Nets Angelo Cangelosi Centre for Neural and Adaptive Systems University of Plymouth (UK) a.cangelosi@plymouth.ac.uk Introduction Animal communication

More information

GERM 3040 GERMAN GRAMMAR AND COMPOSITION SPRING 2017

GERM 3040 GERMAN GRAMMAR AND COMPOSITION SPRING 2017 GERM 3040 GERMAN GRAMMAR AND COMPOSITION SPRING 2017 Instructor: Dr. Claudia Schwabe Class hours: TR 9:00-10:15 p.m. claudia.schwabe@usu.edu Class room: Old Main 301 Office: Old Main 002D Office hours:

More information

FOREWORD.. 5 THE PROPER RUSSIAN PRONUNCIATION. 8. УРОК (Unit) УРОК (Unit) УРОК (Unit) УРОК (Unit) 4 80.

FOREWORD.. 5 THE PROPER RUSSIAN PRONUNCIATION. 8. УРОК (Unit) УРОК (Unit) УРОК (Unit) УРОК (Unit) 4 80. CONTENTS FOREWORD.. 5 THE PROPER RUSSIAN PRONUNCIATION. 8 УРОК (Unit) 1 25 1.1. QUESTIONS WITH КТО AND ЧТО 27 1.2. GENDER OF NOUNS 29 1.3. PERSONAL PRONOUNS 31 УРОК (Unit) 2 38 2.1. PRESENT TENSE OF THE

More information

Abstractions and the Brain

Abstractions and the Brain Abstractions and the Brain Brian D. Josephson Department of Physics, University of Cambridge Cavendish Lab. Madingley Road Cambridge, UK. CB3 OHE bdj10@cam.ac.uk http://www.tcm.phy.cam.ac.uk/~bdj10 ABSTRACT

More information

Part I. Figuring out how English works

Part I. Figuring out how English works 9 Part I Figuring out how English works 10 Chapter One Interaction and grammar Grammar focus. Tag questions Introduction. How closely do you pay attention to how English is used around you? For example,

More information

Project in the framework of the AIM-WEST project Annotation of MWEs for translation

Project in the framework of the AIM-WEST project Annotation of MWEs for translation Project in the framework of the AIM-WEST project Annotation of MWEs for translation 1 Agnès Tutin LIDILEM/LIG Université Grenoble Alpes 30 october 2014 Outline 2 Why annotate MWEs in corpora? A first experiment

More information

The Smart/Empire TIPSTER IR System

The Smart/Empire TIPSTER IR System The Smart/Empire TIPSTER IR System Chris Buckley, Janet Walz Sabir Research, Gaithersburg, MD chrisb,walz@sabir.com Claire Cardie, Scott Mardis, Mandar Mitra, David Pierce, Kiri Wagstaff Department of

More information

Concept Acquisition Without Representation William Dylan Sabo

Concept Acquisition Without Representation William Dylan Sabo Concept Acquisition Without Representation William Dylan Sabo Abstract: Contemporary debates in concept acquisition presuppose that cognizers can only acquire concepts on the basis of concepts they already

More information

Improved Effects of Word-Retrieval Treatments Subsequent to Addition of the Orthographic Form

Improved Effects of Word-Retrieval Treatments Subsequent to Addition of the Orthographic Form Orthographic Form 1 Improved Effects of Word-Retrieval Treatments Subsequent to Addition of the Orthographic Form The development and testing of word-retrieval treatments for aphasia has generally focused

More information

Constraining X-Bar: Theta Theory

Constraining X-Bar: Theta Theory Constraining X-Bar: Theta Theory Carnie, 2013, chapter 8 Kofi K. Saah 1 Learning objectives Distinguish between thematic relation and theta role. Identify the thematic relations agent, theme, goal, source,

More information

Copyright 2017 DataWORKS Educational Research. All rights reserved.

Copyright 2017 DataWORKS Educational Research. All rights reserved. Copyright 2017 DataWORKS Educational Research. All rights reserved. No part of this work may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic or mechanical,

More information

Age Effects on Syntactic Control in. Second Language Learning

Age Effects on Syntactic Control in. Second Language Learning Age Effects on Syntactic Control in Second Language Learning Miriam Tullgren Loyola University Chicago Abstract 1 This paper explores the effects of age on second language acquisition in adolescents, ages

More information

Developing a TT-MCTAG for German with an RCG-based Parser

Developing a TT-MCTAG for German with an RCG-based Parser Developing a TT-MCTAG for German with an RCG-based Parser Laura Kallmeyer, Timm Lichte, Wolfgang Maier, Yannick Parmentier, Johannes Dellert University of Tübingen, Germany CNRS-LORIA, France LREC 2008,

More information

Parallel Evaluation in Stratal OT * Adam Baker University of Arizona

Parallel Evaluation in Stratal OT * Adam Baker University of Arizona Parallel Evaluation in Stratal OT * Adam Baker University of Arizona tabaker@u.arizona.edu 1.0. Introduction The model of Stratal OT presented by Kiparsky (forthcoming), has not and will not prove uncontroversial

More information

Detecting English-French Cognates Using Orthographic Edit Distance

Detecting English-French Cognates Using Orthographic Edit Distance Detecting English-French Cognates Using Orthographic Edit Distance Qiongkai Xu 1,2, Albert Chen 1, Chang i 1 1 The Australian National University, College of Engineering and Computer Science 2 National

More information

On-Line Data Analytics

On-Line Data Analytics International Journal of Computer Applications in Engineering Sciences [VOL I, ISSUE III, SEPTEMBER 2011] [ISSN: 2231-4946] On-Line Data Analytics Yugandhar Vemulapalli #, Devarapalli Raghu *, Raja Jacob

More information

Construction Grammar. University of Jena.

Construction Grammar. University of Jena. Construction Grammar Holger Diessel University of Jena holger.diessel@uni-jena.de http://www.holger-diessel.de/ Words seem to have a prototype structure; but language does not only consist of words. What

More information

Procedia - Social and Behavioral Sciences 154 ( 2014 )

Procedia - Social and Behavioral Sciences 154 ( 2014 ) Available online at www.sciencedirect.com ScienceDirect Procedia - Social and Behavioral Sciences 154 ( 2014 ) 263 267 THE XXV ANNUAL INTERNATIONAL ACADEMIC CONFERENCE, LANGUAGE AND CULTURE, 20-22 October

More information

Proposed syllabi of Foundation Course in French New Session FIRST SEMESTER FFR 100 (Grammar,Comprehension &Paragraph writing)

Proposed syllabi of Foundation Course in French New Session FIRST SEMESTER FFR 100 (Grammar,Comprehension &Paragraph writing) INTERNATIONAL COLLEGE FOR GIRLS SSFFSS,, GGUURRUUKKUULL MAARRGG,, MAANNSSAARROOVVAARR,, JJAAI IPPUURR DEPARTMENT OF FRENCH SYLLABUS OF FOUNDATIION COURSE FOR THE SESSIION 2009--10 1 Proposed syllabi of

More information

Inleiding Taalkunde. Docent: Paola Monachesi. Blok 4, 2001/ Syntax 2. 2 Phrases and constituent structure 2. 3 A minigrammar of Italian 3

Inleiding Taalkunde. Docent: Paola Monachesi. Blok 4, 2001/ Syntax 2. 2 Phrases and constituent structure 2. 3 A minigrammar of Italian 3 Inleiding Taalkunde Docent: Paola Monachesi Blok 4, 2001/2002 Contents 1 Syntax 2 2 Phrases and constituent structure 2 3 A minigrammar of Italian 3 4 Trees 3 5 Developing an Italian lexicon 4 6 S(emantic)-selection

More information