CS 540: Introduction to Artificial Intelligence

Similar documents
CS Machine Learning

Activity 2 Multiplying Fractions Math 33. Is it important to have common denominators when we multiply fraction? Why or why not?

Probabilistic Latent Semantic Analysis

Chapter 2 Rule Learning in a Nutshell

Discriminative Learning of Beam-Search Heuristics for Planning

Test How To. Creating a New Test

Artificial Neural Networks written examination

Learning goal-oriented strategies in problem solving

Chunk Parsing for Base Noun Phrases using Regular Expressions. Let s first let the variable s0 be the sentence tree of the first sentence.

Syntax Parsing 1. Grammars and parsing 2. Top-down and bottom-up parsing 3. Chart parsers 4. Bottom-up chart parsing 5. The Earley Algorithm

Generating Test Cases From Use Cases

Short vs. Extended Answer Questions in Computer Science Exams

Creating a Test in Eduphoria! Aware

Python Machine Learning

Excel Intermediate

Create Quiz Questions

Introduction to Simulation

An OO Framework for building Intelligence and Learning properties in Software Agents

11/29/2010. Statistical Parsing. Statistical Parsing. Simple PCFG for ATIS English. Syntactic Disambiguation

Learning Methods in Multilingual Speech Recognition

Machine Learning and Data Mining. Ensembles of Learners. Prof. Alexander Ihler

Lecture 10: Reinforcement Learning

ReinForest: Multi-Domain Dialogue Management Using Hierarchical Policies and Knowledge Ontology

Lecture 1: Machine Learning Basics

Course Content Concepts

RANKING AND UNRANKING LEFT SZILARD LANGUAGES. Erkki Mäkinen DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF TAMPERE REPORT A ER E P S I M S

(Sub)Gradient Descent

Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition

AUTOMATED TROUBLESHOOTING OF MOBILE NETWORKS USING BAYESIAN NETWORKS

Introduction to Causal Inference. Problem Set 1. Required Problems

STAT 220 Midterm Exam, Friday, Feb. 24

BMBF Project ROBUKOM: Robust Communication Networks

Semi-supervised methods of text processing, and an application to medical concept extraction. Yacine Jernite Text-as-Data series September 17.

Active Learning. Yingyu Liang Computer Sciences 760 Fall

Learning From the Past with Experiment Databases

2/15/13. POS Tagging Problem. Part-of-Speech Tagging. Example English Part-of-Speech Tagsets. More Details of the Problem. Typical Problem Cases

PowerTeacher Gradebook User Guide PowerSchool Student Information System

Rover Races Grades: 3-5 Prep Time: ~45 Minutes Lesson Time: ~105 minutes

Grade 2: Using a Number Line to Order and Compare Numbers Place Value Horizontal Content Strand

SARDNET: A Self-Organizing Feature Map for Sequences

A Neural Network GUI Tested on Text-To-Phoneme Mapping

Decision Analysis. Decision-Making Problem. Decision Analysis. Part 1 Decision Analysis and Decision Tables. Decision Analysis, Part 1

WSU Five-Year Program Review Self-Study Cover Page

Proof Theory for Syntacticians

Given a directed graph G =(N A), where N is a set of m nodes and A. destination node, implying a direction for ow to follow. Arcs have limitations

POLA: a student modeling framework for Probabilistic On-Line Assessment of problem solving performance

On the Polynomial Degree of Minterm-Cyclic Functions

Schoology Getting Started Guide for Teachers

OCR for Arabic using SIFT Descriptors With Online Failure Prediction

Pre-AP Geometry Course Syllabus Page 1

Tap vs. Bottled Water

Assessing Children s Writing Connect with the Classroom Observation and Assessment

Parallel Evaluation in Stratal OT * Adam Baker University of Arizona

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Testing A Moving Target: How Do We Test Machine Learning Systems? Peter Varhol Technology Strategy Research, USA

MYCIN. The MYCIN Task

Millersville University Degree Works Training User Guide

Visual CP Representation of Knowledge

Model Ensemble for Click Prediction in Bing Search Ads

ABSTRACT. A major goal of human genetics is the discovery and validation of genetic polymorphisms

Machine Learning from Garden Path Sentences: The Application of Computational Linguistics

Prediction of Maximal Projection for Semantic Role Labeling

Using Blackboard.com Software to Reach Beyond the Classroom: Intermediate

Evolutive Neural Net Fuzzy Filtering: Basic Description

The Good Judgment Project: A large scale test of different methods of combining expert predictions

Laboratorio di Intelligenza Artificiale e Robotica

Implementing a tool to Support KAOS-Beta Process Model Using EPF

Learning Methods for Fuzzy Systems

Version Space. Term 2012/2013 LSI - FIB. Javier Béjar cbea (LSI - FIB) Version Space Term 2012/ / 18

Software Maintenance

INTERMEDIATE ALGEBRA Course Syllabus

Major Milestones, Team Activities, and Individual Deliverables

Chinese Language Parsing with Maximum-Entropy-Inspired Parser

GACE Computer Science Assessment Test at a Glance

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

Softprop: Softmax Neural Network Backpropagation Learning

Session 2B From understanding perspectives to informing public policy the potential and challenges for Q findings to inform survey design

Math 098 Intermediate Algebra Spring 2018

A Version Space Approach to Learning Context-free Grammars

FUZZY EXPERT. Dr. Kasim M. Al-Aubidy. Philadelphia University. Computer Eng. Dept February 2002 University of Damascus-Syria

Multimedia Application Effective Support of Education

Switchboard Language Model Improvement with Conversational Data from Gigaword

Physics 270: Experimental Physics

CS 1103 Computer Science I Honors. Fall Instructor Muller. Syllabus

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Julia Smith. Effective Classroom Approaches to.

Welcome to ACT Brain Boot Camp

The Paradox of Structure: What is the Appropriate Amount of Structure for Course Assignments with Regard to Students Problem-Solving Styles?

Grammars & Parsing, Part 1:

ecampus Basics Overview

First Grade Curriculum Highlights: In alignment with the Common Core Standards

U of S Course Tools. Open CourseWare (OCW)

Self Study Report Computer Science

Linking Task: Identifying authors and book titles in verbose queries

Statewide Framework Document for:

A General Class of Noncontext Free Grammars Generating Context Free Languages

Iterative Cross-Training: An Algorithm for Learning from Unlabeled Web Pages

PH.D. IN COMPUTER SCIENCE PROGRAM (POST M.S.)

Analysis of Hybrid Soft and Hard Computing Techniques for Forex Monitoring Systems

ACTL5103 Stochastic Modelling For Actuaries. Course Outline Semester 2, 2014

Transcription:

CS 540: Introduction to Artificial Intelligence Midterm Exam: 4:00-5:15 pm, October 25, 2016 B130 Van Vleck CLOSED BOOK (one sheet of notes and a calculator allowed) Write your answers on these pages and show your work. If you feel that a question is not fully specified, state any assumptions you need to make in order to solve the problem. You may use the backs of these sheets for scratch work. Write your name on this page and initial all other pages of this exam (in case a page comes loose during grading). Make sure your exam contains six problems on eight pages. Name Student ID Problem Score Max Score 1 20 2 20 3 10 4 10 5 25 6 15 TOTAL 100

Problem 1 Decision Trees (20 points) Assume that you are given the set of labeled training examples below, where each feature has possible values a, b, and c. A name field is included for convenience. You choose to learn a decision tree and select - as the default output if there are ties. Name F1 F2 Output Ex1 b b + Ex2 c b + Ex3 b c + Ex4 a b - Ex5 c a - a) What score would the information gain calculation assign to each of the features? Be sure to show all your work (use the back of this or the previous sheet if needed). b) Which feature would be chosen as the root of the decision tree being built? (Break ties in favor of F1 over F2, i.e., in alphabetic order.) 2

c) Assume F1 is chosen as the root. Show the recursive calls to ID3 below; be sure to include the arguments to ID3. It is fine to simply use the names of the examples in the recursive calls (i.e., you do no need to copy the feature values in your answer). You need only show the recursive calls; you do not need to show the results produced by these recursive calls. Copied for your convenience: Name F1 F2 Output Ex1 b b + Ex2 c b + Ex3 b c + Ex4 a b - Ex5 c a - 3

Problem 2 Search (20 points) Consider the search space below, where S1 and S2 are the start nodes and G1 and G2 satisfy the goal test. Arcs are labeled with the cost of traversing them (so lower is better) and the estimated cost to a goal is reported inside nodes. For each of the following search strategies, indicate which goal state is reached (if any) and list, in order, all the states popped off of the OPEN list. When all else is equal, nodes should be added to OPEN in alphabetical order. You can show your work (for cases of partial credit), using the notation presented in class, on the back of the previous page. Hint: put both S1 and S2 in OPEN (you need to decide the order) at the beginning. Beam Search (with beam width = 2 and f=h) Goal state reached: States popped off OPEN: A* (f = g + h) Goal state reached: States popped off OPEN: 4

Problem 3 Probabilistic Reasoning (10 points) Using the probability table below, for full credit answer the following questions by showing the complete world states used to compute the answer. Write your final numeric answer on the line provided. a) Prob(B = true C=false A=false)? b) Prob( (A=true C = false) (B = false C = true) )? A B C Prob false false false 0.10 false false true 0.15 false true false 0.05 false true true 0.11 true false false 0.09 true false true 0.10 true true false 0.28 true true true 0.12 5

Problem 4 Game Playing (10 points) a) Apply the mini-max algorithm to the partial game tree below, where it is the minimizer s turn to play and the game does not involve randomness. The values estimated by the static-board evaluator (SBE) are indicated in the leaf nodes (higher scores are better for the maximizer). Write the estimated values of the intermediate nodes inside their circles and indicate the proper move of the minimizer by circling one of the root s outgoing arcs. Process this game tree working left-to-right. -9 7-3 1-2 -5 8-6 4 b) List the first leaf node (if any) in the above game tree whose SBE score need not be computed: the node whose score = Briefly explain why: 6

Problem 5 Multiple-Choice and Short Answer Questions (25 points) For each multiple-choice question, circle your answer. Unless stated otherwise, choose the one best answer. You do not need to explain your answers except for e. a) Genetic algorithms require entities be represented by fixed-sized data structures due to the need to cross over them. TRUE FALSE b) Two search strategies that might not find a solution even when one exists are and. c) If Uniform Cost finds a solution, is Breadth-First guaranteed to find a solution as well? YES d) Assuming h1(n) is admissible and h2(n) is admissible, which of the following would be better for A*? h3(n) = min( h1(n), h2(n) ) h4(n) = max( h1(n), h2(n) ) e) Which of the following do not belong in this group? Briefly explain your choice on the right. i) information gain ii) heuristic functions iii) iv) test-set accuracy fitness functions f) The singularity refers to the case where a decision stump has better test-set accuracy than a decision tree. TRUE FALSE g) The reason ID3 typically works better in an ensemble than k-nn does (even if they have equivalent testset accuracies when used as single model learners) is because ID3 is. h) Draw arrows from the phrases on the left to their best match on the right. Test sets Address the horizon effect Avoid local minima Tune sets Estimate future performance Judge information gain Select good settings for parameters i) Briefly state one strength of iterative deepening over depth-first search:. NO 7

Problem 6 Miscellaneous Questions (15 points) a) List two methods covered in class for reducing overfitting (short answers are fine). b) Draw a picture that illustrates how k-nn, with k=1, partitions ( chops up ) feature space. c) Imagine you have a genetic algorithm population with these three bit-string entities, where the number next to each is the output of the fitness function. These are the entities remaining after the worst-scoring entities have been killed off. The genetic algorithm next needs to choose two parents to cross over. On the blank lines below show the probability each surviving entity is chosen as the first parent. 0101101110 fitness = 3 prob chosen as first parent = 1101001011 fitness = 7 prob chosen as first parent = 1111101101 fitness = 1 prob chosen as first parent = 8