Fuzzy Systems. Heuristic Fuzzy Rule Learning Approaches

Similar documents
Learning Methods for Fuzzy Systems

Evolutive Neural Net Fuzzy Filtering: Basic Description

Knowledge-Based - Systems

QuickStroke: An Incremental On-line Chinese Handwriting Recognition System

Active Learning. Yingyu Liang Computer Sciences 760 Fall

Rule Learning with Negation: Issues Regarding Effectiveness

Rule Learning With Negation: Issues Regarding Effectiveness

Analysis of Hybrid Soft and Hard Computing Techniques for Forex Monitoring Systems

Australian Journal of Basic and Applied Sciences

A Reinforcement Learning Variant for Control Scheduling

The 9 th International Scientific Conference elearning and software for Education Bucharest, April 25-26, / X

OPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS

Word Segmentation of Off-line Handwritten Documents

A student diagnosing and evaluation system for laboratory-based academic exercises

Kamaldeep Kaur University School of Information Technology GGS Indraprastha University Delhi

Knowledge Elicitation Tool Classification. Janet E. Burge. Artificial Intelligence Research Group. Worcester Polytechnic Institute

On Human Computer Interaction, HCI. Dr. Saif al Zahir Electrical and Computer Engineering Department UBC

Softprop: Softmax Neural Network Backpropagation Learning

Mining Association Rules in Student s Assessment Data

(Sub)Gradient Descent

Generative models and adversarial training

Impact of Cluster Validity Measures on Performance of Hybrid Models Based on K-means and Decision Trees

Axiom 2013 Team Description Paper

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

OCR for Arabic using SIFT Descriptors With Online Failure Prediction

Python Machine Learning

Applying Fuzzy Rule-Based System on FMEA to Assess the Risks on Project-Based Software Engineering Education

Learning Methods in Multilingual Speech Recognition

Test Effort Estimation Using Neural Network

FUZZY EXPERT. Dr. Kasim M. Al-Aubidy. Philadelphia University. Computer Eng. Dept February 2002 University of Damascus-Syria

Mining Student Evolution Using Associative Classification and Clustering

Artificial Neural Networks written examination

have to be modeled) or isolated words. Output of the system is a grapheme-tophoneme conversion system which takes as its input the spelling of words,

XXII BrainStorming Day

Discriminative Learning of Beam-Search Heuristics for Planning

A Pipelined Approach for Iterative Software Process Model

Rule Chaining in Fuzzy Expert Systems

South Carolina College- and Career-Ready Standards for Mathematics. Standards Unpacking Documents Grade 5

Classification Using ANN: A Review

Lecture 10: Reinforcement Learning

Semi-supervised methods of text processing, and an application to medical concept extraction. Yacine Jernite Text-as-Data series September 17.

BUSINESS INTELLIGENCE FROM WEB USAGE MINING

The Evolution of Random Phenomena

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

INPE São José dos Campos

ISFA2008U_120 A SCHEDULING REINFORCEMENT LEARNING ALGORITHM

Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition

SARDNET: A Self-Organizing Feature Map for Sequences

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Magdeburg-Stendal University of Applied Sciences

TD(λ) and Q-Learning Based Ludo Players

Machine Learning from Garden Path Sentences: The Application of Computational Linguistics

Lip reading: Japanese vowel recognition by tracking temporal changes of lip shape

AUTOMATIC DETECTION OF PROLONGED FRICATIVE PHONEMES WITH THE HIDDEN MARKOV MODELS APPROACH 1. INTRODUCTION

On the Combined Behavior of Autonomous Resource Management Agents

Automating the E-learning Personalization

Matching Similarity for Keyword-Based Clustering

University of Groningen. Systemen, planning, netwerken Bosman, Aart

Software Maintenance

Lecture 1: Machine Learning Basics

Using focal point learning to improve human machine tacit coordination

Speech Emotion Recognition Using Support Vector Machine

System Implementation for SemEval-2017 Task 4 Subtask A Based on Interpolated Deep Neural Networks

Massachusetts Institute of Technology Tel: Massachusetts Avenue Room 32-D558 MA 02139

Modeling function word errors in DNN-HMM based LVCSR systems

A Comparison of Standard and Interval Association Rules

Iterative Cross-Training: An Algorithm for Learning from Unlabeled Web Pages

Evolution of Symbolisation in Chimpanzees and Neural Nets

Comparison of EM and Two-Step Cluster Method for Mixed Data: An Application

SINGLE DOCUMENT AUTOMATIC TEXT SUMMARIZATION USING TERM FREQUENCY-INVERSE DOCUMENT FREQUENCY (TF-IDF)

Learning and Transferring Relational Instance-Based Policies

Target Language Preposition Selection an Experiment with Transformation-Based Learning and Aligned Bilingual Data

Design Of An Automatic Speaker Recognition System Using MFCC, Vector Quantization And LBG Algorithm

Knowledge based expert systems D H A N A N J A Y K A L B A N D E

Experiments with SMS Translation and Stochastic Gradient Descent in Spanish Text Author Profiling

CS Machine Learning

Analysis of Emotion Recognition System through Speech Signal Using KNN & GMM Classifier

Modeling function word errors in DNN-HMM based LVCSR systems

Detecting English-French Cognates Using Orthographic Edit Distance

Rule-based Expert Systems

AUTOMATED TROUBLESHOOTING OF MOBILE NETWORKS USING BAYESIAN NETWORKS

SOFTWARE EVALUATION TOOL

A study of speaker adaptation for DNN-based speech synthesis

Montana Content Standards for Mathematics Grade 3. Montana Content Standards for Mathematical Practices and Mathematics Content Adopted November 2011

WHEN THERE IS A mismatch between the acoustic

Math-U-See Correlation with the Common Core State Standards for Mathematical Content for Third Grade

Course Outline. Course Grading. Where to go for help. Academic Integrity. EE-589 Introduction to Neural Networks NN 1 EE

The Method of Immersion the Problem of Comparing Technical Objects in an Expert Shell in the Class of Artificial Intelligence Algorithms

Ordered Incremental Training with Genetic Algorithms

Notes on The Sciences of the Artificial Adapted from a shorter document written for course (Deciding What to Design) 1

A Neural Network GUI Tested on Text-To-Phoneme Mapping

Degree Qualification Profiles Intellectual Skills

Analyzing sentiments in tweets for Tesla Model 3 using SAS Enterprise Miner and SAS Sentiment Analysis Studio

How to Do Research. Jeff Chase Duke University

INTERMEDIATE ALGEBRA PRODUCT GUIDE

AQUA: An Ontology-Driven Question Answering System

ELLEN E. ENGEL. Stanford University, Graduate School of Business, Ph.D. - Accounting, 1997.

Machine Learning and Data Mining. Ensembles of Learners. Prof. Alexander Ihler

Computerized Adaptive Psychological Testing A Personalisation Perspective

Continual Curiosity-Driven Skill Acquisition from High-Dimensional Video Inputs for Humanoid Robots

Transcription:

Fuzzy Systems Heuristic Fuzzy Rule Learning Approaches Prof. Dr. Rudolf Kruse Christian Moewes {kruse,cmoewes}@iws.cs.uni-magdeburg.de Otto-von-Guericke University of Magdeburg Faculty of Computer Science Department of Knowledge Processing and Language Engineering R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 1 / 14

Learning Fuzzy Rules Differently There are many different methods to learn fuzzy rules from data: Cluster-oriented approaches find clusters in data where each cluster corresponds to one rule (already discussed). Hyperbox-oriented approaches find clusters in form of hyperboxes. Structure-oriented approaches use predefined fuzzy sets to structure the data space and pick rules from grid cells. Neuro-fuzzy systems (NFS) combine artificial neural networks with fuzzy rule. The last three topics will be discussed in the following. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 1 / 14

Outline 1. Hyperbox-Oriented Rule Learning 2. Structure-Oriented Rule Learning

Hyperbox-Oriented Rule Learning Search for hyperboxes in the data space. Create fuzzy rules by projecting hyperboxes. Fuzzy rules and fuzzy sets are created at the same time. These algorithms are usually very fast. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 2 / 14

Example: Hyperboxes in XOR Data Advantage over fuzzy cluster analysis: There is no loss of information when hyperboxes are represented as fuzzy rules. Not all variables need to be used, don t care variables can be discovered. Disadvantage: Each fuzzy rules uses individual fuzzy sets, i.e. the rule base is complex. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 3 / 14

Outline 1. Hyperbox-Oriented Rule Learning 2. Structure-Oriented Rule Learning Wang & Mendel Algorithm Higgins & Goodman Algorithm

Structure-Oriented Rule Learning We must provide the initial fuzzy sets for all variables. This partitions the data space by a fuzzy grid. Then we detect all grid cells that contain data [Wang and Mendel, 1992]. Finally we compute the best consequents and select the best rules, e.g. using NFS [Nauck and Kruse, 1997] (to be discussed later). R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 4 / 14

Structure-Oriented Rule Learning Simple: The rule base is available after 2 cycles through the training data. 1. Discover all antecendents. 2. Determine the best consequents. Missing values can be handled. Numeric and symbolic attributes can be processed at the same time (mixed fuzzy rules). Advantage: all rules share the same fuzzy sets. Disadvantage: fuzzy sets must be given in advance. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 5 / 14

Example: Wang & Mendel Algorithm given data points step 1: granulate data space Example data set with one input and one output. Note that the closest points to the corresponding rules are red. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 6 / 14

Example: Wang & Mendel Algorithm (cont.) step 2: generate rules resulting crisp approximation Fuzzy rules are shown by their α = 0.5-cuts. The learned model misses extrema far away from the rule centers. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 7 / 14

Example: Wang & Mendel Algorithm (cont.) Generated rule base: R 1 : if x is zero x then y is medium y R 2 : if x is small x then y is medium y R 3 : if x is medium x then y is large y R 4 : if x is large x then y is medium y Intuitively, rule R 2 should probably be used to describe the minimum instead: R 2 : if x is small x then y is small y R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 8 / 14

Higgins & Goodman Algorithm [Higgins and Goodman, 1993] This algorithm is an extension of [Wang and Mendel, 1992]. 1. Only one membership function is used for each X j and Y. So, one large rule coveres the entire feature space initially. 2. Any new membership function is placed at the points of maximum error. Both steps are repeated until a maximum number of divisions is reached or the approximation error remains below a certain threshold. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 9 / 14

1. Initialization Create a membership function for each input covering the entire domain range. Create a membership function for the output at the corner points of the input. At the corner point, each input is maximal or minimal of its domain range. For each corner point, the closest example from the data is used to add a membership function at its output value. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 10 / 14

2. Adding new Membership Functions Find the point within the data with maximum error. The defuzzification equals [Wang and Mendel, 1992] For each X j, add a new membership function at the corresponding value of maximal error point. So, this point is perfectly described by the model. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 11 / 14

3. Create new Cell-based Rule Set New rules: Associate the output membership functions with the newly created cells. So, take the closest point to all membership functions of the input (equals to [Wang and Mendel, 1992]) The associated output membership function is the closest one to the output value of the closest point. If the output value of the closest point is far away, then a new output function is created. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 12 / 14

4. Termination Detection If the error is below a certain threshold (or if a certain number of iterations has been performed), then the algorithm stops. Otherwise it continue at step 2. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 13 / 14

Summary Heuristic fuzzy rule learning methods are usually very fast. This is due to their greedy strategies to select rules. For some applications, however, these strategies are too simple in terms of accuracy. In such situations more sophisticated rule learning methods should be used, e.g. neuro-fuzzy systems. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 14 / 14

References Higgins, C. M. and Goodman, R. M. (1993). Learning fuzzy rule-based neural networks for control. In Advances in Neural Information Processing Systems 5, NIPS Conference, pages 350 357, San Francisco, CA, USA. Morgan Kaufmann Publishers Inc. Nauck, D. and Kruse, R. (1997). A neuro-fuzzy method to learn fuzzy classification rules from data. Fuzzy Sets and Systems, 89(3):277 288. Wang, L. and Mendel, J. M. (1992). Generating fuzzy rules by learning from examples. IEEE Transactions on Systems, Man, and Cybernetics, 22(6):1414 1427. R. Kruse, C. Moewes FS Heuristic Rule Learning Lecture 12 1