Project Progress Report - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Project Progress Report

Description:

... Report. Presented by: Yang Chen. Rafiy Saleh. Ling Ou. Tayyaba Sharif ... x, y, z are inputs to nodes Ai, Bi, Ci(Linguistic labels) The Network Structure ... – PowerPoint PPT presentation

Number of Views:623
Avg rating:3.0/5.0
Slides: 23
Provided by: yuepi
Category:
Tags: bi | ling | progress | project | report

less

Transcript and Presenter's Notes

Title: Project Progress Report


1
Project Progress Report
Data Classification using Neuro-Fuzzy Approach
Presented by Yang Chen
Rafiy Saleh Ling Ou
Tayyaba Sharif
2
Projects Goal
  • To realize a hybrid neuro-fuzzy model for
    classification.
  • We adopt Habermans Survival Data as our
    experiment data set. After number of training
    epochs, the system should tell whether or not a
    patient can live more than 5 years according to
    three input parameters.

3
Motivation
  • Neural networks are low-level computational
    structures that perform well when dealing with
    raw data, however they are opaque to user.
  • Fuzzy logic deals with reasoning on a higher
    level, using linguistic information acquired from
    domain expert, however it lacks the ability to
    learn and can not adjust itself to a new
    environment.
  • Our approach combines the advantages of those two
    approaches.
  • Training part to develop IF-THEN fuzzy rules, and
    determine membership functions for input and
    output variables of the system. Expert knowledge
    can be easily incorporated into the structure of
    the system.

4
Main parts of our system
  • Use a matrix to represent the relationship of the
    nodes in our neuro-fuzzy network.
  • Generate the initial parameters for layer 1 and
    layer 4.
  • Use a hybrid neuro-fuzzy adaptive networks to
    train the parameters for layer 1 and layer 4.
  • After the parameters of the network have been
    decided, the actions of the network will be
    decided. We can use test the performance of our
    classifier.

5
Architecture of the Adaptive Network
  • Adaptive Network
  • A multi-layer feedforward network
  • Each node
  • performs a node function on incoming signals and
    a set of parameters
  • Either square node (adaptive node)
  • Has parameters
  • Or circle node (fixed node)
  • No parameters
  • Links
  • Indicate flow direction of signals between nodes
  • Has no weights associated with

6
Network Structure
Consequent Parameters
Premise Parameters
1
1
1
A1
x
2
A2
2
2
3
A3
3
3
B1
?
f
B2
y
B3
25
25
25
C1
26
26
26
z
C2
27
27
C3
27
7
The Network Structure
  • Has 3 input variables with 27 rules
  • Three membership functions are associated with
    each input
  • So the input space is partitioned into 27 fuzzy
    subspaces each of which is governed by a fuzzy
    IF-TEHN Rule.
  • The premise part of a rule defines a fuzzy
    subspace and the consequent part specifies the
    output within this subspace
  • x, y, z are inputs to nodes Ai, Bi, Ci(Linguistic
    labels)

8
Architecture of the Adaptive Network
  • Layer 0
  • Layer 1
  • Nodes are square nodes with node functions
  • Oi 1 µAi (x)
  • Oi 1 µBi (x)
  • Oi 1 µCi (x)
  • For Oi 1 µAi (x),
  • Input to the node i x
  • Linguistic label associated with this node
    function Ai
  • Membership function of Ai Oi 1
  • Bell function µAi (x)

9
Membership Function
To get a bell-shaped membership distribution with
maximum equal to 1 and minimum equal to 0 we use
this membership function
Where ai, bi, ci is the premise parameter set.
The values of these parameters change and the
bell-shaped functions vary accordingly, this
exhibit various form of membership functions on
linguist label Ai
10
Architecture of the Adaptive Network
  • Layer 2
  • Nodes are circle nodes with outputs presenting
    the firing strength of rules.
  • wi µAi (x) µBi (y) µCi (z), i 1, 2,
    3
  • Layer 3
  • Nodes are circle nodes.
  • For ith node, the ratio of ith rules firing
    strength to the sum of all rules firing
    strengths
  • wi wi / (w1 w2 w3), i 1, 2, 3

11
Architecture of the Adaptive Network
  • Layer 4
  • Nodes are square nodes with node function
  • Oi4 wi fi
  • wi (pix qiy riz
    ji)
  • Output of layer 3 wi
  • Consequent parameters pi ,qi , ri , ji
  • Layer 5
  • Single circle node in this layer with overall
    output as the summation of all incoming signals
  • O15 overall output
  • ? wi fi
  • ? wi fi / ? wi

12
Basic steps for part 3
  • Input the command and parameters according to the
    training dataset
  • Create data structures according to those
    parameters
  • Read the nodes structure generated in part 1
  • Build a neuro-fuzzy network
  • Read the initials parameters generated in part2
  • Read the training data
  • For (I 0 I lt epoch number i)
  • For (j 0 j lt dataset number j)
  • Read the input
  • Calculate the nodes output of layer 1,
    layer 2, and layer 3
  • Put the result into a 2-dimensional data
    structure
  • Put a kalman filter to the data structure to
    generate karman filter
  • Use least squares method to generate the
    parameters for layer 4
  • For (j 0 j lt dataset number j)
  • Calculate the output of layer 4 and layer
    5
  • Calculate the error
  • Use gradient descent method to update the
    parameters for layer 1

13
  • CURRENT STATUS OF THE PROJECT

14
Applying the model to our haberman dataset
  • 1. dataset


15
  • 2. Initial parameter generate by the system

16
  • 3. Matrix, standing for the node connection of
    the neuro-fuzzy network, generated by our system.


..
A matrix with 94 rows and 94 columns
17
  • 4. Final parameters for the Neuro-Fuzzy network

18
Fuzzy set for age
Fuzzy set generated using the parameters
generated in step 4
19
Fuzzy set for year of operation
20
Fuzzy set for number of nodes
21
Conclusion
22
References
  • Haberman data set,
  • ftp//ftp.ics.uci.edu/pub/machine-learning-databas
    es/
  • Adaptive-Network-Based Fuzzy Inference System
    http//wwwmath.uni-muenster.de/SoftComputing
  • /lehre/seminar/ss2002/jang93anfis.pdf
Write a Comment
User Comments (0)
About PowerShow.com