Dr' Bla Pataki - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Dr' Bla Pataki

Description:

M. Sugeno: Industrial applications of fuzzy control, Elsevier Science Pub.Co. 1985. ... nonlinear (typically monotonous, squashing) function ... – PowerPoint PPT presentation

Number of Views:37
Avg rating:3.0/5.0
Slides: 32
Provided by: patak
Category:
Tags: bla | pataki | squashing

less

Transcript and Presenter's Notes

Title: Dr' Bla Pataki


1
Neurofuzzy Systems
  • Dr. Béla Pataki

2
Neurofuzzy Systems
  • References
  • M. Sugeno Industrial applications of fuzzy
    control, Elsevier Science Pub.Co. 1985.
  • Fuzzy Logic Toolbox For Use with Matlab Version
    2., The MathWorks Inc., 2001
  • H.Ishibuchi,R.Fujioke,H.Tanaka Neural Networks
    That Learn from Fuzzy If-Then Rules, IEEE
    Trans.on Fuzzy Systems, Vol. 1.,No.2., pp. 85-97.

3
Neurofuzzy Systems
  • 1. ANFIS (Adaptive Neuro-Fuzzy Inference System)
  • Hybrid system
  • Artificial neural network
  • A Sugeno type fuzzy system structure and
    training
  • method

4
Neurofuzzy Systems
  • Layered feedforward artificial neural network
    structure
  • circle fix nodes (no adjustable parameters)
  • rectangle nodes with adjustable parameters

5
Neurofuzzy Systems
  • Training of the parameters
  • Training is based on the patterns available a
    pattern is a pair of input (x) and the
    corresponding desired output (d) vectors.
    Training set a set of Np such patterns.
  • Error to be minimized
  • for one pattern
  • for all the patterns
  • Output of the ith node of the kth layer

6
Neurofuzzy Systems
  • Learning rule for the ith parameter of the jth
    node in layer k
  • Backpropagation.

7
Neurofuzzy Systems
  • A Sugeno type (MISO) fuzzy system (N inputs, 1
    output, NR rules)
  • kth rule
  • Firing weight of the kth rule
  • Normalized output of the kth rule
  • Output of the fuzzy system

8
Neurofuzzy Systems
  • Neural network like implementation of one fuzzy
    rule

9
Neurofuzzy Systems
  • Neural structure of the whole fuzzy system

10
Neurofuzzy Systems
  • Parameters to be adjusted (by the training)
  • parameters of the membership functions
  • parameters of the consequences of the rules
  • The output of the system

11
Neurofuzzy Systems
  • The output of the system
  • the output linearly depends
  • on the c parameters, the effect
  • of the membership function
  • parameters is hidden in the parameters, it
    is
  • a nonlinear relationship.

12
Neurofuzzy Systems
  • Parameter adjustment
  • the c parameters can be obtained using LS
    estimation procedures
  • the parameters are obtained using
    backpropagation.

13
Neurofuzzy Systems
  • Simple to generalize the system to multiple
    output (MIMO) case (each rule can have more than
    one output variable)
  • is the c parameter of the jth output
    variable in the kth rule

14
Neurofuzzy Systems
  • 2. Classical neural network using fuzzy data
  • 2.A. Neural network for binary classification
  • Example 2-input binary classifier

15
Neurofuzzy Systems
  • Classical neural network
  • x1, x2 are crisp numbers
  • Fuzzy rules
  • small and large can be considered as fuzzy
    numbers. The fuzzy rule is a special pattern.
  • The same structure is used in both cases the
    only difference is that
  • classical network inputs and outputs are crisp,
  • fuzzy case input is a vector of fuzzy numbers,
    the desired output is crisp.

16
Neurofuzzy Systems
  • Example
  • A two-input one-output classification network is
    to be trained using the following hybrid
    knowledge A. patterns are collected measuring
    two parameters x and y are (the measured pattern
    is (x,y)), both parameters are in the range of
    0,20 B. some rules of thumb are available about
    the patterns
  • Data patterns collected
  • Class1 (1, 20) (2,13) (15,15)
  • Class2 (14,19) (16,20) (2,5)
  • Rules of thumb known
  • IF y is small THEN pattern belongs to Class2
  • IF x is small and y is large THEN pattern
    belongs to Class1

17
Neurofuzzy Systems
  • The rules of thumb are fuzzy rules which can be
    transformed to patterns using the following
    definitions of the linguistic variables (or fuzzy
    numbers) Small , Large , All.
  • Training set
  • Class1 (1, 20) (2,13)
  • (15,15)
  • (Small, Large)
  • Class2 (14,19) (16,20)
  • (2,5)
  • (All, Small)

18
Neurofuzzy Systems
  • In the neural networks the data processing is
    performed in the neurons, 3 types of operations
    are used
  • addition
  • multiplication with the (crisp) weights
  • nonlinear (typically monotonous, squashing)
    function
  • Summary if the 3 operations can process both
    crisp and fuzzy numbers, the same network
    structure can be used.

19
Neurofuzzy Systems
  • In the neural networks the data processing is
    performed in the neurons, 3 types of operations
    are applied
  • addition
  • multiplication with the (crisp) weights
  • nonlinear (typically monotonous, squashing)
    function
  • Summary if the 3 operations can process either
    crisp and fuzzy numbers, the same network
    structure can be used.

20
Neurofuzzy Systems
Define the fuzzy number (A) by the alpha-cuts of
its membership function If the (A?L,A?H)
intervals are
given for all 1??gt0, the membership function is
properly defined ? it is enough to define the
operations for the intervals (interval arithmetic)
21
Neurofuzzy Systems
Operations needed Addition Multiplication with
crisp numbers Functions (monotonically
increasing functions)
22
Neurofuzzy Systems
With these operations the forward phase of the
network can be performed. The only difference
that the fuzzy numbers produced are approximated
by using some alpha-cuts. Example
23
Neurofuzzy Systems
Alpha-level0 X1,0H2 X2,0L2
Y0Lsigm(w1X1,0Hw2X2,0Lwbias)
sigm(-320.5201)sigm(-5)0.0067 X1,0L0
X2,0H3 Y0Hsigm(w1X1,0Lw2X2,0Hwbias)
sigm(-300.5301)sigm(1.5)0.8176 Alpha-level
0.2 X1,0.2H1.8 X2,0L2.16 Y0L
sigm(-31.80.52.1601) sigm(-4.32)0.0131 X1
,0.2L0.2 X2,0.2H2.96 Y0H
sigm(-30.20.52.9601) sigm(0.88)0.7068 Alpha
-level0.4 X1,0H1.6 X2,0L2.32
Y0Lsigm(-3.64)0.0256 X1,0L0.4 X2,0H2.92
Y0H(0.26)0.5646
24
Neurofuzzy Systems
Alpha-level0.8 X1,0.8H1.2 X2,0.8L2.64
Y0.8L sigm(-2.28)0.0928 X1,0.8L0.8
X2,0.8H2.84 Y0.8H sigm(-0.98)0.2729 Alpha-lev
el1 X1,1H1 X2,1L2.8 Y1Lsigm(-3.64)0.1680
X1,1L1 X2,1H2.8 Y1H(-1.6)0.1680
25
Neurofuzzy Systems
  • How to solve the learning phase
    (backpropagation)?
  • First the output error (to be optimized during
    the learning) should be properly defined.
  • In the binary classification case for the sake of
    convenience it can be assumed
  • the desired output is either 0 or 1,
  • the real output using crisp input values
    (classical measured patterns) is a crisp number
    between 0 and 1,
  • the real output using fuzzy input numbers is a
    fuzzy number given by some alpha-cuts, but all
    the intervals are

26
Neurofuzzy Systems
Error to be optimized
27
Neurofuzzy Systems
Optimization (learning) The classical
backpropagation algorithm can be used, because or
are both crisp numbers.
28
Neurofuzzy Systems
  • Care should be taken because propagating back
    through a negative weight means that the interval
    borders should be changed.
  • In the above equation is used that can be
    either or . The actual (crisp)
    interval border value depends on
  • the desired value (if d0 then is used, if
    d1 then .)
  • the sign of the weight through which the
    backpropagation is actually performed if
    is negative, then X changes either L?H or H ? L,
    if it is positive no change will occur.

29
Neurofuzzy Systems
  • 2.B. Generalization to multiclass classification
  • It is straightforward to generalize the error
    definition if at the Lth (output) layer NL nodes
    (outputs) are present
  • In the error definition crisp numbers (variables)
    are again, therefore the classical
    backpropagation algorithm could be used again.

30
Neurofuzzy Systems
  • 2.C. Generalization to approximation (control)
    problems
  • The error to be optimized should be properly
    defined again. The most important difference
    compared to the classification problem, that the
    desired output is a fuzzy number (variable) as
    well.
  • This rule corresponds to the pattern of fuzzy
    inputs and output (L,S,L).
  • (In the classification case, this pattern looks
    like (L,S,1) or (L,S,0). )

31
Neurofuzzy Systems
The error definition (for one ouput) A
gain in the error only crisp numbers (variables)
are used ? the classical backpropagation
algorithm could be used again.
Write a Comment
User Comments (0)
About PowerShow.com