A Radial Basis Neural Network For The Analysis of Transportation Data - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

A Radial Basis Neural Network For The Analysis of Transportation Data

Description:

Sigmoid function vs. Threshold logic. Z is a linear combination of input values ... Sigmoid units. Gaussian Function: f(r) = exp (-ri2/2s2) Output Function: Z ... – PowerPoint PPT presentation

Number of Views:262
Avg rating:3.0/5.0
Slides: 36
Provided by: davi985
Category:

less

Transcript and Presenter's Notes

Title: A Radial Basis Neural Network For The Analysis of Transportation Data


1
A Radial Basis Neural Network For The Analysis of
Transportation Data
  • Thesis Presentation
  • By David Aguilar
  • Date 10/04/04

2
Introduction
  • Motivation for Thesis
  • Pollution levels in L.A.
  • Cost of implementing new programs
  • Large data set lends itself to method
  • Contribution to Discipline
  • Exploration of data analysis methods
  • Examining classifier effectiveness

3
Theoretical Foundations
  • What is a Neural Network
  • Types of Networks
  • Threshold logic units
  • Linear Associators
  • Multi-Layer Networks
  • Back Propagation Networks
  • Radial Basis Networks

4
TLUs and Linear Associators
  • Sigmoid function vs. Threshold logic
  • Z is a linear combination of input values
  • Can be used to classify linearly separable data
    e.g. AND, OR, NOT functions

5
Multi-Layer Networks
  • More complex patterns are recognizable
  • Decision boundaries limit each other to provide
    regions of accepted values
  • Useful for functions such as XOR, XNOR

6
Training The Networks
  • Delta Rule for weights
  • ?wi µ(T Z) Xi
  • Back propagation for multilayer nets
  • Output ?wpi ?(Zp1-ZpT-Zp)Zi
  • Hidden ?wij ?(Zi1-ZiSdpwpi)Zj where
    Sdpwpi S of prev. layer weights

7
Radial Basis Networks
  • Sigmoid units
  • Gaussian Function f(r) exp (-ri2/2s2)
  • Output Function Z S(f(r)j Wj)
  • Complex pattern recognition is possible

8
RBNet Training Techniques
  • Type 1 Delta Rule for weights
  • ei (Ti Z)
  • wj(n1) wj(n) µ1 ei f(r)j
  • Type 2 Vector manipulation
  • f(r)j(n1) f(r)j(n) µ2 2wj(n)
    Sei(n) G(f(r)j) (invector reference
    vectorj)
  • Type 3 Adjusting the sigma value
  • s(n1) s(n) µ3 -w(n) Sei(n)
    G(f(r)j) (invector ref.vectorj)(invector
    ref.vectorj)T

9
Implementation and Training
10
Implementation and Training
  • Java implementation
  • GUI system easy to use
  • Main commands easily accessible
  • Help files available during runtime
  • Repetitive functions easy to perform

11
Training Module
  • Three methods of training
  • Level 1 Moving Weights (Delta rule)
  • Level 2 Moving Centers
  • Level 3 Sigma Adjust

12
Training 1 Fixed Centers
13
SSE Curve
14
Network Diagram
15
Training 2 Moving Centers
16
Training 3 Moving Sigmas
17
Testing the Network
  • Allows quick comparisons of data
  • Weight/Output values can be verified

18
Execution and Analysis
  • Running a Saved Network
  • The Save feature
  • Initiating the Run module
  • The Report file

19
The Run Module
20
The Report File
  • Features
  • Filenames
  • Timestamps
  • Output values
  • Threshold
  • Rounded values
  • Accuracy level

21
C.U.T.R. Data
  • Data consists of 16,302 records consisting of 33
    attributes each
  • Dependent variable is the Delta_VTR
  • 4 types of Regression used to analyze data and
    select variables
  • 2 types of Neural Networks built based on each
    regression technique

22
Variable Selection Methods
  • Forward Regression Most significant predictor
    selected, then others added based on their
    contribution to variance
  • Backward Regression All predictors used
    initially, and less significant factors deleted
  • Stepwise Regression Most complex method
    variables selected based on inter-correlations of
    variables
  • Force-entered Regression All potential predictor
    variables used

23
RBF Architecture
  • Number
  • of Neurons

24
RBF Architecture
  • Number of Epochs

25
RBF Approaches
  • Two types of Radial Basis analyses were utilized
  • A binary approach to analysis (8 different binary
    networks)
  • A collective analysis of all 8 bins at once
  • The results of the first approach are labeled RBF
    Net A, and the second RBF Net B

26
Performance Comparisons
  • Comparison With Linear Regression Models

27
Performance Comparisons
  • Comparison With Linear Regression Models

28
Performance Comparisons
  • Comparison with Backpropagation Models 1

29
Performance Comparisons
  • Comparison with Backpropagation Models 1

30
Performance Comparisons
  • Comparison with Backpropagation Models 2
  • (with hidden units)

31
Performance Comparisons
  • Comparison with Backpropagation Models 2

32
Conclusions
  • RBF Network performs better than regression-based
    analysis and networks on 1-Off Training results
  • RBF Network performs as well as other analysis
    methods for Exact validation
  • RBF Network performs 250 as well as the best
    regression-based method of data analysis
    (Stepwise Regression)

33
Future Work
  • Further refinements to data set may be useful
  • Platform testing to reduce training time
  • Testing on other data sets to provide additional
    evidence of effectiveness
  • Automatic selection of number of neurons
  • Improved error handling for system

34
References
  • 1 Consumer Information, California Air
    Resources Board Official Website,
    http//www.arb.ca.gov/, May 7, 2003
  •  2 R. Yelkur, Radial Basis Function Network
    for Predicting The Impact of Trip Reduction
    Strategies, Thesis report, April 1999
  •  3 R. Perez, Artificial Neural Networks,
    University of South Florida Lecture Notes, Spring
    2002
  •  4 A. Blum, R.L. Rivest, Training a 3-node
    neural network is np-complete, Advances in
    Neural Information Processing Systems I, pp.
    494-501, San Mateo, California, 1989
  •  5 P. van der Smagt, G. Hirzinger, Why
    feed-forward networks are in a bad shape,
    Proceedings of the 8th International Conference
    on Artificial Neural Networks (ICANN), Skövde,
    Sweden, 2-4 September 1998, pp. 159 164,
    Springer-Verlag Birlin Heidelberg New York, 1998
  •  6 S. Haykin, Neural Networks, A Comprehensive
    Foundation, 2nd ed., Prentice-Hall, 1999
  • 7 D. Katic, S. Stanlovic, Fast Learning
    Algorithms for Training of Feedforward Multilayer
    Perceptrons Based on Extended Kalman Filter,
    IEEE International Conference on Neural Networks
    Vol. 1, pp. 196 - 201, 1996

35
References
  • 8 N. Sundararajan, P. Saratchandran, L.
    YingWei, Radial Basis Function Neural Networks
    with Sequential Learning, World Scientific
    Publishing Co. Pte. Ltd., 1999
  •  9 T. Cover, Geometrical and Statistical
    Properties of Systems of Linear Inequalities with
    Applications in Pattern Recognition, IEEE
    Transactions on Electronic Computers,
    EC-14(3)326--334, June 1965
  •  10 L. Mendelsohn, Preprocessing Data For
    Neural Networks, Technical Analysis of
  • Stocks Commodities (magazine), Technical
    Analysis, Inc., 1993
  •  11 L. Kai Hansen, J. Larsen, Unsupervised
    Learning and Generalization, IEEE International
    Conference on Neural Networks Vol. 1, pp. 25 -
    30, 1996
  •  12 T. Poggio, F. Girosi, A Theory of Networks
    for Approximation and Learning, Massachusetts
    Institute of Technology Artificial Intelligence
    Laboratory and Center for Biological Information
    Processing, Whitaker College, Cambridge, MA, 1989
  •  13 H. M. Deitel, P.J. Deitel, Javatm How To
    Program, 4th edition, Prentice-Hall, Inc., 1997
  •  14 P. Kanjilal, Orthoganol Transformation
    Techniques in the Optimization of Feedforward
    Neural Network Systems, Optimization Techniques,
    pp. 53 79, Academic Press, 1998
Write a Comment
User Comments (0)
About PowerShow.com