Functional Networks Framework - PowerPoint PPT Presentation

1 / 61
About This Presentation
Title:

Functional Networks Framework

Description:

h. L. M. N. 5. Directed Links: {arrows} K. Elements of Functional Networks ... Sciences, (Samuel Kotz, N. Balakrishnan, Campbell B. Read and Brani Vidakovic, ... – PowerPoint PPT presentation

Number of Views:40
Avg rating:3.0/5.0
Slides: 62
Provided by: ocwKfu
Category:

less

Transcript and Presenter's Notes

Title: Functional Networks Framework


1
Functional Networks Framework
ICS 581 Advanced Artificial Intelligence
Lecture 15
Dr. Emad A. A. El-Sebakhy
Term 061 Meeting Time 630 -745 Location
Building 22, Room 132
2
Agenda
  • Introduction
  • What is a Neural Network (NN)?
  • What is a Functional Network (FunNet)?
  • How do Functional Networks work?
  • Differences between FunNets and NNs
  • Examples of some applications of FunNets
  • Summary

3
Introduction
The Data Pyramids
4
Knowledge Discovery (KD)
  • Data is accumulated rapidly and it needs to be
    analyzed.
  • KD is the use of computational intelligence
    schemes to extract hidden patterns in the
    (useful information) in bodies of data for
    use in decision support and estimation. It is
    the automated extraction of hidden predictive
    information from large databases.
  • Prediction or estimation of an outcome
  • Classification (supervised Learning)
  • Clustering (unsupervised Learning)

5
The Common Learning Schemes
6
Artificial Neural Networks (ANNs) Background
A neural network is a powerful data modeling tool
that is able to represent complex input/output
relationships (those relationships that can not
be described by traditional methods).
ANN is an information processing system that
tries to simulate the human brain in the
following two ways
  • NN acquires knowledge through learning.
  • NNs knowledge is stored within inter-neuron
    connection strengths known as synaptic weights.
  • The Goal of NN
  • Create a model that maps the input to the output
    using input data.
  • Model is to predict the desired output when it is
    unknown.

7
The most common Neural Networks Architecture
Multilayer Perceptron (MLP)
8
Advantages of Neural Networks
  • The true power and advantage of neural networks
    lies in their ability to
  • represent both linear and non-linear
    relationships
  • learn these relationships directly from the data
    being modeled

Disadvantages of Neural Networks
  • The weights of the MLP network are initially set
    to random values. So that the learning
    algorithm will take a large number of iterations
    for the learning to converge.
  • The speed of convergence and stability in the
    backpropagation learning algorithm depends
    on the magnitude of the learning rate parameter
    this will cause oscillations during the
    training.
  • The configuration of the MLP network is determine
    by the number of hidden layers, number of the
    neurons in each of the hidden layers. The choice
    of number of the hidden layers and the
    number of the neurons in each of the hidden
    layers are still critical. The number of the
    hidden layers is determine by trial and error.

9
Limitations of Neural Networks
  • Ad hoc approach for determining network structure
    andthe training process.
  • Significant inputs are not immediately obvious
  • When to stop training to avoid over-fitting ?
  • Stuck at local Minima, it is unable to converge
    to theoptimal solution because of the initial
    random weights.
  • A neural network model is a relative "black box"
    and havelimited ability to explicitly identify
    possible causalrelationships.
  • The multi-layer perceptron feed-forward requires
    off-line training and iterative presentation of
    the training data.
  • The choice of hidden layers and number of neurons
    ineach hidden layers are optional (trial and
    errors).
  • In practice, it is difficult to determine a
    sufficient number of the neurons necessary
    to achieve the desired degree of approximation
    accuracy.

10
Agenda
  • References
  • What is a Neural Network (NN)?
  • What is a Functional Network (FunNet)?
  • How do Functional Networks work?
  • Differences between FunNets and NNs
  • Examples of some applications of FunNets
  • Summary

?
?
11
What is a Functional Network (FunNet)?
Like neural network, a functional network is a
powerful data modeling tool that is able to
capture and represent complex input/output
relationships.
Functional networks, however, are a
generalization or extension of neural networks.
They are also problem driven (not a black box).
12
The Mathematical Definition of FunNet
A Functional Network is a pair ,
where X is a set of nodes and
is a set of
functions over X, such that every node must be
either an input or an output node of at least one
neuron function in
13
A FunNet is analogous to a Printed Circuit Board
(PCB)
14
Elements of Functional Networks
1. Input Units a, b, c, d, e, f, g
f
e
L
g
h
a
N
K
b
c
M
d
j
i
15
Elements of Functional Networks
2. Computing Neurons K, L, M, N
f
e
L
g
h
a
N
K
b
c
M
d
j
i
16
Elements of Functional Networks
3. Output Units i, j
f
e
L
g
h
a
N
K
b
c
M
d
j
i
17
Elements of Functional Networks
4. Intermediate Units h
f
e
L
g
h
a
N
K
b
c
M
d
j
i
18
Elements of Functional Networks
5. Directed Links arrows
f
e
L
g
h
a
N
K
b
c
M
d
j
i
19
Note N gives two outputs i and j and the i
output of N must be identical to the output of M.
20
Agenda
  • References
  • What is a Neural Network (NN)?
  • What is a Functional Network (FN)?
  • How do Functional Networks work?
  • Differences between FNs and NNs
  • Examples of some applications of FNs
  • Summary

?
?
?
21
How Do Functional Networks Work?
  • Selection of the initial topology
  • Simplifying the initial topology
  • Uniqueness of representation
  • Parametric Learning
  • Model selection
  • Model validation

22
1. Selection of the Initial Topology
  • Problem driven design The selection of the
    initial topology of a functional network is often
    based on the characteristics of the problem at
    hand.

23
Example Medical Diagnosis
Suppose the level of a disease d is a function of
three symptoms x, y and z, that is, d D(x, y,
z).
Suppose we obtain the symptoms in three
different sequences Case 1 We measure x and
y, then z.
24
Example Medical Diagnosis
Case 2 We measure y and z, then x.
25
Example Medical Diagnosis
26
Example Medical Diagnosis
Combine the three cases
Intermediate units
Output unit
This is the initial topology of the initial
functional network
27
Corresponding topology of the ANNs
Whats the difference between ANNs and FunNets.
28
2. Simplifying Functional Nets
  • Can the initial topology of a FunNet be
    simplified?
  • Using functional equations, we can determine
  • whether or not there exists a simpler but
    equivalent functional network which gives the
    same output for the same input.

29
Simplifying Functional Nets (cont)
From which, we have d D(x, y, z) kp(x)
q(y) r(z)
Therefore,
30
3. Uniqueness of Representation
  • Given the topology of a FunNet, we need to know
    the conditions for uniqueness (whether or not
    several sets of functions (neurons) lead to
    exactly the same output for the same input).
  • See the following list of references for more
    details.
  • Castillo E., Cobo A., Gómez N., and Hadi A.
    (2000), A General Framework for Functional
    Networks, Networks, 35, 7082.
  • Castillo E., Gutiérrez J. M., Hadi A. S., and
    Lacruz B. (2001), "Some Applications of
    Functional Networks in Statistics and
    Engineering," Technometrics, 43, 1024.
  • Castillo, E., Hadi, A., and Lacruz, B. (2001),
    "Optimal Transformations in Multiple Linear
    Regression Using Functional Networks,"
    Proceedings of the International Work-Conference
    on Artificial and natural Neural Networks. IWANN
    2001, in Lecture Notes in Computer Science 2084,
    Part I, 316324.
  • Castillo, E., Hadi, A. S., Lacruz and, B., and
    Pruneda, R. E. (2003), "Functional Network Models
    in Statistics," Monografías del Seminario
    Matamático García de Galdeano, 27, 174177.

31
More References
  • Emad A. El-Sebakhy, (2004), Functional networks
    training algorithm for statistical pattern
    recognition Ninth IEEE International Symposium
    on Computers and Communications. IEEE Computers
    and Communications, V.1, Page(s)92 - 97.
  • Castillo, E. and Hadi, A. S. (2006), "Functional
    Networks," in Encyclopedia of Statistical
    Sciences, (Samuel Kotz, N. Balakrishnan, Campbell
    B. Read and Brani Vidakovic, eds.), 4, 25732583.
  • Emad A. El-Sebakhy, (2005), Unconstrained
    Functional Networks Classifier the
    International Conference of Artificial
    Intelligence and Machine Learning (AIML05),
    Volume 3, 19-21 December, 2005. Page(s)99 105.
  • Emad A. El-Sebakhy, K. Faisal, T. El-Bassuny, F.
    Azzedin, and A. Al-Suhaim, (2006), Evaluation of
    Breast Cancer Tumor Classification with
    Unconstrained Functional Networks Classifier
    the 4th ACS/IEEE International Conference on
    Computer Systems and Applications. 281-287.
  • Emad A. El-Sebakhy, Faisal A. Kanaan, and Ali S.
    Hadi, (2006), Iterative Least Squares Functional
    Networks Classifier, IEEE Transactions Neural
    Networks V.2 March 2007.
  • Emad A. El-Sebakhy, (2007), Functional Networks
    as a Novel Approach for Building Knowledge-Based
    Classification System, Journal of Artificial
    Intelligence. (In Press).
  • Emad A. El-Sebakhy, (2007), Constrained
    Estimation Functional Networks for Statistical
    Pattern Recognition Problems, International
    Journal of Machine Learning. (In Press).
  • Emad A. El-Sebakhy, (2007), Mining the Breast
    Cancer Diagnosis Using Functional
    Networks-Maximum Likelihood Classifier,
    International Journal of Bioinformatics. (In
    press).

32
4. Parametric Learning
Each function in a Functional Networks can be
approximated by families of linearly independent
functions ?j ?j1(X),, ?jq(X), j 1,
, p, where p is the number of functions and q is
the number of elements in a family.
The common families of Linearly Independent
Functions
Polynomial family
Exponential family
Fourier family
33
Example Nonlinear Parametric Learning
Medical Diagnosis Example (cont)
Let
be the training set,
we can write the model as
These functions can be approximated by
34
Nonlinear Parametric Learning
The parameters Q aj, bj, cj, dj, j 1,, q,
can then be estimated by minimizing some
functions of the errors such as
subject to the uniqueness constraints.
These lead to a nonlinear system of equations or
to nonlinear programming problems.
35
5. Model Selection
  • There are two questions to be answered when
    selecting a functional network
  • Which family of functions to use?
  • Which terms in the family are important?

6. Model Validation
Tests for quality and cross validations are
performed. Using internal and external validation
techniques. See the following list of references
for more details.
36
Model Selection Let x be a sample of size n and
let ? be the set of parameters to be estimated
  • We select the best model as follows
  • Selection methods
  • Backward-Forward (BF)
  • Forward-Backward (FB)
  • Quality Criteria We use the MDL, that is,

37
More References
  • Emad A. El-Sebakhy, (2004), Functional networks
    training algorithm for statistical pattern
    recognition Ninth IEEE International Symposium
    on Computers and Communications. IEEE Computers
    and Communications, V.1, Page(s)92 - 97.
  • Castillo, E. and Hadi, A. S. (2006), "Functional
    Networks," in Encyclopedia of Statistical
    Sciences, (Samuel Kotz, N. Balakrishnan, Campbell
    B. Read and Brani Vidakovic, eds.), 4, 25732583.
  • Emad A. El-Sebakhy, (2005), Unconstrained
    Functional Networks Classifier the
    International Conference of Artificial
    Intelligence and Machine Learning (AIML05),
    Volume 3, 19-21 December, 2005. Page(s)99 105.
  • Emad A. El-Sebakhy, K. Faisal, T. El-Bassuny, F.
    Azzedin, and A. Al-Suhaim, (2006), Evaluation of
    Breast Cancer Tumor Classification with
    Unconstrained Functional Networks Classifier
    the 4th ACS/IEEE International Conference on
    Computer Systems and Applications. 281-287.
  • Emad A. El-Sebakhy, Faisal A. Kanaan, and Ali S.
    Hadi, (2006), Iterative Least Squares Functional
    Networks Classifier, IEEE Transactions Neural
    Networks V.2 March 2007.
  • Emad A. El-Sebakhy, (2007), Functional Networks
    as a Novel Approach for Building Knowledge-Based
    Classification System, Journal of Artificial
    Intelligence. (In Press).
  • Emad A. El-Sebakhy, (2007), Constrained
    Estimation Functional Networks for Statistical
    Pattern Recognition Problems, International
    Journal of Machine Learning. (In Press).
  • Emad A. El-Sebakhy, (2007), Mining the Breast
    Cancer Diagnosis Using Functional
    Networks-Maximum Likelihood Classifier,
    International Journal of Bioinformatics. (In
    press).

38
From data to predictions
FunNets Architectural Design
Implementation and Prediction
Functional Networks Training Algorithm
FunNets Learning Algorithm
39
Agenda
  • References
  • What is a Neural Network (NN)?
  • What is a Functional Network (FN)?
  • How do Functional Networks work?
  • Differences between FNs and NNs
  • Examples of some applications of FNs
  • Summary

?
?
?
?
40
Differences Between FunNets and ANNs
  • The topology of a NN is chosen from among
    severaltopologies using trial and error. The
    initial topology inFunNet is a problem driven
    and can be simplified usingfunctional equations.
  • In standard NNs, the neural functions are given
    andweights are learned. In FunNets, the neural
    functions arelearned from data.
  • In standard NNs all the neural functions are
    identical,univariate and single-argument (a
    weighted sum of inputvalues). In FunNets the
    neural functions can be different,multivariate,
    and/or multiargument.
  • In FunNets, common outputs of different
    functions(neurons) are forced to be identical.
    This structureis not possible in standard neural
    networks.

41
Agenda
  • References
  • What is a Neural Network (NN)?
  • What is a Functional Network (FunNets)?
  • How do Functional Networks work?
  • Differences between FunNets and NNs
  • Examples of some applications of FunNets
  • Summary

?
?
?
?
?
42
Typical Applications of Functional Networks
  • Optical Character Recognition (OCR) Scanning
    typewritten/handwritten documents, finger prints,
    etc
  • Voice Recognition Transcribing spoken words into
    ASCII text
  • Medical Diagnosis Assisting doctors with their
    diagnosis by analyzing the reported symptoms
    and/or medical imaging data such as MRIs or
    X-rays
  • Machine Diagnostics Detect when a machine has
    failed so that the system can automatically shut
    down the machine when this occurs
  • Target Recognition Military application which
    uses video and/or infrared image data to
    determine if an enemy target is present

43
Typical Applications of Functional Networks
  • Targeted Marketing Finding the set of
    demographics which have the highest response rate
    for a particular marketing campaign
  • Intelligent Searching An internet search engine
    that provides the most relevant content and
    banner ads based on the users' past behavior
  • Fraud Detection Detect fraudulent credit card
    transactions and automatically decline the charge

44
Some Examples of Functional Networks
  • Modeling structural engineering problems
  • Bayesian Statistics
  • Time series
  • Iterative problems
  • Bioinformatics
  • Transformations of variables
  • Nonlinear Regression
  • Pattern Classification
  • Cryptography Security
  • Signal Processing with complex arguments

45
Example Iterative Functions
  • Suppose that we wish to calculate the n-th
    iterate of a given function f, that is

1. Selecting the Initial Topology
y
2. Simplifying the Initial Topology
Let
Since
then
46
2. Simplifying the Initial Topology
A general solution of this functional equation is
So, the two functional networks are equivalent
47
3. Uniqueness of Representation
Since
The uniqueness of representation implies
solving the functional equation
with unique solution
where c is an arbitrary constant.
So, the function g must be fixed at a point.
48
4. Learning the Model
49
4. Learning the Model
50
Example Nonlinear Regression
  • Consider the semi-parametric regression model
  • h(y) 1(x1) q(xq) (1)
  • FunNets do not require the functions h(.), 1(.),
    , q(.) to be known.

51
Assuming that h(.) is invertible, the
semi-parametric regression model can be
represented by y
h-11(x1) q(xq)and the following
functional network
Example Nonlinear Regression
52
Numeric Example
A data set consisting of n 40 observations is
generated from where X1, X2 are U(0, 1) and
is U(-0.005, 0.005).
53
We now fit the following model to these data
(y) 1(x1) 2(x2), which is equivalent
to y -11(x1) 2(x2).
Numeric Example
54
An Example
Consider the function
and suppose that we are interested in its
n-iterate.
Assume also that we have a set of data points
where to learn the function
f(x).
Then, we select a polynomial family and learn
55
An Example
Then, we select a polynomial family and minimize
56
An Example
We get the following models
Exhaustive and Backward-Forward Method
57
An Example
Forward-backward method
58
An Example
Forward-backward method
59
An Example
Exact and predicted values for different values
of n.
60
Questions?
61
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com