Title: Evergreen Abstract
1Kharkiv National University of Radio Electronics
Control Systems Research Laboratory
HYBRID GMDH-NEURAL NETWORK OF COMPUTATIONAL
INTELLIGENCE
Bodyanskiy Yevgeniy, Doctor Science, Prof. of
Artificial Intelligence Department, Scientific
Head of CSRL
Vynokurova Olena, Ph.D., Senior Researcher of CSRL
Pliss Iryna, Ph.D., Leader Researcher of CSRL
Control Systems Research Laboratory, Kharkiv
National University of Radio and Electronics,
Lenina av., 14, Kharkiv, Ukraine, 61166,
E-mailsbodya, vinokurova, pliss_at_kture.kharkov.u
a
2GMDH-Neural Network
3Hybrid GMDH-Neural Network Architectures
4Q-Neuron Structure
Input signal
5Q-Neuron Structure
Output of Q-neuron
which considering the notation where
can be rewritten in the following form
or in the more compact matrix form
where
6Learning algorithm of Q-neuron
Optimization criterion
Learning algorithm
Lyapunov function
where
7Learning algorithm of Q-neuron
Afterwards, solving the different equation
taking into account that
we can obtain the optimal step parameter value in
form
Learning algorithm
or
8Wavelon Structure
Input signal
,
,
Output of wavelon
where
9Wavelon Structure
Adaptive membership function based on wavelet
Mexican Hat and Itakura-Saito metric
where
10Learning algorithm of wavelon
Optimization criterion
In general case the learning algorithm can be
written in form
11Learning algorithm of wavelon
where
12Learning algorithm of wavelon
Using the inverse matrices lemma and after
applying simple transformations we obtain the
effective parameters learning algorithm in the
form
13Learning algorithm of wavelon
It is known, that one-step algorithms such as
Kaczmarz one, have rapid response, but they dont
have filtering properties, i.?. they are not
operated well in the conditions of intensive
disturbance and noise. In order to provide the
learning algorithm with smoothing properties.
Proposed learning algorithm
14Result of Experiments
15Experiment results
Signal emulation task
Dynamical object of Narendra-Parthasarathy
Every Q-neuron was learned in the batch mode for
10 epochs. The initial Q-neuron values of
parameters were taken as equal to 0.
16Experiment results
emulation results
actual values
17Experiment results
Signal forecasting task
Cryobiology time series
The second experiment was made on real
electroencephalogram (EEG) signal in the deep
artificial hypothermia based on GMDH-neural
network with adaptive compartmental wavelon. The
data of EEG signal was obtained from the
experiment carried on the Vistar line rat-male
with mass 180-200 gr. in the winter period. The
signal quantization frequency was 40 Hz. The
hypometabolic state in the rat was invoked based
on Andzhusa-Bakhmeteva-Dzhaja method 25. The
data was obtained on the base of joint research
under scientific collaboration with Institute of
cryobiology and cryomedicine of Academia Medical
Science.
18Experiment results
forecasting values
actual values
19Conclusion
Modified GMDH-neural network architecture has
been proposed. The Q-neurons and adaptive
compartmental wavelon were used as the nodes. The
computationally simple and effective Q-neuron
learning algorithm in the matrix form which
allows processing of non-stationary nonlinear
signals and sequences under significantly
uncertainty condition was considered. The highly
effective adaptive compartmental wavelon
all-parameters learning algorithm was proposed.
The replacement of standard GMDH neural network
node and number of node inputs expansion allow to
improve the network approximating properties. The
experimental simulation based on the different
signals kind was carried out, and its results
have confirmed the advantages of the proposed
approach.
20Literature
1. Stepashko V. S. Analyzing the Criteria
Effectiveness for Structure Identication of
Forecasting Models. Problems of Control and
Informatics, 28 3-4, 1994.
2. Ivakhnenko A. G., Madala H.R. Inductive
learning algorithms for complex systems modeling.
London, Tokyo CRC Press, 1994.
3. Bishop C. M. Neural Networks for Pattern
Recognition. Oxford Clarendon Press, 1995.
4. Bodyanskiy Ye., Pliss I., Vynokurova O.
Adaptive wavelet-neuro-fuzzy network in the
forecasting and emulation tasks. Int. Journal on
Information Theory and Applications, 15(1)
47-55, 2008.
5. Bodyanskiy Ye., Vynokurova O. Hybrid
radial-basis neuro-fuzzy wavelon in the
non-stationary sequences forecasting problems
Proc. 2nd Int. Conf. on Inductive Modelling,
144-147, 2008.
6. Carotenuto L., Raiconi G. On the minimization
of quadratic functions with bilinear constraints
via augmented Lagrangians. Journal of
Optimization Theory and Applications, 55 23-36,
1987.
7. Bodyanskiy Ye., Vynokurova O., Pliss I.
Hybrid neural network architecture on Q-neurons
and its learning algorithms. Proc. Int. Conf.
Intellectual systems for decision making and
problems of computational intelligence, 2
235-239, 2009. (in Russian)
8. Jang J. S. R., Sun C. T. Functional
equivalence between radial basis function
networks and fuzzy inference systems. IEEE Trans.
on Neural Networks, 4 156-159, 1993.
9. Itakura F. Maximum prediction residual
principle applied to speech recognition. IEEE
Trans. on Acoustics, Speech and Signal
Processing, 23 67-72, 1975.
10. Mitaim S., Kosko B. What is the best shape
for a fuzzy set in function approximation? In
proceedings of the 5th IEEE Int. Conf on Fuzzy
Systems .Fuzz-96., 2 1237-1213, 1996.
11. Bodyanskiy Ye., Vynokurova O., Yegorova E.
Radial-basis-fuzzy-wavelet-neural network with
adaptive activation membership function. Int.
Journal on Articial Intelligence and Machine
Learning, 8 9-15, 2008.
12. Bodyanskiy Ye., Vynokurova O. Adaptive
wavelon and its learning algorithm. Int. Journal
Control Systems and Computers, 1 47-53, 2009.
(in Russian)
13. Narendra K. S., Parthasarathy K.
Identication and control of dynamical systems
using neural networks. IEEE Trans. on Neural
Networks, 1 4-26, 1990.
14. Jekabsons G. A. A software tool for
performing regression modelling using various
modelling methods. http//www.cs.rtu.lv/jekabsons
/.
21THANK YOU FOR YOUR ATTENTION AND PATIENCE