Title: A Hybrid Self-Organizing Neural Gas Network
1A Hybrid Self-Organizing Neural Gas Network
- James Graham and Janusz Starzyk
- School of EECS, Ohio University
- Stocker Center, Athens, OH 45701 USA
- IEEE World Conference on Computational
Intelligence (WCCI08) - June 1-6, 2008
- Hong Kong
2Introduction
- Self Organizing Networks
- Useful for representation building in
unsupervised learning - Useful for clustering, visualization and feature
maps - Numerous applications in surveillance, traffic
monitoring, flight control, rescue mission,
reinforcement learning, etc. - Some Types of Self Organizing Networks
- Traditional Self-Organizing Map
- Parameterless SOM
- Neural Gas Network
- Growing Neural Gas
- Self-Organizing Neural Gas (SONG)
3Description of the approach- Fritzkes GNG
Network Algorithm Highlights
- GNG starts with a set A of two units a and b at
random positions wa and wb in Rn - In the set A find two nearest neighbors s1 and s2
to the input signal x. - Connect s1 and s2, with an edge and set the edge
age to zero. - Adjust the positions of s1 and its neighborhood
by a constant times (x-s1). (?b for s1 and ?nfor
the neighborhood) - Remove edges in the neighborhood that are older
than amax. - Place a new node every ? cycles between the node
with greatest error and its nearest neighbor. - Reduce error of the node with the maximum error
and its nearest neighbor by ? , and add the
removed error to the new node. - Reduce error of all nodes by a constant (?) times
their current error.
4Example
- Example of Fritzkes network results for 40,000
iterations with the following constants ?b0.05,
?n.0006 , amax88, ?200, ?.5, ?0.0005.
5Description of the approach- Proposed Hybrid
SONG Network Algorithm Highlights
- SONG starts with a random pre-generated network
of a fixed size. - Connections get stiffer with age, making their
weight harder to change. - Error is calculated after the node position
updates rather than before. - Weight adjustment and error distribution are
functions of a distance rather than arbitrary,
hard to set constants. - Edge connections are removed only under the
following conditions - When a connection is added and the node has a
long connection 2x greater than its average
connection length - the long edge is removed. - When a node is moved and has at least 2
connections (after attaching to its destination
node) - its longest connection is removed.
6Description of the approach- Modification of new
data neighborhood
Force calculations
Weight adjustment
Error increase
Age increase by 1
7Description of the approach- Node replacement
Select a node with the minimum error Esk Spread
Esk to its sk neighborhood
maximum error node
sq
minimum error node moved
sk
8Description of the approach- Node replacement
Select a node with the minimum error Esk Spread
Esk to its sk neighborhood
maximum error node
sq
sk
Insert sk to the neighborhood of sq using
weights
longest connection removed
Remove the longest connection Spread half of sq
neighborhood error to sk
9Results
- Initial network structure with 1 random
connection per node (for 200 nodes)
10Results (cont.)
- Structure resulting form 1 initial random
connection.
11Results (cont.)
- Connection equilibrium reached for 1 initial
connection.
12Results (cont.)
- Structure resulting from 16 initial random
connections.
13Results (cont.)
- Connection equilibrium for 16 initial connections.
14Video of Network Progression
Hybrid SONG Network
Fritzke GNG Network
15Results (cont.)
- 2-D comparison, with SOM network
- Salient features of the SOM algorithm
- The SOM network starts as a predefined grid and
is adjusted over many iterations. - Connections are fixed and nodes are not inserted,
moved, or relocated out of their preexisting
grid. - Weight adjustments occur over the entire grid and
are controlled by weighted distance to the data
point.
16Growing SONG Network
- Number of nodes in SONG can be automatically
obtained - The SONG network starts with a few randomly
placed nodes and build itself up until an
equilibrium is reached between the network size
and the error. - A node is added every ? cycles if
- MaxError gt AveError Constant
- Equilibrium appears to be 200 nodes.
17Growing SONG Network (cont.)
- Error handling in growing SONG network was
modified. - The error is reset and recomputed after the
equilibrium was reached - Network continues to learn reaching new
equilibrium - Approximation accuracy vary from run to run
18Growing SONG Network (cont.)
- The results of growing SONG network run (on the
right) compared to the simpler static approach
(on the left).
19Other Applications- Sparsely connected
hierarchical sensory network
- The major features of the SONG algorithm such as
the weight adjustment, error calculation, and
neighborhood selection are utilized in building
self-organizing sparsely connected hierarchical
networks. - The sparse hierarchical network is locally
connected based on neurons firing correlation - Feedback and time based correlation are used for
invariant object recognition.
20Other Applications- Sparsely connected
hierarchical sensory network (cont.)
21Other Applications- Sparsely connected
hierarchical network (cont.)
Correlation based wiring
Declining neurons activations
Sparse hierarchical representations
22Conclusions
- The SONG algorithm is more biologically plausible
than Fritzkes GNG algorithm. Specifically - Weight and error adjustment are not parameter
based. - Connections become stiffer with age rather than
being removed at a maximum age as in Fritzkes
method. - Network has all neurons from the beginning
- SONG approximates data distribution faster than
the other methods tested. - Connectivity between neurons is automatically
obtained and depends on the parameter that
controls edge removal and the network size. - The number of neurons can be automatically
obtained in growing SONG to achieve the desired
accuracy.
23Future Work
- Adapt the SONG algorithm to large input spaces
(high dimensionality, i.e. images) - Adapt the SONG algorithm to a hierarchical
network. - Possible applications in feature extraction,
representation building, and shape recognition. - Insert new nodes as needed to reduce error.
- Optimize the network design.
24Questions
starzyk_at_bobcat.ent.ohiou.edu