Title: Sparse Coding
1Sparse Coding in Sparse Winner networks
ISNN 2007 The 4th International Symposium on
Neural Networks
Janusz A. Starzyk1, Yinyin Liu1, David Vogel2 1
School of Electrical Engineering Computer
Science Ohio University, USA 2 Ross University
School of Medicine Commonwealth of Dominica
2Outline
- Sparse Coding
- Sparse Structure
- Sparse winner network with winner-take-all (WTA)
mechanism - Sparse winner network with oligarchy-take-all
(OTA) mechanism - Experimental results
- Conclusions
3Sparse Coding
- How do we take in the sensory information and
make sense of them?
4Sparse Coding
- Neurons become active representing objects and
concepts
Produce sparse neural representation sparse
coding
- Metabolism demands of human sensory system and
brain - Statistical properties of the environment not
every single bit information matters
http//gandalf.psych.umn.edu/kersten/kersten-lab/
CompNeuro2002/
- Grandmother cell by J.V. Lettvin only one
neuron on the top level representing and
recognizing an object (extreme case) - A small group of neuron on the top level
representing an object
C. Connor, Friends and grandmothers, Nature,
Vol. 435, June, 2005
5Sparse Structure
- 1012 neurons in human brain are sparsely
connected - On average, each neuron is connected to other
neurons through about 104 synapses - Sparse structure enables efficient computation
and saves energy and cost
6Sparse Coding in Sparse Structure
- Cortical learning unsupervised learning
- Finding sensory input activation pathway
- Competition is needed Finding neurons with
stronger activities and suppress the ones with
weaker activities - Winner-take-all (WTA) ? a single neuron winner
- Oligarchy-take-all (OTA) ? a group of neurons
with strong activities as winners
7Outline
- Sparse Coding
- Sparse Structure
- Sparse winner network with winner-take-all (WTA)
mechanism - Sparse winner network with oligarchy-take-all
(OTA) mechanism - Experimental results
- Conclusions
8Sparse winner network with winner-take-all (WTA)
- Local network model of cognition R-net
- Primary layer and secondary layer
- Random sparse connection
- For associative memories, not for feature
extraction - Not in hierarchical structure
Secondary layer
Primary layer
David Vogel, A neural network model of memory
and higher cognitive functions in the cerebrum
9Sparse winner network with winner-take-all (WTA)
- Hierarchical learning network
- Use secondary neurons to provide full
connectivity in sparse structure - More secondary levels can increase the sparsity
- Primary levels and secondary levels
- Finding neuronal representations
- Finding global winner which has the strongest
signal strength - For large amount of neurons, it is very
time-consuming
10Sparse winner network with winner-take-all (WTA)
- Finding global winner using localized WTA
- Data transmission feed-forward computation
- Winner tree finding local competition and
feed-back
- Winner selection feed-forward computation and
weight adjustment
Global winner
h1
s2
s1
h
Input pattern
11Sparse winner network with winner-take-all (WTA)
- Data transmission feed-forward computation
- Signal calculation
- Transfer function
Input pattern
12Sparse winner network with winner-take-all (WTA)
- Winner tree finding local competition and
feedback
- Local competition
- Current mode WTA circuit
- (Signal current)
- Local competitions on network
Local neighborhood Local competition ? local
winner Branches logically cut off l1 l3 Signal
on goes to
Set of post-synaptic neurons of N4level
j
2
3
1
level1
5
4
4
7
6
8
9
7
5
1
level
2
3
4
6
i
Set of pre-synaptic neurons of N4level1
N4level1 is the winner among 4,5,6,7,8 ?
N4level1 ? N4level
13Sparse winner network with winner-take-all (WTA)
- The winner network is found all the neurons
directly or indirectly connected with the global
winner neuron
Winner tree
14Sparse winner network with winner-take-all (WTA)
- Winner selection feed-forward computation and
weight adjustment
- Signal are recalculated through logically
connected links - Weights are adjusted using concept of Hebbian
Learning
Number of global winners found is typically 1
with sufficient links
- 64-256-1028-4096 network
- Find 1 global winner with
- over 8 connections
15Sparse winner network with winner-take-all (WTA)
Number of global winners found is typically 1
with sufficient input links
- 64-256-1028-4096 network
- Find 1 global winner with over 8 connections
16Outline
- Sparse Coding
- Sparse Structure
- Sparse winner network with winner-take-all (WTA)
mechanism - Sparse winner network with oligarchy-take-all
(OTA) mechanism - Experimental results
- Conclusions
17Sparse winner network with oligarchy-take-all
(OTA)
- Signal goes through layer by layer
- Local competition is done after a layer is
reached - Local WTA
- Multiple local winner neurons on each level
- Multiple winner neurons on the top level
oligarchy-take-all - Oligarchy represents the sensory input
- Provide coding redundancy
- More reliable than WTA
18Outline
- Sparse Coding
- Sparse Structure
- Sparse winner network with winner-take-all (WTA)
- Sparse winner network with oligarchy-take-all
(OTA) - Experimental results
- Conclusions
19Experimental Results
- WTA scheme in sparse network
original image
Input size 8 x 8
20Experimental Results
- OTA scheme in sparse network
64 bit input
- Averagely, 28.3 neurons being active represent
the objects. - Varies from 26 to 34 neurons
21Experimental Results
Random recognition
- OTA has better fault tolerance than WTA
22Conclusions Future work
- Sparse coding building in sparsely connected
networks - WTA scheme local competition accomplish the
global competition using primary and secondary
layers efficient hardware implementation - OTA scheme local competition produces neuronal
activity reduction - OTA redundant coding more reliable and robust
- WTA OTA learning memory for developing machine
intelligence - Future work
- Introducing temporal sequence learning
- Building motor pathway on such learning memory
- Combining with goal-creation pathway to build
intelligent machine