Neural Representation and Neural Computation - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Neural Representation and Neural Computation

Description:

Not Hofstadter's Boolean Dream. Not a neurobiologist's dream either ... Other Sources. Dictionary of Philosophy of mind. http://www.artsci.wustl.edu/~philos/MindDict ... – PowerPoint PPT presentation

Number of Views:191
Avg rating:3.0/5.0
Slides: 28
Provided by: chipm3
Category:

less

Transcript and Presenter's Notes

Title: Neural Representation and Neural Computation


1
Neural Representation and Neural Computation
  • P. Churchland and T. Sejnowski, 1990
  • Presented by Chip Mappus cmappus_at_cc

2
Connectionism
  • Computational approach to modeling the brain
    which relies on the interconnection of many
    simple units to produce complex behavior
  • High level mental states are emergent properties

3
Basic questions
  • How is representation of a world by a self
    possible?
  • Philosophical, traditional methods not
    successful, ignore psychological, neurobiological
    findings
  • Quine naturalism
  • How did a priori knowledge evolve into human
    consciousness?

4
Connectionist Knowledge Acquisition
  • Biological phenomenon
  • No a priori knowledge
  • Innate dispositions to behave/believe/organize
  • A priori beliefs of philosophy should not be
    taken literally, but within in the context of the
    assumptions we make in the collective
    conceptions of self
  • Science needs some framework, and this is as good
    as any

5
Why is Neurobiology overlooked?
  • Dualism rejected physical brain influencing
    mental phenomena
  • Materialism links the physical brain with mental
    phenomena, but
  • Disagreement on how mental states are explained
    within the brain
  • Behaviorists reach the conclusion that to explain
    behavior, we do not have to know anything about
    brain
  • Identity Theorists claim mental states are states
    in the brain

6
Theory Dualism
  • Will there be a unified science of the
    mind/brain? No
  • Will psychology theory reduce to neuroscience? No
  • Neuroscience too hard
  • Mental states are implemented in diverse machines
    no one machine state explains the same belief for
    everyone
  • Intentionality mental states identified in terms
    of semantic content- beliefs are stated and
    neurobiological explanations cannot be sensitive
    to logical relations between contents of
    cognitive states

7
Reductionism
  • Strives for integration of psychological and
    neurobiological
  • Reduction an explanation of phenomena described
    by one theory in terms of the phenomena described
    by a more basic theory
  • Theories co-evolve impetus, caloric, gene,
    neuron, electricity, instinct, like excitability
  • Not micro vs. macro
  • Reduced phenomena do not disappear

8
Reductionism Responds
  • Neuroscience is hard, but with new techniques
    more data available to constrain theories
  • There is no problem with high level states that
    are multiply instantiable
  • Explanations of high level cognitive phenomena
    will not be in terms of the lowest level
  • Intentionality explanation will be set in the
    context of evolution and development of the
    nervous system

9
Intentionality Sentence Logic Theory of Cognition
  • Representations are symbolic
  • Computations are rules for manipulating symbolic
    structures
  • Computer metaphor- Cognitive Psychology seeks to
    find the program that the mind is running
  • Thoughts are akin to having a sentence in mind,
    thinking is performing logical operations on
    sentences

10
Problems with Cognition as Symbol Manipulation
  • Timing tests on many cognitive tasks done in .5s
  • 5ms per computational step
  • 100 steps (100 step rule)
  • Sequential programs on conventional computers
    require more steps

11
More Problems
  • Brain is a parallel machine Purkinje cells 80,000
    input connections
  • Cerebral cortex neurons 10,000 output connections
  • Information storage radically different from
    digital computer storage
  • Storage and processing are separate, memory
    addressable
  • Nervous system info stored in connections between
    neurons
  • No distinct storage locations for each
    information, content addressable
  • Distributed not punctate (explains degradation)
  • Computations that are difficult for humans are
    easily done with a computer, and vice versa

12
Even More Problems
  • Nervous system
  • Plastic neurons change
  • Degrade gracefully
  • Fault tolerant (compared to the computer)
  • Analogy between levels of description in a
    conventional computer and the nervous system
    misleading
  • Nonverbal humans and animals pose a problem for
    the sentence logic theory of cognition

13
So, how does the brain represent?
  • Neuronal processes have structure of some kind,
    but not like semantic structure in language
  • This organization is what permits complex
    behaviors
  • Information processing is modeled in terms of
    the trajectory of a complex nonlinear dynamical
    system in a very high dimensional space.

14
Connectionism in Practice NETtalk
  • Convert English text to speech sounds
  • Demonstrates
  • How a network can learn to perform a very complex
    task without symbols and without rules to
    manipulate symbols
  • The differences between local and distributed
    representations

15
NETtalk Layout
  • 3 layers of processing units
  • First receives input letters as a word
  • Last outputs elementary speech sounds (phonemes)
  • Hidden layer performs transformation from letters
    to sounds
  • Training provides structure (back propagation)
  • Gradient descent learning (no rules)

16
NETtalk Network
  • 7 groups of 29 input units (203)
  • 80 Hidden units
  • 26 Output units
  • 18,629 Weight parameters

17
Learning without rules
  • NETtalk can generalize well and pronounce new
    words correctly.
  • Knowledge of English pronunciation
  • How does it do this without rules?
  • How is the knowledge organized?
  • Each time network was started from a random set
    of weights, a different network was generated

18
What did it do?
  • For each set, there is a pattern of activity
    among the hidden units
  • Hard-c sound caused 15 of 80 hidden units to be
    highly activated (rest had little or no activity)
  • Repeated for the 79 letter to sound
    correspondences
  • 79 vectors, each pointing in a different
    direction in the 80 dimensional space
  • Cluster analysis on NETtalk showed vectors for
    vowels clustered together in an organized manner
  • Consonants were more variable across different
    networks

19
NETtalk Organization (1)
  • Representational organization in the network is
    not programmed
  • Each unit in the network participates in more
    than one transition from letter to sound
  • The representation does not resemble sentence
    logic organization
  • Output could be used as input to other networks

20
NETtalk Organization (2)
  • General properties of hierarchical organization
    of letter to sounds emerged only at the level of
    groups of units
  • Different initial conditions yielded different
    networks, but all achieved about the same level
    of performance
  • Not a model for assisting in reading process, but
    a demonstration of network ability

21
Detecting surface curvatures from gray level
input images
  • Hubel, Wiesel (1962) certain neurons in cat
    visual cortex respond better to oriented bars of
    lights and light/dark edges than to spots of
    light
  • Can the information contained in a population of
    partially activated cortical neurons be used to
    compute useful information about the
    three-dimensional surfaces between boundaries of
    objects in the image?

22
Shape from Shading Network
  • Input Array of on-center and off-center
    receptive fields (122)
  • Output Population of units representing
    curvatures and broadly tuned direction of
    maximum curvature (23)
  • Intermediate layer (12)
  • Oriented receptive fields emerged during training
  • Function cannot be inferred from receptive field
    properties

23
Next Generation Networks
  • Parallel networks
  • Scalability
  • Backpropagation biologically implausible
  • Temporal chaining
  • Language acquisition

24
Conclusions (1)
  • Neural networks demonstrated by Sejnowski fit
    well with Smolenskys subsymbolic description of
    connectionism
  • Not Hofstadters Boolean Dream
  • Not a neurobiologists dream either
  • System properties are not available at the single
    unit level
  • Model must be biologically constrained

25
Conclusions (2)
  • Orgels Second Rule Nature is more ingenious
    that we are
  • Marrs analytical levels are not independent of
    each other
  • More than three levels of analysis to explain the
    nervous system seems likely

26
Other Systems (Elman)
  • Emergentism Artificial life (Alife) systems
  • Cells on a grid
  • At each time step, alive or dead depends on
    immediate neighbors
  • Evolution Genetic Algorithms (GA)
  • Goal direction as a product of evolution

27
Other Sources
  • Dictionary of Philosophy of mind
  • http//www.artsci.wustl.edu/philos/MindDict
  • DecTalk development
  • http//research.compaq.com/wrl/DECarchives/DTJ/DTJ
    K01/
Write a Comment
User Comments (0)
About PowerShow.com