Title: A Cognitive Substrate for Human-Level Intelligence
1A Cognitive Substrate for Human-Level Intelligence
- Nick Cassimatis
- In collaboration with
- Paul Bello, Magda Bugajska, Arthi Murugesan
- Human-Level Intelligence Laboratory
2- N.L. Cassimatis (2006). A Cognitive Substrate
for Human-Level Intelligence. AI Magazine.
Volume 27 Number 2.
3General and Human-Level Intelligence
- General Intelligence
- Human-level Intelligence is a useful proxy for
this. - It is a useful lower bound on what you can aim
for. - If you humans can do X (e.g., use natural
language), then I am confident a computer can do
X. If humans cannot do X (e.g., find prime
factors of million-bit numbers), I am less
confident. - Humans are the most general intelligence we know
of, so you can learn a lot about observing us
(including introspection.) - Thinking exactly like a human is not a hard
constraint
4Obstacles Profusion and Integration
- Profusion of knowledge.
- Profusion of algorithms.
- Difficulty of integrating it all.
5Great amount and variety of knowledge
6Great amount and variety of knowledge
- Even very simple situations can require a much
knowledge. - Knowledge about a piggy banks (Charniak)
- It is used to store money.
- If you shake it and there is no sound, it is
empty. - You can remove money by breaking it.
- You can remove money by turning it upside down
and shaking it. - The more money you put in, the more you get out.
- (dozens more pieces of knowledge).
- Exceptions to each point.
- Cyc Millions of assertions, nowhere near
complete. - How do we get all this into one computer program?
7Diversity of algorithms
- E.g., Natural language conversation
- Vision for recognizing faces, tracking eyes,
gestures. - PCA, Bayesian networks, Kalman filters.
- Acoustic speech recognition.
- Hidden Markov Models, Fourier Transforms.
- Syntax, phonology, morphology
- Search- or table-based parsers, rules,
statistical N-gram models. - Semantics and Pragmatics (including semantics and
pragmatics) - Almost everything.
- There are often dozens or hundreds of variations
within each algorithmic class.
8Integration
- How do you get all these algorithms and data
structures to work with each other - Procedural integration Bayes nets, logic theorem
provers, case-based reasoning, neural networks
? - Knowledge integration Scripts, frames, logical
propositions, patterns of activation ?
9Learning
- Commits you to weak representations and execution
algorithms because these are easier to prove
theorems about and be general. - You need rich conceptual foundation to do
learning in the first place.
10How do we deal with this?
- Profusion Cognitive Substrate
- Integration Polyscheme
11Cognitive substrate
- Small set of reasoning mechanisms can underlie
the whole range of human cognition. - Preliminary guess at what would be a good
substrate - Time, space, causality, identity, events,
parthood, desire. - Hypothesis Once you have implemented a
substrate, the rest of AI is relatively simple. - Substrate is AI-complete.
12Evidence for substrate
- Personal experience.
- Linguistics.
- Psychology.
- Neuroscience.
- AI.
- Evolution and learning
13Evolution
- We evolved to deal with a relatively immediate
and concrete physical and social world, not to - Contemplate life on Mars.
- Trade stock options.
- Explore number theory.
- Design airplanes.
- Repair speed boats.
- Calculate tips.
- Market insurance policies.
- Etc.
- Whatever mechanisms we use to reason about these
were originally designed to deal with the
physical and social world. - Hence, human social and physical reasoning
mechanisms are sufficient for the full array of
human reasoning.
14Learning
- What is it that kids have that give them the
ability to learn so much, to be so general? - Substrate mechanisms.
- Mechanism for mapping.
- Mechanisms for learning.
- Mechanisms for being taught.
15Substrate research
- Overall approach
- Build substrate (2-4 year old?)
- Turn it loose on the world.
- Building the substrate
- First guess (physical reasoning)
- Map onto several domains (epistemic reasoning,
syntax, word learning) - Each mapping leads to refinements and
generalizations - Learning mechanisms (analogy)
16Contrast
- Many people dream of building a baby and setting
it loose on the world. - Contrast
- Need a richer substrate.
- Need to integrate learning with reasoning.
17Building a substrate
- Reasoning about time, space, causality, identity,
events, parthood, desire - Requires integration of temporal, spatial, causal
data structures and algorithms. - Polyscheme is an approach to this problem.
18Common Functions
- Basic functions
- Forward inference.
- Subgoaling.
- Identity matching.
- Representing alternate worlds.
- Basic functions can be computed using different
representations - E.g., subgoaling
- Logic when B ?H and want to know if H, make a
subgoal of B. - Neural Network To know the value of the output
units, make a goal of the input units. - Perception To know what is at P, point the
camera to P.
19AI algorithms are ways of ordering common
functions
- Counterfactual reasoning
- When uncertain about A, simulate the world where
A and simulate the world where not-A. - Backtracking search
- Nested counterfactual reasoning.
- Stochastic simulation
- When you think A is more likely than not-A,
simulate the world where A is true more often
than the world where A is not true. - Logic-theorem proving
- When uncertain about P
- Ground P if you can.
- Subgoal on P if you can.
- Means-ends planning.
- When you want G, and A achieves G,
- Simulate the world where A is true and subgoal on
A.
20Integration of algorithms
21Integration of representations
22Physical reasoner demonstrates flexible
integration
- Reactive/Deliberative Robot architecture
- Combines means-ends planning, logical inference,
production rules, neural networks, truth
maintenance, reactive subsystem etc. - Promising approach to (hard) substrate problems.
- In physical reasoner.
- Adding algorithms and representations adds to
huge increase in efficiency. - Several problems mapped onto physical reasoning
substrate.
N. L. Cassimatis, J. Trafton, M. Bugajska, A.
Schultz (2004). Integrating Cognition, Perception
and Action through Mental Simulation in Robots.
Journal of Robotics and Autonomous Systems.
Volume 49, Issues 1-2, 30 November 2004, Pages
13-23.
23Example Syntax
- Murugesan, N.L. Cassimatis (2006). A Model of
Syntactic Parsing Based on Domain-General
Cognitive Mechanisms. In Proceedings of 28th
Annual Conference of the Cognitive Science
Society. - N. L. Cassimatis (2004). Grammatical Processing
Using the Mechanisms of Physical Inferences. In
Proceedings of the Twentieth-Sixth Annual
Conference of the Cognitive Science Society. - Show how to map syntactic parsing to physical
reasoning. - What could words, phrases, case, empty
categories, traces, long-distance dependencies,
coreference, subjacency, anaphora, etc. have to
do with gravity and collision? - If these two domains have underlying unity, then
you cannot quickly rule out mappings between
other domains.
24Syntax
Verbal World Physical World
World, phrase, sentence Event
Constituency Parthood
Phrase structure constraints Physical constraints
Word/phrase category Categories
Word/phrase order Temporal order
Phrase attachment Event identity
Coreference/binding Object identity
Traces Object permanence
Short- and long-distance dependencies Apparent motion and long paths.
25Syntax
26Word Learning
- One-shot, non-associative word learning
- M. Bugajska, N.L. Cassimatis (2006). Beyond
Association Social Cognition in Word Learning.
In Proceedings of the International Conference on
Development and Learning.
27Theory of Mind
- Use counterfactual and default reasoning
mechanisms to reason about other peoples
beliefs. - P. Bello N.L. Cassimatis (2006). Developmental
Accounts of Theory-of-Mind Acquisition
Achieving Clarity via Computational Cognitive
Modeling. In Proceedings of 28th Annual
Conference of the Cognitive Science Society. - P. Bello N.L. Cassimatis (2006). Understanding
other Minds A Cognitive Modeling Approach. In
Proceedings of the 7th International Conference
on Cognitive Modeling.
28Summary of progress
- Preliminary implementation (physical reasoning)
- Demonstrates Polyscheme enables advance in
flexibility, integration and power of intelligent
systems. - Manually mapped onto several domains (epistemic
reasoning, syntax, word learning, wargaming) - Each mapping demonstrates the plausibility of the
substrate appraoch. - Each mapping leads to refinements,
generalizations and eliminations about the
substrate. - Learning mechanisms (analogy).
- Just starting
- Teaching the substrate (this will gradually
result from our NLP work).
29What this demonstrates
- Cognitive substrate enables a real advance
towards solving the profusion and integration
problem. - It enables qualitative advances in capabilities
of intelligent systems. - It enables faster development of systems.
30Future work
- Keep doing mappings.
- Pragmatics.
- Metacognition.
- Self-awareness, consciousness.
- Use insights from this to enhance substrate.
- Automate mappings.
- Keep driving this process towards the goal of
having a 2-4 year old intelligence that can learn
from interacting with the world and people.
31How people can help
- Software engineering.
- Find a domain and do a mapping.
- Add an algorithm or subdomain to the substrate.