Logical Agents - PowerPoint PPT Presentation

1 / 51
About This Presentation
Title:

Logical Agents

Description:

Knowledge base = set of sentences in a formal language ... and in the directly (not diagonally) adjacent squares, it (the agent) will perceive a stench. ... – PowerPoint PPT presentation

Number of Views:39
Avg rating:3.0/5.0
Slides: 52
Provided by: adilaalfa
Category:

less

Transcript and Presenter's Notes

Title: Logical Agents


1
Logical Agents
  • Chapter 7 AIMA 2nd Ed.

2
Outline
  • Knowledge-Based Agents
  • Wumpus World
  • Logic in general models and entailment
  • Propositional (Boolean) logic
  • Equivalence, validity and satisfiability
  • Inference rules and theorem proving
  • forward chaining
  • backward chaining
  • resolution

3
Knowledge bases
domain-independent algorithms
domain-specific content
  • Knowledge base set of sentences in a formal
    language
  • Declarative approach to build an agent (or other
    system) Tell what it needs to know
  • Then it can Ask itself what to do answers
    should follow from the KB
  • Agents can be viewed at the knowledge level
  • i.e., what they know, regardless of how theyre
    implemented
  • Or at the implementation level
  • i.e., data structures in KB and algorithms that
    manipulate them

4
A simple knowledge-based agent
  • The agent must be able to
  • Represent states, actions, etc.
  • Incorporate new percepts
  • Update internal representations of the world
  • Deduce hidden properties of the world
  • Deduce appropriate actions

5
WUMPUS WORLD
stench
pit
Wumpus
breeze
Gold
Agent
6
Wumpus World PEAS description
  • Performance
  • pick up gold 1000
  • fall into a pit or eaten by wumpus -1000
  • each action taken -1
  • using up the arrow -10
  • Enviroment
  • 4 x 4 grid rooms. Agent start at bottom left
    (square 1,1), facing right. Gold and wumpus
    locations randomly chosen. Each square other than
    start can be a pit with probability 0.2

7
Wumpus World PEAS description
  • Actuators
  • Forward, Turn Left 900, Turn Right 900
  • Grab grab object in the same square as the
    agent.
  • Shoot fire an arrow in a straight line in the
    direction the agent is facing. The arrow
    continues until it hits (and kills) the wumpus or
    hits a wall. The agent only has one arrow ? only
    the first shoot action has any effect.
  • The agent dies if it enters a square containing a
    pit or a live wumpus. (It is safe to enter a
    square with a dead wumpus).

8
Wumpus World PEAS description
  • Sensors five sensors represented with an ordered
    pair with five members, each contains a single
    bit of information.
  • In the square containing the wumpus and in the
    directly (not diagonally) adjacent squares, it
    (the agent) will perceive a stench.
  • In the squares directly adjacent to a pit, it
    will perceive a breeze.
  • In the square where the gold is, itll perceive a
    glitter.
  • When an agent walks into a wall, itll perceive a
    bump.
  • When the wumpus is killed, it emits a woeful
    scream that can be perceived anywhere in the
    cave.
  • E.g. if theres a stench and a breeze, but no
    glitter, bump or scream, the agent will receive
    the percept Stench, Breeze, None, None, None.

9
Wumpus world characterization
  • Observable??

No only local perception
Deterministic??
Yes outcomes exactly specified
Episodic??
No sequential at the level of actions
Static??
Yes Wumpus and pits do not move
Discrete??
Yes
Single-agent??
Yes Wumpus is essentially a natural feature
10
Wumpus world Initial State
A Agent B Breeze G Glitter, Gold OK Safe
square P Pit S stench V visited W wumpus
PERCEPT None, None, None, None, None
11
Wumpus world After one move
A Agent B Breeze G Glitter, Gold OK Safe
square P Pit S stench V visited W wumpus
Percept None, Breeze, None, None, None
12
Wumpus world After third move
A Agent B Breeze G Glitter, Gold OK Safe
square P Pit S stench V visited W wumpus
Percept Stench, None, None, None, None
13
Wumpus world After fifth move
A Agent B Breeze G Glitter, Gold OK Safe
square P Pit S stench V visited W wumpus
Percept Stench, Breeze, Glitter, None, None
14
Wumpus world Example tight spots
Breeze in (1,2) and (2,1) ? no safe actions. You
have to compute the probability of a pit in each
of (3,1), (2,2) and (1,3) to decide the most OK
room.
15
Wumpus world Example tight spots
  • Stench in (1,1) ? cannot move.
  • Can use strategy of coercion
  • shoot straight ahead
  • wumpus was there ? dead ? safe
  • wumpus wasnt there ? safe

16
Logic in general
  • Logics are formal languages for representing
    information such that conclusions can be drawn.
  • Syntax defines the sentence in the language.
  • Semantics define the meaning of sentences
    i.e., define truth of a sentence in a world.
  • E.g. the language of arithmetic
  • x 2 ? y is a sentence x2 y gt is not
    a sentence
  • x 2 ? y is true iff the number x 2 is no
    less than the number y
  • x 2 ? y is true in a world where x 7, y
    1
  • x 2 ? y is true in a world where x 0, y
    6

17
Entailment
  • Entailment means that one thing follows from
    another
  • KB? ?
  • Knowledge base KB entails sentence ? if and only
    if ? is true in all worlds where KB is true.
  • E.g., the KB containing Milan won and Roma
    won entails Either Milan won or Roma won.
  • E.g., x y 4 entails 4 x y
  • Entailment is a relationship between sentences
    (i.e., syntax) that is based on semantics.
  • Note brains process syntax (of some sort).

18
Models
  • Logicians typically think in terms of models,
    which are formally structured worlds with respect
    to which truth can be evaluated.
  • We say m is a model of a sentence ? if ? is true
    in m
  • M(?) is the set of all models of ?
  • Then KB? ? if and only if M(KB) ? M(?)
  • E.g. KB Milan won and Roma won
  • ? Milan won

19
Entailment in wumpus world
  • Situation after detecting nothing in 1,1,
    moving right, breeze in 2,1
  • Consider possible models for this! (assuming only
    pits)
  • 3 boolean choices ? 8 possible models

20
Wumpus world models
21
Wumpus world models
KB wumpus-world rules
observations
22
Wumpus world models
KB wumpus-world rules
observations ?1 1,2 is safe KB ? ?1 ,
proved by model checking
23
Wumpus world models
KB wumpus-world rules
observations
24
Wumpus world models
KB wumpus-world rules
observations ?2 2,2 is safe KB ? ?2
25
Inference
  • KB ??i ? sentence ? can be derived from KB by
    procedure i
  • Set of all consequences of KB is a haystack ? is
    a needle. Entailment needle being in the
    haystack inference finding it.
  • Soundness i is sound if
  • whenever KB ??i ?, it also true that KB? ?
  • Completeness i is complete if
  • whenever KB ? ?, it is also true that KB ??i ?
  • an unsound procedure essentially makes things up
    as it goes along it announces the discovery a
    nonexistent needle.
  • an incomplete procedure cannot derive some of
    entailed sentence in the KB we know that a
    particular needle exists in the haystack but the
    procedure is unable to find that needle.

26
Correspondence between World and Representation
  • If a KB is true in the real world, then any
    sentence a derived from KB by a sound inference
    procedure is also true in the real world.

27
Logic
  • Grounding the connection, if any, between
    logical reasoning process and the real
    environment in which the agent exists.
  • How do we know that KB is true in the real
    world? ? philosophical question ? many
    discussions ? see chapter 26.
  • Simple answer the agents sensors create the
    connection.
  • The meaning and truth of percept sentences are
    defined by the processes of sensing and sentence
    construction.
  • Some part of knowledge is not a direct
    representation of a single percept, but a general
    rule derived, perhaps, from perceptual experience
    but not identical to a statement of that
    experience. This kind of general rule are
    produced by a sentence construction process
    called learning.

28
Propositional Logic Syntax
  • The proposition symbols P1, P2, etc. are
    sentences
  • If S is a sentence, ?S is a sentence (negation)
  • If S1 and S2 are sentences, S1 ? S2 is a sentence
    (conjunction)
  • If S1 and S2 are sentences, S1 ? S2 is a sentence
    (disjunction)
  • If S1 and S2 are sentences, S1 ? S2 is a sentence
    (implication)
  • If S1 and S2 are sentences, S1 ? S2 is a sentence
    (biconditional)

29
Propositional Logic Semantics
  • Each model specifies true/false for each
    proposition symbol
  • E.g., P1,2 P2,2 P3,1 ? 8 possible
    models
  • false false true
  • Truth evaluation rules with respect to a model m
  • ?S is true iff S is false
  • S1 ? S2 is true iff S1 is true and
    S2 is true
  • S1 ? S2 is true iff S1 is true or
    S2 is true
  • S1 ? S2 is true iff S1 is false or
    S2 is true
  • i.e., is false iff S1 is true and S2
    is false
  • S1 ? S2 is true iff S1 ? S2 is true and S2 ? S1
    is true
  • Simple recursive process evaluates an arbitrary
    sentence, e.g., ?P1,2 ? (P2,2 ? P3,1) ?false ?
    (false ? true) true ? (false? true) true ?
    true true

30
Truth tables for connectives
31
Wumpus world sentences
  • Let Pi,j be true if there is a pit in i,j
  • Let Bi,j be true if there is a breeze in i,j
  • There is no pit in 1,1
  • R1 ?P1,1
  • A square is breezy if and only if there is an
    adjacent pit
  • R2 B1,1 ? (P1,2 ? P2,1)
  • R3 B2,1 ? (P1,1 ? P2,2 ? P3,1)
  • Include the breeze percepts for the first two
    squares visited
  • R4 ?B1,1
  • R5 B2,1


32
Truth table for the knowledge base
  • Is KB entails ?1 ( there is no pit in 1,2) ?
    ? ?1 ?P1,1
  • KB is true in 3 out of 128 possible models. Since
    ?1 is also true in those 3 models, then KB
    entails ?1.

33
Inference by enumeration
  • Depth-first enumeration of all models is sound
    and complete
  • O(2n) for n symbols problem is co-NP-complete.
  • PL-True? returns true if a sentence holds within
    a model
  • Extend(P, true, model) returns a new partial
    model in which P has the value true

34
Logical Equivalence
  • Two sentences are logically equivalent if and
    only if true in same models
  • ? ? if and only if ?? ? and ?? ?
  • ? ? ? ? ? ? commutativity of ?
  • ? ? ? ? ? ? commutativity of ?
  • ((? ? ?) ? ?) (? ? (? ? ?)) associativity of
    ?
  • ((? ? ?) ? ?) (? ? (? ? ?)) associativity of
    ?
  • ?(??) ? double-negation elimination
  • (? ? ?) (?? ? ??) contraposition
  • (? ? ?) (?? ? ?) implication elimination
  • (? ? ?) ((? ? ?) ? (? ? ?)) biconditional
    elimination
  • ?(? ? ?) (?? ? ??) de Morgan
  • ?(? ? ?) (?? ? ??) de Morgan
  • (? ? (? ? ?)) ((? ? ?) ? (? ? ?))
    distributivity of ? over ?
  • (? ? (? ? ?)) ((? ? ?) ? (? ? ?))
    distributivity of ? over ?

35
Validity and Satisfiability
  • A sentence is valid if it is true in all models,
  • e.g., True, A ? ?A, (A ? (A ? B)) ? B
  • Validity is connected to inference via Deduction
    Theorem
  • KB? ? if and only if (KB ? ?) is valid
  • A sentence is satisfiable if it is true in some
    model
  • e.g., A ? B, C
  • A sentence is unsatisfiable if it is true in no
    models
  • e.g., A ? ?A
  • Satisfiability is connected to inference via the
    following
  • KB? ? if and only if (KB ? ??) is unsatisfiable
  • i.e. prove ? by reductio ad absurdum
    (contradiction)

36
Proof methods
  • Proof methods divide into (roughly) two kinds
  • Application of inference rules
  • Legitimate (sound) generation of new sentences
    from old ones
  • Proof a sequence of inference rule
    applications. Can use inference rules as
    operators in a standard search alg.
  • Typically require translation of sentences into a
    normal form
  • Model checking
  • Truth table enumeration (always exponential in n)
  • Improved backtracking, e.g., Davis-Putnam-Longeman
    n-Loveland
  • Heuristic search in model space (sound but
    incomplete), e.g., min-conflicts-like
    hill-climbing algorithms

37
Resolution
  • Resolution is one of inference rules other rules
    include Modus Ponens, And-Elimination, etc.
  • Conjunctive Normal Form (CNF universal)
  • conjunction of disjunctions of literals
  • clauses
  • e.g., (A ? ?B) ? (B ? ?C ? ?D)

38
Resolution
  • Resolution inference rule (for CNF) complete for
    propositional logic
  • l1 ? ? lk, m1 ? ? mn
  • l1 ? ? li-1 ? li1 ? ? lk ? m1 ? ? mj-1 ?
    mj1 ? ? mn
  • where li and mj are complementary literals.
  • P1,3 ? P2,2, ?P2,2,
  • P1,3
  • Resolution is sound and complete for PL

39
Conversion to CNF
  • B1,1 ? (P1,2 ? P2,1)
  • Eliminate ?, replacing ? ? ? with (? ? ?) ? (? ?
    ?).
  • (B1,1 ? (P1,2 ? P2,1)) ? ((P1,2 ? P2,1) ? B1,1)
  • Eliminate ?, replacing ? ? ? with ?? ? ?.
  • (?B1,1 ? P1,2 ? P2,1) ? (?(P1,2 ? P2,1) ? B1,1)
  • Move ? inwards using de Morgans rules and double
    negation.
  • (?B1,1 ? P1,2 ? P2,1) ? ((?P1,2 ? ?P2,1) ? B1,1)
  • Apply distributivity law (? over ?) and flatten.
  • (?B1,1 ? P1,2 ? P2,1) ? (?P1,2 ? B1,1) ? (?P2,1 ?
    B1,1)

40
Resolution algorithm
  • Proof by contradiction, i.e., show KB ? ??
    unsatisfiable. PL-Resolve returns the set of all
    possible clauses obtained by resolving its two
    inputs.

41
Resolution example
  • KB (B1,1 ? (P1,2 ? P2,1)) ? ?B1,1
  • ? ? P1,2

?P2,1 ? B1,1
P1,2
?P1,2 ? B1,1
?B1,1
?B1,1 ? P1,2 ? P2,1
?P2,1
?B1,1 ? P2,1 ? B1,1
?B1,1 ? P1,2 ? B1,1
?P1,2
P1,2 ? P2,1 ? ?P2,1
P1,2 ? P2,1 ? ?P1,2
42
Horn form
  • Horn form (restricted)
  • Real world KB often contain only clauses of
    restricted kind called Horn clauses
  • KB conjunction of Horn clauses
  • Horn clause
  • disjunction of literals of which at most one is
    positive
  • e.g., ?C ? ?B ? A can be written as (C ? B) ? A
  • Horn clause with exactly one positive literal are
    called definite clause
  • The positive literal ? head the negatives ? the
    body
  • Horn clause with no positive literal can be
    written as an implication whose conclusion is
    FALSE.

43
Forward and backward chaining
  • Modus Ponens (for Horn form) complete for Horn
    KBs
  • Can be used with forward chaining or backward
    chaining. These algorithms run in linear time in
    the size of KB.

44
Forward chaining
  • Idea fire any rule whose premises are satisfied
    in the KB, add its conclusion to the KB, until
    query is found

45
Simple (inefficient?) forward chaining algorithm
46
Forward chaining example
Q
1
0
P
2
1
0
M
2
1
0
L
2
1
0
2
1
0
A
B
47
FC Proof of completeness
  • FC derives every atomic sentence that is entailed
    by KB
  • FC reaches a fixed point where no new atomic
    sentences are derived
  • Consider the final state as a model m, assigning
    true/false to symbols
  • Every clause in the original KB is true in m
  • Proof Suppose a clause a1 ? ? ak ? b is false
    in m. Then a1 ? ? ak is true in m and b is
    false in m. Therefore the algorithm has not
    reached a fixed point!
  • Hence m is a model of KB
  • If KB ? q, q is true in every model of KB,
    including m

48
Backward chaining
  • Idea work backwards from the query q To prove q
    by BC
  • check if q is known already, or
  • prove by BC all premises of some rule concluding
    q
  • Avoid loops check if new subgoal is already on
    the goal stack
  • Avoid repeated work check if new subgoal
  • has already been proved true, or
  • has already failed

49
Backward chaining example
Q
P
M
L
B
A
50
Forward vs. backward chaining
  • FC is data-driven, appropriate for automatic,
    unconscious processing,
  • e.g., object recognition, routine decisions
  • may do lots of work that is irrelevant to the
    goal
  • BC is goal-driven, appropriate for problems
    solving
  • e.g., Where are my keys? How do I get into
    Fasilkom UI?
  • Complexity of BC can be much less than linear in
    size of KB

51
Summary
  • Logical agents apply inference to a knowledge
    base to derive new information and make
    decisions.
  • Basic concepts of logic
  • syntax formal structure of sentences
  • semantics truth of sentences with respect to
    models
  • entailment necessary truth of one sentence given
    another
  • inference deriving sentences from other
    sentences
  • soundness derivations produce only entailed
    sentences
  • completeness derivations can produce all
    entailed sentences
  • Wumpus world requires the ability to represent
    partial and negated information, reason by cases,
    etc.
  • Forward, backward chaining are linear-time,
    complete for Horn clauses. Resolution is complete
    for propositional logic.
  • Propositional logic lacks expressive power.
Write a Comment
User Comments (0)
About PowerShow.com