Intelligent Systems 2II40 C5 - PowerPoint PPT Presentation

1 / 70
About This Presentation
Title:

Intelligent Systems 2II40 C5

Description:

IV.1. Generalities on logics. IV.2. Propositional logic short visit ... Boids were the bases of the penguins in Batman returns ... – PowerPoint PPT presentation

Number of Views:80
Avg rating:3.0/5.0
Slides: 71
Provided by: wwwisW
Category:

less

Transcript and Presenter's Notes

Title: Intelligent Systems 2II40 C5


1
Intelligent Systems (2II40)C5
  • Alexandra I. Cristea

October 2003
2
Outline
  • IV. Knowledge reasoning (finish)
  • V. Planning

3
Knowledge and reasoning
  • IV.1. Generalities on logics
  • IV.2. Propositional logic short visit
  • IV.3. First-order logic (FOL) short visit
  • IV.4. Knowledge-based agents
  • IV.5. Knowledge representation
  • Ontological engineering
  • Categories and objects
  • Actions, situations and events
  • Mental events and mental objects
  • Internet shopping world
  • Reasoning systems for categories
  • Semantic networks
  • Description logic

4
IV.4.D. Mental Events and Mental Objects
5
Mental events objects - ontology
  • Knowledge vs. Belief, e.g.,
  • Knows(agent, BlueCoatPoliceman)
  • KnowsWhether(agent, not(BlueCoatPoliceman))
  • KnowsWhat(agent,Job(BlueCoat),String)
  • Believes(agent, KnowsWhat(Policeman,
    WayTo(Bucharest), ListStrings))

6
Referential opaqueness for beliefs
  • Referential transparency to be able to
    substitute freely a term for an equal term
  • Referential opaqueness a term cannot be
    substituted for an equal term (without changing
    the meaning)
  • Why?

7
Example need of referential opaqueness
  • (SupermanClark)
  • (Believes(Lois, Flies(Superman)) ? Believes(Lois,
    Flies(Clark))
  • Which is false, so FOL is inadequate answers
  • Syntactic theory
  • mental objects represented as strings, as on
    slide w. knowledge vs. beliefs
  • Modal logic FOLmodal operators
  • B (belief), K (knowledge) modal context
    substitution limitation

8
IV.4.F. Reasoning System for categories
9
IV.4.F. Reasoning System for categories
  • Semantic networks
  • Graphical aid for visualizing a knowledge base
  • Algorithms for inferring object proprieties
    (based on category membership)
  • Description logics
  • Formal language for constructing combining
    category defs
  • Algorithms for deciding sub-/super-set
    relationships between categories

10
Semantic networks
  • Represent
  • Objects
  • Categories of objects
  • Relationships between objects
  • Objective like FOL
  • to say (easily) things about objects

11
Simple Semantic Net Example
Mammals
SubsetOf
Legs
HasMother
Persons
2
SubsetOf
Female Persons
Male Persons
SubsetOf
MemberOf
MemberOf
SisterOf
Legs
Mary
John
1
12
Discussion Semantic Nets
  • Semantic Nets related to concept maps, mind maps
    but come usually with a better formalism for
    resolution.
  • Reification of links needed for
  • ?p,s HasSister(p,s) ? SisterOf(s,p)
  • because semantic nets indexed by objects,
    categories, links from them
  • (FOL indexed by first argument of predicate)

13
Pros Cons Semantic nets
  • Drawbacks
  • only binary relations
  • FOL but No negation, disjunction, nested
    functions, existential quantification
  • Advantages
  • Visual aid, easy queries
  • Default values allowed to be overridden

14
Description Logics
  • Objective unlike FOL to describe definitions and
    properties of categories, e.g.
  • Subsumption (category is subset of other?)
  • Classification (object belongs to category?)
  • /- Consistency (membership criteria
    logically satisfiable?)
  • Allows logical operations on predicates
  • And(,,) All()AtLeast()AtMost()

15
The ontology spectrum
Strong semantic
Modal Logic First Order Logic
Is
disjoint subclass w. transitivity
property
Local Domain Theory
Description Logic DAMLOIL,OWL UML
Conceptual Model
Is subclass of
RDF/S XTM Extended ER
Thesaurus
Has narrower meaning than
ER
Schema
Taxonomy
Is subclassification of
Relational Model
Weak semantic
16
RDF, XML
  • Resource Description Framework (RDF) recommended
    by the World Wide Web Consortium (W3C), to model
    meta-data about the resources of the web.
  • RDF can be written in XML
  • The eXtended Markup Language (XML) is accepted as
    THE emerging standard for data interchange on the
    Web.
  • XML allows authors to create their own markup
    (e.g. ltAUTHORgt), which seems to carry some
    semantics. However, from a computational
    perspective tags like ltAUTHORgt carries as much
    semantics as a tag like ltH1gt
  • What is needed??

17
DAMLOIL
  • The DARPA Agent Markup Language (DAML) (start
    August 2000) language tools to facilitate the
    concept of the Semantic Web.
  • Example DAML beer ontology in RDF
  • The Ontology Inference Layer OIL is a proposal
    for a web-based representation inference layer
    for ontologies, combining
  • modelling primitives from frame-based languages
  • with the formal semantics and reasoning services
    provided by description logics.

18
Conclusion Knowledge Reasoning
  • 322-283 BC Aristotle comprehensive taxonomies,
    emphasizing classification categorization
  • Still hot topic
  • IEEE Standard Upper Ontology Working Group
  • 2003-08-19 The OWL Web Ontology Language is now
    a W3C Candidate Recommendation.
  • RDF, XML
  • Semantic Web (Tim Berners-Lee 2001
  • DARPA Agent Markup Language (DAML)
  • Ontology Inference Layer OIL
  • However e.g., Ted Nelson is against this
    (hierarchical) movement!

19
V. Planning
20
V. Planning
  • V.1. Planning generalities
  • V.1.A. Search vs. Planning
  • V.1.B. STRIPS operators
  • V.2.C. Partial Order Planning
  • V.2. Planning in the real world
  • V.2.A. Conditional Planning
  • V.2.B. Monitoring and Replanning
  • V.2.C. Continuous Planning
  • V.2.D. Multi-agent planning

21
V.1. Planning generalities
  • V.1.A. Search vs. Planning
  • V.1.B. STRIPS operators
  • V.2.C. Partial Order Planning

22
V.1.A. Search vs. Planning
23
Search vs. Planning Ex.
  • Task get milk, bananas, and a cordless drill
  • Standard search

24
Problem decomposition
  • Deliver n packages worst case
  • O(n!)
  • O((n/k)!k) if problem can be decomposed in k
    equal parts
  • Most problems partially decomposable

25
V.1.B. STRIPS operators
26
STRIPS (71)
  • Restricted language ? efficient algorithm
  • Represents states, goals and actions
  • State Literals ground and function-free
  • not allowed At(x,y) At(Father(Fred),Sydney)
  • Goal partially specified state conjunction of
    positive ground literals
  • e.g., Rich ? Famous ? At(P2,Tahiti)
  • Action precondition effect
  • Precondition conjunction of positive literals
  • Effect conjunction of literals
  • e.g.,
  • ACTION Buy(x)
  • PRECONDITION At(p), Sells(p,x)
  • EFFECT Have(x)
  • Close world assumption!
  • However many details omitted

27
V.2.C. Partial Order Planning (POP)
28
POP
  • Least commitment strategy
  • Obvious, important decisions first
  • POP partially ordered collection of steps, w.
  • Start step (initial state effect)
  • Finish step (goal description precondition)
  • Causal links (outcome one step precond. other)
  • Temporal ordering
  • Plan complete every precondition achieved
  • Precondition achieved
  • it is effect of earlier step AND
  • no (possibly intervening) step undoes it
  • Open condition precondition of a step not yet
    causally linked

29
Planning process
  • Operators on partial plans
  • Add a link from existing action to open condition
  • Add a step to fulfill open condition
  • Order one step (w. rsp. to another to remove
    possible conflicts)
  • Incomplete/ vague plans ??
  • ? complete, correct plans
  • ( backtrack if open condition unachievable or
    conflict unsolvable)

30
POP algorithm
31
POP alg. cont.
32
Clobbering
  • Clobbering is a step that potentially destroys
    the condition achieved by a causal link.
  • E.g., Go(Home) clobbers At(Supermarket)
  • Solution promotion or demotion

33
Clobbering promotion/ demotion
  • Demotion put before
  • Go(Supermarket)
  • Promotion put after
  • Buy(Milk)

34
Example POP
35
Example POP
36
Example POP
37
POP proprieties
  • sound, complete systematic (no repetitions)
  • Nondeterministic backtracks at choice point of
    failure
  • Choice Sadd to achieve Sneed
  • Choice demotion/ promotion clobberer
  • Extensions disjunction, universals, negation,
    conditionals
  • Efficient w. good heuristic
  • Good for pbs. w. loosely related sub-goals

38
Ex. blocks world
39
Ex. blocks world - cont.
  • ACTION PutOn(x,y)
  • PRECONDITION Clear(x), On(x,z), Clear(y)
  • EFFECT On(x,y),Clear(y),On(x,z), Clear(z)
  • ACTION PutOnTable(x)
  • PRECONDITION Clear(x), On(x,z)
  • EFFECT On(x,Table), On(x,z), Clear(z)

40
Ex. blocks world - cont.
41
Ex. blocks world - cont.
42
Ex. blocks world - cont.
43
Ex. blocks world - cont.
44
Hierarchical task network planning (HTN)
  • Generalization of POP
  • Higher level actions, to be decomposed into lower
    level actions
  • Preconditions of high-level actions are
    intersection of external preconditions of the
    decompositions.
  • Effects of high-level actions are intersection of
    external effects of the decompositions.

Counter-intuitive!!
45
V.2. Planning in the real world
46
Time, schedules, resources
  • Critical Path Method (CPM)
  • Path a linearly ordered sequence of actions
    beginning w. Start and ending w. Finish.
  • Critical path path with longest duration.
  • ES(action) earliest possible start time of
    action
  • LS(action) latest possible start time of action
  • Slack LS-ES

47
Computing CPM
  • ES(start)0
  • ES(B)maxAltBES(A) Duration(A)
  • LS(Finish)ES(Finish)
  • LS(A)minAltBLS(B) - Duration(A)
  • Time complexity O(Nb),
  • Where N- number of actions, b- max branching
    factor into or out of an action

48
V.2. Planning in the real world
  • V.2.A. Conditional Planning
  • V.2.B. Monitoring and Replanning
  • V.2.C. Continuous Planning
  • V.2.D. Multi-agent planning

49
V.2. Real world planning
50
Things do go wrong
  • Incomplete info
  • Unknown preconditions Intact(Spare)?
  • Disjunctive effects Inflate(x) causes
  • Inflated(x) ? SlowHiss(x) ? Burst(x) ?
    BrokenPump(x)
  • Incorrect info
  • Current state incorrect, e.g., spare NOT intact
  • Missing/incorrect postconditions in operators
  • Qualification problem it is impossible to list
    all required preconditions and possible outcomes
    of actions.

51
Solutions
  • Sensorless planning a plan that works regardless
    of state or outcome
  • may not exist
  • Conditional planning plan to obtain info plan
    for every possible situation
  • expensive (plans for unlikely cases as well)
  • Monitoring/repairing assume normal states,
    outcomes check progress during execution, replan
    if necessary
  • unanticipated outcomes may lead to failure
  • Best a combination of above

52
Sensorless planningPlanning with Beliefs
53
(No Transcript)
54
V.2.A.Conditional Planning
  • World non-deterministic, or partially observable
    ? percepts provide info to split the belief state

55
Conditional planning examples
56
Murphy x 2
57
Search Algorithm AND-OR graph
  • function AND-OR-GRAPH-SEARCH(problem) returns a
    conditional plan, or failure
  • OR-SEARCH(INITIAL-STATEproblem, problem, )
  • function OR-SEARCH(state, problem, path) returns
    a conditional plan, or failure
  • if GOAL-TESTproblem(state) then return the
    empty plan
  • if state is on path then return failure
  • for each action, state_set, in
    SUCCESSORSproblem(state) do
  • plan ??AND-SEARCH(state_set,problem,statepa
    th)
  • if plan ?? failure then return actionplan
  • return failure
  • function AND-SEARCH(state_set, problem, path)
    returns a conditional plan, or failure
  • for each si in state_set do
  • plani ? OR-SEARCH(si,problem,path)
  • if plan failure then return failure
  • return if s1 then plan1 else if s2 then plan2
    else if sn-1 then plann-1 else plann

58
Murphy x 3
59
V.2.B. Monitoring and Replanning
  • Failure preconditions of remaining plan not met
  • Preconditions of remaining plan
  • All preconditions of remaining steps not achieved
    by remaining steps
  • All causal links crossing current time point
  • On failure, resume POP to achieve open conditions
    from current state
  • Performs action monitoring plan
    monitoring

60
(No Transcript)
61
(No Transcript)
62
(No Transcript)
63
(No Transcript)
64
(No Transcript)
65
(No Transcript)
66
Other ex. of emergent behaviour
67
Other ex. of emergent behaviour
  • Loop until success until behavior emerges

68
Multi-agent planning
  • Cooperation joint goals plans
  • Problems
  • wrong plan might be selected
  • Synchronization
  • Communication
  • Interesting examples boids (separation,
    cohesion, alignment)
  • Competition
  • success/ failure vs. minimization of cost
  • Plan recognition (for both) hot research area

69
Summary planning
  • There is much more going on in planning, we
    barely touched the surface
  • HTN are used in practice a lot, although it is
    undecidable in the general case
  • Conditional plans can be combined with info
    gathering from sensing actions when info is
    missing
  • In non-fully observable environments, sensorless
    planning can, in principle, still lead to desired
    outcome
  • All the planning methods can be applied as
    searches within belief states trees
  • Execution monitoring can detect violations of
    preconditions and inform the replanning agent
  • Boids were the bases of the penguins in Batman
    returns
  • Plan recognition used for utterance detection in
    speech recognition

70
Homework 5
  • The monkey-and-bananas problem is faced by a
    monkey in a lab with some bananas hanging out of
    reach from the ceiling.
  • A box is available that will enable the monkey to
    reach the bananas if he climbs on it.
  • Initially, the monkey is at A, the bananas at B,
    and the box at C. The monkey and box have the
    same height Low, but if the monkey climbs on the
    box it will have height High, the same as the
    bananas.
  • The actions available for the monkey include Go
    from one place to another, Push an object from
    one place to another, ClimbUp onto or ClimbDown
    from an object, and Grasp or UnGrasp an object.
    Grasping results in holding the object if the
    monkey and object are in the same place at the
    same height.
  • Write down the initial state description.
  • Write down STRIPS-style definitions of the six
    actions.
  • Suppose the monkey wants to fool the scientists,
    who are off to tea, by grabbing the bananas, but
    leaving the box in its original place. Write this
    as a general goal (i.e., not assuming that the
    box is necessarily at C) in the language of
    situation calculus. Can this goal be solved by a
    STRIPS-style system? (Hint check also comments
    on STRIPS slide and POP)
  • 4. Continue till step 6 with your project.
Write a Comment
User Comments (0)
About PowerShow.com