Title: Abductive Logic Programming Agents
1Abductive Logic Programming Agents
- The ALP agent cycle
- ALP combines backward and forward reasoning
- ALP gives a semantics to production rules
- ALP can be used for explaining observations,
conditional solutions, generating actions,
default reasoning - Pre-active reasoning, combining utility and
uncertainty - Deciding whether or not to carry an umbrella
- The prisoners dilemma
2Abductive logic programming (ALP) agent model
An agent
Maintenance goal
Achievement goal
Judge probabilities and utilities
Forward reasoning using beliefs
Backward Reasoning using beliefs
Consequences
Consequences
Consequences
Forward reasoning using beliefs
Decide
Observe
Act
The World
3ALP agents combine beliefs and goals
- Beliefs, represented by logic programs,
- describe how things are.
- Goals represented by integrity constraints,
- prescribe how things should be. They include
- condition-action rules
- commands
- queries
- obligations prohibitions
- atomic and non-atomic actions
- denials
4The ALP agent cycle
- Record current observations,
- Use forward reasoning to derive consequences of
the observations, triggering any integrity
constraints and adding any new goals - Use backward reasoning to reduce goals to
sub-goals - Perform conflict-resolution to choose between
candidate sub-goals that are atomic actions. - Execute the associated actions.
- Conflict-resolution can be performed by using
forward reasoning to derive consequences of
candidate actions. - Decision theory can be used to choose actions
whose consequences have maximal expected utility. -
- Backward reasoning can also be used to explain
observations, before using forward reasoning to
derive consequences.
5The London underground
Goal If there is an emergency then I get
help. Beliefs A person gets help if the
person alerts the driver. A person alerts the
driver if the person presses the alarm signal
button. There is an emergency if there is a
fire. There is an emergency if one person
attacks another. There is an emergency if
someone becomes seriously ill. There is an
emergency if there is an accident. There is a
fire if there are flames. There is a fire if
there is smoke.
6ALP combines forward and backward reasoning
The world
If there is an emergency then get help There is
an emergency
get help
Forward reasoning
alert the driver
Backward reasoning
There is a fire
press the alarm signal button
observe
act
7Abductive Logic Programming
- Abductive Logic Programs ltP, A, ICgt have three
components - P is a normal logic program.
- A is a set of abducible predicates.
- IC, the set of integrity constraints, is a set of
first-order sentences. - Often, ICs are expressed as conditionals
- If A1 ... An then B
- or as denials
-
- not (A1 ... An not B)
- Normally, P is not allowed to contain any
clauses whose conclusion contains an abducible
predicate. - (This restriction can be made without loss of
generality.)
8ALP Semantics and Proof Procedures
- Semantics
- Given an abductive logic program, lt P,A,IC gt ,
an abductive explanation for a goal G is a set ?
of ground atoms in terms of the abducible
predicates such that - G holds in P ? ?
- IC holds in P ? ? or P ? ? ? IC is
consistent. - Proof procedures
- Backward reasoning to show G.
- Forward reasoning to show observations and
explanations - satisfy IC
- Different notions of holds are compatible with
these characterisations, i.e. truth in the
intended minimal model, truth in all models,
etc.
9ALP gives a logical semantics to production
rules.
- Logical rules used to reason forward can be
- represented by LP clauses, with forward
reasoning. - Reactive rules that implement stimulus-response
associations can be represented by integrity
constraints, with forward reasoning. - Pro-active rules that simulate goal-reduction
- If goal G and conditions C then add H as a
sub-goal. - can be represented by LP clauses, with backward
reasoning. -
10ALP viewed in Active Deductive Database terms
- Logic programs define data. E.g.
- The bus leaves at 900.
- The bus leaves at 1000.
- The bus leaves at X00
- if X is an integer 9 X 18.
- Integrity constraints maintain integrity. E.g.
-
- There is no bus before 900.
- If the bus leaves at X00,
- then it arrives at its destination at XY
20 Y 30.
11ALP can be used to explain observations
- Program Grass is wet if it rained.
- Grass is wet if the sprinkler was on.
- The sun was shining.
- Abducible predicates
- it rained, the sprinkler was on
- Integrity constraint
- not (it rained and the sun was shining)
- Observation Grass is wet
- Two potential explanations
- it rained, the sprinkler was on
- The only explanation that satisfies the
integrity constraint is - the sprinkler was on.
12ALP can be used to generate conditional solutions
- Program
- X citizen if X born in USA.
- X citizen if X born outside USA X resident of
USA X naturalised. - X citizen if X born outside USA Y is mother of
X Y citizen X registered. - Mary is mother of John.
- Mary is citizen.
- Abducible predicates X born in USA, X
born outside USA, - X resident of USA, X naturalised, X
registered - Integrity constraint
- if John resident of USA then false.
- Goal John citizen
- Two abductive solutions
- John born in USA,
- John born outside USA John registered
13ALP can be used to generate actions Program the
re is an emergency if there is a fire you get
help if you alert the driver you alert the driver
if you press the alarm signal button Abducible
predicates there is a fire, you press the
alarm signal button Integrity constraint
functioning as a maintenance goal If there is
an emergency, then you get help Abductive
solution you press the alarm signal button
14ALP can be used for default reasoning Program
X can fly if X is a bird and normal X X is a
bird if X is a penguin Abducible predicate
normal X Integrity constraint functioning as a
denial If normal X and penguin X then
false Observation tweety is a bird Abductive
consequence tweety can fly, assuming
normal tweety New observation Consequence
withdrawn
15ALP agents can reason pre-actively, taking into
account utility and uncertainty
- A common form of belief has the form
- Different effects have different utilities
-
- an effect takes place if
- an agent does something and
- some conditions hold in the environment
- The same belief can be used
- to reason forwards from observations
- to reason backwards from desired effects
- to reason forwards from candidate actions
- To reason backwards from observed effects
The state of the environment is uncertain
16Combining utility and uncertaintywith pre-active
thinking
- To get rich, I am thinking about robbing a bank
- But before constructing a plan in all its detail,
- I mentally infer the possible consequences.
-
- Apart from any moral considerations,
- if I rob a bank, get caught, and am convicted,
then I will end up in jail. - But I dont want to go to jail.
- I can control whether or not I try to rob a bank.
- But I can not control whether I will be caught or
be convicted. - I can only judge their likelihood.
-
- If I judge that the likelihood of getting caught
and being convicted is high, - then I will decide not to rob a bank, because I
dont want to go to jail. - I will not even think about how I might rob a
bank, - because all of the alternatives lead to the same
undesirable consequence.
17Preactive thinking can be applied at different
levels of detail
Maintenance goal
Achievement goal
Judge probabilities and utilities
Consequences
Backward reasoning
Forward reasoning
Consequences
Consequences
Forward reasoning
Decide
Act
Observe
18Pre-active thinking
- Goal I carry an umbrella or I do not carry an
umbrella. - Beliefs I stay dry if I carry an umbrella.
- I get wet if I do not carry an umbrella and it
rains. - I stay dry if it doesnt rain.
- Assume I carry an umbrella .
- Infer I stay dry (whether or not it rains).
- Assume I do not carry an umbrella .
- Infer I get wet if it rains.
- I stay dry if it doesnt rain.
- (whether or not I carry an umbrella).
19Decision Theory to find the expected utility of
a proposed action, find all the alternative
resulting states of affairs, weigh the utility
of each such state by its probability, and add
them all up.
u11
p11
u12
p12
Expected utility of action1 p11u11p12u12p13u1
3p14u14
u13
action1
p13
u14
p14
u21
action2
p21
Expected utility of action2 p21u21p22u22p23u2
3p24u24
u22
p22
u23
p23
p24
u24
Choose the action of highest expected utility
20Deciding whether or not to carry an umbrella
Assume Probability it rains
.1 Probability it doesnt rain
.9 Utility of getting wet 10
Utility of staying dry 1
Utility of carrying an umbrella
2 Utility of not carrying an
umbrella 0 Assume I carry an umbrella
. Infer I stay dry with probability
1. Expected utility -2 1 -1 Assume I
do not carry an umbrella . Infer I get wet
with probability .1. I stay dry with
probability .9 Expected utility 0 -10.1 1.9
-1 .9 -.1 Decide I do not carry an
umbrella!
21A more practical alternative might be to use
maintenance goals or condition-action rules
instead If I leave home and it is raining
then I take an umbrella. If I leave home and
there are dark clouds in the sky then I take an
umbrella. If I leave home and the weather
forecast predicts rain then I take an
umbrella. The maintenance goals compile
decision-making into the thinking component of
the agent cycle. The compilation might be an
exact implementation of the Decision Theoretic
specification. Or it might be only an
approximation.
22The Prisoners Dilemma
Goal I turn state witness or I do not turn
state witness Beliefs A prisoner gets 0 years
in jail if the prisoner turns state
witness and the other prisoner does not. A
prisoner gets 4 years in jail if the prisoner
does not turn state witness and the other
prisoner does. A prisoner gets 3 years in jail
if the prisoner turns state witness and the
other prisoner does too. Â A prisoner gets 1
year in jail if the prisoner does not turn
state witness and the other prisoner does not
turn state witness too.Â
23Preactive thinking
- Assume I turn state witness
- Infer I get 0 years in jail
- if the other prisoner does not turn state
witness. - I get 3 years in jail
- if the other prisoner turns state witness .
- Assume I do not turn state witness
- Infer I get 4 years in jail
- if the other prisoner turns state witness.
- I get 1 year in jail
- if the other prisoner does not turn state
witness.
24In Classical Logic
Given the additional belief the other prisoner
turns state witness or the other prisoner does
not turn state witness. Infer If I turn
state witness then I get 0 years in jail or I
get 3 years in jail. If I do not turn state
witness then I get 4 years in jail or I get 1
year in jail.
25In Decision Theory
- Assume Probability the other prisoner turns
state witness .5 - Probability the other prisoner does not turn
state witness .5 - Utility of getting N years in jail N
- Assume I turn state witness
- Infer Probability I get 0 years in jail .5
- Probability I get 3 years in jail .5
- Expected utility .50 .53 1.5 years in
jail. -
- Assume I do not turn state witness
- Infer Probability I get 4 years in jail .5
- Probability I get 1 years in jail .5
- Expected utility .54 .51 2.5 years in jail
-
- Decide I turn state witness
26Conclusion Logic can be used to combine
proactive, reactive and proactive thinking
together with Decision Theory (and other ways of
making decisions)
Maintenance goal
Achievement goal
Judge probabilities and utilities
Consequences
Backward reasoning
Forward reasoning
Consequences
Consequences
Forward reasoning
Decide
Act
Observe