logic & Prolog - PowerPoint PPT Presentation

About This Presentation
Title:

logic & Prolog

Description:

... determine molecular ... the bottleneck in developing expert systems often difficult to codify knowledge as facts & rules extracting/formalizing/refining ... – PowerPoint PPT presentation

Number of Views:78
Avg rating:3.0/5.0
Slides: 28
Provided by: eePdxEdu
Learn more at: http://web.cecs.pdx.edu
Category:
Tags: logic | prolog

less

Transcript and Presenter's Notes

Title: logic & Prolog


1
Dave Reed
  • Knowledge-based problem solving
  • expert systems
  • rule-based reasoning, heuristics
  • reasoning with uncertainty
  • Bayesian probabilities, certainty factors, fuzzy
    reasoning
  • alternative approaches
  • case-based reasoning, model-based reasoning

2
Expert systems
  • expert systems are AI's greatest commercial
    success
  • an expert system uses knowledge specific to a
    problem domain to provide "expert quality"
    performance in that application area
  • DENDRAL (1967) determine molecular structure
    based on mass
  • spectrogtrams
  • MYCIN (1976) diagnosis therapy recommendation
    for
  • infectious blood diseases
  • PROSPECTOR (1978) mineral exploration (found a
    100M ore
  • deposit)
  • XCON (1984) configure VAX and PDP-11 series
    computer
  • systems (saved DEC 70M per year)
  • today, expert systems are used extensively in
    finance, manufacturing, scheduling, customer
    service,
  • FocalPoint (TriPath Imaging) screens 10 of all
    pap smears in U.S.
  • American Express uses an ES to automatically
    approve purchases
  • Mrs. Field's cookies uses an ES to model the
    founder's operational ideas
  • TaxCut uses an ES to give tax advice
  • Phoenix Police Dept uses an ES to help identify
    suspects using M.O.

3
Common characteristics of expert systems
  • system performs at a level generally recognized
    as equivalent to a human expert in the field
  • presumably, human expertise is rare or expensive
  • the demand for a solution justifies the cost
    effort of building the system
  • system is highly domain specific
  • lots of knowledge in a narrow field (does not
    require common sense)
  • amenable to symbolic reasoning, but not solvable
    using traditional methods
  • system can explain its reasoning
  • in order to be useful, it must be able to justify
    its advice/conclusions
  • system manipulates probabilistic or fuzzy
    information
  • must be able to propagate uncertainties and
    provide a range of conclusions
  • system allows for easy modification
  • knowledge bases must be refined

4
System architecture
  • usually, expert systems are rule-based
  • extract expert knowledge in the form of facts
    rules
  • if P1 and P2 and P3, then conclude C.

user interface acquires information
and displays results inference engine performs
deductions on the known facts rules (i.e.,
applies the knowledge base) knowledge
base domain specific facts rules for solving
problems in the domain case-specific
data working memory, stores info about current
deduction
5
Inference example
  • Consider the following rules about diagnosing
    auto problems
  • (R1) if gas_in_engine and turns_over, then
    problem(spark_plugs).
  • (R2) if not(turns_over) and not(lights_on),
    then problem(battery).
  • (R3) if not(turns_over) and light_on, then
    problem(starter).
  • (R4) if gas_in_tank and gas_in_carb, then
    gas_in_engine.

Knowledge Base (KB) contains the general rules
facts about the domain User Interface may be
used to load initial facts about the specific
task, specify a goal
known gas_in_tank known gas_in_carb
goal problem(X)
6
Inference example (cont.)
  • Consider the following rules about diagnosing
    auto problems
  • (R1) if gas_in_engine and turns_over, then
    problem(spark_plugs).
  • (R2) if not(turns_over) and not(lights_on),
    then problem(battery).
  • (R3) if not(turns_over) and light_on, then
    problem(starter).
  • (R4) if gas_in_tank and gas_in_carb, then
    gas_in_engine.

Inference Engine can make forward deductions (use
rules and existing facts to deduce new facts) can
also reason backwards, reducing goal to subgoals
(ala Prolog) goals can be solved by facts, or may
prompt the user for more info
known gas_in_tank known gas_in_carb known
gas_in_engine
goal gas_in_engine turns_over
7
Rule-based reasoning
  • rule-based expert systems have many options when
    applying rules
  • forward reasoning vs. backward reasoning
  • depth first vs. breadth first vs.
  • apply "best" rule vs. apply all applicable rules
  • also, many ways to handle uncertainty
  • probabilities
  • specify likelihood of a conclusion, apply
    Bayesian reasoning
  • certainty factors
  • a certainty factor is an estimate of confidence
    in conclusions
  • not probabilistically precise, but effective
  • fuzzy logic
  • reason in terms of fuzzy sets (conclusion can be
    a member to a degree)
  • again, not probabilistically precise, but
    effective

8
Case study MYCIN
  • MYCIN (1976) provided consultative advice on
    bacterial infections
  • rule-based
  • backward reasoning (from a specific goal back to
    known facts)
  • performs depth first, exhaustive search of all
    rules
  • utilizes certainty factors
  • sample rule
  • IF (1) the stain of the organism is
    gram-positive, AND
  • (2) the morphology of the organism is coccus,
    AND
  • (3) the growth confirmation of the organism is
    clumps,
  • THEN there is suggestive evidence (0.7) that
  • the identity of the organism is staphylococcus.
  • MYCIN used rules to compute certainty factors for
    hypotheses
  • find rules whose conclusions match the hypothesis
  • obtain CF's for premises (look up, use rules,
    ask, ) and compute the CF for the conclusion
  • combine CF's obtained from all applicable rules.

9
Certainty factors in MYCIN
  • Consider two rules
  • (R1) hasHair ? mammal CF(R1) 0.9
  • (R2) forwardEyes sharpTeeth ? mammal CF(R2)
    0.7
  • Suppose you have determined that
  • CF(hasHair) 0.8 CF(forwardEyes) 0.75
    CF(sharpTeeth) 0.3
  • Given multiple premises, how do you combine into
    one CF?
  • CF(P1 ? P2) max( CF(P1), CF(P2) )
  • CF(P1 ? P2) min( CF(P1), CF(P2) )
  • So, CF(forwardEyes ? sharpTeeth) min( 0.75, 0.3
    ) 0.3

10
Certainty factors in MYCIN
  • Consider two rules
  • (R1) hasHair ? mammal CF(R1) 0.9
  • (R2) forwardEyes sharpTeeth ? mammal CF(R2)
    0.7
  • We now know that
  • CF(hasHair) 0.8 CF(forwardEyes) 0.75
    CF(sharpTeeth) 0.3
  • CF(forwardEyes ? sharpTeeth) min( 0.75, 0.3 )
    0.3
  • Given the premise CF, how do you combine with the
    CF for the rule?
  • CF(H, Rule) CF(Premise) CF(Rule)
  • So, CF(mammal, R1) CF(hasHair) CF(R1) 0.8
    0.9 0.72
  • CF(mammal, R2) CF(forwardEyes ? sharpTeeth)
    CF(R2)
  • 0.3 0.7
  • 0.21

11
Certainty factors in MYCIN
  • Consider two rules
  • (R1) hasHair ? mammal CF(R1) 0.9
  • (R2) forwardEyes sharpTeeth ? mammal CF(R2)
    0.7
  • We now know that
  • CF(hasHair) 0.8 CF(forwardEyes) 0.75
    CF(sharpTeeth) 0.3
  • CF(forwardEyes ? sharpTeeth) min( 0.75, 0.3 )
    0.3
  • CF(mammal, R1) 0.72 CF(mammal, R2) 0.21
  • Given diff rules with same conclusion, how do you
    combine CF's?
  • CF(H, Rule Rule2) CF(H, Rule1) CF(H,
    Rule2)(1-CF(H,Rule1))
  • So, CF(mammal, R1 R2)
  • CF(mammal, R1) CF(mammal,
    R2)(1-CF(mammal,R1))
  • 0.72 0.210.28
  • 0.72 0.0588
  • 0.7788
  • note CF(mammal, R1 R2) CF(mammal, R2 R1)

12
Rule representations
  • rules don't have to be represented as IF-THEN
    statements
  • PROSPECTOR (1978) represented rules as semantic
    nets
  • allowed for inheritance, class/subclass relations
  • allowed for the overlap of rules (i.e., structure
    sharing)
  • potential for a smooth interface with natural
    language systems

Barite overlying sulfides suggests the possible
presence of a massive sulfide deposit.
13
Knowledge engineering
  • knowledge acquisition is the bottleneck in
    developing expert systems
  • often difficult to codify knowledge as facts
    rules
  • extracting/formalizing/refining knowledge is long
    and laborious
  • known as knowledge engineering
  • in addition, explanation facilities are
    imperative for acceptance
  • TEIRESIAS (1977) front-end for MYCIN, supported
    knowledge
  • acquisition and explanation
  • could answer WHY is that knowledge relevant
  • HOW did it come to that conclusion
  • WHAT is it currently trying to show
  • could add new rules and adjust existing rules
  • today, expert system shells are a huge market
  • ES shell is a general-purpose system, can plug in
    any knowledge base
  • includes tools to assist in knowledge acquisition
    and refinement

14
Example expert system shell
  • for illustration, we will develop a simple expert
    system shell
  • knowledge base will consist of rules and facts of
    the form
  • Premise1, , PremiseN ---gt Conclusion. //rule
  • true ---gt Conclusion. // fact
  • will utilize a depth-first, stop-at-first-answer
    strategy
  • will start with a simple inference engine
  • no user interface for data acquisition
  • no uncertainties
  • no explanation or justification facilities
  • each of these features will be added
    incrementally

15
ES shell (v. 1)
  • this expert system shell consists of a simple
    inference engine
  • will work with any KB
  • can handle
  • task-specific facts, entered in the form
    known(Fact).
  • facts in the KB
  • negated goals
  • conjunctive goals
  • rules (via back-chaining)

shell1.pro Dave Reed
3/25/02 Expert system shell
- op(1100,
xfx, '---gt'). CASE 1 truth of goal is
already known solve(Goal) - (known(Goal)
(true ---gt Goal)), ! (known(not(Goal))
(true ---gt not(Goal))), !, fail. CASE 2
negated goal solve(not(Goal)) - solve(Goal),
!, fail true. CASE 3 conjunctive
goals solve((Goal1, Goal2)) - !,
solve(Goal1), solve(Goal2). CASE 4 back
chain on rule in KB solve(Goal) - (Premise
---gt Goal), solve(Premise).
16
Auto repair knowledge base (v. 1)
autoKB1.pro Dave Reed
3/25/02
- op(1100, xfx, '---gt'). bad_component(X)
, fix(X, Advice) ---gt fix(Advice). bad_system(sta
rter_system), lights(come_on) ---gt
bad_component(starter). bad_system(starter_system)
, not(light(come_on)) ---gt
bad_component(battery). bad_system(ignition_system
), not(tuned_recently) ---gt
bad_component(timing). bad_system(ignition_system)
, plugs(dirty) ---gt bad_component(plugs). bad
_system(ignition_system), not(plugs(dirty)),
tuned_recently ---gt bad_component(ignition_wi
res). not(car_starts), not(turns_over) ---gt
bad_system(starter_system). not(car_starts),
turns_over, gas_in_carb ---gt
bad_system(ignition_system). runs(rough),
gas_in_carb ---gt bad_system(ignition_system).
car_starts, runs(dies), gas_in_carb ---gt
bad_system(ignition_system). true ---gt
fix(starter, 'replace starter'). true ---gt
fix(battery, 'replace or recharge battery'). true
---gt fix(timing, 'get the timing adjusted'). true
---gt fix(plugs, 'replace the spark plugs'). true
---gt fix(ignition_wires, 'check ignition wires').
  • consider an extension of the auto repair KB

17
ES query (v. 1)
  • note no user interface for knowledge acquisition
  • must assert task-specific facts directly into the
    Prolog database
  • since no uncertainties involved, the ES shell
    behaves similarly to Prolog interpreter

?- consult(shell1). shell1.pro compiled 0.00
sec, 1,280 bytes ?- consult(autoKB1). autoKB1
compiled 0.00 sec, 3,136 bytes ?-
assert(known(not(car_starts))). Yes ?-
assert(known(not(turns_over))). Yes ?-
assert(known(lights(turn_on))). Yes ?-
solve(fix(X)). X 'replace starter' Yes
18
ES shell (v. 2)
shell2.pro Dave Reed
3/25/02 Expert system shell
top_solve(Goal)
- retractall(known(_)),
solve(Goal). CASE 1 truth of goal is
already known solve(Goal) - (known(Goal)
(true ---gt Goal)), ! (known(not(Goal))
(true ---gt not(Goal))), !, fail. CASE 2
negated goal solve(not(Goal)) - solve(Goal),
!, fail true. CASE 3 conjunctive
goals solve((Goal1, Goal2)) - !,
solve(Goal1), solve(Goal2). CASE 4 back
chain on rule in KB solve(Goal) - (Premise
---gt Goal), solve(Premise). CASE 5 ask
user solve(Goal) - askable(Goal),
ask_user(Goal, Response), (Response 'y',
assert(known(Goal)), ! Response 'n',
assert(known(not(Goal))), !, fail).
would like to add a user interface, ask the user
for info when it is needed note since not all
info is askable, can define a predicate in the KB
to identify askable info
a
sk_user(Goal, Answer) - nl, write('User
query '), write(Goal), nl, write('(y/n) '),
read(Answer), respond(Goal,
Answer). respond(_, 'y') - !. respond(_, 'n')
- !. respond(Goal, Answer) - write('Illegal
response.'), nl, ask_user(Goal, Answer).
19
Auto repair knowledge base (v. 2)
autoKB2.pro Dave Reed
3/25/02
- op(1100, xfx, '---gt'). bad_component(X)
, fix(X, Advice) ---gt fix(Advice). bad_system(sta
rter_system), lights(come_on) ---gt
bad_component(starter). bad_system(starter_system)
, not(light(come_on)) ---gt
bad_component(battery). bad_system(ignition_system
), not(tuned_recently) ---gt
bad_component(timing). bad_system(ignition_system)
, plugs(dirty) ---gt bad_component(plugs). bad
_system(ignition_system), not(plugs(dirty)),
tuned_recently ---gt bad_component(ignition_wi
res). not(car_starts), not(turns_over) ---gt
bad_system(starter_system). not(car_starts),
turns_over, gas_in_carb ---gt
bad_system(ignition_system). runs(rough),
gas_in_carb ---gt bad_system(ignition_system).
car_starts, runs(dies), gas_in_carb ---gt
bad_system(ignition_system). true ---gt
fix(starter, 'replace starter'). true ---gt
fix(battery, 'replace or recharge battery'). true
---gt fix(timing, 'get the timing adjusted'). true
---gt fix(plugs, 'replace the spark plugs'). true
---gt fix(ignition_wires, 'check ignition
wires'). askable(car_starts).
askable(turns_over). askable(lights(_)).
askable(runs(_)). askable(gas_in_carb).
askable(tuned_recently). askable(plugs(_)).
  • must identify which info it is reasonable to ask
    the user for
  • info that is deducible by rules should not be
    askable

20
ES query (v. 2)
?- consult(shell2). shell2 compiled 0.05 sec,
2,904 bytes ?- consult(autoKB2). autoKB2
compiled 0.00 sec, 3,548 bytes Yes ?-
top_solve(fix(X)). User query car_starts (y/n)
n. User query turns_over (y/n) foo. Illegal
response. User query turns_over (y/n) n. User
query lights(come_on) (y/n) y. X 'replace
starter' Yes
  • with the addition of the user interface, don't
    have to assert knowledge ahead of time
  • will be prompted for info as it becomes relevant
  • user input is asserted automatically
  • note top_solve automatically retracts all known
    info before beginning the deduction, so no
    leftover knowledge

21
Adding uncertainty
  • will handle uncertainties via certainty factors
    (similar to MYCIN)
  • associate a CF between 0 (known false) and 100
    (known true) for info
  • each fact and rule in the KB will have a CF
    associated with it
  • for askable info, the user will specify a CF for
    that info
  • combine CF's of rule premises as in MYCIN
  • CF(P1 ? P2) max( CF(P1), CF(P2) )
  • CF(P1 ? P2) min( CF(P1), CF(P2) )
  • combine rule premises and conclusion CF as in
    MYCIN
  • CF(H, Rule) CF(Premise) CF(Rule)
  • will only consider a premise or rule if its CF
    exceeds a threshold (60)
  • will report the first conclusion that exceeds the
    threshold (but backtrackable)
  • thus, no need to combine CF's of multiple rules

22
ES shell (v. 3)
shell3.pro Dave Reed
3/25/02
solve(Goal, CF) - retractall(known(_,
_)), solve(Goal, CF, 60). solve(Goal, CF,
Threshold) - (known(Goal, CF) (true ---gt
GoalCF)), !, above_threshold(CF,
Threshold). solve(not(Goal), CF, Threshold) -
!, negate_cf(Threshold, New_threshold),
solve(Goal, CF_goal, New_threshold),
negate_cf(CF_goal, CF). solve((Goal1, Goal2),
CF, Threshold) - !, solve(Goal1, CF1,
Threshold), above_threshold(CF1, Threshold),
solve(Goal2, CF2, Threshold),
above_threshold(CF2, Threshold), and_cf(CF1,
CF2, CF). solve(Goal, CF, Threshold) -
(Premise ---gt GoalCF_rule), solve(Premise,
CF_premise, Threshold), rule_cf(CF_rule,
CF_premise, CF), above_threshold(CF,
Threshold). solve(Goal, CF, Threshold) -
askable(Goal), ask_user(Goal, CF), !,
assert(known(Goal, CF)), above_threshold(CF,
Threshold).
and_
cf(A, B, Min) - Min is min(A, B). rule_cf(CF_rul
e, CF_premise, CF) - CF is (CF_rule
CF_premise / 100). negate_cf(CF, Negated_CF) -
Negated_CF is 100-CF. above_threshold(CF,
T) - T gt 50, CF gt T. above_threshold(CF, T) -
T lt 50, CF lt T.
ask_user(Goal, CF) - nl,
write('User query '), write(Goal), nl,
write('? '), read(Answer), respond(Answer,
Goal, CF). respond(CF, _, CF) -
number(CF), CF lt 100, CF gt 0. respond(_,
Goal, CF) - write('Illegal response.'),
nl, ask_user(Goal, CF).
23
Auto repair knowledge base (v. 3)
autoKB3.pro Dave Reed
3/25/02
- op(1100, xfx, '---gt'). bad_component(X)
, fix(X, Advice) ---gt fix(Advice)100. bad_system
(starter_system), lights(come_on) ---gt
bad_component(starter)75. bad_system(starter_syst
em), not(light(come_on)) ---gt
bad_component(battery)95. bad_system(ignition_sys
tem), not(tuned_recently) ---gt
bad_component(timing)90. bad_system(ignition_syst
em), plugs(dirty) ---gt bad_component(plugs)9
5. bad_system(ignition_system),
not(plugs(dirty)), tuned_recently ---gt
bad_component(ignition_wires)90. not(car_starts)
, not(turns_over) ---gt bad_system(starter_syst
em)95. not(car_starts), turns_over, gas_in_carb
---gt bad_system(ignition_system)90. runs(roug
h), gas_in_carb ---gt bad_system(ignition_syste
m)90. car_starts, runs(dies), gas_in_carb ---gt
bad_system(ignition_system)80. true ---gt
fix(starter, 'replace starter')100. true ---gt
fix(battery, 'replace or recharge
battery')100. true ---gt fix(timing, 'get the
timing adjusted')100. true ---gt fix(plugs,
'replace the spark plugs')100. true ---gt
fix(ignition_wires, 'check ignition
wires')100. askable(car_starts).
askable(turns_over). askable(lights(_)).
askable(runs(_)). askable(gas_in_carb).
askable(tuned_recently). askable(plugs(_)).
  • associate certainty factors with facts and rules
    in the KB
  • here, use the built-in '' operator

24
ES query (v. 3)
?- consult(shell3). shell3 compiled 0.00 sec,
4,180 bytes ?- consult(autoKB3). autoKB3
compiled 0.00 sec, 3,836 bytes Yes ?-
solve(fix(X), CF). User query car_starts ?
0. User query turns_over ? 10. User query
lights(come_on) ? 80. X 'replace starter' CF
60 User query runs(rough) ? 100. User query
gas_in_carb ? 100. User query tuned_recently ?
85. User query plugs(dirty) ? 75. X 'replace
the spark plugs' CF 71.25 Yes
  • solve predicate has the CF as an additional
    argument, so that the certainty of the conclusion
    is also reported
  • note solve does not necessarily give the
    conclusion with highest CF first it reports the
    first conclusion with CF at or above 60

25
ES shell (v. 4)
shell.pro Dave Reed
3/25/02
solve(Goal, CF) - print_instructions,
retractall(known(_, _)), solve(Goal, CF,
, 60). solve(Goal, CF, _, Threshold) -
(known(Goal, CF) (true ---gt GoalCF)), !,
above_threshold(CF, Threshold). solve(not(Goal),
CF, Rules, Threshold) - !,
invert_threshold(Threshold, New_threshold),
solve(Goal, CF_goal, Rules, New_threshold),
negate_cf(CF_goal, CF). solve((Goal1, Goal2),
CF, Rules, Threshold) - !, solve(Goal1,
CF1, Rules, Threshold), above_threshold(CF1,
Threshold), solve(Goal2, CF2, Rules,
Threshold), above_threshold(CF2, Threshold),
and_cf(CF1, CF2, CF). solve(Goal, CF, Rules,
Threshold) - (Premise ---gt GoalCF_rule),
solve(Premise, CF_premise, (Premise ---gt
GoalCF_rule)Rules, Threshold),
rule_cf(CF_rule, CF_premise, CF),
above_threshold(CF, Threshold). solve(Goal, CF,
Rules, Threshold) - askable(Goal),
ask_user(Goal, CF, Rules), !,
assert(known(Goal, CF)), above_threshold(CF,
Threshold).
  • to add explanation facilities,
  • must keep track of chain of rules
  • allow user to ask why at prompt

print_instructions - nl, write('Responses
must be either'), nl, write(' (1) a number
between 0 and 100 (a confidence factor).'), nl,
write(' (2) why (to justify the relvance of the
question).'), nl.
ask_user(Goal, CF, Rules) -
nl, write('User query '), write(Goal), nl,
write('? '), read(Answer), respond(Answer,
Goal, CF, Rules). respond(CF, _, CF, _) -
number(CF), CF lt 100, CF gt 0. respond(why,
Goal, CF, RuleRules) - write(Rule), nl,
ask_user(Goal, CF, Rules). respond(why, Goal,
CF, ) - write('At the top of the rule
stack.'), nl, ask_user(Goal, CF,
). respond(_, Goal, CF, Rules) -
write('Illegal response.'), nl,
ask_user(Goal, CF, Rules).
26
ES query (v. 4)
?- solve(fix(X), CF). Responses must be either
(1) a number between 0 and 100 (a confidence
factor). (2) why (to justify the relvance of the
question). User query car_starts ? 0. User
query turns_over ? why. not(car_starts),
not(turns_over)---gtbad_system(starter_system)95
User query turns_over ? why. bad_system(starter_s
ystem), lights(come_on)---gtbad_component(starter)
75 User query turns_over ? why. bad_component(st
arter), fix(starter, _G245)---gtfix(_G245)100 Use
r query turns_over ? why. At the top of the rule
stack. User query turns_over ? 0. User query
lights(come_on) ? 80. X 'replace starter' CF
60 Yes
  • when the user enters 'why' at a prompt, will be
    shown the rule being investigated
  • subsequent 'why's pop from the stack of rules
  • ideally, would also like for user to be able to
    ask 'how' conclusions were reached

27
Alternative approaches
  • case-based reasoning
  • begin with a collection of cases (previous
    solutions)
  • when you encounter a new situation, find the
    closest match and modify it to apply to the new
    situation
  • common applications legal advice, hardware
    diagnosis, help-line,
  • model-based reasoning
  • attempt to construct a model of the situation
  • provides deeper understanding of the system, but
    more difficult detailed
  • common examples hardware diagnosis
  • construct software models of individual
    components
  • when an error occurs, compare with the model's
    behavior
  • model-based reasoning is used to troubleshoot
    NASA space probes

see Chapter 7 for summary of advantages/disadvanta
ges
Write a Comment
User Comments (0)
About PowerShow.com