Moral Decision Making with ACORDA - PowerPoint PPT Presentation

1 / 19
About This Presentation
Title:

Moral Decision Making with ACORDA

Description:

Moral Decision Making with ACORDA. Lu s Moniz Pereira CENTRIA UNL, Portugal ... the classic trolley problem. Abducibles to model possible moral decisions ... – PowerPoint PPT presentation

Number of Views:142
Avg rating:3.0/5.0
Slides: 20
Provided by: Ari1151
Category:

less

Transcript and Presenter's Notes

Title: Moral Decision Making with ACORDA


1
Moral Decision Making with ACORDA
  • Luís Moniz Pereira CENTRIA UNL, Portugal
  • Ari Saptawijaya FASILKOM UI, Indonesia

2
Study on Morality
  • Interdisciplinary perspectives
  • Philosophy
  • virtue ethics, utilitarianism/consequentialism,
    deontological principles/nonconsequentialism,
    etc.
  • Science primatology, cognitive sciences,
    neuroscience, artificial intelligence, etc.

3
Computational Study on Morality
  • Several names machine ethics, machine morality,
    artificial morality, computational morality
  • Two purposes
  • To understand morality better, from the
    computational point of view
  • To equip artificial agents with the capability of
    moral decision making

4
Our Goal
  • To provide a general framework to model morality
    computationally
  • A toolkit to codify arbitrarily chosen moral
    rules as declaratively as possible
  • Logic programming as a promising paradigm
  • Default negation
  • Abductive logic programming
  • Stable model semantics
  • Preferences
  • etc.

5
Prospective Logic Programming
  • Enable evolving programs to look ahead
    prospectively its possible future states and to
    prefer among them to satisfy goals
  • Working implementation ACORDA
  • Based on EVOLP
  • Benefits from XSB-XASP interface to Smodels

6
Prospective Logic Agent Architecture
7
The Trolley Problem (1)
8
The Trolley Problem (2)
9
The Trolley Problem (3)
10
The Trolley Problem (4)
11
The Principle of Double Effect
  • Harming another individual is permissible if it
    is the foreseen consequence of an act that will
    lead to a greater good
  • in contrast, it is impermissible to harm someone
    else as an intended means to a greater good.

12
Modelling Denise Case
  • There is a man standing on the side track.
  • human_on_side_track(1).

13
Modelling Two Possible Decisions (1)
  • Merely watching
  • expect(watching).
  • train_straight lt- consider(watching).
  • end(die(5)) lt- train_straight.
  • observed_end lt- end(X).

14
Modelling Two Possible Decisions (2)
  • Throwing the switch
  • expect(throwing_switch).
  • redirect_train lt- consider(throwing_switch).
  • kill(N) lt- human_on_side_track(N),
  • redirect_train.
  • end(save_men,ni_kill(N)) lt- redirect_train,
  • kill(N).
  • observed_end lt- end(X,Y).

15
Modelling the Principle of Double Effect (1)
  • Observe final endings of each possible decisions
    to enable us later to morally prefer decisions by
    considering the greater good between possible
    decisions
  • falsum lt- not observed_end.

16
Modellingthe Principle of Double Effect (2)
  • Rule out impermissible actions, i.e. actions that
    involve intentional killing in the process of
    reaching the goal.
  • falsum lt- intentional_killing.
  • intentional_killing lt- end(save_men,i_kill(Y)).

17
Modellingthe Principle of Double Effect (3)
  • Preferring among permissible actions those
    resulting in greater good.
  • elim(end(die(N))) lt- exists(end(save_men,ni_ki
    ll(K))), N gt K.
  • elim(end(save_men,ni_kill(K)))
    lt-exists(end(die(N))),
  • N lt K.

18
Conclusions
  • Use ACORDA to make moral decisions
  • the classic trolley problem
  • Abducibles to model possible moral decisions
  • Compute abductive stable models
  • Capturing abduced decisions along with their
    consequences
  • Integrity constraints
  • Rule out impermissible actions
  • A posteriori preferences
  • Prefer among permissible actions, using utility
    functions

19
Future Work
  • Extend ACORDA concerning a posteriori evaluation
    of choices
  • refinement of morals, utility functions, and
    conditional probabilities
  • once an action is done, ACORDA should receive an
    update with the results of the action
  • ACORDA must tune itself to lessen the chance of
    repeating errors
  • To explore how to express metarule and metamoral
    injunctions
  • To have a framework for generating precompiled
    rules
  • Fast and frugal moral decision making, rather
    than full deliberative moral reasoning every time
Write a Comment
User Comments (0)
About PowerShow.com