Information Systems, Architecture and Moral responsibility: - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Information Systems, Architecture and Moral responsibility:

Description:

new practices (new business, new economy, new science, new publishing, new ... absence of discursive scrutiny. Doxastic Voluntarism is false ... – PowerPoint PPT presentation

Number of Views:27
Avg rating:3.0/5.0
Slides: 33
Provided by: s0171771
Category:

less

Transcript and Presenter's Notes

Title: Information Systems, Architecture and Moral responsibility:


1
Information Systems, Architecture and Moral
responsibility
  • A plea for due moral diligence
  • Jeroen van den Hoven (vandenhoven_at_fwb.eur.nl)
  • Dpt of Philosophy
  • Erasmus University Rotterdam

2
Outline
  • I. ICT and conceptual confusion
  • II. Risk Society
  • III. Moral Responsibility and Information
    Systems
  • IV. Due Moral Diligence in Architecture, Design
    and Development

3
I. ICT and Conceptual Confusion
  • new practices (new business, new economy, new
    science, new publishing, new libraries, new
    correspondence, new friendship), new
    institutions, and new experiences.
  • Change enabling vs. Change constitutive
    Technology

4
Conceptual Confusion (1)
  • democracy
  • community
  • trust
  • friendship
  • jurisdiction
  • reality
  • life

5
Conceptual Confusion (2)
  • intelligence
  • agent
  • property
  • privacy
  • art
  • person

6
Adding to the confusion (1)
  • tele democracy
  • cyber community
  • e trust
  • net friendship
  • cyber jurisdiction
  • virtual reality
  • artificial life

7
Adding to the confusion (2)
  • artificial intelligence
  • software agent
  • intellectual property
  • informational privacy
  • computer art
  • digital person

8
Pseudo certainty
  • A new sort of democracy
  • A new sort of privacy

9
Vacuum
  • Conceptual vacuum gt policy vacuum (Moor, 1985)

10
II. Risk societies (Giddens, Beck)
  • Experiment and rapid innovation
  • Interdependence and tight-coupling
  • Team work and collective agency
  • Global and pluralistic value context

11
Risk societies (2)
  • Manufactured risks
  • Side-effects driven development.
  • Experts part of the problem, not of the solution
  • Normal Accidents (Perrow)

12
Risk societies (3)
  • Tschernobyl
  • Bhopal
  • BSE crisis
  • Millennium Bug
  • Internet security

13
Risk Societies (4)
  • Control Dilemma
  • Certainty Trough
  • Adequate conception of Responsibility

14
Standard Responsibility (Ladd)
  • Backward- looking
  • Blame and compensation oriented
  • Exclusive
  • Legal

15
Non-standard Responsibility
  • Anticipating vs Backward-looking
  • Design-oriented vs Blame-oriented
  • Inclusive vs Exclusive
  • Managerial vs Legal

16
III. Moral Responsibility and Information Systems
  • Information Systems and Moral Responsibility?
  • Epistemic Empowerment lt-gt Epistemic Enslavement

17
A. Epistemic Empowerment (Goldman, Thagard)
  • Power
  • ability to help find true answers to interesting
    questions
  • Fecundity
  • ability to lead to large number of true beliefs
    for many people
  • Speed
  • how quick does it lead to true answers

18
Epistemic empowerment
  • Efficiency
  • how well does it limit cost of getting true
    answers
  • Reliability
  • ratio of true beliefs to the total number of
    beliefs

19
2. Epistemic empowerments moral backlash
  • Some Cases
  • Mrs. Engle
  • The French Police and the stolen vehicle
  • Therac 25 incidents
  • USS Vincennes

20
Thinking for yourself (1)
  • Kant Every Human Being Ought to think for
    himself
  • John Stuart Mill Our Understanding should be
    our own

21
Thinking for yourself (2)
  • Hilary Putnam The autonomous person can no more
    imagine giving up his capacity to think for
    himself, than he can imagine submitting to a
    lobotomy
  • Thomas Scanlon An autonomous person cannot
    accept without independent consideration the
    judgement of others

22
Argument
  • 1. narrowly embedded users2. limited freedom in
    the acquisition of their beliefs3. epistemic
    dependence adds up...4. epistemic enslavement.

23
Narrowly embedded User (1)
  • Information Systems as epistemic artefacts
  • Artefacts do have politics (L. Winner)
  • Systems environments as artificial epistemic
    niches
  • user configured

24
Narrowly embedded User (2)
  • inaccessibility/intractability
  • pressure (limited time/pressure to decide)
  • error (errors and inaccuracies, flawed
    world-model/brittleness software/bugs/limits of
    test and proof/emergent properties)
  • absence of discursive scrutiny

25
Doxastic Voluntarism is false
  • Not entirely free to (decide to) believe

26
Epistemic Dependence
  • Inability to justify ones beliefs personally
  • If I have good reasons to believe that he has
    good reasons to believe it, then I have good
    reasons to believe it.
  • Errors may not always be novice detectable
  • Inability to find defeatíng information

27
Epistemic Enslavement
  • Narrow embeddedness
  • Epistemic Dependence
  • Falseness of doxastic voluntarisme Epistemic
    Enslavement
  • Situation where non-compliance is a form of moral
    risk-taking the user can not justify at the
    moment of his non-compliance.

28
3. Moral Empowerment (1)
  • No epistemic empowerment without moral
    empowerment
  • ME1 Users have an obligation to evaluate system
    before they work with it
  • ME2 ICT professionals/management have an
    obligation to enable users to evaluate the
    knowledge environment before they will start
    working in it.

29
Moral Empowerment (2)
  • ME1 end-users ought to endorse (or act upon) the
    output of information systems they are
    epistemically dependent upon, and with which they
    know they will be working under conditions of
    narrow-embeddedness, only after an inquiry of
    acceptability of the system, the cost of which is
    proportional to the cost that could reasonably be
    expected if what is acted upon should prove
    inadequate.

30
Moral Empowerment (3)
  • ME2 ICT professionals and system designers ought
    to allow users to work with systems in such a way
    as to not make it impossible for them to live up
    to their obligations as users, specified by ME1.

31
Moral Empowerment (4)
  • Participatory design
  • value sensitive design (Friedman)
  • Due Moral Diligence (Van den Hoven)

32
D. Due Moral Diligence
  • Non-standard account of responsibility
  • Fine-grained account of responsibility
  • Account that accomodates moral empowerment
  • Account that is integrated in Software
    development methodologies and
  • Accomodated by relevant codes of ethics
Write a Comment
User Comments (0)
About PowerShow.com