Title: Software: The most dangerous artifact in the world?
1Software The most dangerous artifact in the
world?
- Michael Cavanagh
-
- Bob Wood
2What is Software?
Software
3Software is ...
Invisible Intangible Intolerant Increasingly
complex Indispensable ..... and totally
amoral Which makes it bloody dangerous.
4The dilemma
The release of atom power has changed everything
except our way of thinking.... If only I had
known to what my research would lead I would have
become a watchmaker Albert Einstein
5Light the blue touchpaper and stand well clear...
6Operational States
7Problems of technology use
Tobacco CFCs Credit reporting Mobile phones
Problems of technology abuse
Diamorphine Nuclear fission Spam System
intrusion Chipping Tagging
8Problems of technology failure
9Mess
10(No Transcript)
11The Stakeholders
Your childrens children
You
Line Management
Passers by
Shareholders
Users
Suppliers
Society
Me
Environment
Regulators
Employees
Customers
The law
12Probability
Extreme Prevention
Prevent
Effect
Promote
Extreme Promotion
13Professionalism
- Model Zero
- Compliant uniformity
The 3-P model Proficient deftness, skill and
agility Permanent long practice Professing
ltname of professiongt EQUALS ltmegt
The 4-P model Promise-keeping being in a
permanent state of ethical introspection Tom
DeMarco 1996
14Ethics is ......
Doing good Not doing bad Not screwing people Only
screwing the competition Letting the competition
screw you Doing things right Doing the right thing
15Simple tensions
Duty (What I ought to do)
Consequence (What will achieve my desired goal?)
16Ethics of Duty what I ought to
do(Deontological)
- Abiding to a code, a process, a command
- but I was only following orders!
17Ethics of Consequence(Teleological/ Utilitarian)
- The best outcome for the highest number
- How was I to know that would happen?
18So can a machine be ethical?
Issue
Goal
Method
Assumptions Ethics can be reduced to a
process Human behaviour is based on
reason Actions have uniform, predictable effects
Consequence
Decision
19Asimovs three laws of robotics
1. A robot may not injure a human being or
through inaction allow a human being to come to
harm 2. A robot must obey orders from a human
being provided those orders do not conflict with
the first law 3. A robot must protect itself
provided this does not conflict with either of
the first two laws
20The 0th law
A robot may not injure humanity or through
inaction allow humanity to come to harm
21A Key Governance Process Area? - Ethical
Software Management
- To establish a process whereby the probability
and severity of effects of use, abuse and failure
of the system of which this development is a
component are assessed from the viewpoint of
every stakeholder and that outstanding risks are
managed appropriately
22Attitudes to disaster
From the dawn of time until a few years ago -
Act of God From a few years ago to the
foreseeable future - Who can I sue?
23Consumer Protection Act 1987
Unnecessary to show negligence Only requirements
are the product was defective the defect caused
the damage ... liability is .. imposed on the
producer of the product (DTI guide to the act)
24Negligence (1)
In defence, the burden is on the manufacturer or
designer to show that they took reasonable care.
... reasonable efforts.... .... the state
of the art defence ...
(development risks) (Standards
practices)
25Negligence (2)
A design which departs substantially from
relevant engineering codes is prima facie a
faulty design....
26The Hippocratic oath
- I swear to fulfil, to the best of my ability and
judgment, this covenant - I will respect the hard-won scientific gains of
those physicians in whose steps I walk, and
gladly share such knowledge as is mine with those
who are to follow. - I will apply, for the benefit of the sick, all
measures which are required, avoiding those twin
traps of overtreatment and therapeutic nihilism. - I will remember that there is art to medicine as
well as science, and that warmth, sympathy, and
understanding may outweigh the surgeon's knife or
the chemist's drug. - I will not be ashamed to say "I know not," nor
will I fail to call in my colleagues when the
skills of another are needed for a patient's
recovery. - I will respect the privacy of my patients, for
their problems are not disclosed to me that the
world may know. Most especially must I tread with
care in matters of life and death. If it is given
me to save a life, all thanks. But it may also be
within my power to take a life this awesome
responsibility must be faced with great
humbleness and awareness of my own frailty. Above
all, I must not play at God. - I will remember that I do not treat a fever
chart, a cancerous growth, but a sick human
being, whose illness may affect the person's
family and economic stability. My responsibility
includes these related problems, if I am to care
adequately for the sick. - I will prevent disease whenever I can, for
prevention is preferable to cure. - I will remember that I remain a member of
society, with special obligations to all my
fellow human beings, those sound of mind and body
as well as the infirm. - If I do not violate this oath, may I enjoy life
and art, respected while I live and remembered
with affection thereafter. May I always act so as
to preserve the finest traditions of my calling
and may I long experience the joy of healing
those who seek my help.Written in 1964 by
Louis Lasagna, Academic Dean of the School of
Medicine at Tufts University, and used in many
medical schools today.
27An oath for Scientists, Engineers and Executives
- I vow to practice my profession with conscience
and dignity - I will strive to apply my skills only with the
utmost respect for the well-being of humanity,
the earth and all its species - I will not permit considerations of nationality,
politics, prejudice or material advancement to
intervene between my work and this duty to
present and future generations
28Professional ethics in Software Engineering
- First, do no harm
- Do things on purpose, with a purpose
- Be competent
- Seek always to improve
- ... and make a contribution ...