Course - DT249/1 - PowerPoint PPT Presentation

1 / 70
About This Presentation
Title:

Course - DT249/1

Description:

Course - DT249/1 Subject - Information Systems in Organisations Semester 1, Week 8 BUILDING INFORMATION SYSTEMS – PowerPoint PPT presentation

Number of Views:171
Avg rating:3.0/5.0
Slides: 71
Provided by: DIT63
Category:

less

Transcript and Presenter's Notes

Title: Course - DT249/1


1
Course - DT249/1
  • Subject - Information Systems in Organisations

Semester 1, Week 8
BUILDING INFORMATION SYSTEMS
2
Textbooks?
  • The Laudon and Laudon book, Management
    Information Systems (Seventh Edition) Chapters
    10 and 11 match this lecture the closest though
    I have gone off at a tangent of my own a bit!

3
Where/When Does the Build Begin?
  • The building of an Information System could be
    viewed as occurring throughout the life cycle,
    but the main build the one that has the
    appearance of a project is in the latter part
    of the Design Phase and throughout the
    Implementation Phase.
  • Building assumes that most, if not all, of the
    design issues have been agreed upon between the
    client and the vendor.

4
System Build Where It Fits In
The build

Implementation is building upon the Design to
further develop the selected solution
5
From Design to Implementation
  • Let us take a look at the typical features of
    Design and Implementation phases to see how the
    building of an Information System spreads itself
    as an extended process.
  • Later in the lecture we can view this part of the
    Life Cycle as a kind of managed project.

6
Systems Design
  • The Systems Design Phase
  • Details how the system will meet information
    requirements as determined by the procedure of
    Systems Analysis
  • May involve the user interface and output design
    for usability
  • Deals with configuration and architecture issues
    related to the system for scalability,
    reusability, and performance (measurement)

7
Systems Design (2)
  • The systems Design Phase usually has many items
    of documentation.
  • Typical and important documents are the
    Specifications for the system solution.
  • The Design Phase should reflect clearly the
    clients business priorities and information
    needs.

8
Between Design and Implementation
  • Programming and Testing
  • Programming
  • The process of translating System Specifications
    into program code.
  • There are many different languages. More than
    one may be needed for coding the complete
    Specification. Breaks down into logic of linear
    processes, logical decisions, repetition, and
    calls to other programs.
  • Integration is needed and may be required at
    module, application, system, intra and
    inter-system levels via middleware and calls to
    programs/modules.

9
Between Design and Implementation (2)
  • Programming and Testing
  • Testing
  • Checks whether the system of programs/modules
    produces the desired results under known
    conditions.
  • Is a serious part of the development of a system
    and good test cases are required.
  • Testing includes unit testing (testing
    individual modules), system testing (testing all
    programs together), acceptance testing
    (demonstrating all programs for client
    agreement).
  • There is often a document showing a test plan.

10
Between Design and Implementation (3)
  • System Conversion
  • The process of changing from the old system to
    the new system.
  • Four possible installation choices
  • Firm-wide rollout versus single location
    installation rollout (Example all Currys stores
    Vs. one Currys at a time)
  • Entire application installation versus phased
    installation (Ex all software loaded Vs. one
    application at a time)
  • Direct installation vs. parallel installation
    (Running without old system Vs. using new and old
    together until the new is trusted)
  • Insource versus Outsource (Internal IT expertise
    Vs. hired in IT expertise)

11
Between Design and Implementation (4)
  • System Conversion
  • Four possible Strategies for conversion
  • Direct cutover (As in 1, 2 and 3 of previous
    slide)
  • Parallel (As in 1, 2 and 3 of previous slide)
  • Pilot study (As in 1 and 2 of previous slide)
  • Phased approach (As in 1 and 2 of previous slide)
  • Documentation-driven (where no previous systems
    exist, relying on documentation to test and
    compare)

12
  • Four installation choices
  • Firm-wide rollout vs. single location
    installation rollout
  • Entire application installation vs. phased
    installation
  • Direct installation vs. parallel installation
  • Insource vs. Outsource (usually transitional)

Source Valacich et al., 2001
13
Between Design and Implementation (5)
  • Production and Maintenance
  • Production is the stage after the new system is
    installed and the conversion is complete.
    Basically, it is the system in use.
  • Maintenance is a series of tasks following
    changes in hardware, software, documentation or
    procedures to correct errors in the system(s).
  • Maintenance can be seen as starting the process
    over again but on a different scale.

14
A Mid-Session Summary
  • Evaluating and choosing Information Systems
    solutions has the processes of
  • Feasibility issues examined
  • Costs and benefits examined
  • Advantages and disadvantages identified
  • Business value of systems identified
  • Change management planned (More on this later)

15
A Mid-Session Summary (2)
  • Developing an Information Systems solution looks
    a lot like this

16
A Mid-Session Summary (3)
  • Implementing the solution has the processes of
  • Systems design
  • Completing implementation
  • Hardware selection and acquisition
  • Software development and programming
  • Testing
  • Training and documentation
  • Conversion
  • Production and maintenance
  • Managing the change

17
Managing the Build as a Project
  • As we move through the Design Phase agreements
    and take the product of those agreements
    (Specifications) and apply the Specs to
    program-writing, going into the Implementation
    Phase requires management of a specific kind.
  • Building of the Information System can be viewed
    as a project and treated as such.

18
Whos Who in Project Management
  • The people associated with this broad level of
    management include
  • Users
  • Organisational Management
  • Auditors, Quality Assurance personnel and
    standard bearers those who represent the
    products or services.

19
Whos Who (2)
  • The people visible during the coding of
    Specifications are typically
  • Systems Analysts
  • Systems Designers
  • Programmers
  • Web Designers (in the case of Internet
    applications)
  • Operations Personnel (End users)

20
Issues in Project Management
  • Establish the Scope of the Project
  • This usually means making a statement on
  • Key deliverables
  • Timescales and budgets
  • Performance standards
  • Documentation standards
  • Verification, validation and testing
  • Keeping strictly within the agreed scope to avoid
    scope creep

21
Issues in Project Management (2)
  • Managing Change
  • Change, as an issue in Information Systems
    development has several connotations and each is
    dealt with differently at different times. The
    two variations that a Systems Developer may need
    to deal with are
  • the change to the workplace environmental.
  • Changes to the systems development project
    administrative.
  • We will look at the idea of changing the project
    on this occasion.
  • / Continued

22
Issues in Project Management (3)
  • Change, then, in the current context includes
    modifications to the approved project baseline.
  • In this case, change request procedures are
    established
  • Any change request is documented so that its
    exact nature is understood by all.
  • Investigate the impact of each change request on
    timescales, budgets, quality, resources and end
    users.
  • Seek management approval of change requests.

23
Issues in Project Management (4)
  • Managing Risk
  • Types of risk
  • Budget overrun,
  • Time overrun,
  • Performance requirements not being met,
  • Incompatibility with the existing or proposed
    environment(s),
  • Business needs not being met,
  • Business and technical benefits not achieved (at
    the implementation stage),
  • Resource or skill deficiency becoming evident.

24
Issues in Project Management (5)
  • Strategies for managing risk
  • Have clear objectives and a well defined scope
    (This may need client agreement.)
  • Monitor potential problem areas carefully,
  • Ensure training and education is adequate,
  • Partition design and development work into
    manageable chunks (Consider life cycle stages.)
  • Involve the users in the project,
  • Select the appropriate techniques and tools.

25
Issues in Project Management (6)
  • Managing Quality
  • Managing quality is concerned with the
    prevention and detection of defects (mainly in
    the software, in this context).
  • A quality plan for software might ask
  • What standards are applicable?
  • What inspections will take place and when?
  • Who will be involved in the inspections?
  • What techniques will be used in the inspections?
  • What procedures will be applied at each type of
    inspection?
  • What standards will be applied to products and
    deliverables?
  • / Continued

26
Issues in Project Management (7)
  • Four levels of review
  • Project team reviews
  • Product reviews
  • Management reviews and approvals
  • Quality assurance reviews

27
Four Levels
  • The Whos Who list from earlier could be viewed
    as a list of people who might become members of
    temporary committees and groups that will help
    coordinate the development of a complex
    (expensive) system under development.
  • Members of one committee or group can be members
    of other committees or groups.

28
Four Levels (2)
  • The Information System Steering Group/Committee
  • A Steering Group may be a permanent coordinating
    mechanism at corporate level, responsible for
    establishing and implementing company policy.
  • Members usually include heads of all key business
    functions and the head of Information Technology.
    Normally chaired by a Board member or similar
    high-ranking individual.

29
Four Levels (3)
  • The Project Steering Committee
  • Responsible for the management of one or more
    specific projects.
  • Established, where necessary, by the Steering
    Group.
  • Should consist of representatives of the business
    units affected by the projects and of IT
    specialists.

30
Four Levels (4)
  • The Project Team itself
  • A group responsible for the actual work in the
    project.
  • Headed by a Project Manager who reports to the
    Steering Committee on/for the project.
  • This leader is normally a senior member of the
    Information Technology staff.

31
Four Levels (5)
  • The Quality Assurance Group
  • A permanent group responsible for assuring the
    quality of all systems development.
  • One member of this group will normally be
    seconded to each project.

32
Four Levels (6)
  • The Information Systems Steering Group
  • A group concerned with the business and strategic
    issues arising from Information Systems and
    setting the technical direction from an
    organisational perspective.
  • Ensures that all IS projects are coordinated and
    properly carried out.
  • Responsible for formalising and publishing the
    Strategic IS/IT provision (documentation).
  • Initiates reviews of the Strategic IS/IT usage.
  • Sets priorities that provide the business
    direction of IT.
    /Continued

33
Four Levels (7)
  • Responsible for Information Technology (IT)
    strategy.
  • Ensures that IT runs as an organisation within
    an organisation.
  • Provides a forum for the exchanging of ideas,
    concerns and experiences.
  • Setting priorities for
  • Benefits - what is most important to do?
  • Resources - what can be done?
  • Risks - What is likely to succeed?

34
Four Levels (8)
  • The Project Steering Committee
  • Initiates formal communication procedures between
    groups involved in the project.
  • Helps ensure that all planned deliverables are on
    time and within budget.
  • Reviews and approves all project plans.
  • Authorises commitment of resources.
  • Approves or disproves continuance of the project.
  • Evaluates the post-implementation review.

35
The Problems of Steering Committees
  • They may have difficulty in reaching decisions.
  • They may have an inability to communicate
    decisions in a clear and unambiguous manner.
  • They are increasingly involved in administrative
    detail.
  • They may show a lack of communication between
    themselves and systems groups.
  • Poor attendance may occur where committee members
    fail to be representative at meetings.
  • They may have unrepresentative members (due to
    laziness or disinterest).
  • Sometimes there is excessive involvement in
    minutiae.

36
Manager of a Project
  • The Project Manager
  • Defines and reviews deliverables,
  • Estimates resources,
  • Plans, schedules and tracks tasks,
  • Assesses risk,
  • Resolves issues.

37
Manager of a Project (2)
  • Problems with poor project management
  • 60 of projects overrun budget, timescales or
    both,
  • 33 were abandoned due to rising costs,
  • 52 had timescales extended,
  • 15 had more people assigned to the project.
  • (From a 1989 Survey)

38
The Quality Assurance Group
  • Process reviews done by this group ask
  • What went wrong and why?
  • How could it have been avoided?
  • What lessons can be learned for the future?
  • What went right?
  • How were new problems and challenges overcome?

39
The Quality Assurance Group (2)
  • Total Quality Management (TQM)
  • Ensures quality is a planned rather than
    coincidental feature.
  • Needs full management commitment to quality.
  • Needs well-trained and highly motivated staff.
  • Sends a positive signal to the outside world.

40
From Management to Programming
  • Let us examine some of the issues related to the
    sub-phases of Programming and Testing.
  • The next part of the lecture tries to keep to the
    themes of Design and Implementation Phases as
    well as Project Management.

41
Coding Practices
  • A good software program
  • works according to specification and is
    verifiable
  • is well commented
  • is written in an appropriate language
  • has a simple design
  • is modular, with independence
  • uses only sequence, selection and iteration
  • is independent of specific hardware constraints
  • is efficient

42
Coding Practices (2)
  • Good commenting (accompanying instructions)
    examples
  • 4 - 5 lines per module (subroutine/section)
  • 1 line per 4 - 5 lines of code.
  • Assembler programs should have almost one comment
    per line
  • 4GLs do not need much commenting.
  • Comments should be brief and to the point.
  • Data and module names should also be brief and to
    the point.

43
Coding Practices (3)
  • Pitfalls in code commenting
  • Redundant commenting,
  • Obsolete comments,
  • Incorrect comments,
  • Vague comments,
  • Correct, but incomplete comments,
  • Incomprehensible comments.

44
Design Options
  • When designing software programs there are two
    main options Top-Down Design and Bottom-Up.
  • Top-Down is probably the most popular and begins
    with a general view, moving to more detailed
    views that become comparable to code modules.
  • Bottom-Up begins with finer-detailed program
    design options and tries to fit them to broader
    solutions.

45
Top-Down Design
  • Top-Down Design practice includes
  • Formal and rigorous specification of input,
    processing and output of each module.
  • When the module is properly specified, disregard
    the internal workings.
  • Keep away from trivialities.
  • Each level of design should be expressible on a
    single page of flowchart.
  • Pay as much attention to data design as to
    process / algorithm design.

46
Top-Down Structure Diagrams
HIPO Diagrams
HIPO stands for Hierarchical Input Process Output
47
Top-Down Structure Diagrams (2)
  • For each module do an Input-Process-Output (IPO)
    chart

Output
Input
Processing
48
Top-Down Coding
  • As a level is specified, the coding is done for
    that level, before subordinate levels are
    specified.
  • Design flaws discovered early on.
  • Dummy modules must be inserted, to allow for
    the running of the program.
  • Some modules will take precedence over others
  • a processing module cannot run without the input
    module being written and the results cannot be
    seen without the output module.
  • Arrange modules in the program in an organised
    fashion, i.e. either horizontally or vertically.

49
The Advantages of Modularity
  • When writing code as modules there are
    advantages.
  • Code is
  • Easier to write and debug.
  • Easier to maintain and change.
  • Easier for a manager to control (e.g. as regards
    delegating programming tasks to programmers of
    varying abilities).

50
The Disadvantages of Modularity
  • Coding modules has disadvantages
  • A lot of programmers do not understand it.
  • It requires a great deal of extra work.
  • It may require more processing time on the
    computer.
  • It may require more main memory.
  • It may cause problems in real-time and on-line
    systems.
  • Modules normally working together should be on
    the same page.

51
Testing Levels
  1. Module (unit/program) Testing.
  2. Subsystem Testing (Groups of modules but not the
    whole system).
  3. Integration Testing (The whole system).

52
Testing Methods
  • Black Box Testing - (Functional Testing)
  • Top-Down testing
  • Knowing the specific functions a system is
    required to perform, tests are developed to
    demonstrate that each function is working
    correctly.

53
Testing Methods (2)
  • White Box Testing - (Structure Testing)
  • Bottom-Up testing
  • Knowing the internal workings of the system,
    tests are developed to ensure that each program
    is working according to its specification.

54
Black Box
  • Use dummy modules to represent the lower echelons

Main
A
B
C
55
Black Box (2)
  • The system is considered as a black box whose
    behaviour is determined by examining inputs and
    related outputs.
  • Tests are designed to demonstrate that
  • system functions are operational
  • input is correctly accepted
  • output is correctly produced
  • the integrity of the system files/databases are
    maintained
  • Test cases are derived from the requirements
    documentation.

56
Black Box (3)
  • Black box tests attempt to find software errors
    such as
  • incorrect or missing functions
  • interface errors
  • errors in data structures or external file or
    database access
  • performance errors
  • initialisation and termination errors

57
Black Box (4)
  • The benefits of Top-Down testing
  • System testing ought to be eliminated
  • Major interfaces are tested first
  • Prototyping is enabled
  • A usable subset is available before the deadline
  • Testing is evenly distributed
  • Quicker results
  • It creates a natural test harness

58
White Box

59
White Box Testing
  • Using White Box Testing the software engineer can
    design test cases that check
  • Path - Guarantee that all independent program
    paths within a module have been exercised at
    least once
  • Condition - Exercise all logical decisions on
    their true and false sides
  • Loop - Execute all loops at their boundaries
  • Data Structures - Exercise internal data
    structures to ensure their validity

60
White Box Testing (2)
  • Bottom-Up Testing might be needed
  • for testing a module for insertion into a
    Top-Down structure.
  • To rigorously test a module where the program
    environment cannot.
  • IE The module cannot be tested as part of the
    whole software system.
  • To allow for an imperfect Top-Down implementation
    (EG where the design is flawed or inefficient).

61
Desk Checking
  • When a programmer checks the program logic by
    looking at it as a print-out. General errors
    found include
  • Failure to follow specification
  • Commenting errors
  • Quality (standards) errors
  • Fitting-in (being able to run on the hardware)
  • Logic errors (sequence, selection, iteration
    logic errors)

62
Structured Walkthrough
  • This is a presentation of a program to a group,
    which may include other programmers on the
    project, the project leader or manager and maybe
    a user.
  • All are issued with a listing of the program
    specification, coding, test data and results a
    day or two before the meeting.

63
Structured Walkthrough (2)
  • The purpose of the walkthrough is to provide a
    non-aggressive evaluation of the program, in
    regards to its 'goodness' as described earlier.
  • The programmer receives advice on where a program
    contains errors. It is the programmer's
    responsibility to correct any errors uncovered
    and to hold another walk-through. The idea of
    the walkthrough is that responsibility for the
    'goodness' of the program is shared.

64
Evaluating Test Results
  • Test results can be -
  • output files
  • reports
  • screens
  • updated data on a database
  • To check them they must be printed, browsable or
    compared with expected results. Differences
    should be printed or otherwise recorded.
  • If differences exist, a storage dump may be
    produced. This is difficult to use, stops the
    test run and generally signifies serious trouble.

65
Utilities
  • The utility exercises of code testing (which may,
    themselves, be programs system software
    programs) include
  • Traces
  • Core dumps
  • Snapshots
  • Desk checking
  • Test data loader
  • Test data generator
  • Transaction capture facility

66
Other Testing Methods
  • Static program analysis (isolating programs and
    inspecting them line by line)
  • Dynamic program analyser (This may be a testing
    program run over the module under test)
  • Mathematical proofs (Using comparative formulae)
  • Seeded bugs (Placing bugs (which are erroneous
    instructions or data) into a program to see how
    many are detected. Example placing 10, 6 found
    indicates a 40 detection failure rate)
  • Cleanroom approach (Where the code is tested as
    it is written so that no errors creep in)

67
Testing Principles
  • All tests should be traceable to client
    requirements. (A quality quotient should be
    met).
  • Tests should be planned long before testing
    begins.
  • 80 of all errors uncovered during testing will
    likely be traceable to 20 of all program
    modules.

68
Testing Principles (2)
  • Testing should begin in the small and progress
    toward testing in the large.
  • Exhaustive testing is not possible.
  • To be most effective, testing should be conducted
    by an independent third party.

69
Characteristics of Testable Software
  • Operability
  • Observability
  • Controllability
  • Decomposability
  • Simplicity
  • Stability
  • Understandability

70
What Next?
  • Thats it for Building Information Systems.
  • Next week
  • Managing Information Systems operation
  • along with
  • Introduction to Information Systems security
Write a Comment
User Comments (0)
About PowerShow.com