Abstract Reasoning of Programs Part 2 - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Abstract Reasoning of Programs Part 2

Description:

Need a feasible technique for reasoning about complex algorithms ... Two well-known algorithms for finding a MST are those of Kruskal and Prim ... – PowerPoint PPT presentation

Number of Views:524
Avg rating:3.0/5.0
Slides: 24
Provided by: chuk
Category:

less

Transcript and Presenter's Notes

Title: Abstract Reasoning of Programs Part 2


1
Abstract Reasoning of ProgramsPart 2
  • Francis Tang
  • March 21, 2003

2
References
  • CLRS Cormen, Leiserson, Rivest and Stein,
    Introduction to Algorithms 2nd Ed, MIT Press,
    2001.
  • Gusfield Gudfield, Algorithms on Strings, Trees
    , and Sequences Computer Science and
    Computational Biology, CUP, 1997. (The BII
    Library has a copy of this book.)

3
Revision Motivation
  • Want to apply rigour to programming
  • Need a feasible technique for reasoning about
    complex algorithms
  • We need abstraction to be able to feasibly reason
    about complex algorithms
  • Should be language independent

4
Objectives
  • Demonstrate loop invariant arguments by example
  • Provide you with the foundations to be able to
    reason about while-loop algorithms/programs
  • Provide you with enough background to be able to
    understand other peoples reasoning
  • Show you how abstraction can give simplifications

5
Non-objectives (sorry)
  • Teach you about programming in Java/C/C/Perl
  • Teach you about Operating Systems

6
Revision Loop Invariant
  • while bexp
  • do
  • cmd
  • od
  • A loop invariant is a logical formula, INV,
    describing machine state such that
  • INV is true before loop
  • If bexp is true and INV is true, then after
    executing cmd, INV is still true.
  • If INV is a loop invariant, then after loop, we
    know bexp is false and INV is true.

7
Analogy Proof by Induction
  • Q How do you prove for all n
  • A Proof by induction
  • Prove for n 0
  • Prove that if true for n k, then it is true for
  • n k1

8
Reminder Fast Exponentiation
  • xa z1 nN
  • while n ! 0
  • do
  • if even (n)
  • then xxx nn/2
  • else zzx nn-1
  • od
  • Invariant z xn aN
  • An invariant argument allowed us to program this
    tricky algorithm correctly
  • Aside why is this algorithm faster than the
    naïve algorithm? (Hint consider the rate at
    which n decreases)

9
Ex Array Search
  • Suppose we have an array of numbers
  • a0 gives you the first entry
  • a1 gives you the second entry
  • Suppose this array is sorted
  • For indices i, j, we have iltj implies ai lt
    aj.
  • We assume that this array contains N entries.
    That is, a0, a1, aN-1 are all valid array
    entries.
  • The problem is given a number x, assuming there
    is j such that ajx, find j
  • Naïve algorithm checks x against a0. If this
    is false, checks against a1 and so on.
  • The binary search algorithm can do this faster by
    using the sorted assumption
  • Can be generalised to other random-access
    datastructures and other order relations

10
Binary Chop
  • The idea is as follows
  • Compare x against an element in the middle of the
    array
  • From this we know whether x is stored above or
    below this middle element
  • Repeat this for the top/bottom half of the array,
    and so on

x
Test
11
An example
12
Programming the Binary Chop
  • Despite its simplicity, programming the Binary
    Chop correctly requires care
  • The initialisation of variables must be done
    carefully
  • We must make sure we dont accidentally miss out
    any elements in our search
  • Lets use a loop invariant to help us write the
    program
  • Idea is this
  • We define a window that contains the element we
    are looking for
  • On each iteration step we narrow the window
  • When the window contains exactly one element,
    then that must be our answer

13
An explicit invariant
  • Assume that there is j such that ajx.
  • Define the window using two variables
  • u, l (upper and lower)
  • Define invariant as follows
  • There is j such that
  • u lt j ltl and ajx
  • u 0 l N
  • while (l u) ! 1
  • do
  • m (ul)/2
  • if amltx
  • then u m
  • else l m
  • od

14
Exercise
  • Check that the invariant defined on the
    previous slide really is an invariant for the
    loop defined on the same slide.
  • (The assumption is important!)
  • Extended exercise modify the invariant to so
    that when the program terminates, aux, even
    without the assumption. (This is a bit tricky.)

15
Some Graph Definitions
  • A (undirected) edge-weighted graph G is given by
  • a set of nodes V,
  • a set of edges E (symmetric relation over V) and
  • a weight function w mapping
  • Such a graph is said to be connected if there is
    a path between any two nodes
  • The weight of a graph is the sum of the weight of
    its edges

16
Minimum Spanning Tree (MST)
  • CLRS MST can be used to solve wiring problems
    in circuit design.
  • Gusfield MST algorithms are closely related to
    iterative pairwise alignment approximate
    solutions to Multiple Sequence Alignment
  • Gusfield MST can be used to solve the Weighted
    Steiner Tree Problem on Hypercubes within a
    factor of less than two

17
Trees
  • A tree is a connected acyclic graph
  • For any graph G, a subset of its edges determines
    a subgraph
  • A spanning tree is a subgraph that is a tree
  • A Minimum Spanning Tree (MST) is a spanning tree
    of least weight
  • Two well-known algorithms for finding a MST are
    those of Kruskal and Prim

A nice online resource http//students.ceid.upatr
as.gr/papagel/project/kef5_4.htm
18
Kruskals Algorithm
  • We construct a set of edges A satisfying the
    following invariant
  • A is a subset of some MST
  • We start with A empty
  • On each iteration, we add a suitable edge to A
  • In the case of Kruskals algorithm, candidate
    edges are considered in order by increasing weight
  • A
  • sort E by increasing weight
  • e first edge in E
  • while A not a spanning tree
  • do
  • if Ae is acyclic then A Ae
  • e next edge from E
  • od

19
Kruskals continued
  • An easy way to determine whether A is a spanning
    or not is to use the following theorem
  • If T is a spanning tree of G (V,E) then T has
    V-1 edges
  • The main difficulty in implementing Kruskals
    algorithm efficiently is determining whether
    adding e to A still gives a tree

20
Kruskal disjoint sets
  • In CLRS, it is noted that, at any point, A
    defines a forest (i.e. a set of trees)
  • Whenever the ends of e are in distinct trees,
    then it is safe to add e to A
  • The algorithm in CLRS maintains a set of
    disjoint sets of nodes that provide a fast way to
    determine when two nodes are in the same tree
  • Though this is explained in the prose, it is not
    explicit in the invariant, and thus not
    rigorously justified

21
Partitionings
  • P is said to be a partitioning of V if for some k

implies ij
22
Kruskal Refined
  • P u u in V
  • A
  • sort E by increasing weight
  • (u,v) first edge in E
  • while A not a spanning tree
  • do
  • if u ! v then
  • P union(P, u, v)
  • A A (u,v)
  • (u,v) next edge from E
  • od
  • We write u ! v to mean that u and v are in
    different partitions in P
  • The operation union(P, u, v) computes a new
    partitioning from P where u and v are merged
  • Implementations of partitionings with these
    operations (called Disjoint Sets) are described
    in Ch 21 of CLRS.

23
A Stronger Invariant
  • Invariant
  • A is a subset of some MST
  • P is a partitioning of V
  • u and v are connected by edges in A iff there is
    some X in P containing u and v
  • We have to check that this really is an invariant
  • True initially
  • Maintained by the loop body
Write a Comment
User Comments (0)
About PowerShow.com