Prepared by- Jatinder Paul, Shraddha Rumade - PowerPoint PPT Presentation

About This Presentation
Title:

Prepared by- Jatinder Paul, Shraddha Rumade

Description:

In the case of Heaps elements are stored as an array since they are filled level ... Heaps are usually used in finding maximum element not the minimum element. ... – PowerPoint PPT presentation

Number of Views:19
Avg rating:3.0/5.0
Slides: 35
Provided by: snr67
Learn more at: https://ranger.uta.edu
Category:

less

Transcript and Presenter's Notes

Title: Prepared by- Jatinder Paul, Shraddha Rumade


1
  • Binary search trees
  • Heaps
  • Single source shortest paths
  • Prepared by- Jatinder Paul, Shraddha Rumade

2
Binary Trees
  • A binary tree is a tree which is either empty, or
    one in which every node
  • has no children or
  • has just a left child or
  • has just a right child or
  • has both a left and a right child.
  • In other words in binary tree each node can have
    zero,
  • one, or two children .
  • A complete binary tree is a special case of a
    binary
  • tree, in which all the levels, except perhaps the
    last,
  • are full while on the last level, any missing
    nodes are
  • to the right of all the nodes that are present.

Fig. A complete binary tree
3
Binary Search Trees
  • Binary Search Tree (BST) is a prominent data
    structure used in many
  • systems programming applications for representing
    and managing dynamic
  • sets.
  • Assuming k represents the value of a given node,
    BST is a binary tree with the
  • following property
  • all children to the left of the node have
    values smaller than k, and
  • all children to the right of the node have
    values larger than k.
  • BST can be used to build
  • Dictionaries
  • Priority Queues

4
Binary Search Trees (Contd.)
  • In the case of Heaps elements are stored as an
    array since they are filled level by level .
    Whereas levels in BST are dynamically filled and
    are unpredictable hence we use linked lists to
    represent them.
  • Heaps are usually used in finding maximum element
    not the minimum element. BST are useful finding
    both minimum and maximum element.
  • The highest valued element in a BST can be
    found by traversing from the root in the right
    direction all along until a node with no right
    link is found (we can call that the rightmost
    element in the BST).
  • The lowest valued element in a BST can be found
    by traversing from the root in the left direction
    all along until a node with no left link is found
    (we can call that the leftmost element in the
    BST)
  • In other words , leftmost node is minimum value
    node and rightmost node is the maximum value
    node .
  • And if the tree is balanced then finding minimum
    and maximum element takes O( log n) time.
  • BST maintains sorted order in presence of
    insertion and deletion .

5
Inorder Traversal
  • The Inorder traversal of a binary tree is defined
    as follows
  • Traverse the left subtree
  • visit the root
  • traverse the right subtree.
  • Inorder-Traversal(x)
  • 1. if x ? NIL
  • 2. then Inorder-Traversal(leftx)
  • 3. print keyx
  • 4. Inorder-Traversal(rightx )
  • The above algorithm takes linear time. The
    inorder procedure is called recursively twice
  • for each element (once for the left child and
    once for its right child), and the element is
  • visited right between them. Therefore, the
    construction time is equal to T(n).

6
Sorting Using BST
  • Inorder traversal of a binary search tree always
    gives a sorted sequence of the
  • values. This is a direct consequence of the BST
    property.
  • Given a set of unordered elements, the following
    method can be used to sort the
  • elements
  • construct a binary search tree whose keys are
    those elements, and then
  • perform an inorder traversal of this tree.
  • If the data is already stored in binary search
    tree, only a traversal is needed in the
  • construction.
  • BSTSort(A) 1. for each element in an array do
    2. Insert element in the BST //
    Constructing a BST take O( log n) time 3.
    Inorder-Traversal (root) // Takes O(n)
    time
  • Total running time of BSTSort(A) is O( n log n).
    It is same as HeapSort.

7
Example Sorting Using BST
  • Input Sequence - 2 3 8 4 5 7 6
    1 2.5 2.4
  • Step 1 Creating Binary Search Tree of above
    given input sequence.


2
3
1
8
2.5
4
2.4
5
7
6
8
Example Sorting Using BST (cont.)
  • Input Sequence - 2 3 8 4 5 7 6
    1 2.5 2.4
  • Step 2 Perform Inorder-Traversal.


2
1
2
3
1
8
2.5
4
2.4
5
7
6
9
Example Sorting Using BST (cont.)
  • Input Sequence - 2 3 8 4 5 7 6
    1 2.5 2.4
  • Step 2 Perform Inorder-Traversal.


2
1
2
3
1
2.4
2.5
3
8
2.5
4
2.4
5
7
6
10
Example Sorting Using BST (cont.)
  • Input Sequence - 2 3 8 4 5 7 6
    1 2.5 2.4
  • Step 2 Perform Inorder-Traversal.


2
1
2
3
1
2.4
2.5
3
8
2.5
4
5
6
4
2.4
7
8
5
Sorted Array
7
6
11
Binary Search Trees (cont.)
  • Search is straightforward in a BST. Start with
    the root and keep moving left or right using the
    BST property. If the key we are seeking is
    present, this search procedure will lead us to
    the key. If the key is not present, we end up in
    a null link.
  • The running time of the search operation is
    O (h) , where h can be
  • h log n for a balanced binary tree
  • h n for an unbalanced tree that resembles a
    linear chain of n nodes in the worst case
  • Example, for a binary search tree with n
    elements, with n 1000, it needs about 10
    comparisons for the search operation with n
    1,000,000, it needs about 20 comparisons.
    However, if the binary search tree is unbalanced
    and is elongated, the run time of a search
    operation is longer.
  • Insertion in a BST is also a straightforward
    operation. If we need to insert an element x, we
    first search for x. If x is present, there is
    nothing to do. If x is not present, then our
    search procedure ends in a null link. It is at
    this position of this null link that x will be
    included.
  • Red-black trees are a variation of binary search
    trees to ensure that the tree is balanced.
  • Height is O (log n), where n is the number of
    nodes.

12
DefinitionA heap is a complete binary tree with
the condition that every node (except the root)
must have a value less than or equal to its
parent.
Heaps
  • A binary tree of height, h, is complete iff
  • it is empty or
  • its left sub-tree is complete of height h-1
  • and its right sub-tree is completely full of
    height h-2 or
  • its left sub-tree is completely full of height
    h-1 and its right sub-tree is complete of height
    h-1.
  • A complete tree is filled from the left
  • all the leaves are on the same level or two
    adjacent ones and
  • all nodes at the lowest level are as far to the
    left as possible


Figure 1
13
Heap property The value of every parent
node is greater than or equal to the values of
either of its child nodes. i.e. for every node
i key (parent (i)) key (i) Why do we need
heaps? Heaps are used to maintain set of numbers
in a dynamic sense. The basic data structures
that we use have some drawbacks such as, Arrays
- Searching takes O(n) time Sorted arrays - Fast
searching but expensive insertion. Link lists -
Searching is complex. Extracting min or max node
is not a trivial operation In heaps
searching time is O(log n) while extracting the
minimum and maximum node takes O(n/2) O(1) time
respectively.
14
Heap representation A heap data structure is
represented as an array A object with two
attributes lengthA - number of elements in
the array, heap-sizeA - number of elements in
the heap. heap-sizeA lengthA In an array A
the root of the heap resides in A1 Consider a
node with index i, Index of parent is Parent(i)
i/2 Index of left child of i is
LEFT_CHILD(i) 2 x i Index of right child of i
is RIGHT_CHILD(i) 2 x i1
1 2 3 4 5
6 7 8 9
10
16 11 9 10 5 6 8 1 2 4
Array representation of the heap in figure 1
15
Minimum-Maximum nodes in a Heap
20 21 22 2h
The height of a heap with n elements is hlog
n. The minimum number of elements in a heap is
when it has just one node at the lowest level.
The levels above the lowest level form a complete
binary tree of height h-1 and 2h-1 nodes. Hence
the minimum number of nodes possible in a heap of
height h is 2h. Clearly a heap of height h, has
the maximum number of elements when its lowest
level is completely filled. In this case the heap
is a complete binary tree of height h and hence
has 2h1-1 nodes.
16
  • Heap Algorithms
  • HEAPIFY
  • HEAPIFY is an important subroutine for
    maintaining heap property.
  • Given a node i in the heap with children l and
    r. Each sub-tree rooted at l and r is assumed to
    be a heap. The sub-tree rooted at i may violate
    the heap property key(i) lt key(l) OR
    key(i) lt key(r)
  • Thus Heapify lets the value of the parent node
    float down so the sub- tree at i satisfies the
    heap property.
  • Algorithm
  • HEAPIFY(A, i)
  • 1. l ? LEFT_CHILD (i)
  • 2. r ? RIGHT_CHILD (i)
  • 3. if l heap_sizeA and Al gt Ai
  • 4. then largest ? l
  • 5. else largest ? i
  • 6. if r heap_sizeA and Ar gt Alargest
  • 7. then largest ? r
  • 8. if largest ? i
  • 9. then exchange Ai ? Alargest
  • 10. HEAPIFY (A,largest)

17
Heapify Example
16
1
13
10
5
9
3
2
8
4
16
13
10
5
9
3
2
8
4
A
1
18
Heapify Example
16
1
13
10
5
9
3
2
8
4
16
13
10
5
9
3
2
8
4
A
1
19
Heapify Example
16
10
13
1
5
9
3
2
8
4
16
13
5
9
3
2
4
A
10
8
1
20
Heapify Example
16
1
10
13
1
8
5
9
3
1
2
1
4
16
13
5
9
3
2
4
A
10
1
8
21
Running Time of Heapify
  • Fixing relation between i( a node ), l (left
    child of i ), r ( right child of i ) takes T(1)
    time. Let the heap at node i have n elements.
    The number of elements at subtree l or r , in
    worst case scenario is 2n/3 i.e. when the last
    level is half full.
  • Or Mathematically
  • T(n)
    T(2n/3) T(1)
  • Applying Master Theorem (Case 2) , we can
    solve the above to
  • T(n)O ( log
    n)
  • Alternatively ,
  • In the worst case the algorithm needs
    walking down the heap of height h log n. Thus
    the running time of the algorithm is O(log n)

22
Algorithm to build a Heap
  • This procedure builds a heap of the array
    modified by the Heapify algorithm
  • BUILD_HEAP(A)
  • 1. heap_size a ? length A
  • 2. for i ? length A/2
    downto 1 do
  • 3. HEAPIFY (A, i)
  • Elements after length A/2 till n
    are leaf nodes hence in line 2 we apply
  • heapify to node from ? length A/2
    to 1.

length A /2 5. It is seen thus that 6th
node onwards all are leaf nodes
23
Running Time of Build_Heap
  • We represent heap in the following manner

i
h
For nodes at level i , there are 2i nodes . And
the work is done for h-i levels.
h log n Total work done ? 2i
(h-i) i1
h log n ?
2i (log n -i)
i1
24
Running Time of Build_Heap (cont.)
  • Substituting j log n- i we get ,
  • 1
  • Total work done ? 2log n - j j
  • jlog n
  • log n
  • ? (2log n / 2j ) j
  • j1 log n
  • n ? j / 2j
  • j1
  • O (n)
  • Thus running time of Build_Heap O(n)

25
HeapSort
  • HEAPSORT(A)
  • BUILD_HEAP(A)
  • for i lt--- length A downto 2
  • do exchange A1 lt------gt Ai
  • heap-sizeA ? heap-sizeA-1
  • HEAPIFY(A,1)
  • Running time of HEAPSORT
  • The call to BUILD_HEAP takes O(n) time and each
    of the n-1 calls to MAX-HEAPIFY
  • takes O (log n ) time. Thus HEAPSORT procedure
    takes O(n log n ) time.
  • Why doesnt Heapsort take O(log n) time as in the
    case of other Heap algorithms?
  • Consider the Build_Heap algorithm, a node is
    pushed down and since the lower part of
  • the heap is decreasing at each step the number of
    leaf node operations performed
  • decreases logarithmically. While in HeapSort the
    node moves upwards. Thus the
  • decreasing lower part does not reduce the number
    of operations.

26
HEAP-EXTRACT-MAX
  • The HEAP-EXTRACT-MAX removes and returns the
    maximum element of the
  • heap i.e. root .
  • HEAP-EXTRACT-MAX(A)
  • if heap-size A lt1
  • then error heap underflow
  • max ? A1
  • A1? Aheap-sizea
  • heap-sizeA? heap-sizeA-1
  • MAX-HEAPIFIY (A,1)
  • return max
  • Step 4 takes the last element in the heap and
    places it at the root and then applies
  • heapify . The running time of HEAP-EXTRACT-MAX is
    O(log n) , since it performs only a
  • constant amount of work on top of the O(log n)
    time for MAX-HEAPIFY.

27
Comparison between the running times of Heap
algorithms
Algorithm Time Complexity
Heapify O( log n )
Build_Heap O (n)
Extract_Max O ( log n )
Delete_Heap O( log n )
Find_Max O (1)
Insert O(log n )
28
Single Source Shortest Paths
  • Suppose G be a weighted directed graph where a
    minimum w(u, v) is associated with
  • each edge (u, v) in E. These weights represent
    the cost to traverse the edge.
  • A path from vertex u to vertex v is a sequence of
    one or more edges.
  •     lt(v1,v2), (v2,v3), . . . , (vn-1, vn)gt in
    EG where u v1 and v vn

The cost (or length or weight) of the path P is
the sum of the weights of edges in
the sequence. The shortest-path weight from a
vertex u ? V to a vertex v ? V in the weighted
graph is the minimum cost of all paths from u to
v. If there exists no such path from vertex u to
vertex v then the weight of the shortest- path is
8.
1
2
6
3
4
5
29
Single source shortest path problem Till now we
have used BFS traversal algorithm to find the
shortest path. But this is a special case where
path length is measured in links i.e. all weights
are 1. Now we consider graphs with different link
weights. Problem Given a weighted graph G, find
a shortest path from given vertex to each other
vertex in G. One solution to this problem is
the Dijkstras algorithm which is a greedy
algorithm. It turns out that one can find the
shortest paths from a given source to all points
in a graph in the same time, hence this problem
is called the single-source shortest paths
problem.
30
Dijkstras Single Source Shortest Paths Algorithm
  • DIJKSTRA (G, w, s)
  • INITIALIZE-SINGLE-SOURCE(G, s)
  • S ? ø
  • Q ? VG
  • while Q ? ø
  • do u ? EXTRACT-MIN(Q)
  • S ? S U u
  • for each vertex v ? Adju do
  • if dv gt du w (u, v)
  • then dv ? du w (u, v)
  • p v ? u
  • The algorithm repeatedly selects the vertex u V
    - S with the minimum shortest-path
  • estimate, adds u to S, and relaxes all edges
    leaving u.

31
Dijkstras Algorithm
To find the shortest path from node 1 to node 5.
Step Fringe Set 1 2, 7 2
3, 7
1
4
4
7
3
2
0.01
8
3
2
6
3
6
5
5
7
5
1
4
32
Dijkstras Algorithm
To find the shortest path from node 1 to node 5.
Step Fringe Set 1 2, 7 2
3, 7 3 3, 6 4 4,
5, 6
1
4
4
7
3
2
0.01
8
3
2
12
6
3
6
5
5
4.01
7
5
1
9.01
4
11.01
10.01
33
Dijkstras Algorithm
To find the shortest path from node 1 to node 5.
Step Fringe Set 1 2, 7 2
3, 7 3 3, 6 4 4,
5, 6 5 4, 6 6 6
1
4
4
7
3
2
0.01
8
3
2
12
6
3
6
5
4.01
7
5
1
9.01
4
10.01
Shortest path from 1 to 5 is 1-7-3-5 with the
minimum weight of 9.01
34
Dijkstras Algorithm (cont.)
The graph formed as a result of the algorithm is
also the shortest path spanning tree. Running
time of Dijkstras Algorithm For a graph G(V, E)
l V l n, l E l m Then the running time of
Dijkstras algorithm is given by O ((mn) log
n) Thus it is seen that it is slower than
previously viewed search algorithms. Note The
Fringe sets created at each iteration can be
stored in a heap. An interesting Applet that
simulates Dijkstras algorithm
Write a Comment
User Comments (0)
About PowerShow.com