CSE621 : Parallel Algorithms - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

CSE621 : Parallel Algorithms

Description:

See the CREW PRAM algorithm with n processors. ... Triangular Matrix : a matrix such that Iij=0 for all pairs ij such that i j. ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 21
Provided by: poste74
Category:

less

Transcript and Presenter's Notes

Title: CSE621 : Parallel Algorithms


1
CSE621Parallel Algorithms Lecture 5Matrix
Operation(II)
September 27, 1999
2
Overview
  • Review of the previous lecture
  • Pointer Jumping
  • Gaussian Elimination
  • Matrix Inversion
  • Summary

3
Review of the previous lecture
  • Parallel Prefix Computations
  • PRAM, Tree, 1-D, 2-D algorithms
  • Carry-Lookahead Addition Application
  • Parallel Matrix-Vector Product
  • PRAM, 1-D, 2-D MOT algorithms
  • Parallel Matrix Multiplication
  • PRAM, 2-D, 3-D MOT algorithms

4
Pointer Jumping
  • When traverse a linked list, a need to reach the
    end of the list is required.
  • By the nature, the traverse of the linked list is
    sequential.
  • Pointer Jumping is a technique to find a path to
    the end in parallel.
  • Applications
  • Finding roots in a forest
  • List ranking

5
Finding Roots in a Forest
  • Consider a rooted forest F on n nodes, where we
    denote the nodes by 1, , n
  • F is implemented using the parent array
    implementation Parent1n, where Parentjj
    when j is a root node.
  • Given a node i in the forest, where Ri denotes
    the root of the tree in F containing the node i.
  • The entire array R1n denotes the root of each
    node.
  • See the CREW PRAM algorithm with n processors.
  • Time complexity W(n) O(log n), S(n) O(n/log
    n)

6
(No Transcript)
7
(No Transcript)
8
List Ranking
  • Determine the distance from each element in the
    list to the end of the list.
  • Using pointer jumping, obtain an O(log n)
    algorithm on EREW PRAM.
  • Assume that the linked list is maintained using
    the array L1n and the array Next1n.
  • The variable Start points the index in Next
    corresponding to the beginning of the linked
    list.
  • Nexti0 to indicate the end of the linked list.
  • The sequential algorithm takes O(n).
  • Time complexity W(n) O(log n), S(n) O(n/log
    n)

9
(No Transcript)
10
(No Transcript)
11
Gaussian Elimination
  • A method of solving systems of linear equations
    by adding multiples of one row of a matrix to
    another row to make the matrix upper triangular.
  • Elimination step
  • Normalize row i by dividing each element by a
    constant so that the diagonal element 1.
  • Add multiples of row i to each row beneath it so
    that a column of zeros appears beneath the ith
    element.
  • aij - aikakj/akk
  • See the steps.

12
(No Transcript)
13
Gaussian Elimination (contd)
  • Algorithm on linear diagonal array
  • Inputs are provided by outside.
  • Actions by the processors on the diagonal (at
    left ends of rows)
  • signal to processor on right to pass aijs
    through downwards unchanged until first nonzero
    element (akk) reaches left end processor (this
    takes care of reordering rows to keep diagonals
    nonzero).
  • calculate 1/akk for first nonzero element and
    pass to right as signal and multiplicative factor
  • for every element aik after first nonzero one,
    pass it to right (this is used in calculating aij
    - aikakj/akk)
  • Actions by other than diagonal processors
  • pass any data and signals received from the left
    on to the right
  • the signals are
  • the signal to pass an element through unchanged
  • the signal to calculate akj/akk and both store
    and pass through the value
  • the signal to calculate aij - aikakj/akk using
    the value stored

14
(No Transcript)
15
Matrix Inversion using Gaussian Elimination
  • Using Gaussian Elimination
  • Perform Gaussian Elimination on the N X 2N matrix
    A A I
  • If A is non-singular, the sequence of elementary
    row operations used to reduce A to I will also
    transform I to A-1
  • See the extended figure

16
Matrix Inversion using Matrix Multiplication
  • Using Matrix Multiplication
  • Using a characteristic polynomial
  • CharA(x) det(xI - A) xn
    c1x(n-1) cn-1x cn
  • By the classical Cayley-Hamilton theorem, the
    matrix A is a zero of its own characteristic
    polynomial that is
  • An c1A(n-1) cn-1A cnI 0
  • Setting x0, we immediately obtain cndet A.
    Hence, cn ltgt 0 if and only if, A is invertible.
  • A-1 (-1/cn)(A(n-1) c1A(n-2) cn-1I)
  • To compute A-1, it is sufficient to compute the
    coefficients c1, , cn of the characteristic
    polynomial and the powers A2, , A(n-1)
  • Computation of Matrix powers log2 n on CREW
    PRAM with n4 processors
  • Computation of coefficients due to Leverier
    Theorem

17
Matrix Inversion (contd)
  • Leverier Theorem Given the n x n matrix A, let
    si denote the trace of Ai, that is, the sum of
    the diagonal entries of Ai, i1, , n. Then,
    we have
  • where c1, , cn are the coefficients of the
    characteristic polynomial of A and sk
    trace(Ak) sum of diagonal elements in Ak.
  • The trace of AI can be computed in O(log n)
    parallel addition steps.
  • Can compute the cis from the sis by inverting a
    lower triangular matrix on CREW PRAM using n4
    processors.

18
Matrix Inversion of Lower Triangular Matrix
  • Lower Triangular Matrix a matrix such that
    Iij0 for all pairs ij such that igtj.
  • The matrix is invertible iff the product of
    diagonal elements is nonzero.
  • For convenience, assume that n2k, kgt0.
  • Let L
  • We can verify that L-1

L1 0
A L2
L-11 0
-L-12 AL -11L-12
19
Matrix Inversion of Lower Triangular Matrix
(contd)
  • Use Divide and Conquer Algorithm
  • Recursively invert two lower triangular matrices
    of size n/2 and
  • two parallel matrix multiplications
  • CREW PRAM Algorithm with n3 processors using
    the matrix multiplication algorithm.
  • Time complexity
  • W(n) W(n/2) Q(log n) Q(log2 n)

20
Summary
  • Pointer Jumping
  • Finding Roots in the forest
  • List Ranking
  • Gaussian Elimination
  • On the Linear Diagonal Array
  • Matrix Inversion
  • Using Gaussian Elimination
  • Using Matrix Multiplication
  • of Lower Triangular Matrix
Write a Comment
User Comments (0)
About PowerShow.com