Title: Self-Adjusting%20Computation%20Umut%20Acar%20Carnegie%20Mellon%20University
1Self-Adjusting ComputationUmut AcarCarnegie
Mellon University
- Joint work with Guy Blelloch, Robert Harper,
Srinath Sridhar, Jorge Vittes, Maverick Woo
2Dynamic Algorithms
- Maintain their input-output relationship as the
input changes - Example A dynamic MST algorithm maintains the
MST of a graph as user to insert/delete edges - Useful in many applications involving
- interactive systems, motion, ...
3Developing Dynamic Algorithms Approach I
- Dynamic by design
- Many papers
- Agarwal, Atallah, Bash, Bentley, Chan, Cohen,
Demaine, Eppstein, Even, Frederickson, Galil,
Guibas, Henzinger, Hershberger, King, Italiano,
Mehlhorn, Overmars, Powell, Ramalingam, Roditty,
Reif, Reps, Sleator, Tamassia, Tarjan, Thorup,
Vitter, ... - Efficient algorithms but can be complex
4Approach II Re-execute the algorithm when the
input changes
- Very simple
- General
- Poor performance
5Smart re-execution
- Suppose we can identify the pieces of execution
affected by the input change - Re-execute by re-building only the affected pieces
Execution (A,I)
Execution (A,Ie)
6Smart Re-execution
- Time re-execute O(distance between executions)
Execution (A,I)
Execution (A,Ie)
7Incremental Computation or Dynamization
- General techniques for transforming algorithms
dynamic - Many papers Alpern, Demers, Field, Hoover,
Horwitz, Hudak, Liu, de Moor, Paige, Pugh, Reps,
Ryder, Strom, Teitelbaum, Weiser, Yellin... - Most effective techniques are
- Static Dependence Graphs Demers, Reps,
Teitelbaum 81 - Memoization Pugh, Teitelbaum 89
- These techniques work well for certain problems
8Bridging the two worlds
- Dynamization simplifies development of dynamic
algorithms - but generally yields inefficient algorithms
- Algorithmic techniques yield good performance
- Can we have the best of the both worlds?
9Our Work
- Dynamization techniques
- Dynamic dependence graphs Acar,Blelloch,Harper
02 - Adaptive memoization Acar, Blelloch,Harper 04
- Stability Technique for analyzing performance
ABHVW 04 - Provides a reduction from dynamic to static
problems - Reduces solving a dynamic problem to finding a
stable solution to the corresponding static
problem - Example Dynamizing parallel tree contraction
algorithm Miller, Reif 85 yields an efficient
solution to the dynamic trees problem Sleator,
Tarjan 83, ABHVW SODA 04
10Outline
- Dynamic Dependence Graphs
- Adaptive Memoization
- Applications to
- Sorting
- Kinetic Data Structures with experimental results
- Retroactive Data Structures
11Dynamic Dependence Graphs
- Control dependences arise from function calls
12Dynamic Dependence Graphs
- Control dependences arise from function calls
- Data dependences arise from reading/writing the
memory
c
b
a
13Change propagation
Change
Propagation
c
b
c
b
a
a
14Change propagation
Change
Propagation
c
b
c
b
a
a
15Change propagation with Memoization
Change
Propagation
c
b
c
b
a
a
16Change propagation with Memoization
Change
Propagation
c
b
c
b
a
a
17Change propagation with Memoization
Change
Propagation
c
b
c
b
a
a
18Change Propagation with Adaptive Memoization
Change
Propagation
c
b
c
b
a
a
19The Internals
- 1. Order Maintenance Data Structure Dietz,
Sleator 87 - Time stamp vertices of the DDG in sequential
execution order - 2. Priority queue for change propagation
- priority time stamp
- Re-execute functions in sequential execution
order - Ensures that a value is updated before being read
- 3. Hash tables for memoization
- Remember results from the previous execution only
- 4. Constant-time equality tests
20Standard Quicksort
fun qsort (l) let fun qs (l,rest)
case l of NIL gt rest CONS(h,t) gt
let (smaller, bigger) split(h,t)
sbigger qs (bigger,rest) in qs
(smaller, CONS(h,sbigger)) end in
qs(l,NIL) end
21Dynamic Quicksort
fun qsort (l) let fun qs (l,rest,d)
read(l, fn l' gt case l' of NIL gt
write (d, rest) CONS(h,t) gt let
(less,bigger) split (h,t) sbigger
mod (fn d gt qs(bigger,rest,d)) in
qs(less,CONS(h,sbigger,d)) end in
mod (fn d gt qs (l,NIL,d)) end
22Performance of Quicksort
- Dynamized Quicksort updates its output in
expected - O(logn) time for insertions/deletions at the end
of the input - O(n) time for insertions/deletions at the
beginning of the input - O(logn) time for insertions/deletions at a random
location - Other Results for insertions/deletions anywhere
in the input - Dynamized Mergesort expected O(logn)
- Dynamized Insertion Sort expected O(n)
- Dynamized minimum/maximum/sum/... expected
O(logn)
23Function Call Tree for Quicksort
24Function Call Tree for Quicksort
25Function Call Tree for Quicksort
26Insertion at the end of the input
27Insertion in the middle
28Insertion in the middle
29Insertion at the start, in linear time
Input 15,30,26,1,5,16,27,9,3,35,46
15
1
30
5
26
35
3
46
16
27
9
30Insertion at the start, in linear time
Input 20,15,30,26,1,5,16,27,9,3,35,46
20
15
15
30
1
30
1
26
35
16
5
26
35
5
46
27
3
46
16
27
9
3
9
31Kinetic Data Structures Basch,Guibas,Herschberger
99
- Goal Maintain properties of continuously moving
objects - Example A kinetic convex-hull data structure
maintains the convex hull of a set of
continuously moving objects
32Kinetic Data Structures
- Run a static algorithm to obtain a proof of the
property - Certificate Comparison Failure time
- Insert the certificates into a priority queue
- Priority Failure time
- A framework for handling motion Guibas,
Karavelas, Russel, ALENEX 04
while queue ¹ empty do certificate remove
(queue) flip (certificate) update the
certificate set (proof)
33Kinetic Data Structures via Self-Adjusting
Computation
- Update the proof automatically with change
propagation - A library for kinetic data structures Acar,
Blelloch, Vittes - Quicksort expected O(1), Mergesort expected
O(1) - Quick Hull, Chans algorithm, Merge Hull
expected O(logn)
while queue ¹ empty do certificate remove
(queue) flip (certificate) propagate ()
34Quick Hull Find Min and Max
J
O
K
P
A
L
N
A B C D E F G H I J K L M N O P
35Quick Hull Furthest Point
J
O
K
P
A
L
N
A B D F G H J K M O P
36Quick Hull Filter
J
O
K
P
A
L
N
A B F J J O P
37Quick Hull Find left hull
J
O
B
K
P
A
L
N
A B B J J O O P
38Quick Hull Done
J
O
B
K
P
A
L
N
A B B J J O O P
39Static Quick Hull
fun findHull(line as (p1,p2),l,hull) let
pts filter l (fn p gt Geo.lineside(p,line))
in case pts of EMPTY gt CONS(p1,
hull) _ gt let pm max (Geo.dist
line) l left findHull((pm,p2),l,hull,dest)
full findHull((p1,pm),l,left) in
full end end fun quickHull l let
(mx,xx) minmax (Geo.minX, Geo.maxX) l in
findHull((mx,xx),points,CONS(xx,NIL) end
40Kinetic Quick Hull
fun findHull(line as (p1,p2),l,hull,dest) let
pts filter l(fn p gt Kin.lineside(p,line)) in
modr (fn dest gt read l (fn l gt case l of
NIL gt write(dest,CONS(p1, hull)) _ gt read
(max (Kin.dist line) l) (fn pm gt let
gr modr (fn d gt findHull((pm,p2),l,hull,d))
in findHull((p1,pm),l,gr,dest)))) end
end fun quickHull l let (mx,xx)
minmax (Kin.minX, Kin.maxX) l in modr(fn d gt
read (mx,xx)(fn (mx,xx) gt split ((mx,xx),l,
CONS(xx,NIL),d)))) end
41Kinetic Quick Hull
Certificates /Event
Input size
42Dynamic and Kinetic Changes
- Often interested in dynamic as well as kinetic
changes - Insert and delete objects
- Change the motion plan, e.g., direction, velocity
- Easily programmed via self-adjusting computation
- Example Kinetic Quick Hull code is both dynamic
and kinetic - Batch changes
- Real time changes Can maintain partially correct
data structures (stop propagation when time
expires)
43Retroactive Data StructuresDemaine, Iacono,
Langerman 04
- Can change the sequence of operations performed
on the data structure - Example A retroactive queue would allow the user
to go back in time and insert/remove an item
44Retroactive Data Structures via Self-Adjusting
Computation
- Dynamize the static algorithm that takes as input
the list of operations performed - Example retroactive queues
- Input list of insert/remove operations
- Output list of items removed
- Retroactive change change the input list and
propagate
45Rake and Compress Trees Acar,Blelloch,Vittes
- Obtained by dynamizing tree contraction ABHVW
04 - Experimental analysis
- Implemented and applied to a broad set of
applications - Path queries, subtree queries, non-local queries
etc. - For path queries, compared to Link-Cut Trees
Werneck - Structural changes are relatively slow
- Data changes are faster
46Conclusions
- Automatic dynamization techniques can yield
efficient dynamic and kinetic algorithms/data
structures - General-purpose techniques for
- transforming static algorithms to dynamic and
kinetic - analyzing their performance
- Applications to kinetic and retroactive data
structures - Reduce dynamic problems to static problems
- Future work Lots of interesting problems
- Dynamic/kinetic/retroactive data structures
47Thank you!