An Overview of Cache Replacement Algorithms Based on Recency and Frequency PowerPoint PPT Presentation

presentation player overlay
1 / 52
About This Presentation
Transcript and Presenter's Notes

Title: An Overview of Cache Replacement Algorithms Based on Recency and Frequency


1
An Overview of Cache Replacement Algorithms Based
on Recency and Frequency
  • ECE 7970
  • Manjeera Jeedigunta
  • Date 10-23-06

2
Outline
  • Introduction to Caches
  • Cache Replacement Algorithms
  • Recency Based Algorithm
  • LIRS
  • Frequency Based Algorithm
  • FBR
  • Recency/Frequency Based Algorithm
  • LRFU
  • Conclusions

Introduction to Caches
3
Introduction to CachesPresent Situation
  • Memory Intensive Applications
  • Video, Audio applications
  • Database applications
  • Scientific computations
  • Better Processor Technology
  • High processor speed

4
Introduction to CachesState of the Art
SRAM Strengths High Speed Drawbacks Low
Density High Cost Used in Cache
DRAM Strengths Low Cost High
Density Drawbacks Low Speed Used in Main
Memory
5
Introduction to CachesNeed of the Hour
  • Large memory Size
  • To cater to memory intensive applications
  • Fast memory
  • To bridge the memory and processor speed gap.

Solution Better Memory Hierarchy Designs Caches
6
Introduction to Caches
What is a Cache?
  • High Speed
  • Expensive
  • Small
  • Stores a copy of data requested
  • by the CPU

7
Introduction to CachesPlacement
8
Introduction to CachesBasic Concept
Principle of Locality
Temporal Locality If a particular data is
accessed once, it will be accessed again in the
near future
Spatial Locality If a particular data is
accessed once, data in and around that
block will also be accessed
9
Cache Management Issues
Block Placement Where to place the block in the
cache
Block Identification How to identify if a block
is already present in cache
Block Replacement Which block to discard when
cache is full
Write Strategy What to write back to the main
memory
10
Cache Performance
Miss Percentage Ratio of unsuccessful
accesses of cache to total of accesses
Hit Percentage Ratio of successful accesses
of cache to total of accesses
11
Cache Performance
  • Increase in Hit percentage
  • Reduced access time
  • Reduced Execution time
  • Increase Performance levels
  • Decrease in Miss Percentage
  • Increase in memory access time
  • Increase Execution time
  • Decrease Performance levels

12
Cache Performance
  • All four phases of Cache Management are important
  • Block Placement
  • Block Identification
  • Block Replacement
  • Write strategy

Block Replacement
13
Outline
  • Introduction to Caches
  • Cache Replacement Algorithms
  • Recency Based Algorithm
  • LIRS
  • Frequency Based Algorithm
  • FBR
  • Recency/Frequency Based Algorithm
  • LRFU
  • Conclusions

Cache Replacement Algorithms
14
Working of the Cache
Request for data
Cache
CPU
Controller
Address Field
Tag Index
15
Working of the CacheIn Case of Cache Hit
Tag Index
Tag
Data
Valid
Match

CPU
16
Working Of a CacheIn Case of a Miss
Cache Replacement Algorithm
Cache Full
Main Memory
Data Block
Cache
17
Types of Cache Replacement Algorithms
Recency Based Algorithms The time from the last
reference Algorithms Least Recently
Used-LRU Low Inter-Reference Recency-LIRS
Frequency Based Algorithms The of times a
block is referenced Algorithms Least Frequently
Used-LFU Frequency Based Replacement-FBR
18
Outline
  • Introduction to Caches
  • Cache Replacement Algorithms
  • Recency Based Algorithm
  • LIRS
  • Frequency Based Algorithm
  • FBR
  • Recency/Frequency Based Algorithm
  • LRFU
  • Conclusions

Recency Based Algorithms
19
Principle Behind Recency Based Algorithm
  • Based on recency of reference of each block in
    the cache
  • Most Popular- LRU
  • Associates each block with the time of last
    reference
  • Longest time of reference-Discarded

20
Outline
  • Introduction to Caches
  • Cache Replacement Algorithms
  • Recency Based Algorithm
  • LIRS
  • Frequency Based Algorithm
  • FBR
  • Recency/Frequency Based Algorithm
  • LRFU
  • Conclusions

LIRS
21
Low Inter-Reference RecencySet Replacement
Algorithm
  • Based on IRR information
  • IRR of other blocks referenced in the time
    between the two consecutive references to that
    block

22
Low Inter-Reference RecencySet Replacement
Algorithm
  • Referenced blocks are divided into
  • High Inter-Reference Recency-HIR
  • Low Inter-Reference Recency-LIR

L Cache Size
LIR Set Size
HIR Set Size
23
HIR Vs LIR
  • HIR
  • High Inter-Reference
  • Recency
  • Low probability of being
  • referenced
  • Two Types
  • HIR-R Resident, on the
  • cache
  • HIR-N non-resident
  • not on the cache
  • LIR
  • Low Inter-Reference
  • Recency
  • High probability of being
  • referenced
  • Forms major part of
  • cache

24
LIRS algorithm
LIR
HIR-R
HIR-R
HIR-R
HIR-R
. . . .
HIR-N
List Q

HIR-R
LIR
LIRS Stack S
25
LIRS Advantages
  • It dynamically and responsively distinguishes
    blocks by comparing their possibilities to be
    referenced in the near future
  • Minimizes implementation overhead

26
Outline
  • Introduction to Caches
  • Cache Replacement Algorithms
  • Recency Based Algorithm
  • LIRS
  • Frequency Based Algorithm
  • FBR
  • Recency/Frequency Based Algorithm
  • LRFU
  • Conclusions

Frequency Based Algorithms
27
Frequency Based Replacement Algorithms
  • Based on frequency of reference of each block in
    the cache
  • Most Popular- LFU
  • Associates each block with the number of times it
    has been referenced
  • Smallest of references-Discarded

28
Outline
  • Introduction to Caches
  • Cache Replacement Algorithms
  • Recency Based Algorithm
  • LIRS
  • Frequency Based Algorithm
  • FBR
  • Recency/Frequency Based Algorithm
  • LRFU
  • Conclusions

FBR
29
Frequency Based ReplacementFBR
  • Each block is associated with a counter that
    keeps track of the number of times the block has
    been referenced
  • Divided into three sections
  • New section
  • Middle section
  • Old section

30
FBR Algorithm
  • New section Most recently accessed blocks
  • Cache hit In case block in new section its
    counter is not incremented and vice-versa
  • Cache miss the block with smallest count outside
    the new section is discarded.

31
FBR Algorithm
  • Old section Consists of blocks that have not
    been referenced in the near future.
  • Cache miss a block from old section with
    smallest count is discarded
  • In case two blocks with same count the older one
    is discarded.

32
Advantages of FBR Algorithm
  • It separates locality and identifies frequently
    accessed blocks
  • Run time overhead reasonable
  • Implementation complexity
  • reasonable

33
Outline
  • Introduction to Caches
  • Cache Replacement Algorithms
  • Recency Based Algorithm
  • LIRS
  • Frequency Based Algorithm
  • FBR
  • Recency/Frequency Based Algorithm
  • LRFU
  • Conclusions

Recency/Frequency Based Algorithms
34
Least Recently Used/Least Frequently Used-LRFU
  • Associates CRF number with each block
  • CRF Combined Recency Frequency
  • xtime interval between current past reference

Weighing Function F(x)
References
CRF
35
CRF value Computation
  • b block being referenced
  • is the time at which b is being referenced
  • are time instances of past
    references of block b such that

Block with lowest CRF is replaced
36
Properties of LRFUProperty 1
  • The LRFU algorithm replaces the same block as LFU
    if F(x)c for all x and c i is positive.

37
Properties of LRFUProperty 2
  • The LRFU replaces the same block as LRU if the
    weighing function satisfies the following
    condition

38
Illustration of Property 2
  • a referenced at time t
  • b referenced at every instance of time, most
    recent reference is at t-1
  • Recency based algorithm b is replaced instead of
    a
  • LRFU

39
Weighing functions
  • Class of functions satisfying Property 1 and
    Property 2
  • is the control parameter, 0-1
  • p is greater than or equal to 2

40
Property 3
  • CRF value of block b at the time of the kth
    reference is derived from the CRF value of block
    b at the time of the (k-1)th reference , if
    F(xy)F(x)F(y). is given as follows

41
Property 4
  • As,
  • If and neither a nor b has been
    referenced after time t then
  • for all

42
AlgorithmCache Hit case
  • Block b is requested
  • Checked for in Cache
  • If present
  • CRF value recomputed
  • Time of last reference recorded
  • Heap data structure is used to order the values
    of counts again.

43
AlgorithmCache Miss case
  • Block b is requested
  • Checked for in Cache
  • If not present
  • It is fetched from main memory
  • CRF value initialized
  • Time of reference initialized
  • Root block of heap data structure is replaced
    with new one and order is restored.

44
Experimental Conditions
  • Workloads
  • File System traces
  • Database traces
  • Algorithms compared with
  • LRU
  • LFU
  • LRU-2
  • FBR

45
Experiments Conducted
  • Performance metric Hit Ratio
  • Experiment 1
  • Comparison of hits ratios of LRFU with other
    algorithms, for varying cache sizes
  • Experiment 2
  • Effect of on the performance of LRFU in
    terms of hit ratios.

46
Results Experiment 1
LRFU performs as good as The other algorithms
Figure. 1 Comparison LRFU with other
algorithms File system trace
47
ResultsExperiment 1
LRFU performs as good as The other algorithms
Figure. 1 Comparison LRFU with other
algorithms Data base trace
48
ResultsExperiment 2
Hit rate increases with lambda then decreases and
become constant as lambda increases
With increase in cache size the peak hit rate is
obtained in lower lambda
Figure. 1 Effect of lambda on the performance
of LRFU
49
Summary of LRFU
  • Offers a spectrum of policies using weighing
    function and lambda
  • Exploits the benefits of recency and frequency
    based algorithms
  • LRFU performs competitively with other algorithms
  • For higher cache sizes, peak hit rate is obtained
    for lower lambda

50
Outline
  • Introduction to Caches
  • Cache Replacement Algorithms
  • Recency Based Algorithm
  • LIRS
  • Frequency Based Algorithm
  • FBR
  • Recency/Frequency Based Algorithm
  • LRFU
  • Conclusions

Conclusions
51
Conclusions
  • Cache management vital in computers systems
  • Cache replacement algorithms
  • Recency based
  • Frequency based
  • Exploiting the benefits of both the types.

52
  • Thank You!!!
Write a Comment
User Comments (0)
About PowerShow.com