Client Cache Management - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Client Cache Management

Description:

Client Cache Management Improving the broadcast for one probability access distribution will hurt the performance of other clients with different access distributions. – PowerPoint PPT presentation

Number of Views:144
Avg rating:3.0/5.0
Slides: 23
Provided by: sanj137
Learn more at: https://web.mst.edu
Category:

less

Transcript and Presenter's Notes

Title: Client Cache Management


1
Client Cache Management
  • Improving the broadcast for one probability
    access distribution will hurt the performance of
    other clients with different access
    distributions.
  • Therefore the client machines need to cache pages
    obtained from the broadcast.

2
Client Cache Management
  • With traditional caching clients cache the data
    most likely to be accessed in the future.
  • With Broadcast Disks, traditional caching may
    lead to poor performance if the servers
    broadcast is poorly matched to the clients access
    distribution.

3
Client Cache Management
  • In the Broadcast Disk system, clients cache the
    pages for which the local probability of access
    is higher than the frequency of broadcast.
  • This leads to the need for cost-based page
    replacement.

4
(No Transcript)
5
(No Transcript)
6
(No Transcript)
7
Client Cache Management
  • One cost-based page replacement strategy replaces
    the page that has the lowest ratio between its
    probability of access (P) and its frequency of
    broadcast (X) - PIX
  • PIX requires the following
  • 1 Perfect knowledge of access probabilities.
  • 2 Comparison of PIX values for all cache resident
    pages at cache replacement time.

8
Example
  • One page is accessed 1 of the time at a
    particular time and is also broadcast 1 of the
    time.
  • Second page is accessed only 0.5 but is
    broadcast 0.1 of time
  • Which page to be replaced, 1st will be replaced
    in favor of 2

9
(No Transcript)
10
(No Transcript)
11
Client Cache Management
  • Another page replacement strategy adds the
    frequency of broadcast to an LRU style policy.
    This policy is known as LIX.
  • LIX maintains a separate list of cache-resident
    pages for each logical disk
  • A page enters the chain corresponding to the disk
    in which it is broadcast
  • Each list is ordered based on an approximation of
    the access probability (L) for each page.

12
Cont.
  • When a page is hit, it is moved to top of chain
    and when a new page is entered gt
  • A LIX value is computed by dividing L by X, the
    frequency of broadcast for the page at the bottom
    of each chain
  • The page with the lowest LIX value is replaced.

13
(No Transcript)
14
(No Transcript)
15
Prefetching
  • PIX/LIX for only demand driven pages
  • An alternative approach to obtaining pages from
    the broadcast.
  • Goal is to improve the response time of clients
    that access data from the broadcast.
  • Methods of Prefetching
  • Tag Team Caching
  • Prefetching Heuristic

16
Prefetching
  • Tag Team Caching - Pages continually replace each
    other in the cache.
  • For example two pages x and y, being broadcast,
    the client caches x as it arrives on the
    broadcast. Client drops x and caches y when y
    arrives on the broadcast.

17
(No Transcript)
18
  • Expected delay in demand driven
  • Suppose a client is interested in accessing X and
    Y and Px Py 0.5 with one single slot for
    cache
  • In demand driven , cache X and if needs Y, wait
    for Y and replace the cache by Y
  • Expected delay on a cache miss is ½ of the
    rotation of the disk
  • Expected delay over all accesses is
  • CiMiDi, where C is the access probability, M
    is the probability of cache miss and D is the
    expected broadcast delay for page i
  • For pages x and y, it is 0.5 0.50.5
    0.50.50.5 0.25

19
Expected Delay in Tag-team caching
  • 0.50.50.25 0.50.50.25 0.125, that is
    average cost is ½ of the demand driven scheme
  • Why a miss can occur at any time in demand
    driven whereas misses can only occur during a
    half of the broadcast

20
Prefetching
  • Simple Prefetching Heuristic
  • Performs a calculation for each page that arrives
    on the broadcast based on the probability of
    access for the page (P) and the amount of time
    that will elapse before the page will come around
    again (T).
  • If the PT value of the page being broadcast is
    higher than the page in cache with the lowest PT
    value, then the page in cache is replaced.

21
Example
22
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com