Page allocation to reduce access time of physical caches - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

Page allocation to reduce access time of physical caches

Description:

Page allocation to reduce access time of physical caches. Brian K Bray. William L Lynch ... A function which does not have any effect. ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 21
Provided by: lgol7
Category:

less

Transcript and Presenter's Notes

Title: Page allocation to reduce access time of physical caches


1
Page allocation to reduce access time of physical
caches
  • Brian K Bray
  • William L Lynch
  • M J Flynn

2
Paging with TLB
3
Key words
  • Identity function
  • A function which does not have any effect.
    It always returns the same value that was used as
    its argument
  • Unified cache
  • Balances the load between instruction and data
    fetches

4
Associative-mapped cache
  • In associative cache, Any block from main memory
    can be placed anywhere in the cache.
  • This block is uniquely identified by its MM block
  • number (tag)

5
Conventional address translation
6
Main points
  • Set-selection bits are required to
    begin
  • cache access.
  • Associativity bits represent the
    increase
  • in cache size
  • The Bin bits indicates the page location
  • No of Bins Cache size per degree of
    associativity / page size

7
  • Requires translation before cache access
  • This requires cycle extension or an extra
    pipeline stage to do address translation.
  • Offsetting some of the latency advantages of the
    low associativity cache

8
Page coloring
9
(No Transcript)
10
Effects of page coloring
  • Translation in parallel with cache allow more
    time to access TLB
  • Improve the cache performance

11
Inclusion Benefit
  • Reduce the cache coherence complexity of two
    level cache organizations.
  • Screen unnecessary cache coherency traffic from
    the first level cache.
  • Formula for necessary associativity of second
    level cache
  • A2gt (size1/ Page size) /
    (B1/B2)

12
  • More general definition
  • A2gt (size1/X) (B2/B1)
  • Here X Minimum (size1/A1, Size of unit where
    virtual and physical addresses are
    equal)
  • With Page coloring,
  • X minimum( size1/A1, 2b page size)
    for bgt0

13
Bench Marks
  • A trace-driven simulator produced data comparing
    colored and uncolored page mapping.

14
(No Transcript)
15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
Memory Partitioning
  • Main memory is partitioned into 2b sets ( reduces
    the effective main memory)
  • Page allocation no longer consists of selection
    of the first element on a free list (fully
    associative)
  • Rather, a colored page must be selected from the
    correctly-colored free list (set associative)
  • Causes extra paging

19
(No Transcript)
20
Conclusion
  • Page coloring removes the need for address
    translation before cache access.
  • Proper use of page coloring retain overall low
    latency and gain repeatable performance with
    significant OS complexity
  • TLB can be accessed in parallel with cache
  • Reduces the associativity needed by the second
    level cache
Write a Comment
User Comments (0)
About PowerShow.com