Title: Cache and Caching
1Cache and Caching
- David Sands
- CS 147 Spring 08
- Dr. Sin-Min Lee
2The Cache
- Faster memory access times.
- Remembering frequently accessed data.
- Block of memory for temporary storage or indexing.
3What are Pages?
- Not made of paper
- Just a block of data being accessed.
- Memory from disk -gt RAM or container.
- When we are out of page space, what do we do?
4Page Replacement Algorithms
- Optimal Page Replacement-- swap page that will
not be used for a while with a page that is about
to be used. - First in, First out (FIFO)-- queue
- Second-Chance-- circular queue. If reference
bit is set, put at last place. If not set, cache
it. - Clock-- just like second-chance, but uses a
hand iterator instead of putting at back of
queue.
5More Algorithms
woot
- Not Recently Used -- favors keeping page in
cache that are recent (keep track of referenced /
modified) - Least Recently Used-- assumes new pages in cache
will be used again in near future. Hard to
implement. - Random-- swaps random pages compare to FIFO
and LRU - fast - Not Frequently Used-- counter variable for
uses of a page. Swaps out the underutilized
pages. - Aging-- Favoritism for recently referenced
pages. Priority. Swaps out old pages first.
6Primitive Example
-- Fetch D then C then B then A from the tree
memory. -- What we access is chained above the
tree into tree-like cache. -- If we do FIFO
algorithm, we replace D if theres no cache space
left.
7Cache Types
- Memory Cache-- RAM to CPU
- Disk Cache-- Disk to CPU
- Memory, hardware, software, disk, page, and
virtual memory caches
8Disk Cache
- Hard disk Buffer ? cache
- The page cache is controlled by the Kernel
9Other Cache Examples
- DNS daemon mapping IP addresses.
- Web Browser Recently visited website.
- Search Engines popular sites.
- Databases indexing and data dictionary.
10L1, L2, L3 Cache
- Provides tiers of cache memory
- As memory size and distance from CPU increases,
access time becomes longer. - Cost-benefit problem.
- L3 Cache not required, but has larger storage, so
we like it.
11L1, L2, L3 Cache (cont.)
?L1 Inside processor chip(like registers)
- L2 Outside processor(can be on motherboard)
? L3 between L2 and main memory
12Cache Write Policy
- Datum is written to cache
- How do we update the entry in main memory?
- Write through-- if there is a copy in the cache,
updates the cache data on the fly.-- overloads
BUS with multiple requests. - Write back-- Updates the cache data with final
data only.-- reduces BUS traffic, hides
inconsistency.
13Hit or Miss? - searching
Does the desired Tag in the cache memory match an
index in Main memory? -- If so, use the data
from cache- HIT. -- Else, search the main memory
for the data- MISS.
HIT RATIO Percent of accesses that HIT.
14Miss Rate vs. Cache Size
Method of cache mapping to data elements
15Time Analysis for One L1 Cache
- L1 CacheAvg. Cost rCh (1-r)Cm
- r hit ratio
- Ch L1 cache access time
- Cm memory access time
- This is a probability distribution function.
16Multiple Cache Analysis
- Just extend the probability function
- L1 and L2 Cache Setup
- Avg. Cost r1Ch1 r2Ch2 (1-r1-r2)Cm
- 1 L1 cache
- 2 L2 cache
- Probability for memory fetch 1-r1-r2
17 18References
- http//en.wikipedia.org/wiki/Cache
- http//en.wikipedia.org/wiki/Paging
- http//en.wikipedia.org/wiki/Page_replacement_algo
rithm - http//en.wikipedia.org/wiki/CPU_cache