CS61C Midterm - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

CS61C Midterm

Description:

CS61C Midterm #2 Review Session. A little Cache goes a long ... Cheap (Large) Actual Memory Systems. Fast, Expensive (Small) Slow, Cheap (Large) Idea: ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 13
Provided by: dadk
Category:
Tags: cs61c | midterm

less

Transcript and Presenter's Notes

Title: CS61C Midterm


1
CS61C Midterm 2 Review Session
  • A little Cache goes a long way

2
The Ideal Memory System
Fast Cheap (Large)
3
Actual Memory Systems
Fast, Expensive (Small)
Slow, Cheap (Large)
4
Idea Multilevel Memory (cache)


5
The Cache
CPU
  • Store recently used data in fast memory
  • Cache Hit
  • Address were looking for is in cache
  • Cache Miss
  • Not found read memory and insert into cache
  • This works because

Tag
Data
Main Memory
6
Locality
Just referenced x
address
Spatial Locality
data
Reference to data near x likely
Temporal Locality
stack
Likely to reference x again soon
code
time
7
Computing Average Access Time
Q Suppose we have a cache with a 5ns access
time, main memory with a 60ns access time, and a
cache hit rate of 95. What is the average
access time?
8
Cache Design Issues
  • Associativity
  • Fully associative, direct-mapped, n-way set
    associative
  • Block Size
  • Replacement Strategy
  • LRU, etc.
  • Write Strategy
  • Write-through, write-back

9
An Example
10
Multiple Choice (1)
  • LRU is an effective cache replacement policy
    primarily because programs
  • exhibit locality of reference
  • usually have small working sets
  • read data much more frequently than writing data
  • can generate addresses that collide in the cache

11
Multiple Choice (2)
  • Increasing the associativity of a cache improves
    performance primarily because programs
  • exhibit locality of reference
  • usually have small working sets
  • read data much more frequently than writing data
  • can generate addresses that collide in the cache

12
Multiple Choice (3)
  • Increasing the block size of a cache improves
    performance primarily because programs
  • exhibit locality of reference
  • usually have small working sets
  • read data much more frequently than writing data
  • can generate addresses that collide in the cache
Write a Comment
User Comments (0)
About PowerShow.com