cache memory

30
Cache Memory E048-Shailesh Tanwar E061-Yash Nair E017-Shairal Neema E019-Aashita Nyati

Upload: shailesh-tanwar

Post on 15-Aug-2015

65 views

Category:

Engineering


0 download

TRANSCRIPT

Page 1: Cache memory

Cache Memory

E048-Shailesh Tanwar

E061-Yash Nair

E017-Shairal Neema

E019-Aashita Nyati

Page 2: Cache memory

Agenda

Memory Hierarchy

What is Cache Memory

Working of Cache

Structure of Cache

Cache Write Policies

Levels of Cache

Cache Organization

Mapping techniques

Replacement algorithms

Page 3: Cache memory

Memory Hierarchy – Diagram

Decreasing cost per bitIncreasing capacityIncreasing access timeDecreasing frequency of access of the memory by the processor

Page 4: Cache memory

What is Cache Memory?

Cache memory is used in order to achieve higher performance of CPU by allowing the CPU to access data at faster speed.

It is placed closest to the processor in the computer assembly.

It is way too costly.

It is also a type of memory but keeping in mind the cost factor it cannot be used as a primary memory.

CPU CacheMain

Memory

Page 5: Cache memory

AN INSIGHT INTO THE WORKING OF CACHE

CPU CacheMain

Memory

Page 6: Cache memory

Structure of the Cache memory

Tag Data Block

Contains address of actual data fetched from Main Memory

Contains actual data fetchedFrom the Main Memory

Page 7: Cache memory

Cache write policies

When we write ,should we write to cache or memory?

Write through cache – write to both cache and main memory. Cache and memory are always consistent.

Write back cache – write only to cache and set a “dirty bit”. When the block gets replaced from the cache ,write it out to memory.

Page 8: Cache memory

Levels of Cache

CPU Level 1Cache

Level 2Cache

Level 3Cache

Main Memory

Fastest FastLess Fast

Slow

Page 9: Cache memory

Cache Organization

Processor Cache

Address

Addressbuffer

Control Control

Data

Databuffer

System bus

Page 10: Cache memory

Mapping Techniques

Direct mapping Associative mapping Set associative mapping

Page 11: Cache memory

Direct mapping

Simplest technique In this , each block of main memory is mapped

into only one possible cache line .

i = j modulo m

where , i= cache memory

j= main memory

m=no. of lines in the cache

Page 12: Cache memory
Page 13: Cache memory

Address length = (s + w) bits Number of addressable units = 2^(s+w)

words or bytes Block size = line size = 2w words or bytes Number of blocks in main memory = 2^(s+

w)/2^w = 2s Number of lines in cache = m = 2^r Size of tag = (s – r) bits

Page 14: Cache memory

ASSOCIATIVE MAPPING

It overcomes the disadvantage of direct mapping.

It permits each main memory block to be loaded into any line of the cache .

Page 15: Cache memory
Page 16: Cache memory

Address length = (s + w) bits Number of addressable units = 2^(s+w)

words or bytes Block size = line size = 2^w words or bytes Number of blocks in main memory =

2^(s+ w)/2^w = 2^s Number of lines in cache = undetermined Size of tag = s bits

Page 17: Cache memory

SET ASSOCIATIVE MAPPING

The relationship which is followed here is

m= v*k

i= j modulo v

Where ,

i= cache set no.

j= main memory

m= no. of lines in the cache

v= no. of set

k= no. of lines in each set

This is called k-way set associative mapping .

Page 18: Cache memory
Page 19: Cache memory

Address length = (s + w) bits Number of addressable units = 2s+w

words or bytes Block size = line size = 2^w words or

bytes Number of blocks in main memory = 2^s Number of lines in set = k Number of sets = v = 2d Number of lines in cache = kv = k * 2d Size of tag = (s – d) bits

Page 20: Cache memory

Replacement algorithms

Discard itemsAdd new ones

Cache Memory

Cache Full…

Page 21: Cache memory

Replacement algorithms

Optimizing instructions. To manage cache information on computer. In direct mapping.

Each block only maps to one cache block .

Associative and set associative mapping.

Page 22: Cache memory

Least recently used(LRU) First in first out(FIFO) Least frequently used(LFU) Random

Replacement algorithms

Page 23: Cache memory

Least Recently used

The most effective. Keeps track of which block used when. Discards the least recently used blocks first.

USE bit 0

USE bit 1

Page 24: Cache memory

2 3 4 2 1 3 7

1 2 3 4 5 6 72 2 2 2 2 2 7

3 3 3 1 1 14 4 4 3 3

EXAMPLE

Page hit

1

Page fault

3

7

Page 25: Cache memory

First In First Out

The simplest algorithm. Bad performance. First entering block, is discarded first. Replaces the block that has been in cache the longest.

Page 26: Cache memory

2 3 4 2 1 3 7

1 2 3 4 5 6 72 2 2 2 1 1 1

3 3 3 3 3 34 4 4 4 7

EXAMPLE

Page hit Page hit

1

7 Page fault

Page 27: Cache memory

Least Frequently Used

Counts how often a block is needed. Every block has one counter of its own which is

initially set to 0. As the block is referenced, the counter is

incremented. Replaces the lowest reference frequency block.

Page 28: Cache memory

2 3 4 2 1 3 7

1 2 3 4 5 6 72 2 2 2 2 2 2

3 3 3 3 1 14 4 4 4 3

EXAMPLE

Page hit

1 7

Page fault

I I I II I I I

3

Page 29: Cache memory

Random

Randomly selects a block. Discards it to make space. Does not keep track of access history. This eliminates the overhead cost of tracking page

references.

Page 30: Cache memory

Thank You