Cache Simulator

Visualize how CPU caches map, store, and evict memory blocks

How to use

This cache simulator lets you access memory addresses and watch how a CPU cache maps, stores, and evicts data in real time. Step through preset access patterns to see spatial locality, temporal locality, thrashing, and conflict misses in action across direct-mapped, set-associative, and fully associative cache configurations with LRU, FIFO, and random replacement policies.

Configure the cache using the dropdowns above. Block size determines how many consecutive bytes are fetched together (spatial locality). Cache lines sets the total capacity. Associativity controls how many ways per set. More ways means fewer conflict misses but more hardware cost. Replacement policy (LRU, FIFO, Random) determines which line gets evicted when a set is full.

Access addresses by typing a number (0–255) and clicking Access, or load a preset pattern and step through with the Next button or Spacebar.

Watch the breakdown above the cache table. It shows how each address splits into tag, index, and offset bits, which determines where data maps in the cache.

Try different configs with the same pattern. Load Thrashing in direct-mapped mode, then switch to 2-way and reload to see how associativity eliminates conflict misses.

Bit Split Formulas
FieldFormulaExample (default)
Offset bits log2(block size) log2(4) = 2
Index bits log2(cache lines / associativity) log2(8 / 1) = 3
Tag bits address bits − index bits − offset bits 8 − 3 − 2 = 3
Num sets cache lines / associativity 8 / 1 = 8
8-bit (0–255)
3t / 3i / 2o
Enter an address (0–255) to see the tag / index / offset breakdown
SetWayVTagBlock
Access Memory
Access Queue
Patterns
Statistics
Accesses0
Hits0
Misses0
Hit Rate
Access History
#AddrBinaryTagSetWayResult