Cache Mapping Techniques Tutorial

Today in this cache mapping techniques based tutorial for Gate we will learn about different type of cache memory mapping techniques. These techniques are used to fetch the information from main memory to cache memory. cache mapping techniques, cache memory,address mapping in cache memory, direct mapping in cache, set associative mapping technique,advantages of cache memory

Cache Mapping Techniques

What is cache memory mapping?

How mapping is done in cache memory?

Today in this cache mapping techniques based tutorial for Gate  we will learn about different type of cache memory mapping techniques. These techniques are used to fetch the information from main memory to cache memory. 

There are three type of  mapping techniques used in cache memory. Let us see them one by one
Three types of mapping procedures used for cache memory are as follows -

What is cache memory mapping?

Cache memory mapping is a method of loading the data of main memory into cache memory. In more technical sense content of main memory is brought into cache memory which is referenced by the cpu.  This can be done in three ways -

(i) Associative mapping Technique

The fastest and most flexible computer used associative memory in computer organization. The associative memory has both the address and memory word. In associative mapping technique any memory word from main memory can be store at any location in cache memory.
The address value of 15 bits is shown as a five-digit octal number and its corresponding 12 bit word is shown as a four-digit octal number and its corresponding 12-bit word is shown as a four-digit octal number. CPU generated 15 bits address is placed in the argument register and the associative memory is searched for a matching address.

cache mapping techniques

If the address is present  then  corresponding 12-bit data is read from it and sent to the CPU. But If no match occurs for that address, in that case  required word is accessed from the main memory , after that this address-data pair is sent to the associative cache memory. 

cache mapping techniques

Suppose that cache is full then question  arises that where to store this address-data pair. In this condition this concept of replacement algorithms comes into existence.

Replacement algorithm determines that which existing data in cache is remove from cache and make a space free so that required data can be placed in cache.

A simple procedure is to replace cells of the cache is round-robin order whenever a new word is requested from main memory. This constitutes a first-in first-out (FIFO) replacement policy.

(ii) Direct mapping

Associative memories are more costly as compared to random-access memories because of logic is added in  with each cell. The 15 bits address generated by the cpu is divided into two fields.

 The nine lower bits represents  the index field and the remaining six bits form the tag field. The figure shows that main memory required an address that includes tag and the index bits.
The index field bits represent the number of address bits required to fetch the cache memory.

Consider a case where there are 2k words in cache memory and 2words in main memory. The n bit memory address is divided into two fields: k bits for the index field and the n-k bits for the tag field. The direct mapping cache organization uses the n-k bits for the tag field.
  •  In this direct mapped cache tutorial it is also explained the direct mapping technique in cache organization uses the bit address to access the main memory and the k-bit index to access the cache. The internal arrangement  of the words in the cache memory is as shown in figure.
  •  It has been shown in cache that each word in cache consists of the data and tag associated with it.When a new word is loaded into the cache, then its tag bits are also stored alongside with the data bits. When the CPU generates a memory request, the index field is used for the address to access that cache. 
  • The tag field of the address referred by the cpu  are compared with the tag in the word read from the cache. If these two tags match  it means that there is a hit and the desired data word is available in the cache.
  •  If these two tags does not match then there is a miss and the required word is  not present in cache and it is read  from main memory. It is then stored in the cache memory along with the new tag. 
  • The disadvantage of direct mapping technique is that the hit ratio can drop considerably if two or more words whose addresses have the same index but different tags are accessed repeatedly.

cache mapping techniques

To see how the direct-mapping organization operates, consider the numerical example as shown. The word at address zero is presently stored in the cache with  its index = 000, tag = 00, data = 1220. Assume that the CPU now wants to read the word at address 02000. Since the index address is 000, so it is used to read the cache and two tags are then compared.

 Here we found that the cache tag is 00 but the address tag is 02 these two tags do not match, iss occurs. So the main memory is accessed and the data word 5670 is sent to the CPU. The cache word at index address 000 is then replaced with a tag of 02 and data of 5670.

cache mapping techniques
First part  of the index filed is the  block field and second is the word field. In a 512-word cache there are 64 blocks  and size of each block is 8 words. Since there are 64 blocks in cache so 6 bits are used to identify a block within 64 blocks. So 6 bits are used to represent the block field and size of each block is 8 words so 3 bits are used to identify a word among these 8 words.

(iii)Set associative mapping

Disadvantage of direct mapping techniques is that it required a lot of comparisons.

A third type of cache organization, called set associative mapping, is an improvement over the direct-mapping organization in that in set associative mapping technique. In this technique each data word is stored together with its tag  and the number of tag data items in one word of cache is said to form a set.

Here I have explained the concept of set associative memory with the help of an example. An example of a set associative cache is shown in figure. Each index address has two parts data words and their associated tags.

6 bits are required to represent the tag field and 12 bits are required to represent the word field. Word length is36 bits. Here an index  9 bits index address can have 512 words. So cache memory size is 512 x 36.

It can accommodate 1024 words of main memory since each word of cache contains two data words. In general, a set-associative cache of set size k will accommodate words of main memory in each word of cache.
cache mapping techniques

When the CPU generates a logical address to fetch a word from main memory then the index value of the address is used to access the cache. The tag field of the CPU generated address is then compared with both tags in the cache to determine weather they match or not. 

The comparison logic is performed by an associative search of the tags in the set similar to an associative memory search so it is named as "set-associative."

We can improve the cache performance of cache memory if we can improve the hit ratio and the hit ratio can be improve by improving the set size increases because more words with the same index but different tags can reside in cache.

When a miss occurs in a set-associative cache and the set if full, it is necessary to replace one of the tag-data items with a new value using cache replacement algorithms.

I hope this Computer Science Study Material for Gate will be beneficial for gate aspirants.

Also read : Different cache levels and Cache Performance



addressing modes types,1,advance-java,2,aktu entrance exam,1,aktu exam schedule,1,ASP,1,bare machine,1,base register and limit register,1,C Programming,18,C Plus Plus,1,c programming notes for gate,18,C programming Tutorials,8,Cache Memory,1,Childcare,1,CJ,1,Cloud Computing,1,CN,4,Computer Architecture,2,Computer architecture based questions for gate exam,11,Computer architecture Tutorials,1,Computer Network,3,Computer Network Study Material,2,Computer network study material for gate,1,Computer Networks,10,Computer networks gate questions with answer,3,computer networks notes,1,computer networks tutorial,1,Computer Science Study Material for Gate,13,computer science study material for gate exam,17,contiguous memory allocation,2,Core Java,3,cyber crime report,1,Cyber crime status,1,cybercrime and security,1,cybercrime examples,1,data communications and networking,1,Data Structure,2,Data Structure Questions,1,Data Transmission Architecture,1,Data Transmission in wsn,1,database normalization,1,dbmas study material,1,DBMS,8,dbms gate questions with answer,1,dbms multiple choice questions with answers for gate,1,dbms question paper,1,DE,1,Digital Electronics,1,DS,4,Electroencephalogram,1,ER diagram Tutorial,1,Gate 2017,3,Gate 2017 Admit card,1,GATE 2020,1,gate cse c programming questions,10,gate cse study material,2,gate cse syllabus,2,gate practice set,7,gate questions on c programming,10,gate study material for computer science,14,Gate study material for computer science 2017,1,gate study material for cse,46,General,3,HCL Aptitude Test,1,HR Interview Questions,1,HTML,4,Important Date of Gate 2017 Exam,1,Information Security Policy,1,internal and external fragmentation,1,Java Tutorials,3,JDBC,2,JDBC Tutorial,1,JS,1,memory fragmentation,1,memory management,1,memory management questions and answer in os,1,Motivational,4,NCER,1,Numerical Techniques Lab,1,OOT,1,Operating System,6,Operating System Gate Questions,2,Operating System Objective Questions,4,Operating System Questions Bank,1,operating system study material for gate exam,11,operating system tutorial notes,8,Operating System tutorials resident monitor,1,page swapping,1,paged memory allocation,1,paged memory allocation in operating system,1,Regression testing,1,relocation register,1,routing table,1,Software Engineering,10,Software Engineering baes study material for gate,1,Software Quality Assurance,3,software verification methods,1,Stack,1,Study Material for gate Computer Science,5,swapping in memory management,1,swapping in operating system,1,TCS Code Vita,1,TCS Interview Questions,1,Technical Interview,1,Technical Questions from DBMS,1,Tips to Learn Coding,1,UML,1,Virtualization,1,What is process control block ?,1,what is software testing?,1,Wireless Sensor Network,3,worst fit algorithm for memory allocation,1,XML,2,
Computer Science Junction: Cache Mapping Techniques Tutorial
Cache Mapping Techniques Tutorial
Today in this cache mapping techniques based tutorial for Gate we will learn about different type of cache memory mapping techniques. These techniques are used to fetch the information from main memory to cache memory. cache mapping techniques, cache memory,address mapping in cache memory, direct mapping in cache, set associative mapping technique,advantages of cache memory
Computer Science Junction
Loaded All Posts Not found any posts VIEW ALL Readmore Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS PREMIUM CONTENT IS LOCKED STEP 1: Share. STEP 2: Click the link you shared to unlock Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy