SlideShare a Scribd company logo
A  Seminar Presentation On “Dynamic Cache Management Technique” Presented By: Ajay Singh Lamba (IT , Final Year)
Content Introduction to cache memory How stored data is transferred to the CPU  Mapping functions Dynamic Cache Management Dynamic Techniques For L0-cache Management
Introduction to cache memory A cache, in computer terms, is a place to store information that's faster than the place where the information is usually stored Cache memory is fast memory that is used to hold the most recently accessed data  Only frequently accessed data will stay in cache, which allows the CPU to access it more quickly  it is placed in the processor chip, which allows it to 'talk' with the processor direct at a much higher speed than standard RAM.
How stored data is transferred to the CPU ??
Mapping functions  Since M>>C, how are blocks mapped to specific lines in cache. Direct mapping  Associative mapping  Set associative mapping
Dynamic Cache Management It’s resizing strategy of the cache memory Dynamic caching allows for dynamic resizing both across and within applications execution. The basic idea is that only the most frequently executed portion of the code should be stored in the L0-cache
POWER TRENDS FOR CURRENT MICROPROCESSORS
DYNAMIC TECHNIQUES FOR L0-CACHE  MANAGEMENT  1. Simple Method. 2. Static Method. 3. Dynamic Confidence Estimation Method. 4. Restrictive Dynamic Confidence Estimation Method. 5. Dynamic Distance Estimation Method.
SIMPLE METHOD  If a branch predictor is mispredicted, the machine will access the I-cache to fetch the instructions.  If a branch is predicted correctly, the machine will access the L0-cache.  In a misprediction , the machine will start fetching the instructions from the correct address by accessing the I-cache.
STATIC METHOD  If a ‘high confidence’ branch was predicted incorrectly, the I-cache is accessed for the subsequent basic blocks. If more than n low confidence branches have been decoded in a row, the I-cache is accessed. Therefore the L0-cache will be bypassed when either of the two conditions are satisfied.  In any other case the machine will access the L0-cache.
DYNAMIC CONFIDENCE ESTIMATION METHOD It is a dynamic version of the static method.  The confidence of the I-cache is accessed if  1. A high confidence branch is mispredicted.  2. More than n successive ‘low confidence’ branches are encountered. it is more accurate in characterizing a branch and, then, regulating the access of the L0-cache.
RESTRICTIVE DYNAMIC CONFIDENCE ESTIMATION METHOD  Restrictive dynamic scheme is a more selective scheme in which only the really important basic blocks would be selected for the L0-cache. The L0-cache is accessed only if a ‘high confidence’ branch is predicted correctly. The I-cache is accessed in any other case. This method selects some of the most frequently executed basic blocks, yet it misses some others.
Dynamic Distance Estimation Method All n branches after a mispredicted branch are tagged as ‘low confidence’ otherwise as ‘high confidence’. The basic blocks after a ‘low confidence’ branch are fetched from the L0-cache. The net effect is that a branch misprediction causes a series of fetches from the I-cache. A counter is used to measure the distance of a branch from the previous mispredicted branch.
Thank you Any Query ??

More Related Content

PPT
Dynamic cache management
PPT
Dynamic cache management technique
PPTX
Hazelcast
PPTX
Io techniques & its types
PPSX
30 Q Consult Shravan
DOC
ο αγαπημένος μου συγγραφέας
PDF
Daring and doing is the new black
PPT
Developing Innovative Payment Approaches: Finding the Path to High Performance
Dynamic cache management
Dynamic cache management technique
Hazelcast
Io techniques & its types
30 Q Consult Shravan
ο αγαπημένος μου συγγραφέας
Daring and doing is the new black
Developing Innovative Payment Approaches: Finding the Path to High Performance

Viewers also liked (20)

PPTX
12 - A years journey through images and thoughts
PPTX
Describing our original places
PPT
Flash americanbloger
PPTX
国内インターネット広告費の推移(2005 2014)
PPTX
ο ρόαλντ νταλ
PPS
Taipei image
PPTX
How to create the right startup ecosystem?
PPT
ΤΟ ΕΠΟΣ ΤΟΥ 40
PPTX
Slide pbl ict
PPT
Diagnosing State Enrollment Systems: Just What the Doctor Ordered!
PDF
Dubai conference
PDF
Motivation, culture and Learnings by Anna Oscarsson
PPTX
Prospective Employee Research Tips
PPT
Περπάτημα στο νερό
PPTX
ευγένιος τριβιζάς
PPTX
το σεντούκι με τις πέντε κλειδαριές!
PDF
Facebook Timeline - It's Time to Tell your Story
PPT
Health Care Reform: Primary Care and Behavioral Health Integration
PPT
Montana Health Improvement Program: New Roles for FQHCs in State Delivery Sys...
PPTX
Biting
12 - A years journey through images and thoughts
Describing our original places
Flash americanbloger
国内インターネット広告費の推移(2005 2014)
ο ρόαλντ νταλ
Taipei image
How to create the right startup ecosystem?
ΤΟ ΕΠΟΣ ΤΟΥ 40
Slide pbl ict
Diagnosing State Enrollment Systems: Just What the Doctor Ordered!
Dubai conference
Motivation, culture and Learnings by Anna Oscarsson
Prospective Employee Research Tips
Περπάτημα στο νερό
ευγένιος τριβιζάς
το σεντούκι με τις πέντε κλειδαριές!
Facebook Timeline - It's Time to Tell your Story
Health Care Reform: Primary Care and Behavioral Health Integration
Montana Health Improvement Program: New Roles for FQHCs in State Delivery Sys...
Biting
Ad

Similar to Dynamic Cache Management (20)

PPTX
Blue and Green Narrative Writing Story Starters Education Presentation _20241...
PPTX
A Novel Approach of Caching Direct Mapping using Cubic Approach
PPTX
cachememppt analyzing the structure of the cache memoyr
PPTX
Cache memory ppt
PPTX
Cache management
PDF
Cache optimization
PPTX
Cache memory
PPTX
GRP13_CACHE MEMORY ORGANIZATION AND DIFFERENT CACHE MAPPING TECHNIQUES.pptx
PPT
Memory organization including cache and RAM.ppt
PDF
Computer Organization and Design 4th Edition Patterson Test Bank
PDF
computer-memory
PPT
Cache memory
PPTX
Cache memory
PPTX
Cache memory
PPT
Memory Organization and Cache mapping.ppt
PDF
cashe introduction, and heirarchy basics
PDF
Don’t give up, You can... Cache!
PPT
lec16-memory.ppt
PDF
Access strategies ppt_ind
PDF
AspectCache
Blue and Green Narrative Writing Story Starters Education Presentation _20241...
A Novel Approach of Caching Direct Mapping using Cubic Approach
cachememppt analyzing the structure of the cache memoyr
Cache memory ppt
Cache management
Cache optimization
Cache memory
GRP13_CACHE MEMORY ORGANIZATION AND DIFFERENT CACHE MAPPING TECHNIQUES.pptx
Memory organization including cache and RAM.ppt
Computer Organization and Design 4th Edition Patterson Test Bank
computer-memory
Cache memory
Cache memory
Cache memory
Memory Organization and Cache mapping.ppt
cashe introduction, and heirarchy basics
Don’t give up, You can... Cache!
lec16-memory.ppt
Access strategies ppt_ind
AspectCache
Ad

More from Rajan Kumar (20)

PPTX
Solar Sunseekar
PPTX
Cmos Sensor
PPT
Wind Turbine
PPT
Wind Mill
PPTX
PPTX
PPT
Virtual Private Network
PPT
Temperature Distribution In Journal Bearing
PPT
Soler Car
PPT
Sensors On 3d Digitization
PPTX
PPTX
Holography
PPSX
Foult Tolerence In Distributed System
PPTX
Hydraulic Ram
PPSX
Implementation Of Bss And Nss In Mobile Communication
PPT
Infiband
PPTX
Smart Cards
PPT
PPT
Fluorescent Multi-Layer Disc
PPT
Solar Sunseekar
Cmos Sensor
Wind Turbine
Wind Mill
Virtual Private Network
Temperature Distribution In Journal Bearing
Soler Car
Sensors On 3d Digitization
Holography
Foult Tolerence In Distributed System
Hydraulic Ram
Implementation Of Bss And Nss In Mobile Communication
Infiband
Smart Cards
Fluorescent Multi-Layer Disc

Dynamic Cache Management

  • 1. A Seminar Presentation On “Dynamic Cache Management Technique” Presented By: Ajay Singh Lamba (IT , Final Year)
  • 2. Content Introduction to cache memory How stored data is transferred to the CPU Mapping functions Dynamic Cache Management Dynamic Techniques For L0-cache Management
  • 3. Introduction to cache memory A cache, in computer terms, is a place to store information that's faster than the place where the information is usually stored Cache memory is fast memory that is used to hold the most recently accessed data Only frequently accessed data will stay in cache, which allows the CPU to access it more quickly it is placed in the processor chip, which allows it to 'talk' with the processor direct at a much higher speed than standard RAM.
  • 4. How stored data is transferred to the CPU ??
  • 5. Mapping functions Since M>>C, how are blocks mapped to specific lines in cache. Direct mapping Associative mapping Set associative mapping
  • 6. Dynamic Cache Management It’s resizing strategy of the cache memory Dynamic caching allows for dynamic resizing both across and within applications execution. The basic idea is that only the most frequently executed portion of the code should be stored in the L0-cache
  • 7. POWER TRENDS FOR CURRENT MICROPROCESSORS
  • 8. DYNAMIC TECHNIQUES FOR L0-CACHE MANAGEMENT 1. Simple Method. 2. Static Method. 3. Dynamic Confidence Estimation Method. 4. Restrictive Dynamic Confidence Estimation Method. 5. Dynamic Distance Estimation Method.
  • 9. SIMPLE METHOD If a branch predictor is mispredicted, the machine will access the I-cache to fetch the instructions. If a branch is predicted correctly, the machine will access the L0-cache. In a misprediction , the machine will start fetching the instructions from the correct address by accessing the I-cache.
  • 10. STATIC METHOD If a ‘high confidence’ branch was predicted incorrectly, the I-cache is accessed for the subsequent basic blocks. If more than n low confidence branches have been decoded in a row, the I-cache is accessed. Therefore the L0-cache will be bypassed when either of the two conditions are satisfied. In any other case the machine will access the L0-cache.
  • 11. DYNAMIC CONFIDENCE ESTIMATION METHOD It is a dynamic version of the static method. The confidence of the I-cache is accessed if 1. A high confidence branch is mispredicted. 2. More than n successive ‘low confidence’ branches are encountered. it is more accurate in characterizing a branch and, then, regulating the access of the L0-cache.
  • 12. RESTRICTIVE DYNAMIC CONFIDENCE ESTIMATION METHOD Restrictive dynamic scheme is a more selective scheme in which only the really important basic blocks would be selected for the L0-cache. The L0-cache is accessed only if a ‘high confidence’ branch is predicted correctly. The I-cache is accessed in any other case. This method selects some of the most frequently executed basic blocks, yet it misses some others.
  • 13. Dynamic Distance Estimation Method All n branches after a mispredicted branch are tagged as ‘low confidence’ otherwise as ‘high confidence’. The basic blocks after a ‘low confidence’ branch are fetched from the L0-cache. The net effect is that a branch misprediction causes a series of fetches from the I-cache. A counter is used to measure the distance of a branch from the previous mispredicted branch.
  • 14. Thank you Any Query ??