What is Caching ?
- Caching is an area of a computer’s memory devoted to temporarily storing recently used data.
- In caching the content are which includes HTML pages, images, files and Web objects, is stored on the local hard drive in order to make it faster for the user to access it, which helps to develop the efficiency of the computer and its overall performance.
- For example, when a user returns to a Web page they have recently accessed, the browser can pull those files from the cache instead of the original server because it has stored the user’s activity.
- The storing of that information saves the user can getting to it faster, and less time to the traffic on the network.
How cache works ?
- When a cache client wants to access data, it first checks the cache. Cache hit is known as when the requested data is found.
- The current of attempts that result in cache hits is known as the cache hit rate or ratio.
- If the requested data isn’t found in the cache a situation known as a cache miss. It is getting from main memory and through into the cache. How this is done, and what data is ejected from the cache to make room for the new data, depends on the caching algorithm or policies the system uses.
Caching Memory
- Web browsers, such as Internet Explorer, Firefox, Safari and Chrome, use a browser cache to develop performance of frequently accessed webpages.
- When we visit a webpage, the requested files are stored in computing storage in the browser’s cache.
- Clicking back and returning to a previous page enables from the browser to retrieve most of the files it needs from the cache instead of having them all resent from the web server.
- This approach is called read cache. The browser can read data from the browser cache much faster than from the webpage it can reread the files.
Cache Algorithms
- Some examples of cache algorithms they are,
- Least Frequently Used (LFU)
- Least Recently Used (LRU)
- Most Recently Used (MRU)
Least Frequently Used (LFU)
- Keeps track of how often an entry is accessed. The data that has to use lowest count gets then removed first.
Least Recently Used (LRU)
- Puts recently accessed data to near the top of the cache. When the cache reaches its limit, the least recently accessed data are removed.
Most Recently Used (MRU)
- Removes the most recently accessed data first. This approach is best when older items are more likely to be used.