What is Cache Memory?
Computers deal with massive quantities of data that no single human brain could ever process - and yet, just like humans, computers need to remember things. Also like humans, computers distinguish between different types of memory: "short-term" cache memory, and "long-term" disk storage.
But what is cache memory exactly, and how should it be used for a website or web application? In this article, we'll discuss the definition of cache memory, how cache memory works, and the difference between cache memory and RAM.
What is cache memory?
Cache memory is a type of computer memory for temporary storage of important, frequently accessed information. Reading from and writing to cache memory is much faster than other forms of data storage. By storing data in cache memory, you can dramatically speed up time-consuming or high-volume transactions.
How does cache memory work?
Cache memory sits between the CPU and main memory, which allows it to be accessed relatively quickly. Depending on the processor, cache memory may be split into multiple levels, including L1 and L2 (and possibly L3). The L1 cache is smaller, and can be accessed more quickly, than the L2 and L3 caches.
Before searching through slower forms of memory such as RAM or disk storage, an application first checks the cache to see if the information is already present there. If it is, the application simply reads the data from the cache, saving a significant amount of time. Otherwise, the application retrieves the data from its original source, and then stores a copy of this data in the cache for possible future access.
Only a small amount of data can be stored in cache memory, due to the cache's limited size. In order to make room for new information, caches need a replacement policy that defines how to remove the least "useful" information in cache memory. For example, when making room for new records, the cache may remove data that is least frequently used (LFU) or least recently used (LRU).
Another challenge when implementing caching is the issue of "stale" data that remains in the cache, while the underlying source data has changed. One possible solution for stale data is a definite "expiration date" that requires scheduled refreshes for information in cache memory.
Cache memory vs. RAM
Cache memory is often confused or conflated with RAM (random access memory), but there's an important distinction between the two concepts.
First, we note that modern computer systems have multiple cascading levels of memory:
- CPU registers, which can be accessed extremely quickly but which store an extremely limited amount of information.
- Cache memory, which we've already discussed.
- Random access memory (RAM), which makes up the bulk of a computer's memory resources. The term "random access" comes from the fact that data anywhere within the RAM can be accessed in roughly the same time, regardless of the data's physical location in memory.
- Disk storage such as hard drives, which are the slowest form of memory. Accessing data stored on disk is orders of magnitude slower than accessing the same data stored in a cache.
In strict terms, cache memory is RAM that has been specially designated to serve as a cache. In practice, however, the term "cache memory" is often distinguished and treated as separate from RAM, due to their differing use cases and performance.
More technically, cache memory is usually implemented with static RAM (SRAM), while system RAM is typically dynamic RAM (DRAM). SRAM is faster and costlier than DRAM, which makes it an ideal choice for cache memory.
Cache memory use cases
The increased speed and efficiency of cache memory has obvious benefits. Both users and servers can take advantage of caching for websites and web applications.
Meanwhile, servers make use of cache memory in order to provide content more quickly to users. Caching is especially important for database-heavy websites and applications. If users are repeatedly accessing certain data in the database, the website can improve performance by storing this information in cache memory instead.
Databases that can be queried (e.g. for business intelligence and analytics) also make frequent use of caching. The intermediate results of certain highly complex query operations, which can take a long time to execute, can be stored in cache memory. The performance boost provided by caching enables real-time dashboards and reporting.