What is Redis caching?
Redis is an open-source, in-memory data structure store that is used to implement key-value databases and application caches. But what is Redis caching exactly, and how does caching in Redis work? This article will have all the answers so that you can improve the performance of your Redis cache.
What is caching?
Caching is a strategy to increase the speed and efficiency of an application by storing data locally to prevent the need for costly retrievals. The types of data that are typically cached include expensive database queries, user session data, and API responses.
Data in cache memory is typically stored between the CPU and the main RAM, potentially stored in multiple levels: L1, L2, and L3. The smaller the number, the smaller the cache is and the more quickly the data inside can be accessed.
How does caching in Redis work?
Redis client-side caching is also known as "tracking." Because Redis is a data structure-centric solution, caching in Redis can make use of strings, hashes, lists, sets, sorted sets, streams, and many other data structures. Reading from and writing to a cache simply requires the use of the GET and SET commands in Redis.
Redis includes support for many different cache eviction policies, giving you fine-grained control over how to clear your cache. These policies are:
- Evict the least frequently used keys.
- Evict the least recently used keys.
- Evict the keys with the shortest time to live.
- Randomly evict keys.
- Do not evict keys (block writes until memory is manually freed).
In addition, these policies are available both for all keys and for only those keys that have been marked with an "expire" field set.
Redis caching comparison
When implemented correctly, caching in Redis can significantly improve application performance. But how does Redis caching stack up against other alternatives?
Redis is frequently compared with Memcached, another open-source in-memory key-value store. Like Redis, Memcached is a popular caching engine with support for dozens of programming languages. According to InfoWorld, however, Redis is the superior choice for nearly all caching use cases - although Memcached may be better for "relatively small and static data, such as HTML code fragments."
The reasons to prefer Redis over Memcached for caching include:
- Redis offers multiple cache eviction policies, while Memcached only offers LRU (least recently used).
- Key names and values in Redis can be up to 512 megabytes, while key names in Memcached can be only 250 bytes.
- Redis caches can store a wide range of data structures, including lists, sets, sorted sets, bitmaps, and geospatial indices.
Caching in Redis Java clients
Caching in the base version of Redis is powerful enough, but you can enhance it even further by installing a third-party Redis Java client such as Redisson.
Redisson includes implementations of many familiar Java distributed objects and collections, smoothing the Redis learning curve for Java developers. The RLocalCachedMap interface in Redisson extends the ConcurrentMap interface in Java, providing a local cache feature to Redis developers.
Below is a quick example of how to use RLocalCachedMap for caching in Redis:
RLocalCachedMap<String, Integer> map = redisson.getLocalCachedMap("test", LocalCachedMapOptions.defaults()); String prevObject = map.put("123", 1); String currentObject = map.putIfAbsent("323", 2); String obj = map.remove("123"); // use fast* methods when previous value is not required map.fastPut("a", 1); map.fastPutIfAbsent("d", 32); map.fastRemove("b"); RFuture<String> putAsyncFuture = map.putAsync("321"); RFuture<Void> fastPutAsyncFuture = map.fastPutAsync("321"); map.fastPutAsync("321", new SomeObject()); map.fastRemoveAsync("321");
The RLocalCachedMap interface allows developers to modify settings such as:
- The maximum size of the cache.
- The time to live per cache entry.
- The maximum idle time per cache entry.
- The cache's eviction policy.
Redisson includes support for three different caching strategies:
- Read-through caching: The application first queries the cache for the desired information; if it is missing, it adds the information to the cache from the database.
- Write-through caching: The application updates the cache first and the database second.
- Write-behind caching: The application first updates the cache, which then updates the database after a given period of time.
Local caching in Redisson can provide a dramatic performance enhancement. Redisson also includes a "near cache," a smaller version of the local cache for the most frequently or most recently accessed data. Using this near cache can boost the performance of the standard JCache API by up to 45 times.
What's more, Redisson also includes support for a variety of third-party caching frameworks: Spring Cache, Hibernate Cache, and MyBatis Cache. By using Redis together with Redisson, you'll have more than enough options for your application caching