Redis Caching in ASP.NET Core- Depth in Distributed Caching

Redis Caching in ASP.NET Core- Depth in Distributed Caching

In this article, we will learn how to use Redis caching in an ASP.Net Core Web API. Before starting this article please must visit our previous article In-Memory Caching in ASP.NET Core where we discussed about Caching , In-Memory Caching in ASP.NET Core, and other concepts related to caching. In this article, we will talk about Distributed Caching, Redis, Setting up Redis Caching in ASP.NET Core.

Find SourceCode

Prerequisites

– Basic knowledge of Caching & In-memory of Caching
– Visual Studio 2019
– .NET Core SDK 5.0 ( You can use SDK 3.1 )
Redis Caching in ASP.NET Core- Depth in Distributed Caching
Tweet

What is Distributed Caching ?

A distributed cache is a cache shared by multiple app servers, typically maintained as an external service to the app servers that access it. A distributed cache can improve the performance and scalability of an ASP.NET Core app, especially when the app is hosted by a cloud service or a server farm.

Like an in-memory cache, it can improve the ASP.Net Core application response time quite drastically. However, the implementation of the Distributed Cache is application-specific. This means that there are multiple cache providers that support distributed caches. A few of the popular ones are Redis and NCache.

Pros of Distributed Caching

A distributed cache has several advantages over other caching scenarios where cached data is stored on individual app servers. When cached data is distributed, the data:

  • Is coherent (consistent) across requests to multiple servers.
  • Multiple Applications / Servers can use one instance of Redis Server to cache data. This reduces the cost of maintenance in the longer run.
  • The cache would not be lost on server restart and application deployment as the cache lives external to the application.
  • It does not use the local server’s resources.

Cons of Distributed Caching

There are two main disadvantages of the distributed caching:

  • The cache is slower to access because it is no longer held locally to each application instance.
  • The requirement to implement a separate cache service might add complexity to the solution

What is Redis Caching ?

Redis(Remote DIctionary Server) is an open source, in-memory data structure store, used as a database, cache and message broker. It supports data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes with radius queries and streams.

It’s a NoSQL Database as well and is being used at tech giants like Stackoverflow, Flickr, Github, and so on. Redis is a great option for implementing a highly available cache to reduce the data access latency and improve the application response time. As a result, we can reduce the load off our database to a very good extent.

Let’s understand in a better way of Redis cache,

  • User requests a user object.
  • App server checks if we already have a user in the cache and return the object if present.
  • App server makes a HTTP call to retrieve the list of users.
  • Users service returns the users list to the app server.
  • App server sends the users list to the distributed (Redis) cache.
  • App server gets the cached version until it expires (TTL).
  • User gets the cached user object.

IDistributedCache Interface

ASP.NET Core provides IDistributedCache interface to interact with the distributed caches including Redis.. IDistributedCache Interface provides you with the following methods to perform actions on the actual cache.

  • GetAsync – Gets the Value from the Cache Server based on the passed key.
  • SetAsync – Accepts a key and Value and sets it to the Cache server
  • RefreshAsync – Resets the Sliding Expiration Timer (more about this later in the article) if any.
  • RemoveAsync – Deletes the cache data based on the key.

How to setup Redis in a Windows Machine

To setup Redis in a windows machine, There is Redis open source Github Repo, You could also use the MSI Executable file.

Extract the highlighted zip folder and open redis-server.exe file.

Redis cache command

Now, the Redis server is up and running. We can test it using the redis-cli command.

Open a new command prompt and run redis-cli on it and try the following commands:

Now that we installed the Redis-server and saw that it is working properly, we can modify the API to use Redis-based distributed caching.

Redis CLI Commands

Setting a Cache Entry
-> Set name "corespider"
OK
Getting the cache entry
-> Get name
"corespider"
Deleting a Cache Entry
-> Get name
"corespider"
-> Del name
(integer)1

Integrating Redis Cache In ASP.NET Core

In the previous example of In-Memory Caching we created the ASP.NET Core Web API project. This API is connected to DB via Entity Framework Core and return the customer list.

MAKE Sure that Redis Server is up and running in the background.

  • Install the below package using Nuget Package manager console OR you can install package using Nuget search.
Install-Package Microsoft.Extensions.Caching.StackExchangeRedis
  • To support the Redis cache let’s configure the cache in startup.cs/ConfigureServices method.

By default, Redis runs on the local 6379 port. To change this, open up Powershell and run the following command.

./redis-server --port {your_port}

services.AddStackExchangeRedisCache(options =>
{
    options.Configuration = "localhost:4450";
});

Implement Redis Cache memory in Controller

        private readonly ApplicationDbContext _context;
        private readonly IMemoryCache memoryCache;
        private readonly IDistributedCache distributedCache;
        public CustomersController(ApplicationDbContext context, IMemoryCache memoryCache, IDistributedCache distributedCache)
        {
            _context = context;
            this.memoryCache = memoryCache;
            this.distributedCache = distributedCache;
        }
        // GET: api/Customers
        [HttpGet]
        public async Task<ActionResult<IEnumerable<Customer>>> GetCustomers()
        {
            var cacheKey = "Allcustomer";
            string serializedCustomerList;
            var customerList = new List<Customer>();
            var redisCustomerList = await distributedCache.GetAsync(cacheKey);
            if (redisCustomerList != null)
            {
                serializedCustomerList = Encoding.UTF8.GetString(redisCustomerList);
                customerList = JsonConvert.DeserializeObject<List<Customer>>(serializedCustomerList);
            }
            else
            {
                customerList = await _context.Customers.ToListAsync();
                serializedCustomerList = JsonConvert.SerializeObject(customerList);
                redisCustomerList = Encoding.UTF8.GetBytes(serializedCustomerList);
                var options = new DistributedCacheEntryOptions()
                    .SetAbsoluteExpiration(DateTime.Now.AddMinutes(10))
                    .SetSlidingExpiration(TimeSpan.FromMinutes(2));
                await distributedCache.SetAsync(cacheKey, redisCustomerList, options);
            }
            return customerList;
        }
  • Line #4-8: We use the required constructor in the controllers.
  • Line #14: We set the key internally in the code.
  • Line #17: Access the distributed cache object to get data from Redis using the key “AllCustomer”
  • Line #18 -23: If the key has a value in Redis, then convert it to a list of Customers and send back the data. If the value does not exist in Redis, then access the database via efcore, get the data and set it to redis.
  • Line #25-31: The data will be stored in Redis as a byte array. We will be converting this array of a string. Converts the string to an object of type List.
  • SlidingExpiration: Gets or sets how long a cache entry can be inactive (e.g. not accessed) before it will be removed. This will not extend the entry lifetime beyond the absolute expiration (if set).
  • AbsoluteExpiration: Gets or sets an absolute expiration date for the cache entry.

Test the Redis Cache in Postman

Let’s open the postman tool and test the cache performance.

In the first API call took about 6850 ms to return a response. During the first call the cache for customerList also would be set by our API onto the Redis server. Thus, the second call would be direct to the Redis cache, which should reduce the response time significantly. Let’s see that only takes 33 ms.

How to conditionally inject an IDistributedCache implementation of MemoryDistributedCache during local development?

You are using the key-value caching functionality of Redis. You are coding an Azure function (could be an ASP.NET Core Web app) and you want to develop and debug the code. Do you need to install Redis locally? Not neccessary. Through a couple of lines of clever DI, you can “fool” your classes to use the MemoryDistributedCache implementation of the interface IDistributedCache.

In this snippet we are checking for the the value of an environment variable and conditionally injecting the desired implementation of IDistributedCache

[TestMethod]
        public void Condition_Injection_Of_IDistributedCache()
        {
            var builder = new ConfigurationBuilder();
            builder.AddJsonFile("settings.json", optional: false);
            Config = builder.Build();
            ServiceCollection coll = new ServiceCollection();
            if (System.Environment.GetEnvironmentVariable("localdebug") == "1")
            {
                coll.AddDistributedMemoryCache();
            }
            else
            {
                coll.AddStackExchangeRedisCache(options =>
                {
                    string server = Config["redis-server"];
                    string port = Config["redis-port"];
                    string cnstring = $"{server}:{port}";
                    options.Configuration = cnstring;
                });
            }
            var provider = coll.BuildServiceProvider();
            var cache = provider.GetService<IDistributedCache>();
        }

How do we benchmark the performance of Redis cache?

The primary motivation for using a distributed cache is to make the application perform better. In most scenarios the central data storage becomes the bottleneck. Benchmarking the distributed cache gives us an idea of how much latency and throughput to expect from the cache for various document sizes.

Find Source Code

Summary

In this detailed article, we have learnt about Redis Caching in ASP.Net Core. We discussed Caching, the Basics, In-Memory Caching, Implementing In-Memory Caching in ASP.NET Core etc.

That’s a wrap for this article. I hope that it cleared about In-Memory Caching in ASP.NET Core-Basic guide of Caching. Please write to us if have any.

Have you enjoyed this post and found it useful? If so, please consider supporting me:

Buy me a coffeeBuy me a coffee

References

Latest Post

In-Memory Caching in ASP.NET Core-Basic guide of Caching
in memory caching

This article explain about In-Memory Caching in ASP.NET Core, We discussed here Caching, the Basics, In-Memory Caching, Implementing In-Memory Caching Read more

CQRS pattern with MediatR in ASP.NET Core 5.0
CQRS pattern with MediatR in ASP.NET Core 5.0

This article explain CQRS design pattern with MediatR in ASP.Net Core 5.0. CQRS is a design pattern that separated the Read more

ASP.NET Core Web API with MongoDB CRUD- Beginner’s Guide
ASP.NET CORE WEB API 5.0 MONGODB CRUD

This article explain CRUD operation using MongoDB in ASP.NET Core Web API. We discuss here basics of MongoDB and how Read more

Build CRUD REST APIs with ASP.NET Core 5.0 with EF Core 5.0
Build CRUD REST APIs with ASP.NET Core 5.0

This article explains how to build CRUD REST APIs with ASP.NET Core 5.0 with EF core 5.0 and API secured Read more

In-Memory Caching in ASP.NET Core-Basic guide of Caching

In-Memory Caching in ASP.NET Core-Basic guide of Caching

In this article we will discuss about in depth about In-Memory Caching in ASP.Net Core. Here we discuss about all the possibilities of caching and will build a simple endpoint that can help demonstrate of cache entries from the in-memory Cache, and finally we will check after enabling the caching the application performance.

Find Source Code

Prerequisites

– Basic knowledge of Caching
– Visual Studio 2019
– .NET Core SDK 5.0 ( You can use SDK 3.1 )
In-Memory Caching in ASP.NET Core-Basic guide of Caching
Tweet

What is Caching ?

Caching is the process of storing copies of files in a cache, or temporary storage location, so that they can be accessed more quickly.

A cache is a hardware or software component that stores data so that future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere. In simply we can say the most frequently used data are stored in a temporary storage so that it can be accessed much faster for the future calls from the client. This technique will boost the performance of the application drastically by removing unnecessary and frequent requests to the data source.

Caching can significantly improve the performance and scalability of an app by reducing the work required to generate content. Caching works best with data that changes infrequently and is expensive to generate. Caching makes a copy of data that can be returned much faster than from the source. Apps should be written and tested to never depend on cached data.

  • On left hand side, you can see every time the resources are called to the datasource that may burden to the server and counter performance issue.
  • Whereas on right side part, you can see when a user is call to a datasource then it cached and it provide the result set from cached datasource only not the main datasource, so from performance point of view it should be much faster.

Caching In ASP.NET Core

ASP.NET Core support for various types of caching as follows.

  1. In-Memory Caching – Where the data is cached within the server’s memory.
  2. Distributed caching – The data is stored external to the application in sources like Redis cache etc., we will discuss it later.

In-Memory Caching in ASP.NET Core ?

In ASP.NET Core, it is now possible to cache the data within the application, this is known as In-Memory Caching in ASP.NET Core. The simplest cache is based on the IMemoryCache. IMemoryCache represents a cache stored in the memory of the web server. Apps running on a server farm (multiple servers) should ensure sessions are sticky when using the in-memory cache. Sticky sessions ensure that subsequent requests from a client all go to the same server.

In-memory caching is a service that’s referenced from an app using Dependency Injection. So, we first need to register this service to the built-in IoC container of ASP.NET Core by modifying ConfigureServices method of Startup.cs.

Cache Implementation in ASP.NET Core application

We have created the ASP.NET Core web API application that have set up an API and configured Entity Framework Core. This API will return a list of all customers in the database. If you need to learn about setting up Entity Framework Core in ASP.NET Core, we recommend you to go through this article – CRUD Operation in ASP.NET Web API with Entity Framework.

In this example we created a ASP .NET Core web API 5.0 project and connect with EF along with customer database.

In-Memory Caching in ASP.NET Core is a Service that should be registered in the service container of the application.

services.AddMemoryCache();

The above line of code should be add on configure services. This adds a non-distributed, in-memory implementation to the IServiceCollection. Now our Application eligible to provide in-memory caching.

Implement Cache in Controller Endpoint

[Route("api/[controller]")]
    [ApiController]
    public class CustomersController : ControllerBase
    {
        private readonly ApplicationDbContext _context;
        private readonly IMemoryCache memoryCache;
        public CustomersController(ApplicationDbContext context, IMemoryCache memoryCache)
        {
            _context = context;
            this.memoryCache = memoryCache;
        }
        // GET: api/Customers
        [HttpGet]
        public async Task<ActionResult<IEnumerable<Customer>>> GetCustomers()
        {
            var cacheKey = "Allcustomer";
            if (!memoryCache.TryGetValue(cacheKey, out List<Customer> customerList))
            {
                customerList = await _context.Customers.ToListAsync();
                var cacheExpiryOptions = new MemoryCacheEntryOptions
                {
                    AbsoluteExpiration = DateTime.Now.AddMinutes(5),
                    Priority = CacheItemPriority.High,
                    SlidingExpiration = TimeSpan.FromMinutes(2)
                };
                memoryCache.Set(cacheKey, customerList, cacheExpiryOptions);
            }
            return customerList;
        }
      }
  }

Code Explanation

  • Line #6 – IMemoryCache definition
  • Line #16 – Here we are setting the cache key internally in our code.
  • Line #17 – If there is a cache entry with the key as “Allcustomer”, then push this data to a List object. Else, if there is no such cache entry, call the database via the context object and set this value as a new cache entry along with the expiration parameters. The expiration date is according to you, you can according to your requirements.
  • SlidingExpiration: Gets or sets how long a cache entry can be inactive (e.g. not accessed) before it will be removed. This will not extend the entry lifetime beyond the absolute expiration (if set).
  • AbsoluteExpiration: Gets or sets an absolute expiration date for the cache entry.

A cached item set with a sliding expiration only is at risk of becoming stale. If it’s accessed more frequently than the sliding expiration interval, the item will never expire. Combine a sliding expiration with an absolute expiration to guarantee that the item expires once its absolute expiration time passes

Test the In-Memory Cache in Postman

Let’s open the postman tool and test the cache performance.

You can see when I call the customer API, it fetch around 50-60 records to take time 9391 ms.

In this call, we are directly calling the database, which may be slow depending on the traffic, connection status, size of the response, and so on. We have configured our API to parallelly store these records to the cache as well.

So in the second call, the response is expected at much more better time.

As you can see, on second API call it takes 39 ms. You can see the drastic improvement in the performance, this is how awesome Caching is.

Pros of In-Memory Cache

  • Much Quicker than other forms of distributed caching as it avoids communicating over a network.
  • Highly Reliable and best suited for Small to Mid Scale Applications.

Cons of In-Memory Cache

  • If configured incorrectly, it can consume your server’s resources.
  • With the scaling of application and longer caching periods, it can prove to be costly to maintain the server.
  • If deployed in the cloud, maintaining consistent caches can be difficult.

Points To Remember about Caching

  • Application should never depend on the Cached data as it is highly probable to be unavailable at any given time. Traditionally it should depend on your actual data source. Caching is just an enhancement that is to be used only if it is available/valid.
  • Try to restrict the growth of the cache in memory. This is crucial as Caching may take up your server resources if not configured properly. You can make use of the Size property to limit the cache used for entries.
  • Use Absolute Expiration / Sliding Expiration to make your application much faster and smarter. It also helps restricts cache memory usage.
  • Try to avoid external inputs as cache keys. Always set your keys in code.

How to improve Cache Entry

To do improvements on caching, we can use Background Jobs to update cache at a regular interval. If the Absolute Cache Expiration is set to 10 minutes, then we can can run a recurring job every 11 minutes of interval to update the cache entry. You can use Hangfire to achieve the same in ASP.NET Core Applications.

Find source code

Summary

In this detailed article, we have learnt about In-Memory Caching in ASP.Net Core. We discussed Caching, the Basics, In-Memory Caching, Implementing In-Memory Caching in ASP.NET Core etc.

That’s a wrap for this article. I hope that it cleared about In-Memory Caching in ASP.NET Core-Basic guide of Caching. Please write to us if have any.

Have you enjoyed this post and found it useful? If so, please consider supporting me:

Buy me a coffeeBuy me a coffee

Redis Caching in ASP.NET Core- Depth in Distributed Caching
Redis Caching in ASP.Net Core

This article explain about Redis Caching in ASP.NET Core, We discuss about Distributed Caching, Redis, Setting up Redis Caching in Read more

CQRS pattern with MediatR in ASP.NET Core 5.0
CQRS pattern with MediatR in ASP.NET Core 5.0

This article explain CQRS design pattern with MediatR in ASP.Net Core 5.0. CQRS is a design pattern that separated the Read more

ASP.NET Core Web API with MongoDB CRUD- Beginner’s Guide
ASP.NET CORE WEB API 5.0 MONGODB CRUD

This article explain CRUD operation using MongoDB in ASP.NET Core Web API. We discuss here basics of MongoDB and how Read more

Build CRUD REST APIs with ASP.NET Core 5.0 with EF Core 5.0
Build CRUD REST APIs with ASP.NET Core 5.0

This article explains how to build CRUD REST APIs with ASP.NET Core 5.0 with EF core 5.0 and API secured Read more

Pin It on Pinterest