Context and Problem


ASP.NET Core Memory cache has its own limits. It is difficult to scale in a cloud environment with multiple instances. In cloud applications, sometimes your instance can be short lived (especially in docker), which means the memory cache would be lost and data must be fetched from database again for next requests. My project Elf used memory cache and had these problems before. Let's see how Azure can solve this.

Solution


Azure Cache for Redis provides a fully managed distributed cache service for cloud applications. It is built on top of Redis, a popular cache and messaging product. Integrating Redis into cloud applications can improve performance and scalability. All instances of the application will connect to Redis to read and write cache instead of operating in their own separated memories. So that we can have consistent cache across each instance. At this point, restarting an instance won't lose the cache, so short-lived containers are no longer a problem for cache.

ASP.NET Core memory cache when deployed to multiple instances look like this:

Azure Cache for Redis looks like this:

I also use a cloud native design pattern called "Cache-Aside pattern" with Azure Cache for Redis. So that repeated access to data does not hit database every time. 

Implementation


Create Azure Cache for Redis

First, create an Azure Cache for Redis instance. Choose a location near your application server. 

The creation process can take 15-20 minutes. When it is finished, I recommend going to "Advanced settings", and set the Minimum TLS version to 1.2, so that you can get rid of an annoying warning message on the overview page.

Then, go to Console in Overview page.

Enter "PING" and if server response "PONG", it means your Redis is ready to go.

Now, go to "Access keys" page and copy either primary or secondary connection string for later use.

Adding Redis to ASP.NET Core

Install NuGet package Microsoft.Extensions.Caching.StackExchangeRedis into your project.

Add the Redis connection string you copied from Azure. In my case, I name it "RedisConnection".

"ConnectionStrings": {
  "RedisConnection": "************"
}

Add Redis to DI container.

builder.Services.AddStackExchangeRedisCache(options =>
{
    options.Configuration = builder.Configuration.GetConnectionString("RedisConnection");
});

And now we can use the IDistributedCache interface from DI. However, compared to IMemoryCache interface, the IDistributedCache cannot store objects directly, objects need to be serialized for writing into cache and deserialized when reading from cache. In my case, the only object I deal with is "Link" type. So, I created an extension on IDistributedCache like this.

public static class DistributedCacheExtensions
{
    public static async Task<Link> GetLink(this IDistributedCache cache, string token)
    {
        var cachedLinkBytes = await cache.GetAsync(token);
        if (null == cachedLinkBytes) return null;

        var cachedLinkJson = Encoding.UTF8.GetString(cachedLinkBytes);
        var cachedLink = JsonSerializer.Deserialize<Link>(cachedLinkJson);

        return cachedLink;
    }

    public static async Task SetLink(this IDistributedCache cache, string token, Link link, TimeSpan? ttl = null)
    {
        var json = JsonSerializer.Serialize(link);
        var bytes = Encoding.UTF8.GetBytes(json);

        if (ttl == null)
        {
            await cache.SetAsync(token, bytes);
        }
        else
        {
            await cache.SetAsync(token, bytes, new() { SlidingExpiration = ttl });
        }
    }
}

Get an IDistributedCache instance from DI

private readonly IDistributedCache _cache;

public ForwardController(IDistributedCache cache)
{
    _cache = cache;
}

Now I can write into Redis like this:

await _cache.SetLink(token, link, TimeSpan.FromSeconds(link.TTL.GetValueOrDefault()));

And read from Redis:

var linkEntry = await _cache.GetLink(token);

For complete code, please see my Elf project on GitHub: https://github.com/EdiWang/Elf