Remove Caching Boilerplate Code With PostSharp Caching

Over the past year or so, I’ve been taking a look at the PostSharp framework. It started with looking at how complex multi-threaded scenarios can be handled with PostSharp Threading, then I took a dive into creating a mini-APM logger with PostSharp Logging. Since doing so, I reached out to people I’ve worked with, both past and present, if they’ve had experience using PostSharp, and the response was overwhelmingly positive. And the feedback kept coming back to the same thing time and time again, that PostSharp just takes care of super common scenarios without having to work from scratch every time.

And that brings me to the PostSharp Caching library. At some point, on every project, a developer is going to implement caching. And as the saying goes…

There are two hard things in Computer Science : cache invalidation, naming things, and off-by-one errors.

PostSharp unfortunately can’t help with the latter two, but it can make the “simple at first, complex over time” act of caching and cache invalidation an absolute breeze.

Caching With PostSharp Attributes

Before jumping to invalidating our cache, we need to actually have items cached to begin with. I’m going to set up a simple console application like so :

static async Task Main(string[] args)
{
    Stopwatch watch = new Stopwatch();
    watch.Start();
    Console.WriteLine((await SayHello("Wade")) + " - " + watch.ElapsedMilliseconds);
    Console.WriteLine((await SayHello("Wade")) + " - " + watch.ElapsedMilliseconds);
    Console.WriteLine((await SayHello("John")) + " - " + watch.ElapsedMilliseconds);
    Console.WriteLine((await SayHello("Wade")) + " - " + watch.ElapsedMilliseconds);
}

static async Task<string> SayHello(string name)
{
    await Task.Delay(1000);
    return $"Hello {name}!";
}

I’ve added in a Task.Delay in our SayHello method, purely to demonstrate a slow function and make our caching more obvious. It’s meant to simulate maybe a very slow and unavoidable database call – e.g. Something you may want to cache in the first place!

When we run this, it’s pretty obvious what’s going to happen.

Hello Wade! - 1019
Hello Wade! - 2024
Hello John! - 3031
Hello Wade! - 4043

Between each call to SayHello, there’s about a 1 second delay.

Now let’s add PostSharp. First we need to use our Package Manager Console to install the PostSharp Caching library :

Install-Package PostSharp.Patterns.Caching

Then we just have to add two lines :

static async Task Main(string[] args)
{
    //Set a default backend (In our case a memory cache, but more on that later)
    CachingServices.DefaultBackend = new MemoryCachingBackend();

    Stopwatch watch = new Stopwatch();
    watch.Start();
    Console.WriteLine((await SayHello("Wade")) + " - " + watch.ElapsedMilliseconds);
    Console.WriteLine((await SayHello("Wade")) + " - " + watch.ElapsedMilliseconds);
    Console.WriteLine((await SayHello("John")) + " - " + watch.ElapsedMilliseconds);
    Console.WriteLine((await SayHello("Wade")) + " - " + watch.ElapsedMilliseconds);
}

//Cache this method
[Cache]
static async Task<string> SayHello(string name)
{
    await Task.Delay(1000);
    return $"Hello {name}!";
}

We just have to set up what our caching backend will be (In this case, just a memory cache). And then do nothing more but add the [Cache] attribute to any method we wish to cache the result of.

Running this again :

Hello Wade! - 1077
Hello Wade! - 1080
Hello John! - 2102
Hello Wade! - 2102

Perfect! We can see that the first attempt to SayHello with the param “Wade” took 1 second, but each subsequent time was almost instant (From the cache). Interestingly though, we can see that when we pass in a different name, in our case “John”, it took the full second. That’s because PostSharp Caching takes into account the parameters of the method, and creates unique caches based on the input. Pretty impressive stuff, but it doesn’t stop there!

PostSharp Caching Is Highly Configurable

Of course, the above is just a simple example. We can extend this out to include things like absolute expiration in minutes :

[Cache(AbsoluteExpiration = 5)]

Or sliding expiration :

[Cache(SlidingExpiration = 5)]

Or you can even have a hierachy where you can place a CacheConfiguration on an entire class :

[CacheConfiguration(AbsoluteExpiration = 5)]
public class MyService
{
    [Cache]
    public string HelloWorld()
    {
        return "Hello World!";
    }
}

What I’m trying to get at is that things are highly configurable. You can even configure caching (Including disabling caching completely) at runtime using Caching Profiles like so :

//Make all caching default to 5 minutes. 
CachingServices.Profiles.Default.AbsoluteExpiration = TimeSpan.FromMinutes(5);
//On second thought, let's just disable all caching for the meantime. 
CachingServices.Profiles.Default.IsEnabled = false;

So stopping short of giving you the laundry list of configuration options, I really just want to point out that this thing is super configurable. Everytime I thought “Well what about if you want to….” the documentation was there to give me exactly that.

Controlling Cache Keys

As we’ve seen earlier, our cache key is automatically build around the parameters of the method we are trying to cache. It’s actually a little more complicated than that, it uses the enclosing type (e.g. a ToString() on the enclosing class), the method name, and the method parameters. (And even then, it actually uses more, but let’s just stick with that for now!).

Imagine I have code like so :

static async Task Main(string[] args)
{
    //Set a default backend (In our case a memory cache, but more on that later)
    CachingServices.DefaultBackend = new MemoryCachingBackend();

    Stopwatch watch = new Stopwatch();
    watch.Start();
    Console.WriteLine((await SayHello("Wade", Guid.NewGuid())) + " - " + watch.ElapsedMilliseconds);
    Console.WriteLine((await SayHello("Wade", Guid.NewGuid())) + " - " + watch.ElapsedMilliseconds);
}

[Cache]
static async Task<string> SayHello(string name, Guid random)
{
    await Task.Delay(1000);
    return $"Hello {name}!";
}

Yes I know it’s a rather obtuse example! But the fact is the random guid being passed into the method is breaking caching. When I run this I get :

Hello Wade! - 1079
Hello Wade! - 2091

But if it’s a parameter that I really don’t care about as part of caching, I want to be able to ignore it. And I can! With another attribute of course :

[Cache]
static async Task<string> SayHello(string name, [NotCacheKey]Guid random)
{
    await Task.Delay(1000);
    return $"Hello {name}!";
}

And running this I get :

Hello Wade! - 1079
Hello Wade! - 1082

While this is a simple example, I just wanted to bring up the fact that while PostSharp Caching is super easy to implement and can work right out of the box, it’s also crazy extensible and there hasn’t been a scenario yet that I’ve been “stuck” with the library not being able to do what I want.

Invalidating Cache

For all the talk my intro did about invalidating cache being hard.. it’s taken us a while to get to this point in the post. And that’s because PostSharp makes it a breeze!

Your first option is simply annotating the methods that update/insert new values (e.g. where you would normally invalidate cache). It looks a bit like this :

static async Task Main(string[] args)
{
    //Set a default backend (In our case a memory cache, but more on that later)
    CachingServices.DefaultBackend = new MemoryCachingBackend();

    Stopwatch watch = new Stopwatch();
    watch.Start();
    Console.WriteLine((await SayHello("Wade")) + " - " + watch.ElapsedMilliseconds);
    UpdateHello("Wade");
    Console.WriteLine((await SayHello("Wade")) + " - " + watch.ElapsedMilliseconds);

    Console.WriteLine((await SayHello("John")) + " - " + watch.ElapsedMilliseconds);
    UpdateHello("Wade");
    Console.WriteLine((await SayHello("John")) + " - " + watch.ElapsedMilliseconds);
}

[Cache]
static async Task<string> SayHello(string name)
{
    await Task.Delay(1000);
    return $"Hello {name}!";
}

[InvalidateCache(nameof(SayHello))]
static void UpdateHello(string name)
{
    //Do something here. 
}

A long example but I wanted to point something out. That when you ask to invalidate the cache, it takes the parameters of your update method to match the keys of your cached method. So in this case, because I Update the name “Wade” twice, it only ever clears that cache key. It doesn’t simply wipe the entire cache for the entire method.

So our output becomes :

Hello Wade! - 1062
Hello Wade! - 2092
Hello John! - 3111
Hello John! - 3111

But maybe the whole attribute thing isn’t for you. You can actually invalidate cache imperatively also like so :

[Cache]
static async Task<string> SayHello(string name)
{
    await Task.Delay(1000);
    return $"Hello {name}!";
}

[InvalidateCache(nameof(SayHello))]
static void UpdateHello(string name)
{
    //Do some work

    //Now invalidate the cache. 
    CachingServices.Invalidation.Invalidate(SayHello, name);
}

What I love about this is that I’m passing a reference to the method name, and the value of the parameter. Even though I’m invalidating a specific cache item, I’m still not having to work out what the actual cache key is. That means should the way in which we build the cache key for SayHello ever change, the invalidation of the cache never changes because it’s an actual strongly typed reference to the method.

Obviously the added benefit of invalidating cache imperatively is that you can both conditionally define logic on when you want to invalidate, and be able to invalidate cache from inside an existing method without adding attributes. That being said, the attributes are really useful if a lot of your logic is CRUD interfaces.

Distributed Caching Backends

I don’t want to go on too long about caching backends because, for the most part, you’re going to be using Redis for distributed cache, and a local in-memory cache if you just want something inside your one instance of your application.

But I did want to mention one feature that I’ve had to implement myself many times when using other caching libraries. That is, the combination of using a Redis Server for distributed cache, but *also* a local in memory cache. The reason being, if my application is horizontally scaled, of course I want to use Redis so that everyone can share the same cached entities. However, fetching from Redis incurs a level of overhead (namely network) to go and fetch the cache each time. So what I’ve generally had to implement myself, is keeping a local store of in-memory cache for faster local retrieval, as well as managing the Redis distributed cache for other machines.

But of course, PostSharp caching makes this a breeze :

RedisCachingBackendConfiguration redisCachingConfiguration = new RedisCachingBackendConfiguration();
redisCachingConfiguration.IsLocallyCached = true;

In general, the caching backends used in PostSharp are actually very extensible. While Redis and In-Memory serves me just fine, you could implement PostSharps interface to add in your own backend as well (For example SQL). Then you get the power of the attributes and all of the goodness of PostSharp in code, but with your own one off caching backend.

Who Is This Library For?

When I recommend libraries or products, I like to add a little bit around “Who is this for?”. Because not everything I use works in small start ups, or vice versa, in large enterprises. However, I really believe PostSharp Caching has the ability to fit into almost any product.

Early in my development career, I thought that I could develop everything myself. And all it really meant was taking away time from the features my customers really cared about, and devoting time to re-inventing the wheel on some boilerplate code. When I thought not only about putting a dollar amount on my time, but the opportunity cost lost of not having additional features, it suddenly made so much more sense to leave boilerplate code, like caching, to people that actually dedicate their time to getting it right. PostSharp Caching is one of those products, one that you can just plug in, have it work right away, and save your time for features that actually matter.


This is a sponsored post however all opinions are mine and mine alone. 

Leave a Comment