It’s been a while since we’ve talked about the preview releases coming from the .NET Team. Mostly that’s been because many of the preview updates contain small incremental releases that aren’t quite anything to write home about yet. For example, if you’re not interested or using Blazor at the moment, then three quarters of the updates aren’t for you.
But that changes with .NET 7 Preview 6 release with two new features that are going to be all out game changers. Those are :
- Output caching middleware
- Rate limiting middleware
I’ll chat about both of these, but first, how can you get your hands on them yourself?
Downloading .NET 7 Preview 6
The first thing you should do is download the preview SDK from here.
Next is a little tricky. If you are using Visual Studio Code, then you *should* be able to get things running immediately. However, for Visual Studio, you will need the preview version available here.
Again, I want to reiterate that you need the preview version of Visual Studio 2022. You cannot use any previous versions of Visual Studio (e.g. 2019), nor can you use the release version.
Output Caching Middleware
When this gets a general release, we’ll dive into it more thoroughly, but for now, I actually just wanted to talk about one feature that the more I thought about, the more I realized “I can’t believe this hasn’t been solved in the past”.
Think about a scenario whereby an extremely popular API endpoint gets called multiple times per second. That endpoint calls a database stored procedure which itself takes several seconds to complete (That’s why we’re caching it after all!).
Let’s say we invalidate that cache at some point. Typically every request will now hit the endpoint, and execute the stored procedure all at the same time. We can call this “cache stampedes”, whereby the check is simply “No cache?, go to the database”.
.NET however has a solution for this. Instead of every request calling the database until the first result is returned, it will instead allow the very first request to call the database, and all subsequent requests will instead wait for the cache to be populated. It’s something so simple, yet also something I’ve seen in production a lot.
Even if you remove the cache invalidation, I’ve often see deployments go belly up because the cache isn’t “pre-primed” before go live. This solution from the .NET Team also solves this!
So what complex code do you need to set this up? Well.. Just a single line of course :
app.MapGet("/myendpoint", () => DoWork()).CacheOutput();
Rate Limiting Middleware
Similar to the Output Caching Middleware, a Rate Limiting Middleware is well overdue as a first class citizen in the .NET ecosystem.
I think we all have the concept of rate limiting in general, but what took my interest in this, was that the .NET Team have implemented various types of rate limits and packaged them all into the same middleware. That is, you now have many strategies in which you can rate limit your application rather than a simple “X requests in X timespan”.
Already announced are :
Concurrent Request Limit
Does what it says on the tin. If your concurrent limit is 5 requests, then there can only be 5 requests being processed at one time (And the 6th will be rejected). As soon as one of those 5 completes, a new request can be made.
Token Bucket Limit
Imagine a bucket that can hold 100 tokens. And every minute, 10 tokens are added back into the bucket. When a request comes in, a token is removed from the bucket, and so on. This type of strategy is very common when you need some sort of burstability because you can completely drain the bucket all at once if you wish, and then wait for the tokens to slowly fill up again.
Fixed Window Limit
Put simply, every hour you can make 100 requests. Every hour, this limit resets back to 100. Nice and simple! This is extremely useful if you have “daily” limits for example where the limit is tied to a particular time e.x. At midnight the rate limit resets.
Sliding Window Limit
Similar to the fixed window limit, you can make 100 requests in any 1 hour window, but that 1 hour window moves. More or less, the strategy becomes “In the past hour, you may have no more than 100 requests”.