Queues In C# .NET

I was recently looking into the new Channel<T>  API in the latest version of .NET Core (For an upcoming post), but while writing it up, I wanted to do a quick refresher of all the existing “queues” in C# .NET. These queues are also available in full framework (And possibly other platforms), but all examples are written in .NET Core so your mileage may vary if you are trying to run them on a different platform.

FIFO vs LIFO

Before we jump into the .NET specifics, we should talk about the concept of FIFO or LIFO, or “First In, First Out” and “Last In, First Out”. For the concept of queues, we typically think of FIFO. So the first message put into the queue, is the first one that comes out. Essentially processing messages as they go into a queue. The concept of LIFO, is typically rare when it comes to queues, but in .NET there is a type called Stack<T>  that works with LIFO. That is, after filling the stack with messages/objects, the last one put in would then be the first one out. Essentially the order would be reversed.

Queue<T>

Queue<T>  is going to be our barebones simple queue in .NET Core. It takes messages, and then pops them out in order. Here’s a quick code example :

static void Main(string[] args)
{
    Queue<string> messageQueue = new Queue<string>();
    messageQueue.Enqueue("Hello");
    messageQueue.Enqueue("World!");

    Console.WriteLine(messageQueue.Dequeue());
    Console.WriteLine(messageQueue.Dequeue());
    Console.ReadLine();
}

Pretty stock standard and not a lot of hidden meaning here. The Enqueue  method puts a message on our queue, and the Dequeue  method takes one off (In a FIFO manner). Our console app obviously prints out two lines, “Hello” then “World!”.

Barring multi threaded scenarios (Which we will talk about shortly), you’re not going to find too many reasons to use this barebones queue. In a single threaded app, you might pass around a queue to process a “list” of messages, but you may find that using a List<T>  within a loop is a simpler way of achieving the same result. Infact if you look at the source code of Queue, you will see it’s actually just an implementation of IEnumerable anyway!

So how about multi threaded scenarios? It kind of makes sense that you may want to load up a queue with items, and then have multiple threads all trying to process the messages. Well using a queue in this manner is actually not threadsafe, but .NET has a different type to handle multi threading…

ConcurrentQueue<T>

ConcurrentQueue<T>  is pretty similar to Queue<T> , but is made threadsafe by a copious amount of spinlocks. A common misconception is that ConcurrentQueues are just a wrapper around a queue with the use of the lock  keyword. A quick look at the source code here shows that’s definitely not the case. Why do I feel the need to point this out? Because I often see people try and make their use of Queue<T>  threadsafe by using locks, thinking that they are doing what Microsoft does when using ConcurrentQueue, but that’s pretty far from the truth and actually takes a pretty big performance hit when doing so.

Here’s a code sample of a ConcurrentQueue :

static void Main(string[] args)
{
    ConcurrentQueue<string> concurrentQueue = new ConcurrentQueue<string>();
    concurrentQueue.Enqueue("Hello");
    concurrentQueue.Enqueue("World!");

    string message;
    while(concurrentQueue.TryDequeue(out message))
    {
        Console.WriteLine(message);
    }

    Console.ReadLine();
}

So you’ll notice we can no longer just dequeue a message, we need to TryDequeue. It will return true if we managed to pop a message, and false if there is no message to pop.

Again, the main point of using a ConcurrentQueue over a regular Queue is that it’s threadsafe to have multiple consumers (Or producers/enqueuers) all using it at the same time.

BlockingCollection<T>

A blocking collection is an interesting “wrapper” type that can go over the top of any IProducerConsumerCollection<T>  type (Of which Queue<T>  and ConcurrentQueue<T>  are both). This can be handy if you have your own implementation of a queue, but for most cases you can roll with the default constructor of BlockingCollection. When doing this, it uses a ConcurrentQueue<T> under the hood making everything threadsafe (See source code here). The main reason to use a BlockingCollection is that it has a limit to how many items can sit in the queue/collection. Obviously this is beneficial if your producer is much faster than your consumers.

Let’s take a quick look :

static void Main(string[] args)
{
    BlockingCollection<string> blockingCollection = new BlockingCollection<string>(2);
    Console.WriteLine("Adding Hello");
    blockingCollection.Add("Hello");
    Console.WriteLine("Adding World!");
    blockingCollection.Add("World!");
    Console.WriteLine("Adding Good");
    blockingCollection.Add("Good");
    Console.WriteLine("Adding Evening");
    blockingCollection.Add("Evening!");

    Console.ReadLine();
}

What will happen with this code? You will see “Adding Hello”, “Adding World!”, and then nothing… Your application will just hang. The reason is this line :

BlockingCollection<string> blockingCollection = new BlockingCollection<string>(2);

We’ve initialized the collection to be a max size of 2. If we try and add an item where the collection is already at this size, we will just wait until a message is dequeued. How long will we wait? Well by default, forever. However we can change our add line to be :

blockingCollection.TryAdd("Hello", TimeSpan.FromSeconds(60));

So we’ve changed our Add call to TryAdd, and we’ve specified a timespan to wait. If this timespan is hit, then the TryAdd method will return false to let us know we weren’t able to add the item to the collection. This is handy if you need to alert someone that your queue is overloaded (e.g. the consumers are stalled for whatever reason).

Stack<T>

As we talked about earlier, a Stack<T> type allows for a Last In, First Out (LIFO) queuing style. Consider the following code :

static void Main(string[] args)
{
    Stack<string> stack = new Stack<string>();
    stack.Push("Hello");
    stack.Push("World!");

    Console.WriteLine(stack.Pop());
    Console.WriteLine(stack.Pop());

    Console.ReadLine();
}

The output would be “World!” then “Hello”. It’s rare that you would need this reversal of messages, but it does happen. Stack<T>  also has it’s companion in ConcurrentStack<T> , and you can initialize BlockingCollection with a ConcurrentStack within it.

Channel<T>

There is a brand new Channel<T> type released with .NET Core. Because it’s just so different from the existing queue types in .NET, I’ll have an upcoming post discussing a tonne more about how they work, and why you might use them.

4 thoughts on “Queues In C# .NET”

  1. Found this post while looking for info on message queues in .NET Core.

    LIFO does not mean “Last In, Last Out” as you stated in the post. It is Last In, First Out

    Reply

Leave a Comment