Queue, Stack and LinkedList — Specialised Sequences

Queue<T>, Stack<T>, and LinkedList<T> are specialised collections optimised for specific access patterns that List<T> handles less efficiently. A queue processes items in the order they arrive (FIFO — first in, first out), a stack processes the most recently added item first (LIFO — last in, first out), and a linked list enables efficient insertion and removal anywhere in the sequence without shifting elements. Knowing when to reach for these instead of List<T> separates competent C# from excellent C#.

Queue<T> — FIFO Processing

// Queue — first in, first out
var jobQueue = new Queue<string>();

// Enqueue — add to the back
jobQueue.Enqueue("send-welcome-email");
jobQueue.Enqueue("resize-avatar");
jobQueue.Enqueue("send-push-notification");

Console.WriteLine(jobQueue.Count);  // 3

// Dequeue — remove and return from the front
string nextJob = jobQueue.Dequeue();         // "send-welcome-email"
Console.WriteLine(nextJob);

// Peek — look at the front without removing
string upcoming = jobQueue.Peek();           // "resize-avatar"
Console.WriteLine(jobQueue.Count);  // 2

// Safe versions (no exception when empty)
bool hasJob = jobQueue.TryDequeue(out string? job);
bool hasPeek = jobQueue.TryPeek(out string? next);

// Background job processor pattern (simplified)
while (jobQueue.TryDequeue(out var currentJob))
{
    Console.WriteLine($"Processing: {currentJob}");
    // ProcessJob(currentJob);
}
Note: The built-in Queue<T> is not thread-safe. For background task queues in ASP.NET Core (where a web request enqueues a job and a background service dequeues it), use System.Threading.Channels.Channel<T> (the modern approach) or ConcurrentQueue<T>. The Channel<T> API supports async producer-consumer patterns — the producer await channel.Writer.WriteAsync(item) and the consumer await foreach (var item in channel.Reader.ReadAllAsync()) — making it the idiomatic choice for high-throughput background processing in ASP.NET Core 5+.
Tip: For priority queues (process high-priority items before low-priority ones), use PriorityQueue<TElement, TPriority> added in .NET 6. It dequeues the element with the lowest priority value first: var pq = new PriorityQueue<string, int>(); pq.Enqueue("low", 10); pq.Enqueue("high", 1); pq.Dequeue(); returns "high". This is useful for notification systems where urgent alerts should be sent before routine ones.
Warning: Avoid using List<T> as a queue by calling RemoveAt(0) — this is O(n) because every remaining element shifts left by one position. A Queue<T> dequeues from the front in O(1) using a circular buffer internally. For 1,000 items per second over a busy request cycle, the difference between O(1) and O(n) dequeue is significant.

Stack<T> — LIFO Processing

// Stack — last in, first out
var undoStack = new Stack<string>();

// Push — add to the top
undoStack.Push("typed 'Hello'");
undoStack.Push("inserted image");
undoStack.Push("added heading");

// Pop — remove and return from the top
string lastAction = undoStack.Pop();   // "added heading" — most recent
Console.WriteLine(lastAction);

// Peek — look at top without removing
string nextUndo = undoStack.Peek();   // "inserted image"

// Safe pop
bool success = undoStack.TryPop(out string? action);

// Expression evaluator / bracket matcher
public static bool AreBracketsBalanced(string expression)
{
    var stack = new Stack<char>();
    foreach (char c in expression)
    {
        if (c is '(' or '[' or '{')
            stack.Push(c);
        else if (c is ')' or ']' or '}')
        {
            if (stack.Count == 0) return false;
            char open = stack.Pop();
            if (!IsMatchingPair(open, c)) return false;
        }
    }
    return stack.Count == 0;

    static bool IsMatchingPair(char o, char c) =>
        (o == '(' && c == ')') || (o == '[' && c == ']') || (o == '{' && c == '}');
}

LinkedList<T> — Efficient Mid-Sequence Operations

// LinkedList — O(1) insert/remove anywhere given a node reference
var history = new LinkedList<string>();

LinkedListNode<string> first  = history.AddFirst("page1");
LinkedListNode<string> second = history.AddLast("page2");
LinkedListNode<string> third  = history.AddLast("page3");

// Insert before a node — O(1)
history.AddBefore(third, "page2.5");

// Remove a node — O(1) given the node reference
history.Remove(second);

// Traverse forward
foreach (string page in history)
    Console.Write(page + " ");   // page1 page2.5 page3

// When to use LinkedList vs List:
// LinkedList: O(1) insert/remove at known node; O(n) random access
// List: O(1) random access; O(n) insert/remove in middle (element shifting)
// Use LinkedList only when you hold node references and need frequent mid-sequence changes

When to Use Each

Collection Access Pattern Key Use Cases
Queue<T> FIFO — process in arrival order Background jobs, message processing, print queues
Stack<T> LIFO — process most recent first Undo/redo, expression parsing, DFS traversal
LinkedList<T> O(1) insert/remove by node LRU cache eviction, playback queues with reorder
PriorityQueue<T,P> Process by priority Notification urgency, task scheduling

Common Mistakes

Mistake 1 — Using List.RemoveAt(0) as a queue (O(n) dequeue)

❌ Wrong — shifts all elements left on every dequeue:

var jobs = new List<string>();
jobs.Add("job1");
string next = jobs[0]; jobs.RemoveAt(0);  // O(n)!

✅ Correct — use Queue<T>.Dequeue() which is O(1).

Mistake 2 — Using Queue in a multi-threaded ASP.NET Core service without synchronisation

❌ Wrong — race condition: two threads dequeue the same item.

✅ Correct — use ConcurrentQueue<T> or Channel<T> for thread-safe producer-consumer in ASP.NET Core background services.

🧠 Test Yourself

A user navigates: Home → Products → Category → Item. They click Back. Which collection models the navigation history and what operation do Back and Forward use?