Question

Inspired by my current problem, which is kind of identical to this: Analogue of Queue.Peek() for BlockingCollection when listening to consuming IEnumerable<T> with the difference that I - currently - am using ConcurrentQueue<T> instead of BlockingCollection<T>, I wonder what any use case for ConcurrentQueue<T>.TryPeek() may be?

Of course I mean a use case without manual lock(myQueue) stuff to serialize queue accesses as TPL is meant to improve/substitute those lockings.

Was it helpful?

Solution

I had an application that used ConcurrentQueue<T>.TryPeek to good effect. One thread was set up to monitor the queue. Mostly it was looking at queue size, but we also wanted to get an idea of latency. Because the items in the queue had a time stamp field that said what time they were put into the queue, my monitoring thread could call TryPeek to get the item at the head of the queue, subtract the insertion time from the current time, and tell me how long the item had been in the queue. Over time and many samples, that gave me a very clear picture of how long it was taking for a received item to be processed.

It didn't matter that some other thread might dequeue the item while my monitoring code was still examining it.

I can think of a few other scenarios in which it would be useful to see what's at the head of the queue, even though it might be pulled off immediately.

OTHER TIPS

I have a ConcurrentQueue where many threads may Enqueue, but I limit just one thread doing TryPeek and TryDequeue by lock:

lock (dequeueLock)
    if (queue.TryPeek(out item))
        if (item.Value <= threshold)
            queue.TryDequeue(out item);

Note: other threads may continue to Enqueue while this code runs.

It would be nicer to have some atomic peek - check - dequeue operation, but the lock is fine for my scenario.

TryPeek is used to wait for the object to be at the first of the queue. TryDequeue will dequeue any object that is there. So, for instance, I wrote a webserver that is multithreaded, but during authorization, when authorization is enabled for certain request, they need at one point to be processed in the order they were received. I don't want to lock up the whole thread function or half of it, only so that for some clients I can process their requests in order.

So, I created a Dictionary<string, ConcurrentQueue<HttpListenerContext>>, then at the very beginning of the server thread, I lock temporarily and check to see if authorization will be required, if so I store the HttpListenerContext in a queue with the client IP as the dictionary key, so that different clients don't block each other's threads unnecessarily. Then, I process the headers and compute the hashes as normal, as page may make two or three request using ajax and websockets connections after the initial, it is better to multithread the hashing of the authorization information (which is digest authorization I implemented for HttpListener myself, so that I am not restricted to using Active Directory). Then when the authorization needs to be checked for the case that what is called the client nonce count is only one greater than the last request for that client's session, a security feature, I use the que I created and TryPeek with Thread.Yield() to wait until that threads HttpListenerContext is the first in the que to finish authorization and then dequeue it.

In short, it can be used to multithread where for most the thread you want things to run in parallel, to take advantage of different cores, but then for some threads for a piece of them you need everything to get back in order.

My understanding is that you use this method when you want to do a peek but you are not sure there is an item in the queue. Normally Peek on an empty queue will throw an exception. TryPeek will return false if the item is not there. This can be extremely useful in multithreaded scenarios where another thread may dequeue the item in between checks for empty queue and actually peeking for the value.

Try it

T item = bc.GetConsumingEnumerable().FirstOrDefault();
if (item != null)
{
 //...
}

Having a look at an object to see if it is valid before taking it out is an option, just remember that when you do this that the Concurrent Queue will create a reference and not release the object from memory when you dequeue it. If you do, and you are memory profiling as I did with my ConcurrentQueue, you will see something like this.

enter image description here

Notice the ConcurrentQueueSegment with 11,060 instances while the queue only holds 8.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top