Question

I'm interacting with an AzureStorageAccount with CloudQueueClient similarly to the manner described in this msdn example

CloudStorageAccount storageAccount = CloudStorageAccount.Parse("some connection string");
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();

and to add an element to a queue (with some boiler plate removed):

var queue = queueClient.GetQueueReference("queuename");
var message = new CloudQueueMessage("myString");
queue.AddMessageAsync(message);

So that means I can add "myString" to my queue. Great. And if I repeatedly call those lines of code I can add "mystring" lots of time. Also good, but inefficient.

How do I add multiple items to the queue in one message?

I've researched this a bit and found Entity Group Transactions, which may be a suitable fit. However, this looks very different to what I've been doing and don't really give me any code examples. Is there any way to use this and continue to use Microsoft.WindowsAzure.StorageClient libary to construct my messages?

Was it helpful?

Solution 2

I believe there is no real need to add multiple items to a queue in one message, because the best practice is to keep message and corresponding message handler as small as possible.

It is very hard to talk about inefficiency here. But when the number of messages grow so that it will really impact performance then you could use the BatchFlushInterval property to batch your messages. Otherwise follow Best Practices for Performance Improvements Using Service Bus Brokered Messaging

UPDATE:

By batching messages yourself within, for example a list, you would need to solve, at the very least, the following problems which may result in an unmanageable solution:

  • Keep track on the size of the message so as to not exceed the maximum message size limit
  • Find the ways to abandon, complete and move particular messages to a dead letter queue
  • Implement a batching strategy yourself
  • Keep track of large message processing times and implement locking if it takes too long

PS If you could outline the purpose of your question, then a better solution might be found.

OTHER TIPS

One way you can send multiple messages is to build a wrapper class that contains a list of individual string values (or objects), serialize this into a JSon object for example, and send that as the payload for the message. The issue here is that depending on the size of your objects, you could eventually exceed the size limitation of a message. So that's not a recommended implemtation.

At some point I was dealing with a system that needed to send a lot of messages per second; it was massively distributed. Instead of batching multiple messages we ended up creating a shard of message queues, across multiple storage accounts. The scalability requirements drove us to this implementation.

Keep in mind that Chunky vs Chatty applies to sending more information in order to avoid roundtrips, so as to optimize performance. Message queuing is not as much about performance; it is more about scalability and distribution of information. In other words, eventual consistency and distributed scale out are the patterns to favor in this environment. I am not saying you should ignore chunky vs chatty, but you should apply it where it makes sense.

If you need to send 1 million messages per second for example, then chunkier calls is an option; sharding is another. I typically favor sharding because there are fewer scalability boundaries. But in some cases, if the problem is simple enough, chunking might suffice.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top