Pergunta

I found this related question but my situation is a little bit different.

I have a ASP.NET application that produces long running tasks that should be processed by a number of background processes (Windows Services). Most of the tasks are similar and can be handled by most task runners. Due to different versions of the client application (where the tasks are generated by users) some tasks can only be processed by task runners of specific version. The web server has no knowledge about the kind of task. It just sends all the tasks to the same queue using MSMQ.

If a task enters the queue, the next free task runner should receive the task, decide if he can handle this kind of task, remove the task from the queue and run the task.

If the runner that received the message is not able to process this kind of task, it should put back the message to the queue, so that another runner can have a look on the message.

I tried to implement a conditional receive using a transaction, that I can abort if the task has the wrong format:

transaction.Begin();
var msg = queue.Receive(TimeSpan.FromSeconds(1000), transaction);
if (CanHandle(msg))
{
    transaction.Commit();
    // handle
}
else
{
    transaction.Abort();
}

It seems to work, but I don't know if this is the preferable way do go.

Another problem with this solution is, if there is no other free runner that can handle this message I will receive it again and again.

Is there a way I can solve this problem only using MSMQ? The whole task data is already stored in a SQL database. The task runner accesses the task data over a HTTP API (Thats why I rule out solution like SQLServer Service Broker). The data sent to the message queue is only meta data used to identify the job.

If plain MSMQ is not the right tool, can I solve the problem using MassTransit (I didn't like the fact that I have to install and run the additional MassTransit RuntimeServices + SQL db for it) for example? Other suggestions?

Foi útil?

Solução

The way you are utilizing MSMQ is really circumventing some of the fundamental features of the technology. If queue message cannot be universally handled by the reader, you are incurring a pretty sizable system performance penalty, where many of your task processing services can get sent back empty-handed when they ask for tasks. In extreme scenario, imagine what would happen if there were only one service that could perform task type "A." If that service were to go down, and the first task to be pulled out of the queue is of type "A," then your entire system will shut down.

I would suggest one of two approaches:

  1. Utilize multiple queues, as in one per task version. Hide task retrieval behind an API or some other service. Your service can request a task from one or more task types, or you can even allow for anything. The API would then be charged with figuring out which queue to pull from (i.e. map to a specific task type, pick one at random, do some sort of round robining, etc.)
  2. Opt for a different storage technology over queueing. If you write good enough SQL, a relational database would be more than up for the task. You just must exhibit a lot of care to not incur deadlocks.

Outras dicas

Can you create another queue? if Yes then I would create multiple queues. Like GenericTaskQ, which will have all the tasks in it then xTaskQ and yTaskQ. Now your xTaskRunner will pick the tasks from Generic queue and if can not process it then put it in yTaskQ(or whatever q is appropriate). same is for yTaskRunner, if it cant handle the message put it in xTaskQueue. And x and y taskrunners should always look for their respective queues first, if nothing there then go look into genericq.

If you can not create multiple qs, use message(task) labels (which should be unique, we normaly use GUID) to remember what tasks have already been seen by a task runner and can not be processed. also use Peek, to check if this message is already been addressed, before actually receiving the message.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top