Question

I'm trying to design one multiserver updates deployment system, I was thinking if there is any limitation for big binary strings. If I put for example a string from one 100MB file in the queue?

Thanks,
Pedro

Était-ce utile?

La solution 2

Searching for "RabbitMQ Large Files" turned up a significant amount of advice on the subject.

The standard response seems to be that it should, in theory, be able to handle it, but you may find that your broker becomes unresponsive.

If you own both sides of the queue (sender/receiver), then you may consider chunking the data into more manageable 'chunks' of data. e.g. 100KB chunks. This will be nicer to your broker. One of the search hits from above had a link to a 'streaming' sender written in ruby, which did chunking.

If you do not own both sides of the queue, then consider using a form of 'claim check', where your message contains the location of the large blob/file/data in storage location better suited to it.

This could be pretty interesting background information: http://rabbitmq.1065348.n5.nabble.com/Can-RabbitMQ-handle-big-messages-tt566.html#a569

Autres conseils

I've done it and I would not necessarily recommend it. Its probably better to store the file in something like GridFS (MongoDB) and then reference the _id in the RabbitMQ message. You can then pull the file on the consumer using Mongo's interface and delete it once done.

I have this running with about 20M objects in GridFS and its been rocksolid.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top