I am developing a python web crawler using RabbitMQ. The server sends
list of urls to be crawled to client and he will send the results after
crawling. My problem is that when client is processing the request and it
dies suddenly all the information abt the request is lost. i.e. rabbitmq
removes the message from queue when client takes a request. I want it to be
relaible. Even if the client dies message should be available in message
queue which can be allocated to some other client. Please reply asap.
Hoping for a positive response on this issue.
Thanks
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.rabbitmq.com/pipermail/rabbitmq-discuss/attachments/20120214/11eb28f9/attachment.htm>
An HTML attachment was scrubbed...
URL: <http://lists.rabbitmq.com/pipermail/rabbitmq-discuss/attachments/20120214/11eb28f9/attachment.htm>