Hello,

Are there any known memory leaks when using shovels with 2.6.1? We're
seeing a slow leak that grows over 6-8 days on a few instances of
rabbitmq that are using the shovel plugin and we're not exactly sure
why. I'm not sure if this is related to the memory leak we previously
reported a few weeks ago that was fixed in 2.7.1

These instances are handling between 5 and 7 million messages a day.

The shovel config (and rabbit config) we're using is:


{rabbitmq_shovel,
[{shovels,
[{engine_my_prod_shovel,
[{sources, [{brokers,
["amqp://my-user-prod:rabbitmq-my-prod at localhost/my-prod"]},
{declarations,
['queue.declare',
{'queue.bind',
[{exchange, <<"my-exchange">>},
{queue, <<"my-queue">>}]}
]}]},
{destinations, [{brokers, [

"amqp://my-user-prod:rabbitmq-my-prod at aus-svc3/my-prod?heartbeat=5",

"amqp://my-user-prod:rabbitmq-my-prod at aus-svc4/my-prod?heartbeat=5"
]},
{declarations,
[{'exchange.declare',
[{exchange, <<"my-exchange">>},
{type, <<"direct">>},
durable]},
{'queue.bind',
[{exchange, <<"my-exchange">>},
{queue, <<"my-queue">>}]}

]}]},
{queue, <<"my-queue">>},
{prefetch_count, 1000},
{ack_mode, on_confirm},
{publish_properties, [{delivery_mode, 2}]},
{publish_fields, [{exchange, <<"my-exchange">>},
{routing_key, <<"">>}]},
{reconnect_delay, 5}
]}
]
}]
}
].


I have the output of the following if it's helpful (though I won't
email it to the list; it's about 5M)

io:format("~p", [lists:reverse(lists:sort([{process_info(Pid, memory),
Pid, process_info(Pid)} || Pid <- processes()]))]).


Travis
--
Travis Campbell
travis at ghostar.org

Search Discussions

  • Matthias Radestock at Dec 20, 2011 at 7:20 pm
    Travis,
    On 20/12/11 16:42, Travis wrote:
    Are there any known memory leaks when using shovels with 2.6.1? Nope.
    We're seeing a slow leak that grows over 6-8 days on a few instances of
    rabbitmq that are using the shovel plugin and we're not exactly sure
    why. I'm not sure if this is related to the memory leak we previously
    reported a few weeks ago that was fixed in 2.7.1
    Is the shovel publishing to mirrored queues? If so then then it's
    probably the same problem.
    The shovel config (and rabbit config) we're using is:


    {rabbitmq_shovel,
    [{shovels,
    [{engine_my_prod_shovel,
    [{sources, [{brokers,
    ["amqp://my-user-prod:rabbitmq-my-prod at localhost/my-prod"]},
    {declarations,
    ['queue.declare',
    {'queue.bind',
    [{exchange,<<"my-exchange">>},
    {queue,<<"my-queue">>}]}
    ]}]},
    {destinations, [{brokers, [

    "amqp://my-user-prod:rabbitmq-my-prod at aus-svc3/my-prod?heartbeat=5",

    "amqp://my-user-prod:rabbitmq-my-prod at aus-svc4/my-prod?heartbeat=5"
    ]},
    {declarations,
    [{'exchange.declare',
    [{exchange,<<"my-exchange">>},
    {type,<<"direct">>},
    durable]},
    {'queue.bind',
    [{exchange,<<"my-exchange">>},
    {queue,<<"my-queue">>}]}

    ]}]},
    {queue,<<"my-queue">>},
    {prefetch_count, 1000},
    {ack_mode, on_confirm},
    {publish_properties, [{delivery_mode, 2}]},
    {publish_fields, [{exchange,<<"my-exchange">>},
    {routing_key,<<"">>}]},
    {reconnect_delay, 5}
    ]}
    ]
    }]
    }
    ].
    Two observations:

    1) there is a stray 'queue.declare' in the first shovel config. This
    will cause a new queue to be created every time the shovel connection is
    interrupted. You would surely have seen those queues in the management
    UI and rabbitmqctl, so I doubt that's the cause of the leak.

    2) Instead of the establishing a connection to 'localhost', specify no
    host at all, e.g.
    "amqp://my-user-prod@/my-prod"; that way a more efficient and less fault
    prone direct connection is used instead of a network connection (which
    is also why no password is required).


    Regards,

    Matthias.
  • Travis at Dec 30, 2011 at 6:59 pm

    On Tue, Dec 20, 2011 at 1:20 PM, Matthias Radestock wrote:
    We're seeing a slow leak that grows over 6-8 days on a few instances of
    rabbitmq that are using the shovel plugin and we're not exactly sure
    why. ?I'm not sure if this is related to the memory leak we previously
    reported a few weeks ago that was fixed in 2.7.1

    Is the shovel publishing to mirrored queues? If so then then it's probably
    the same problem.
    It is.

    I upgraded to a patched version of 2.6.1 that has the fix y'all
    created, but I'm still seeing a slow increase of memory usage for no
    apparent reason.
    Two observations:

    1) there is a stray 'queue.declare' in the first shovel config. This will
    cause a new queue to be created every time the shovel connection is
    interrupted. You would surely have seen those queues in the management UI
    and rabbitmqctl, so I doubt that's the cause of the leak.
    Would this be seen as queues named amqp.gen-<randomname>? We've been
    plagued by those and I never could figure out why.
    2) Instead of the establishing a connection to 'localhost', specify no host
    at all, e.g.
    "amqp://my-user-prod@/my-prod"; that way a more efficient and less fault
    prone direct connection is used instead of a network connection (which is
    also why no password is required).
    Interesting. Thanks!

    Travis
    --
    Travis Campbell
    travis at ghostar.org
  • Matthias Radestock at Dec 30, 2011 at 9:44 pm
    Travis,
    On 30/12/11 18:59, Travis wrote:
    I upgraded to a patched version of 2.6.1 that has the fix y'all
    created, but I'm still seeing a slow increase of memory usage for no
    apparent reason.
    Please upgrade to 2.7.1. That way, if there is still is a problem, we
    have a sound basis for continuing our investigation.
    Two observations:

    1) there is a stray 'queue.declare' in the first shovel config.
    [...]
    Would this be seen as queues named amqp.gen-<randomname>?
    Yes. And obviously if nothing clears those up they constitute a leak.

    Regards,

    Matthias.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouprabbitmq-discuss @
categoriesrabbitmq
postedDec 20, '11 at 4:42p
activeDec 30, '11 at 9:44p
posts4
users2
websiterabbitmq.com
irc#rabbitmq

2 users in discussion

Travis: 2 posts Matthias Radestock: 2 posts

People

Translate

site design / logo © 2022 Grokbase