FAQ
Hi,

My project use RabbitMQ and Redis, for each MQ package start a new
goroutine to store them in Redis (won't block following code). It uses vitess
pool <https://github.com/youtube/vitess/tree/master/go/pools> to manager
Redis connection (if each goroution start a new Redis connection, too many
Redis client for server).

When Redis server do dgsave(or other thing cause operation slowly), write
data to redis is slow, more and more goroutine generate, then program RES
memory(top command) will increase.

For this scenario, consumer data slower than producer, any solution in
Golang world?

Thanks,
Linbo

--
You received this message because you are subscribed to the Google Groups "golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Search Discussions

  • Rui Ueyama at Aug 22, 2014 at 4:07 am
    A common approach is to use a channel as a semaphore. Create a channel with
    buffer size N and before creating a new goroutine send a value to the
    channel. When the channel becomes full, the send is blocked so that it
    won't create a new goroutine.

    Each goroutine receives a value from the channel when it's done, so that a
    new goroutine could be spanwed.

    In code, it's like this:

    c := make(chan bool, 50) // concurrency = 50
    for <whatever> {
       c <- true // blocks if the channel is full
       go func() {
         defer func() { <-c } // make room for another goroutine
         // do whatever the goroutine should do
       }
    }

    On Thu, Aug 21, 2014 at 8:57 PM, linbo liao wrote:

    Hi,

    My project use RabbitMQ and Redis, for each MQ package start a new
    goroutine to store them in Redis (won't block following code). It uses vitess
    pool <https://github.com/youtube/vitess/tree/master/go/pools> to manager
    Redis connection (if each goroution start a new Redis connection, too many
    Redis client for server).

    When Redis server do dgsave(or other thing cause operation slowly), write
    data to redis is slow, more and more goroutine generate, then program RES
    memory(top command) will increase.

    For this scenario, consumer data slower than producer, any solution in
    Golang world?

    Thanks,
    Linbo

    --
    You received this message because you are subscribed to the Google Groups
    "golang-nuts" group.
    To unsubscribe from this group and stop receiving emails from it, send an
    email to golang-nuts+unsubscribe@googlegroups.com.
    For more options, visit https://groups.google.com/d/optout.
    --
    You received this message because you are subscribed to the Google Groups "golang-nuts" group.
    To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts+unsubscribe@googlegroups.com.
    For more options, visit https://groups.google.com/d/optout.
  • Linbo liao at Aug 22, 2014 at 7:59 am
    Thanks, looks it is a reasonable approach, I will have a try.

    在 2014年8月22日星期五UTC+8下午12时07分16秒,Rui Ueyama写道:
    A common approach is to use a channel as a semaphore. Create a channel
    with buffer size N and before creating a new goroutine send a value to the
    channel. When the channel becomes full, the send is blocked so that it
    won't create a new goroutine.

    Each goroutine receives a value from the channel when it's done, so that a
    new goroutine could be spanwed.

    In code, it's like this:

    c := make(chan bool, 50) // concurrency = 50
    for <whatever> {
    c <- true // blocks if the channel is full
    go func() {
    defer func() { <-c } // make room for another goroutine
    // do whatever the goroutine should do
    }
    }


    On Thu, Aug 21, 2014 at 8:57 PM, linbo liao <llb...@gmail.com
    <javascript:>> wrote:
    Hi,

    My project use RabbitMQ and Redis, for each MQ package start a new
    goroutine to store them in Redis (won't block following code). It uses vitess
    pool <https://github.com/youtube/vitess/tree/master/go/pools> to manager
    Redis connection (if each goroution start a new Redis connection, too many
    Redis client for server).

    When Redis server do dgsave(or other thing cause operation slowly), write
    data to redis is slow, more and more goroutine generate, then program RES
    memory(top command) will increase.

    For this scenario, consumer data slower than producer, any solution in
    Golang world?

    Thanks,
    Linbo

    --
    You received this message because you are subscribed to the Google Groups
    "golang-nuts" group.
    To unsubscribe from this group and stop receiving emails from it, send an
    email to golang-nuts...@googlegroups.com <javascript:>.
    For more options, visit https://groups.google.com/d/optout.
    --
    You received this message because you are subscribed to the Google Groups "golang-nuts" group.
    To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts+unsubscribe@googlegroups.com.
    For more options, visit https://groups.google.com/d/optout.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupgolang-nuts @
categoriesgo
postedAug 22, '14 at 3:57a
activeAug 22, '14 at 7:59a
posts3
users2
websitegolang.org

2 users in discussion

Linbo liao: 2 posts Rui Ueyama: 1 post

People

Translate

site design / logo © 2021 Grokbase