This is more of a theoretical question:
I read and re-read through Rob Pike's 2012 talk on concurrency patterns. One statement that struck me was that he said: "Buffering removes synchronization"
So, I haven't used buffered channels much but in wondering if this statement means what I think it does.
Suppose I build an app that is concurrent and technically sound or correct from racy conditions an implemented using non-buffered channels only. Let's assume the app is written for true correctness in the sense of proper communication and synchronization.
Now let's say I go into my app and tweak some channels to be buffered for the sake of performance or whatever reason. If buffering removes synchronization does this not mean my program may not be technically correct anymore? Because I can no longer guarantee that certain areas of code are ruining in synchronization?
If guess if this true then does that mean buffered channels are dangerous? I thought they were still synchronized but what changed was that they allow u to send asynchronously? In this scenario, a message going into a channel is queued into the buffer, requiring more memory but at least the receiver will still block waiting for a message? Is my understanding of this correct?
Anybody want to take a stab?
I appreciate the answers as usual and am having more fun with this language than any language before.
You received this message because you are subscribed to the Google Groups "golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to firstname.lastname@example.org.
For more options, visit https://groups.google.com/groups/opt_out.