Fetching the resource can be expensive, so I want to cache the fetched
data. The basic logic is, the object gets a request, if it's in the cache,
returns it, else it fetches, caches, and returns it. Very simple in the
single-threaded case. Now I want to make it concurrent. In particular, I
want fetching requests to not block non-fetching requests, and if two
clients ask for an uncached resource at the same time, we don't launch two
fetches for the same thing.
A solution I've thought of is to have a mutex-guarded map from resource
identifier to a channel which sends the resource over and over. Each
resource would then have a single goroutine which is either in the process
of fetching the resource or will cheaply send it back. Is this a reasonable
idea? Would creating a goroutine for every entry in the cache have
excessive overhead? If so, any other suggestions? This is a use case I've
run into a few times.
Hunter
--
You received this message because you are subscribed to the Google Groups "golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.