redis server running on a pretty fast system with 4GB of ram and an SSD HD.
There's less than 400,000 records in the database for now.
When we run a piece of code that does thousands of connections, it will
consistently get a "Redis server went away" after about 4,000 records being
We have two ways of doing this, one that works, and one that doesn't. We're
trying to figure out why the latter doesn't work and drops the above error.
The first way that works:
1. We connect to the server.
2. We process 40,000 database inserts.
3. We close the server connection.
That way works every time and does what it's supposed to.
The second way that fails:
1. We run through several recursive loops in the program, and each function
has it's own server connect, process, and close steps.
2. After about 4,000 inserts, the program stops and errors out with the
Redis server went away errors.
I know some people will say to do the first method that works, but this
might be an indicator of a larger problem that we want to resolve before
going into production. The program only takes a few seconds to complete, so
that means that within a time frame of a few seconds we are doing thousands
of connect/disconnect statements. Maybe there is a timeout that occurs
after a connection is closed that takes a second or two to complete?
Any advice would be appreciated. I saw in our redis.conf file there is a
line about how many clients can connect and it's commented out, but the
line says 10,000.
Should I try uncommenting that and increasing that number to say, 50,000?
Thank you for any help,
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To view this discussion on the web visit https://groups.google.com/d/msg/redis-db/-/WICAi9hZyBwJ.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to firstname.lastname@example.org.
For more options, visit this group at http://groups.google.com/group/redis-db?hl=en.