python - Connection reset on large MGET requests -
when making large mget
requests redis (>2,000,000 arguments) using redis-py
, following socket error:
connectionerror: error 104 while writing socket. connection reset peer.
i've tried different clients, issue remains. read here there possibly window scaling bug going on, tried adjusting net.ipv4.tcp_wmem
, net.ipv4.tcp_rmem
have smaller maximum window, didn't work either. i'm running on python 2.7.3, ubuntu 12.04.1 lts, , redis 2.6.4.
you cannot retrieve such number of values single mget. command not designed sustain such workload. wrong idea generate large redis commands:
on server side, command should fit in input buffer. result of command should fit in output buffer. input buffer limited 1 gb. output buffer there soft , hard limits depending on nature of client. growing buffers close these limits looking troubles. redis closes connection when limits reached.
on client side, there similar buffers , hard-coded limits.
redis single-threaded event loop. execution of commands serialized. large command make redis unresponsive other clients.
should want retrieve massive amount of data, supposed pipeline several or mget commands. example, following code can used retrieve arbitrary number of items while minimizing number of roundtrips , server side cpu consumption:
import redis n_pipe = 50 # number of mget commands per pipeline execution n_mget = 20 # number of keys per mget command # return dictionary input array containing keys def massive_get( r, array ): res = {} pipe = r.pipeline(transaction=false) = 0 while < len(array): keys = [] n in range(0,n_pipe): k = array[i:i+n_mget] keys.append( k ) pipe.mget( k ) += n_mget if i>=len(array): break k,v in zip( keys, pipe.execute() ): res.update( dict(zip(k,v)) ) return res # example: retrieve keys 0 1022: pool = redis.connectionpool(host='localhost', port=6379, db=0) r = redis.redis(connection_pool=pool) array = range(0,1023) print massive_get(r,array)
Comments
Post a Comment