New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
when use mass insert data to redis, loss data #3466
Comments
@phenix3443 How much memory do you have? |
@charsyam the redis use about 1.98G memory, i use one process to hanle all 200 file, but still loss data. |
@phenix3443 Do you check there are same keys? |
there are not same key,wait a moment , i check the result every process return from redis-server. |
@charsyam , the result from redis-server has something wrong, like maybe i must solve these problem, but thank you very much. |
@charsyam ,i find i can only store about 3G data in redis, use default config, then i can not store any more, the redis server disconnect the redis-cli --pipe connect. is there any suggestion? |
@phenix3443 how about using it on 64bit machine |
this is 64bit machine, |
i fond maxmemory set in configure files, may this cause the problem. i am trying to fix it. |
@phenix3443 Yes, If you already use it on 64bit, change maxmemory value or remove it. |
i use python multiprocessing to insert a lot of recods to redis, every process use a method which is introduced in this article http://redis.io/topics/mass-insert, but finally the redis loss some data.
when total records is 50000,or 200000 , evey file have 10000 records, so ,there are 5 or 20 files. every process hanle one file. the data in redis is correct。
but when the totol record is 2000,000 . redis loss some data.
code as follow:
code.py.txt
The text was updated successfully, but these errors were encountered: