Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use pagination to speed up #92

Open
gouchaoer opened this issue Apr 8, 2016 · 7 comments
Open

use pagination to speed up #92

gouchaoer opened this issue Apr 8, 2016 · 7 comments

Comments

@gouchaoer
Copy link

if there is thousands of keys in redis, web frontend will load all data from backend, which is slow and stupid.

1,use pagination on keys which have the same prefix
2,actually user do not need to load all keys on the left side, so only load keys that user choose.

phpredisadmin is difficult to use in develop environment because it load all keys!

@erikdubbelboer
Copy link
Owner

Before when we used the KEYS command pagination was never really possible. Now with the SCAN command it is. But a lot needs to be modified for this to work and at the moment I don't have time for this I'm afraid.

@gouchaoer
Copy link
Author

I suggest that anyone who uses phpredisadmin to visualize redis in development enviroment, do not set too much keys in global set. You can always use a hash table to do it except that you want to set an expire time.

@GregOriol
Copy link
Contributor

phpredisadmin is completely broken for me on a setup where there are >100'000 keys :-(

@sombahadurlimbu
Copy link

sombahadurlimbu commented Jun 13, 2018

Without pagination, its difficult to work on large set of keys. Any news when this will be included? Also in case of clusters, rather than searching keys or keys pattern to individual redis servers, a common interface to search keys for the cluster nodes will be great. Because right now if we have to search any key or keys, we have to find it in every nodes of the cluster until you find them.

@gouchaoer
Copy link
Author

@sombahadurlimbu if yo find that you need to find keys in redis, you probably need rational database such as mysql(for large scale maybe tidb). redis is a cache and we should not query or iterate things in redis.

@erikdubbelboer
Copy link
Owner

Yes pagination would definitely be a good addition for big data sets.

In theory the javascript code should do ajax requests which result in SCAN operations in redis to fetch all the keys. This way PHP won't run out of memory or time.

But this is quite a big change and I just don't have the time to write this at the moment I'm afraid 😞

Pull requests are always welcome!

@sombahadurlimbu
Copy link

sombahadurlimbu commented Jun 28, 2018

@gouchaoer
why would i need a mysql/rdbms? we use redis for caching purpose and we got 3 master nodes setup. Since the data is sharded accross nodes, there should be a one search interface additionally in order to search KEY/KEY patterns to multiple nodes together.

Developers won't bother which redis node the key is sharded/kept, they like to know whether the data exists or whether the value is correct or not. Reddie provides one such interface on the cluster level.

I ended up creating one shell script to be called from our nodejs api to search for this purpose since our major issue was due to huge no. of keys and phpredisadmin not supporting pagination, and another problem was to search for a single key or pattern accross multiple nodes.

It will be great if these 2 things could be addressed in phpredisadmin (pagination & search through all masternodes of cluster along with individual nodes which is by default phpredisadmin is doing).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants