Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for consistent hashing #135

Open
EnchanterIO opened this issue Aug 6, 2021 · 1 comment
Open

Add support for consistent hashing #135

EnchanterIO opened this issue Aug 6, 2021 · 1 comment

Comments

@EnchanterIO
Copy link

From a quick look it seems like deciding to what node the key goes is based on crc32 hash and modulo.

func (ss *ServerList) PickServer(key string) (net.Addr, error) {
	ss.mu.RLock()
	defer ss.mu.RUnlock()
	if len(ss.addrs) == 0 {
		return nil, ErrNoServers
	}
	if len(ss.addrs) == 1 {
		return ss.addrs[0], nil
	}
	bufp := keyBufPool.Get().(*[]byte)
	n := copy(*bufp, key)
	cs := crc32.ChecksumIEEE((*bufp)[:n])
	keyBufPool.Put(bufp)

	return ss.addrs[cs%uint32(len(ss.addrs))], nil
}

But apparently (TIL) there is a better practice by using consistent hashing.

Unfortunately, this particular approach suffers from a fatal flaw due to the way that 
modulo works. As the number of cache nodes scales up, most hash keys will get 
remapped to new nodes with empty caches, as a side effect of using modulo. You can 
calculate the number of keys that would be remapped to a new cache node by dividing 
the old node count by the new node count. For example, scaling from 1 to 2 nodes 
remaps half (½) of your cache keys; scaling from 3 to 4 nodes remaps three-quarters 
(¾) of your keys; and scaling from 9 to 10 nodes remaps 90 percent of your keys to 
empty caches

From https://d0.awsstatic.com/whitepapers/performance-at-scale-with-amazon-elasticache.pdf

Am I understanding correctly that gomemcache doesn't have consistent hashing?

Thanks

@lovelock
Copy link

Feel free to extend it like I do https://github.com/lovelock/gomemcache

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants