Skip to content

rasky/gcs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

49 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Simple implementation of the Golomb Compressed Sets (GCS), a statistical
compressed data-structure. It is similar to Bloom filters, but it is far more
compact: given N elements, and P probability of a false positive, an optimal
Bloom filter requires at least N*log2(e)*log2(1/P) bits, where GCS gets the
bar closer to theoretical minimum of N*log2(1/P). With real-world data sets,
GCS can be 20-30% more compact than a Bloom filter.

The cons is of course speed: GCS is fully compressed so a query is an order of
magnituted slower than Bloom filters. On the other hand, it is not required to
decompress it fully in RAM, so it can be streamed. Thus, they make sense in an
environment where queries are performed at interactive rate and RAM is scarce
compared to the dataset.

A full explanation of GCS can be found in my blog post on the subject:
http://giovanni.bajo.it/2011/09/golomb-coded-sets/

The provided implementations are in Python, C++ and Javascript. They are fully
equivalent, but the C++ and JS implementations cache the GCS in memory, while
the Python implementation streams it from disk. Examples of data sets (English
and Italian dictionaries) are provided to play with them. See test.sh.

Test live Javascript implementation here:
http://cybercase.github.io/gcs/

See these references for more details:
http://www.imperialviolet.org/2011/04/29/filters.html
http://algo2.iti.uni-karlsruhe.de/singler/publications/cacheefficientbloomfilters-wea2007.pdf