Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce memory pressure of the caching to the minimum #357

Open
hicolour opened this issue Mar 9, 2022 · 1 comment
Open

Reduce memory pressure of the caching to the minimum #357

hicolour opened this issue Mar 9, 2022 · 1 comment

Comments

@hicolour
Copy link

hicolour commented Mar 9, 2022

I'm trying to reduce memory pressure of the caching to the minimum.

I manage to reduce it a lot using the configuration below, but I still see leftovers that have symptoms of the leak.

At the first glance it likes that it comes from the LevelZeroMapCache but I'm not sure - and I cannot find eany reference in the documenetaion.

2022-03-09-12-40-10

val cacheKeyValueIdsOverride = false

val memoryCacheOverride = MemoryCache.off

  val mmapDisabled = MMAP.Off(
    ForceSave.Off
  )

  val segmentConfigOverride = DefaultConfigs
    .segmentConfig()
    .copyWithMmap(
      mmapDisabled
    )
    .copyWithCacheSegmentBlocksOnCreate(false)
    .copyWithFileOpenIOStrategy(IOStrategy.AsyncIO(cacheOnAccess = false))
    .copyWithBlockIOStrategy(
      blockIOStrategy = (_) => IOStrategy.AsyncIO(cacheOnAccess = false)
    )

  val fileCacheOverride = FileCache.On(
    0,
    ActorConfig.TimeLoop(
      name = s"${this.getClass.getName} - FileCache TimeLoop Actor",
      delay = 1.seconds,
      ec = DefaultExecutionContext.sweeperEC
    )
  )


      persistent.Map[String, Int, Nothing, Glass](
        dir = File.newTemporaryDirectory("suggestions_topn_base_").deleteOnExit().path,
        mmapMaps = mmapDisabled,
        cacheKeyValueIds = cacheKeyValueIdsOverride,
        segmentConfig = segmentConfigOverride,
        fileCache = fileCacheOverride,
        memoryCache = memoryCacheOverride
      )


@simerplaha
Copy link
Owner

Yep detailed documentation on cache configurations is missing. We should really document each cache config with performance benchmarks when turned off vs on.

Everything is configurable, so you should be able to configure caching from being very high to almost no caching.

For now, similar to how you've disabled caching with segmentConfigOverride in your code, you can try the same for sortedIndex, hashIndex, binarySearchIndex, bloomFilter and valuesConfig (find them in DefaultConfigs)

Set the following for all the above blocks.

case action: IOAction.DecompressAction => IOStrategy.SynchronisedIO(cacheOnAccess = false)

Just a note, in your settings above you've disabled caching opened files with this setting.

.copyWithFileOpenIOStrategy(IOStrategy.AsyncIO(cacheOnAccess = false))

So every thread that tries to access a file would open a new java.nio.FileChannel. Unless you have an extreme use-case I would set this to cacheOnAccess = true so there are less concurrently opened FileChannels for a file.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants