Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Library is not Simple and fast NodeJS internal caching #286

Open
nopeless opened this issue Apr 16, 2022 · 6 comments
Open

Library is not Simple and fast NodeJS internal caching #286

nopeless opened this issue Apr 16, 2022 · 6 comments

Comments

@nopeless
Copy link

The title says "Simple and fast NodeJS internal caching"

Yet the source code is nothing like that

The api is not simple, it provides callbacks when there shouldn't even be callbacks because everything is synchronous

Many methods went through pointless revisions because of initial design failures

It says "typescript rewrite" but the last commit in the typescript rewrite branch is almost 2 years old.

There is no way to opt out of statistics which is calculated every time an item is added to the cache, adding to the bloat

It goes through every item instead of organizing a sorted binary heap queue

Maybe some people use this library because they like slow speeds or bloated code, but I feel like the README should mention these issues and step down the buzzwords.

If you agree with me, please upvote it. Maybe I'll write a better library

@jeremyj563
Copy link

jeremyj563 commented Apr 23, 2022

The title says "Simple and fast NodeJS internal caching"

Yet the source code is nothing like that...

Have you located a more capable alternative library? It looks like lru-cache is active but I haven't actually tried it yet.

Edit:
Okay that ones a different use-case (LRU - Least Recently Used)

@wemod123
Copy link

I had use this package in project before, but until recently I realize the performance issue,

To read(just the get method ) a 2.87MB JSON object takes around 2s in my real case

For me the only useful functional feature is TTL, and if only have few keys with tiny object, should anyway working well, while in this case should also have performance issue which can be ignored.

@nopeless
Copy link
Author

@wemod123 @jeremyj563
I wrote a small npm package called nope-mem-cache
It's written in ts and I'm willing to fix bugs within a reasonable time
(I would actually love testers)
Check it out maybe

@NexZhu
Copy link

NexZhu commented Sep 7, 2022

Did a quick search and found some libraries, haven't tried yet:
https://github.com/aholstenson/transitory
https://github.com/tinovyatkin/tlru
https://github.com/rafikalid/lru-ttl-cache

@nopeless
Copy link
Author

nopeless commented Sep 7, 2022

Did a quick search and found some libraries, haven't tried yet:
https://github.com/aholstenson/transitory
https://github.com/tinovyatkin/tlru
https://github.com/rafikalid/lru-ttl-cache

I just use my libraries

@falsandtru
Copy link

Recommend the highest performance constant complexity cache algorithm.

https://github.com/falsandtru/dw-cache

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants