Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pluggable cache strategies? #21

Open
ivos opened this issue Oct 17, 2020 · 8 comments
Open

Pluggable cache strategies? #21

ivos opened this issue Oct 17, 2020 · 8 comments
Labels
enhancement New feature or request

Comments

@ivos
Copy link

ivos commented Oct 17, 2020

Have you considered supporting different cache strategies?

Now the cache is hard-coded in the library, but providing a cache interface (or using some standard, existing one) that people can implement and then plug in their own cache might be useful. Some ideas of what could be handy:

  1. No cache. In a multi-user app with data being frequently modified, one might want to fetch fresh data from the backend whenever a page is displayed.
  2. Time-to-live setting. Some data might have logical lifespan, after which it could be automatically expired and thus saving us clearing the cache imperatively. The timeout might be set on the global level, per-wrapped fn, or per call.

Alternatively, it might be beneficial to use some existing, proven library for caching (e.g. https://www.npmjs.com/package/lru-cache ?) instead of implementing own solution, and allowing the users to configure it.

@andreiduca
Copy link
Owner

andreiduca commented Oct 17, 2020

Currently, all data is cached with simple javascript Map objects, without a clearing strategy (only the manual process being available).

However, I have considered both an actual caching strategy, and a multi-level TTL approach. See this comment: https://github.com/andreiduca/use-async-resource/blob/9d09a48/src/cache.ts#L8 😋 I do not have a roadmap for this yet.

A pluggable caching layer is an interesting idea. Adding another library as a dependency is easy and fast to implement, but not necessarily the best idea in terms of bundle size.

I will consider all options.

@andreiduca andreiduca added the enhancement New feature or request label May 26, 2021
@lubieowoce
Copy link

@andreiduca Any updates? :) idk if you've been following the progress on react 18, but it's looking like react will provide a cache of its own at some point: reactwg/react-18#25

Also related: reactwg/react-18#80

@lubieowoce
Copy link

Also semi-related: Have you thought of allowing people to scope the cache via a Provider (+fallback to default cache if no Provider used)? That'd be very useful!

@mmasdivins
Copy link

In my case the data is frequently modified so I don't want to cache the requests, is there a way to disable cache?

@scrungrth
Copy link

I'm trying to add a generic error boundary component w/ retry button but it is cumbersome as in the case of an error I have to know all the apiFns used in order to clear their caches because even errors are cached currently.

@Kleptine
Copy link

Kleptine commented Nov 6, 2021

I'd also like to disable the cache. :(

@Kleptine
Copy link

Kleptine commented Nov 6, 2021

Dumb option:

let [dataReader] = useAsyncResource(nodeValueFunction, GlobalUiState.selected_node);
useEffect(() => {
    resourceCache(nodeValueFunction).clear();
});

@scrungrth
Copy link

I ended up forking so that I could update the apiFn(...).then error handler to clear the cache shortly after an error so that I don't need to know what API call failed in my error fallback component in order to retry.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants