Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a lazy method #4608

Open
lucsoft opened this issue Apr 19, 2024 · 4 comments
Open

Add a lazy method #4608

lucsoft opened this issue Apr 19, 2024 · 4 comments

Comments

@lucsoft
Copy link

lucsoft commented Apr 19, 2024

Is your feature request related to a problem? Please describe.

Often in code there is a need to run something like lazy, like connecting to a database. Seen also in Kotlin

Describe the solution you'd like

i would like to have a function that gets executed once and returns the results always.

const con = await lazy(async () => db.connect()) // async lazy
const firstHitDate = lazy(() => new Date()) // sync lazy

Describe alternatives you've considered

Doing it manually and more verbose.

Example Implementation

export const lazy = <T>(callback: () => T) => {
    let result: T = undefined!;
    return () => result = (result || callback());
};
@lionel-rowe
Copy link
Contributor

lionel-rowe commented May 5, 2024

Looks like a special case of memoization (memoizing a nullary function), and IMO it'd be more useful to have a general memoize function. Memoization in the general case can be implemented in various different ways though. A (probably non-exhaustive) list of considerations:

  • Passing custom caches (e.g. pre-populated, automatically capped size by deleting least-recently-used entries, maybe even backed by synchronous/asynchronous storage APIs?)
  • Exposing the cache for fine-grained adding/deleting of entries
  • For asynchronous callback functions, deleting promises from the cache upon rejection
  • Caching by nullary (fn()), unary primitive (fn(n: number)), unary reference (fn(el: HTMLElement)), or n>1-ary (fn(...args: any)), which may be a combination of reference/primitive args and may be variable length
  • Custom resolving of arguments to cache keys, e.g. more aggressive caching for reference arguments by serializing them to primitives first
  • How to deal with additional arguments being passed to non-variable-length functions, e.g. does [x, x, x].map(memoizedUnaryFn) always miss the cache due to the index and array implicitly being passed as arguments to map?
  • Whether and how to cache the this argument

There's also currently a Stage-1 TC39 proposal for memoization which could be a good reference point. The proposal as it stands relies on tuples though, which are currently at Stage 2.

@lucsoft
Copy link
Author

lucsoft commented May 5, 2024

Thanks for your Comment @lionel-rowe! The scope of this feature is just to allow lazy initialization / lazy code execution. See the Kotlin std where the inspo is coming from.

Your large comment kinda shows how complex real memo is. Which maybe not beneficial when the proposal advances or as a first step going into the Memo topic for the Deno Std. The Lazy method is simple enough to not have any large drawbacks, as it's quite straight forward. if cache empty, run the function, cache the result, return the cache.

@lionel-rowe
Copy link
Contributor

Your large comment kinda shows how complex real memo is. Which maybe not beneficial when the proposal advances

Proposals at stage 1 typically take several years to get to stage 4 and achieve widespread implementation, and there's no guarantee they'll get there at all — quite often the committee fails to reach consensus on some sticking point and they get delayed indefinitely or withdrawn. std routinely covers ground that overlaps with APIs advancing through the TC39 proposal process, with the relevant std APIs being deprecated and removed if and when those proposals achieve implementation. So I think overlap is a feature, not a bug.

The reason I mention memoization here is that I have a couple of use cases for it currently, and only the other day I was considering suggesting it for inclusion in std. Given that lazy is a special case of memoization, it'd make sense to roll them into one if feasible to do so (otherwise the rollout would go something like "release lazy → release memoize → alias lazy to memoize → deprecate the lazy alias → remove the lazy alias", which isn't a great story for API stability).

I currently have an implementation of memoize that's pretty versatile and well-tested, although there might still be some edge cases it doesn't do so well with. I just added some tests for your examples above.

@lionel-rowe
Copy link
Contributor

@iuioiua any enthusiasm for adding a caching module containing memoize and LruCache (least-recently-used cache), with memoize also fulfilling the needs of lazy? I think I've ironed out most of those edge cases in my standalone implementation now and could have a PR up pretty soon.

I've also had a look at the Kotlin lazy implementation — it has some thread-safety stuff, but the non-thread-safe version used for Kotlin/JS is pretty simple. The only thing that wouldn't be covered by memoize is that lazy overwrites initializer to null after running, whereas memoize would always hold a reference to the wrapped function (as it doesn't know the function won't need to be re-run later with different args). If freeing up variables closed over by the initialization function is likely to be useful, then maybe I'm wrong that lazy is a special case of memoize, and they should be 2 different functions after all. So the surface area of the putative caching module would be { lazy, LruCache, memoize }.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants