Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the possibity to track a time execution of async functions #34

Open
antoniomuso opened this issue Nov 8, 2021 · 12 comments · May be fixed by #43
Open

Add the possibity to track a time execution of async functions #34

antoniomuso opened this issue Nov 8, 2021 · 12 comments · May be fixed by #43
Labels
enhancement New feature or request

Comments

@antoniomuso
Copy link
Contributor

antoniomuso commented Nov 8, 2021

What do you think of trackAsyncCbTime function to trace async function resolve's time?

async function asyncFunction(a, b, c) {
    await doSomething(a, b, c);
    return doSomething2(a, b, c);
}

const asyncFunctionOutput = await fastify.trackAsyncCbTime(
    { scope: 'something', label: 'test' },
    asyncFunction,
    a,
    b,
    c
);

WDYT?

@antoniomuso antoniomuso changed the title Add the possibity to track a time execution of a async-function Add the possibity to track a time execution of async functions Nov 8, 2021
@dnlup
Copy link
Contributor

dnlup commented Nov 9, 2021

Yes that is useful 😉

@simonecorsi
Copy link
Collaborator

simonecorsi commented Nov 10, 2021

I've missed a point in this that I haven't thought before and there is only one thing to keep in mind, this will never be a 100% correct timing.

Because of how underlying queueing work, then's callback (here is async/await but that's syntactical sugar) is queued in the next microtaskQueue that may or may not (probably has) already multiple callback queued up and those will count toward our computation.

There is no way around this because that how async jobs works.

If we're ok with it I'm fine but we should at least warn the users!

@antoniomuso
Copy link
Contributor Author

It's ok for me to add a warn in the docs.

@dnlup
Copy link
Contributor

dnlup commented Nov 10, 2021

I've missed a point in this that I haven't thought before and there is only one thing to keep in mind, this will never be a 100% correct timing.

Because of how underlying queueing work, then's callback (here is async/await but that's syntactical sugar) is queued in the next microtaskQueue that may or may not (probably has) already multiple callback queued up and those will count toward our computation.

There is no way around this because that how async jobs works.

If we're ok with it I'm fine but we should at least warn the users!

That would be the same if you timed a Promise without the utility functions want to provide (if I understand correctly). I don't think we need a warning for that.

@simonecorsi
Copy link
Collaborator

If your callback ends up down in the queue you would have all the overhead in timing of the others, it's something I would want to know if it can make my metrics somehow not what I expect

@dnlup
Copy link
Contributor

dnlup commented Nov 11, 2021

A couple of thoughts:

  1. maybe it could be useful to expose this also for sync functions
  2. if the function that we want to time is attached to a Fastify instance we could have context issues

@antoniomuso
Copy link
Contributor Author

The first point it's ok for me, for the second I don't understand the possible problem, can you give an example?

@dnlup
Copy link
Contributor

dnlup commented Nov 11, 2021

Sure, imagine we have to time something like:

fastify.decorate('someFn', function (arg1, arg2) {
  this.log.info(arg1,arg2)
})
fastify.timeAsync(fastify.someFn, arg1, arg2)

I think we would lose the this context in the decorated function.

@dnlup
Copy link
Contributor

dnlup commented Nov 16, 2021

Finally found this:

https://nodejs.org/dist/latest-v16.x/docs/api/perf_hooks.html#performancetimerifyfn-options

We could provide a wrapper of this in the plugin, but I think it's also pretty straightforward to use as it is. I think it's a much cleaner approach.

@antoniomuso
Copy link
Contributor Author

I like it; we should test if this function has overheads. It uses the PerformanceObserver that can add performance overhead.

@dnlup
Copy link
Contributor

dnlup commented Nov 16, 2021

I have used it to track GC activity in the past with no perf issues, but I'll test it again for sure.

@dnlup dnlup added the enhancement New feature or request label Nov 16, 2021
@dnlup dnlup linked a pull request Nov 30, 2021 that will close this issue
4 tasks
@simonecorsi simonecorsi added this to the 🔥 Version 2 milestone Nov 30, 2021
@dnlup
Copy link
Contributor

dnlup commented Feb 22, 2022

I am removing this one from the milestone. I would prefer not blocking a new release waiting for this feature.

@dnlup dnlup removed this from the 🔥 Version 2 milestone Feb 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants