Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimisations #19

Open
dzannotti opened this issue May 16, 2019 · 6 comments
Open

Optimisations #19

dzannotti opened this issue May 16, 2019 · 6 comments

Comments

@dzannotti
Copy link
Owner

I'm not sure if V8 pre-optimises math calculations like static denominators of divisions, but if it doesn't we definitely can improve on that.

Also babel-macro probably allows us alot of inline capabilities of functionalities (atleast params.js)

@pendragon-andyh
Copy link
Collaborator

I have experimented with converting static denominator divisions into multiplications in the past for a C# rating engine. It causes rounding errors that would not be a problem for us, but would impact other types of application. So I would not expect V8 to do that optimization. We should be doing that optimization for ourselves. Likewise we should keep an eye open for areas that use lots of Math functions.

Probably the biggest optimization is in terms of the algorithms used (e.g. how we construct the oscillators and whether we use over-sampling).

I am a little worried about babel doing anything inside of the engine core. Only modern browsers can use the Audio Worklet (I am assuming that Firefox will land in the near future) ... so should not need to use wacky future-JS stuff that might be inefficiently emulated by babel. Would rather constrain to bog standard ES6. I am more than fine with babel being used for the React UI.

Would be nice if the engine could monitor its own performance. We could have it emit a message if it starts the exceed the time budget. Max budget is 3ms. Want to be able to run at-least 6 voices within that budget.

@dzannotti
Copy link
Owner Author

babel-macro work exactly like C macros, and they are transpiled at compile time. this would allow us to have an inline function behaviour, which saves the function call overhead at 0 cost for us. I hear your concern and agree overall

@pendragon-andyh
Copy link
Collaborator

Nice work on last night's commit.

Can you look at allocations (so that we can reduce the load from garbage collections):

synth.worklet.js

  • this.bufferL and this.bufferR do not appear to be used. Allocated once every 128 samples.

junox.js

  • this.voices = this.voices.filter(voice => !voice.isFinished())
  • Suggest this is moved after the following loop, and have that loop identify if the list of voices needs any removals.

chorus.js

  • Returns a newly-allocated array on each call.
  • Suggest left and right outputs are written to instance variables - and have the calling junox.js pick results up from them.

I will have a play with my git setup at the weekend. I have a feeling that my inability to commit changes is because I am not authenticated.

@dzannotti
Copy link
Owner Author

dzannotti commented May 20, 2019

synth.worklet.js

  • this.bufferL and this.bufferR do not appear to be used. Allocated once every 128 samples.

It's allocated only once (and then if webaudio ever requests more than the previous size), but it's indeed not used and can be removed

junox.js

  • this.voices = this.voices.filter(voice => !voice.isFinished())
  • Suggest this is moved after the following loop, and have that loop identify if the list of voices needs any removals.

This means we'd be running tick and render for every voice in the 128 samples period, even if they're disabled (and their output is guaranteed to be 0). I am not sure which one is more performant between the 2 choices. Probably a mix of both (we should check if the voice is finished before rendering (that only costs 1 if), and cleanup once per 128 samples.

chorus.js

  • Returns a newly-allocated array on each call.
  • Suggest left and right outputs are written to instance variables - and have the calling junox.js pick results up from them.

Good idea!

dzannotti added a commit that referenced this issue May 20, 2019
dzannotti added a commit that referenced this issue May 20, 2019
@pendragon-andyh
Copy link
Collaborator

Something like this within junox.js

      this.tick()

      let monoOut = 0
      let isVoiceFinished = false
      for (let j = 0; j < this.voices.length; j++) {
        const voice = this.voices[j]
        monoOut += voice.render()
        isVoiceFinished != voice.isFinished()
      }

      monoOut *= this.patch.vca

      if (isVoiceFinished) {
        // remove dead voices.
        this.voices = this.voices.filter(voice => !voice.isFinished())
      }

@dzannotti
Copy link
Owner Author

yup, i've implemented something similiar in #37

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants