Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

X-Y oscilloscope music #31

Open
benbovy opened this issue Mar 18, 2021 · 5 comments
Open

X-Y oscilloscope music #31

benbovy opened this issue Mar 18, 2021 · 5 comments
Labels
enhancement New feature or request

Comments

@benbovy
Copy link
Collaborator

benbovy commented Mar 18, 2021

I'd love to be able to experiment with something like this (well, perhaps less ambitious animations) in a notebook at some point.

This would require some X-Y oscilloscope widget to which we can plug in an ipytone audio node. I've found some examples either using custom shaders (https://www.shadertoy.com/view/XttSzf, https://github.com/m1el/woscope) or based on the Canvas API (https://github.com/Sean-Bradley/Oscilloscope).

I guess it shouldn't be too hard implementing a basic version in ipytone, so that we can have a very flexible emulator (code + widgets). I have zero experience in shaders / Canvas API, though.

With both ipytone and ipycanvas, I imagine real-time drawing -> generate an audio signal from the drawing -> feed the X-Y oscilloscope emulator with the audio signal, like this: https://www.youtube.com/watch?v=AGeHwNEwbZk. :-)

Cc @martinRenou (in case you have any thoughts / suggestion for the X-Y oscilloscope widget, that would be really helpful!).

@davidbrochart
Copy link
Contributor

Not sure it's 100% relevant, but could be interesting:
https://github.com/meyda/meyda
https://github.com/vcync/modV

@martinRenou
Copy link

My two cents: I wouldn't use ipycanvas for that, for performances reason. You'd have way better performances using the Web canvas API directly in JavaScript.

@benbovy
Copy link
Collaborator Author

benbovy commented Mar 18, 2021

Not sure it's 100% relevant, but could be interesting:
https://github.com/meyda/meyda
https://github.com/vcync/modV

Hmm modV looks like a full application rather than a library? Not sure how meyda would be useful here as it is rather the opposite of ipytone (parameters <-> sound/music) and there are similar libraries in Python such as librosa... Maybe for real-time analysis of an imported audio track so that we could programmatically generate some smart overdubs with ipytone... ?

My two cents: I wouldn't use ipycanvas for that, for performances reason. You'd have way better performances using the Web canvas API directly in JavaScript.

Yes I was rather thinking of using ipycanvas to draw a shape "by hand", get the canvas image and do some computation on the Python side to convert it into "XY-oscilloscope compatible" audio signal generated with ipytone... However, there's some way to go before even trying something like that!

Do you know if the Canvas API is performant enough for responsive drawing at the audio sample rate?

@martinRenou
Copy link

Do you know if the Canvas API is performant enough for responsive drawing at the audio sample rate?

The Canvas API is quite fast, as long as you don't draw tens of thousands of shapes and use the API wisely it should be fine. Of course, WebGL shaders are faster, but they are also a lot more complicated.

I am not an audio expert, what is the order of magnitude of audio rate? I am not sure you want to match the audio frequency, you'll be limited by your screen refresh frequency anyway (60fps on my laptop). You can definitely make 60fps animations with a Web2D canvas.

@benbovy
Copy link
Collaborator Author

benbovy commented Mar 18, 2021

I am not an audio expert, what is the order of magnitude of audio rate? I am not sure you want to match the audio frequency, you'll be limited by your screen refresh frequency anyway (60fps on my laptop). You can definitely make 60fps animations with a Web2D canvas.

Sorry my comment was dumb (typical audio rate is 44.1kHz vs. 60 Hz screen rate haha).

The Canvas API is quite fast, as long as you don't draw tens of thousands of shapes and use the API wisely it should be fine. Of course, WebGL shaders are faster, but they are also a lot more complicated.

It'll be wiser to start with the Canvas API then for a basic rendering (without trying to emulate the nice rendering of an analog oscilloscope). Thanks!

@benbovy benbovy added the enhancement New feature or request label Jun 23, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants