You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, all devices running iOS are limited to playback of a single audio or video stream at any time. Playing more than one video—side by side, partly overlapping, or completely overlaid—is not currently supported on iOS devices. Playing multiple simultaneous audio streams is also not supported.
What this means for Remotion is that I can use the existing components to build complex compositions with overlaid audio which works fine on almost any device, except for Mobile iOS which accounts for 50% of the world demographics. There, it just laters between any of the audio layers.
Possible Solution
I'd like to see a new Audio component that's based on the Web Audio API, which Mobile iOS should have support for parallell audio stream according to their own documentation.
One could extract out a global Audio Context, which the new WebAudioAPIAudio component reads from, adds the buffer and the start / end / duration / loop parameters. Then we'd need to implement an interface between Remotion's internal state (play/pause/time etc.) and the global AudioContext.
I'd be more than happy to give this implementation a shot.
The text was updated successfully, but these errors were encountered:
Valid feature request, and I am happy if you give it a shot!
You can write in Discord if you want help during it, see also https://remotion.dev/discord to get started.
Just want to add that this documentation seems out of date, you can indeed play multiple audio streams now.
Hi! First of all, thank you for an amazing library.
Feature Request 🛍️
According to Apple's own documentation:
What this means for Remotion is that I can use the existing components to build complex compositions with overlaid audio which works fine on almost any device, except for Mobile iOS which accounts for 50% of the world demographics. There, it just laters between any of the audio layers.
Possible Solution
I'd like to see a new Audio component that's based on the Web Audio API, which Mobile iOS should have support for parallell audio stream according to their own documentation.
One could extract out a global Audio Context, which the new WebAudioAPIAudio component reads from, adds the buffer and the start / end / duration / loop parameters. Then we'd need to implement an interface between Remotion's internal state (play/pause/time etc.) and the global AudioContext.
I'd be more than happy to give this implementation a shot.
The text was updated successfully, but these errors were encountered: