Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changing byteArray to SoundSource #34

Open
VivekFitkariwala opened this issue Nov 16, 2014 · 7 comments
Open

Changing byteArray to SoundSource #34

VivekFitkariwala opened this issue Nov 16, 2014 · 7 comments

Comments

@VivekFitkariwala
Copy link

I am trying to pass the microphone data through the filters but don't know how to convert the byteArray to SoundSource to pass it through the filters. I am recording the audio at 11 kHz. Please help!!

@kenfehling
Copy link

Maybe create a Sample and then do sample.writeWavBytes(byteArray)

@VivekFitkariwala
Copy link
Author

@kenfehling I don't have sample. I want to create SoundSource from the byteArray which I have recorded from microphone.
I want to convert byteArray to Sound so that I can use filters on the sound source
var source:SoundSource = new SoundSource(sound as Sound);

@kenfehling
Copy link

Oh right, I had that backwards, writeWavBytes(byteArray) puts info into a byteArray.

Maybe something like this:

const sound:Sound = new Sound();
sound.loadPCMFromByteArray(byteArray, numSamples, format, false, 11025.0);

I'm not sure what the format should be, the default is "float" but you might have signed/unsigned integers or something in the byte array.

@VivekFitkariwala
Copy link
Author

Okay, I was able to change the byteArray to sound and play it. But when I make the soundSource from the sound it does not play.

var recordedSound:Sound = new Sound();
recordedSound.loadPCMFromByteArray(soundBytes, soundBytes.length / 4, "float", false);

            var soundSource:SoundSource = new SoundSource(recordedSound);
            var audioSource:AudioPlayer = new AudioPlayer();
            audioSource.play(soundSource);

Is the sound passed through soundSource should have any specific format or something?

@kenfehling
Copy link

Yes, give it an AudioDescriptor to specify the sample rate and mono/stereo.

const audioDescriptor:AudioDescriptor = new AudioDescriptor(AudioDescriptor.RATE_11025, 1);
const soundSource:SoundSource = new SoundSource(recordedSound, audioDescriptor);

@VivekFitkariwala
Copy link
Author

Thanks for the quick reply. I still can't play the sound when passed through soundSource

using the above code I got this error
AudioPlayer no longer supports lower audio descriptors. Please pass the source through the StandardizeFilter() before output.

After passing through the StandardizeFilter there was no audio played in the player.

In the source code if the rate is 44kHz and channel is 2 (means stereo) it does not require to pass through StandardizeFilter. So I changed the recording to 44kHz?

@VivekFitkariwala
Copy link
Author

Here is the code

soundBytes.position=0;
var recordedSound:Sound = new Sound();
recordedSound.loadPCMFromByteArray(soundBytes, soundBytes.length / 4, "float", false, AudioDescriptor.RATE_11025);
//recordedSound.play();

            var soundSource:SoundSource = new SoundSource(recordedSound, new AudioDescriptor(AudioDescriptor.RATE_11025, 1) );
            var audioSource:IAudioSource;
            audioSource = new StandardizeFilter(soundSource);
            var audioPlayer:AudioPlayer = new AudioPlayer();
            audioPlayer.play(soundSource);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants