Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CREPE model Tensorflow on Android #79

Open
remyut opened this issue Jan 24, 2022 · 4 comments
Open

CREPE model Tensorflow on Android #79

remyut opened this issue Jan 24, 2022 · 4 comments

Comments

@remyut
Copy link

remyut commented Jan 24, 2022

Hi,

I use the CREPE model on a web browser which works pretty fine, but is there a way to integrate it in Android or flutter web/mobile using Tensorflow library?

the model works just nice when the mic is used from the web browser (with ml5 js for example). I wish to connect another source, like phone mic, is this possible?

Could you please help me to understand?

I believe it should be possible, I do not want to re-train the model it already works fine. I just need to understand how to modify the input audio data from another source to fit the model.

Thanks for your help
Remy

@martingasser
Copy link

I'm using it right now in an iOS/Android React Native app using TFLite.

You can read up on how to convert the Keras model to TFLite here: https://www.tensorflow.org/lite/convert

I just had to write a small conversion script that builds the Keras model and loads the weights from the h5 files provided in the repo, and then I used the tf.lite.TFLiteConverter.from_keras_model(...) API.

@remyut
Copy link
Author

remyut commented Apr 17, 2022 via email

@RemyNtshaykolo
Copy link

I'm using it right now in an iOS/Android React Native app using TFLite.

You can read up on how to convert the Keras model to TFLite here: https://www.tensorflow.org/lite/convert

I just had to write a small conversion script that builds the Keras model and loads the weights from the h5 files provided in the repo, and then I used the tf.lite.TFLiteConverter.from_keras_model(...) API.

Hey, were you able to load a model bigger than the tiny one? The bigger model is the more latence I have during the inference

@martingasser
Copy link

I'm using it right now in an iOS/Android React Native app using TFLite.
You can read up on how to convert the Keras model to TFLite here: https://www.tensorflow.org/lite/convert
I just had to write a small conversion script that builds the Keras model and loads the weights from the h5 files provided in the repo, and then I used the tf.lite.TFLiteConverter.from_keras_model(...) API.

Hey, were you able to load a model bigger than the tiny one? The bigger model is the more latence I have during the inference

Of course, bigger models need more computation. On a low-end Android device (Samsung Galaxy A12), I can only run the tiny model in real time. On an iPhone 11, I can use the "small" model without problems.

Which device are you using?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants