-
Notifications
You must be signed in to change notification settings - Fork 268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Serving a trained model #29
Comments
I'm about to be in a same situation and thought I will basically save the predictor after training, and load it in a API Container to get my predictions. |
Yes, you could serve the model by taking a |
I have tried the Flask app approach but I get: RuntimeError: The Session graph is empty error. |
For those of you who are trying to serve a ktrain model with Flask: It looks like this is an issue with Flask/TensorFlow, not ktrain. The latest version of Flask causes a Session graph is empty error when trying to serve a TensorFlow model on TensorFlow 1.14. See this Keras issue for more information. It apparently works in TensorFlow 2.0. However, when using a pre-v0.8 version of ktrain on TensorFlow 2, ktrain still runs in TensorFlow 1.x mode in order to support both TF 1.14 and 2.0 right now. This is why you see this error on both TF 1.14 and TF 2.0 when using ktrain. This will no longer be a problem in ktrain v0.8 (which has not yet been released) because this version of ktrain will only support TensorFlow 2 (not TensorFlow 1.14). For right now, the workaround is to downgrade Flask with:
After starting the server with If the model was trained on IMDB, this should display the following in the browser:
|
I have a ktrain text classifier based on BERT. What would be the right way to go about saving the model for serving?
The text was updated successfully, but these errors were encountered: