Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Freezing the model for deployment #370

Open
gautam247gk opened this issue Feb 3, 2022 · 0 comments
Open

Freezing the model for deployment #370

gautam247gk opened this issue Feb 3, 2022 · 0 comments

Comments

@gautam247gk
Copy link

Hi @keithito, I have successfully trained a model and tested it out using the demo server.py . I am pretty new to TensorFlow and machine learning, id like to know how I can export the model checkpoints to a frozen model so that I can use it with something like tfjs on a browser for integration with a chat bot. It would be really helpful if you could help out with a guide to freeze the model. I tried using this code : https://blog.metaflow.fr/tensorflow-how-to-freeze-a-model-and-serve-it-with-a-python-api-d4f3596b3adc but i ran into issue with output nodes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant