Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

saving the trained model for future use - #4

Open
vijaymanikandan opened this issue Sep 6, 2015 · 5 comments
Open

saving the trained model for future use - #4

vijaymanikandan opened this issue Sep 6, 2015 · 5 comments

Comments

@vijaymanikandan
Copy link

Thanks for posting your code, I played with it to understand LSTM implementation. I just have a quick question regarding saving the models for later use. Right now, when I run train.py, I can see the model getting trained and I see some outputs. But is there a way to save the model to file and later use it to retrain/predict on future data? I tried using pickle but I get an error saying maximum recursion reached. Please post your thoughts.

Thanks!

@lmqs
Copy link

lmqs commented Jun 3, 2016

I also need to save the trained model, I tried to save "NTWK" but gives serialization error: maximum recursion depth exceeded. Has anyone experienced this? thanks

@guduxingzou
Copy link

hello do you have done that,save/predict the model? @vijaymanikandan @lucianaqueiroz098

@rakeshvar
Copy link
Owner

I don't plan to. As it is not meant to be a full-fledged product. Only a proof of concept. It is a very bare bones implementation, not an end-to-end product. May be someone has it on a fork?

@lmqs
Copy link

lmqs commented Jul 15, 2016

I think saving the outputs required function "tester": the layer 2 and layer 1 work, @rakeshvar can confirm this?

@guduxingzou
Copy link

the predict I think to use the teser function .

I get this repo: https://github.com/mosessoh/CNN-LSTM-Caption-Generator @rakeshvar Can this repo handle this digit recognition ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants