Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Auto-Save function #279

Open
c0def0x01 opened this issue Oct 13, 2022 · 3 comments
Open

Auto-Save function #279

c0def0x01 opened this issue Oct 13, 2022 · 3 comments

Comments

@c0def0x01
Copy link

c0def0x01 commented Oct 13, 2022

I find myself frequently in the situation to train e.g. a symbolic Regressor on my local pc. With higher number of generations this can take several hours. If, for some reason, the process is interrupted I loose all the previously calculated generations.

Would it be possible for you to add an option that allows to auto-save the model during the ‚fit()‘ operation in training, e.g. every number of n generations?

@trevorstephens
Copy link
Owner

You could generate this functionality within a simple loop by combining warm starting with pickling

@c0def0x01
Copy link
Author

Thank you. Yes you are right. Could have had the idea myself….

However one question: if I reload a previously trained SybolicRegressor and evolve with warm_start further generations, I observed that it displays very high ‚population average fitness’ values (>10^20), whereas the best individual values for fitness continue around where the previous training round finished. Is it just a matter of display, or because the old population averages are not saved with the model …. ?
Please excuse if my question is naive or ignorant.
If this values are just abmattet of display, then your proposal is actually the one will do. Thx

@trevorstephens
Copy link
Owner

If you can provide a short self contained example with a toy dataset I can look into it. I'm not familiar with the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants