Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Random seed does not take effect during finetuning #3769

Closed
zjgemi opened this issue May 10, 2024 · 0 comments · Fixed by #3773
Closed

[BUG] Random seed does not take effect during finetuning #3769

zjgemi opened this issue May 10, 2024 · 0 comments · Fixed by #3773
Assignees
Labels

Comments

@zjgemi
Copy link

zjgemi commented May 10, 2024

Bug summary

The parameter seed in training part does not take effect during finetuning. The results do not change after modifying the seed.

DeePMD-kit Version

2024Q1

Backend and its version

pytorch

How did you download the software?

Built from source

Input Files, Running Commands, Error Log, etc.

dp --pt train --finetune pretrained_model.pt --model-branch H2O_H2O-PD input.json

Steps to Reproduce

Modify the seed and rerun the command.

Further Information, Files, and Links

No response

@zjgemi zjgemi added the bug label May 10, 2024
@iProzd iProzd self-assigned this May 10, 2024
@iProzd iProzd linked a pull request May 12, 2024 that will close this issue
@njzjz njzjz closed this as completed May 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

3 participants