You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Inference from a model I trained isn't working and just printing out "Error: 'config'" instead
To Reproduce
I used the colab in the repo, with the change that I made the main directory on my Google Drive so I don't have to pull every time and so the models I make are automatically saved.
I trained a model, and maybe this is the problem, with v2 and 40k sample rate. I later saw in the configs that there is no v2-40k config.
I then tried the inference there, and it didn't work out, it just spat out "Error: 'config'"
I traced it to the vc pipeline where it loads the checkpoint and tries to access ckpt['config'].
Then outside of the code, I loaded the checkpoints saved from my training and they do not have a 'config' key.
So I looked at the code that saves the checkpoints, and there is no 'config' key there either.
I think I'm doing something wrong, but I'm not sure what.
should all of the saved G's be usable? I think the most possible issue is that I'm using the wrong model file. I'm using the ones in the log/model_name directory. Usually RVC saves other weights in the weights directory, but there isn't one here.
As mentioned above, I checked the training code where it saves, and there is no explicit 'config' in the saved pth when it's saving the epoch checkpoints. Are there different models being saved?
Is V2 + 40k supported even though there is no config file for it?
Could having the repo on Google drive cause issues? I know sometimes it causes Linux path issues.
save_only_latest seems to not be a usable flag since the best model might not be the latest, and usually we need to go back and see the performance and pick the one that's best. How is this flag really used?
Expected behavior
Inference works
The text was updated successfully, but these errors were encountered:
Describe the bug
Inference from a model I trained isn't working and just printing out "Error: 'config'" instead
To Reproduce
I used the colab in the repo, with the change that I made the main directory on my Google Drive so I don't have to pull every time and so the models I make are automatically saved.
I trained a model, and maybe this is the problem, with v2 and 40k sample rate. I later saw in the configs that there is no v2-40k config.
I then tried the inference there, and it didn't work out, it just spat out "Error: 'config'"
I traced it to the vc pipeline where it loads the checkpoint and tries to access ckpt['config'].
Then outside of the code, I loaded the checkpoints saved from my training and they do not have a 'config' key.
So I looked at the code that saves the checkpoints, and there is no 'config' key there either.
I think I'm doing something wrong, but I'm not sure what.
Expected behavior
Inference works
The text was updated successfully, but these errors were encountered: