Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: 'config' when trying to infer from a model that I trained #50

Open
sleimanitani opened this issue Apr 25, 2024 · 0 comments
Open

Comments

@sleimanitani
Copy link

sleimanitani commented Apr 25, 2024

Describe the bug
Inference from a model I trained isn't working and just printing out "Error: 'config'" instead

To Reproduce
I used the colab in the repo, with the change that I made the main directory on my Google Drive so I don't have to pull every time and so the models I make are automatically saved.

I trained a model, and maybe this is the problem, with v2 and 40k sample rate. I later saw in the configs that there is no v2-40k config.

I then tried the inference there, and it didn't work out, it just spat out "Error: 'config'"

I traced it to the vc pipeline where it loads the checkpoint and tries to access ckpt['config'].
Then outside of the code, I loaded the checkpoints saved from my training and they do not have a 'config' key.
So I looked at the code that saves the checkpoints, and there is no 'config' key there either.

I think I'm doing something wrong, but I'm not sure what.

  1. should all of the saved G's be usable? I think the most possible issue is that I'm using the wrong model file. I'm using the ones in the log/model_name directory. Usually RVC saves other weights in the weights directory, but there isn't one here.
  2. As mentioned above, I checked the training code where it saves, and there is no explicit 'config' in the saved pth when it's saving the epoch checkpoints. Are there different models being saved?
  3. Is V2 + 40k supported even though there is no config file for it?
  4. Could having the repo on Google drive cause issues? I know sometimes it causes Linux path issues.
  5. save_only_latest seems to not be a usable flag since the best model might not be the latest, and usually we need to go back and see the performance and pick the one that's best. How is this flag really used?

Expected behavior
Inference works

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant