Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can't train the clap #21

Open
vivagwb opened this issue May 10, 2023 · 3 comments
Open

can't train the clap #21

vivagwb opened this issue May 10, 2023 · 3 comments

Comments

@vivagwb
Copy link

vivagwb commented May 10, 2023

(open-musiclm) G:\Learn\AmateurLearning\AI\Practice\open-musiclm-main>python ./scripts/train_clap_rvq.py --results_folder ./results/clap_rvq --model_config ./configs/model/musiclm_small.json --training_config ./configs/training/train_musiclm_fma.json
loading clap...
Some weights of the model checkpoint at roberta-base were not used when initializing RobertaModel: ['lm_head.decoder.weight', 'lm_head.dense.weight', 'lm_head.bias', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.dense.bias']

  • This IS expected if you are initializing RobertaModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
  • This IS NOT expected if you are initializing RobertaModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
    F:\Miniconda\envs\open-musiclm\Lib\site-packages\torchaudio\transforms_transforms.py:611: UserWarning: Argument 'onesided' has been deprecated and has no influence on the behavior of this module.
    warnings.warn(
    F:\Miniconda\envs\open-musiclm\Lib\site-packages\accelerate\accelerator.py:258: FutureWarning: logging_dir is deprecated and will be removed in version 0.18.0 of 🤗 Accelerate. Use project_dir instead.
    warnings.warn(
    F:\Miniconda\envs\open-musiclm\Lib\site-packages\accelerate\accelerator.py:375: UserWarning: log_with=tensorboard was passed but no supported trackers are currently installed.
    warnings.warn(f"log_with={log_with} was passed but no supported trackers are currently installed.")
    Traceback (most recent call last):
    File "G:\Learn\AmateurLearning\AI\Practice\open-musiclm-main\scripts\train_clap_rvq.py", line 33, in
    trainer = create_clap_rvq_trainer_from_config(
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "G:\Learn\AmateurLearning\AI\Practice\open-musiclm-main\scripts..\open_musiclm\config.py", line 317, in create_clap_rvq_trainer_from_config
    trainer = ClapRVQTrainer(
    ^^^^^^^^^^^^^^^
    File "G:\Learn\AmateurLearning\AI\Practice\open-musiclm-main\scripts..\open_musiclm\trainer.py", line 606, in init
    self.ds = SoundDataset(
    ^^^^^^^^^^^^^
    File "G:\Learn\AmateurLearning\AI\Practice\open-musiclm-main\scripts..\open_musiclm\data.py", line 80, in init
    assert path.exists(), 'folder does not exist'
    AssertionError: folder does not exist @zhvng @jlalmes
@MichaelALong
Copy link

This error is because it cannot find the training data. The training_config parameter should be the config file which specifies the data directory. To download the fma data, use the script download_fma_large.sh. There is also a small (7GB) FMA version which you can manually download instead.

@fahnub
Copy link

fahnub commented Jun 12, 2023

Thanks for this @MichaelALong

@baardev
Copy link

baardev commented Nov 4, 2023

just create the folder data/fma_large (or whatever has been defined as "folder" in configs/training/train_musiclm_fma.json) and you will move past that error to the next error, "no files exists". Again, this mystery is solved by copying your 'mp3', 'wav', or 'flac' files to `data/fma_large'.

Suggestions to dev. instead of 'folder does not exist" and "no files exist", maybe say "folder 'data/fma_large' does not exist" and "No Wav, flac, MP3 files found in 'data/fma_large'"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants