Skip to content
This repository has been archived by the owner on Nov 11, 2023. It is now read-only.

ValueError: math domain error #169

Open
3 tasks done
mp075496706 opened this issue Apr 18, 2023 · 0 comments
Open
3 tasks done

ValueError: math domain error #169

mp075496706 opened this issue Apr 18, 2023 · 0 comments
Labels
help wanted The issue author is asking for help

Comments

@mp075496706
Copy link

Please check the checkboxes below.

  • I have read README.md and Quick solution in wiki carefully.
  • I have been troubleshooting issues through various search engines. The questions I want to ask are not common.
  • I am NOT using one click package / environment package.

OS version

Win10 Professional 22H2

GPU

RTX3060Ti, CUDA Version:12.1, Driver Version:531.41

Python version

3.8.9

PyTorch version

torch-2.0.0+cu118-cp38-cp38-win_amd64.whl

Branch of sovits

4.0(Default)

Dataset source (Used to judge the dataset quality)

No background noise and clear vocal audio recorded using a mobile phone

Where thr problem occurs or what command you executed

Trainning

Problem description

When I train using train.py, the command line window will always output information such as Epoch: XXXX, cost 6. x s.
At first, it was normal, but when Epoch was over 3000, there were occasional errors such as' ValueError: math domain error '.
So I had to input the training command line again.
I have currently trained to a G_ The model of 20800.pth,
So, in line 227 of the train.py script, I added code and made the final modifications as follows:

 for i in losses:
       try:
             reference_loss += math.log(i, 10)
       except ValueError:
             print("value error")
             continue

As of the time I submitted this issue, my training progress was not interrupted and I successfully received a G_ 21600.pth model.
So, does anyone know if my problem has been effectively solved?

Log

2023-04-18 08:25:09,015	44k	INFO	====> Epoch: 4268, cost 6.95 s
2023-04-18 08:25:15,851	44k	INFO	====> Epoch: 4269, cost 6.84 s
2023-04-18 08:25:22,641	44k	INFO	====> Epoch: 4270, cost 6.79 s
2023-04-18 08:25:29,419	44k	INFO	====> Epoch: 4271, cost 6.78 s
2023-04-18 08:25:36,209	44k	INFO	====> Epoch: 4272, cost 6.79 s
2023-04-18 08:25:43,076	44k	INFO	====> Epoch: 4273, cost 6.87 s
2023-04-18 09:25:55,262	44k	INFO	{'train': {'log_interval': 200, 'eval_interval': 800, 'seed': 1234, 'epochs': 30000000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 10240, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 512, 'port': '8001', 'keep_ckpts': 3, 'all_in_mem': False}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 44100, 'filter_length': 2048, 'hop_length': 512, 'win_length': 2048, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': 22050}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [8, 8, 2, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 1}, 'spk': {'girl': 0}, 'model_dir': './logs\\44k'}
2023-04-18 09:25:55,263	44k	WARNING	D:\so-vits-svc-4.0 is not a git repository, therefore hash value comparison will be ignored.
2023-04-18 09:25:57,163	44k	INFO	Loaded checkpoint './logs\44k\G_20800.pth' (iteration 4155)
2023-04-18 09:25:57,669	44k	INFO	Loaded checkpoint './logs\44k\D_20800.pth' (iteration 4155)
2023-04-18 09:26:14,955	44k	INFO	====> Epoch: 4155, cost 19.69 s
2023-04-18 09:26:21,865	44k	INFO	====> Epoch: 4156, cost 6.91 s
2023-04-18 09:26:28,869	44k	INFO	====> Epoch: 4157, cost 7.00 s
2023-04-18 09:26:35,866	44k	INFO	====> Epoch: 4158, cost 7.00 s
2023-04-18 09:26:42,847	44k	INFO	====> Epoch: 4159, cost 6.98 s
2023-04-18 09:26:49,664	44k	INFO	====> Epoch: 4160, cost 6.82 s
2023-04-18 09:26:56,796	44k	INFO	====> Epoch: 4161, cost 7.13 s
2023-04-18 09:27:03,655	44k	INFO	====> Epoch: 4162, cost 6.86 s
2023-04-18 09:27:10,634	44k	INFO	====> Epoch: 4163, cost 6.98 s
2023-04-18 09:27:17,629	44k	INFO	====> Epoch: 4164, cost 6.99 s
2023-04-18 09:27:24,616	44k	INFO	====> Epoch: 4165, cost 6.99 s
2023-04-18 09:27:31,605	44k	INFO	====> Epoch: 4166, cost 6.99 s
2023-04-18 09:27:38,464	44k	INFO	====> Epoch: 4167, cost 6.86 s
2023-04-18 09:27:45,199	44k	INFO	====> Epoch: 4168, cost 6.73 s
2023-04-18 09:27:52,243	44k	INFO	====> Epoch: 4169, cost 7.04 s
2023-04-18 09:27:59,200	44k	INFO	====> Epoch: 4170, cost 6.96 s
2023-04-18 09:28:06,193	44k	INFO	====> Epoch: 4171, cost 6.99 s
2023-04-18 09:28:13,205	44k	INFO	====> Epoch: 4172, cost 7.01 s
2023-04-18 09:28:20,027	44k	INFO	====> Epoch: 4173, cost 6.82 s
2023-04-18 09:28:27,053	44k	INFO	====> Epoch: 4174, cost 7.03 s
2023-04-18 09:28:34,137	44k	INFO	====> Epoch: 4175, cost 7.08 s
2023-04-18 09:28:41,204	44k	INFO	====> Epoch: 4176, cost 7.07 s
2023-04-18 09:28:48,197	44k	INFO	====> Epoch: 4177, cost 6.99 s
2023-04-18 09:28:55,371	44k	INFO	====> Epoch: 4178, cost 7.17 s
2023-04-18 09:29:02,392	44k	INFO	====> Epoch: 4179, cost 7.02 s
2023-04-18 09:29:09,405	44k	INFO	====> Epoch: 4180, cost 7.01 s
2023-04-18 09:29:16,420	44k	INFO	====> Epoch: 4181, cost 7.01 s
2023-04-18 09:29:23,422	44k	INFO	====> Epoch: 4182, cost 7.00 s
2023-04-18 09:29:30,422	44k	INFO	====> Epoch: 4183, cost 7.00 s
2023-04-18 09:29:37,431	44k	INFO	====> Epoch: 4184, cost 7.01 s
2023-04-18 09:29:44,452	44k	INFO	====> Epoch: 4185, cost 7.02 s
2023-04-18 09:29:51,429	44k	INFO	====> Epoch: 4186, cost 6.98 s
2023-04-18 09:29:58,439	44k	INFO	====> Epoch: 4187, cost 7.01 s
2023-04-18 09:30:05,488	44k	INFO	====> Epoch: 4188, cost 7.05 s
2023-04-18 09:30:12,565	44k	INFO	====> Epoch: 4189, cost 7.08 s
2023-04-18 09:30:19,584	44k	INFO	====> Epoch: 4190, cost 7.02 s
2023-04-18 09:30:26,417	44k	INFO	====> Epoch: 4191, cost 6.83 s
2023-04-18 09:30:33,376	44k	INFO	====> Epoch: 4192, cost 6.96 s
2023-04-18 09:30:40,353	44k	INFO	====> Epoch: 4193, cost 6.98 s
2023-04-18 09:30:46,549	44k	INFO	Train Epoch: 4194 [80%]
2023-04-18 09:30:46,550	44k	INFO	Losses: [1.1533994674682617, 3.6290440559387207, 14.561955451965332, 19.54532814025879, 0.36340442299842834], step: 21000, lr: 5.911663351026662e-05, reference_loss: 26.36424860035109
2023-04-18 09:30:47,752	44k	INFO	====> Epoch: 4194, cost 7.40 s
2023-04-18 09:30:54,723	44k	INFO	====> Epoch: 4195, cost 6.97 s
2023-04-18 09:31:01,739	44k	INFO	====> Epoch: 4196, cost 7.02 s
2023-04-18 09:31:08,731	44k	INFO	====> Epoch: 4197, cost 6.99 s
2023-04-18 09:31:15,757	44k	INFO	====> Epoch: 4198, cost 7.03 s
2023-04-18 09:31:22,581	44k	INFO	====> Epoch: 4199, cost 6.82 s
2023-04-18 09:31:30,207	44k	INFO	====> Epoch: 4200, cost 7.63 s
2023-04-18 09:31:37,363	44k	INFO	====> Epoch: 4201, cost 7.16 s
2023-04-18 09:31:44,371	44k	INFO	====> Epoch: 4202, cost 7.01 s
2023-04-18 09:31:51,436	44k	INFO	====> Epoch: 4203, cost 7.07 s
2023-04-18 09:31:58,523	44k	INFO	====> Epoch: 4204, cost 7.09 s
2023-04-18 09:32:05,562	44k	INFO	====> Epoch: 4205, cost 7.04 s
2023-04-18 09:32:12,568	44k	INFO	====> Epoch: 4206, cost 7.01 s
2023-04-18 09:32:19,588	44k	INFO	====> Epoch: 4207, cost 7.02 s
2023-04-18 09:32:26,580	44k	INFO	====> Epoch: 4208, cost 6.99 s
2023-04-18 09:32:33,740	44k	INFO	====> Epoch: 4209, cost 7.16 s
2023-04-18 09:32:40,922	44k	INFO	====> Epoch: 4210, cost 7.18 s
2023-04-18 09:32:47,981	44k	INFO	====> Epoch: 4211, cost 7.06 s
2023-04-18 09:33:56,656	44k	INFO	{'train': {'log_interval': 200, 'eval_interval': 800, 'seed': 1234, 'epochs': 30000000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 10240, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 512, 'port': '8001', 'keep_ckpts': 3, 'all_in_mem': False}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 44100, 'filter_length': 2048, 'hop_length': 512, 'win_length': 2048, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': 22050}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [8, 8, 2, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 1}, 'spk': {'girl': 0}, 'model_dir': './logs\\44k'}
2023-04-18 09:33:56,656	44k	WARNING	D:\so-vits-svc-4.0 is not a git repository, therefore hash value comparison will be ignored.
2023-04-18 09:33:58,752	44k	INFO	Loaded checkpoint './logs\44k\G_20800.pth' (iteration 4155)
2023-04-18 09:33:59,262	44k	INFO	Loaded checkpoint './logs\44k\D_20800.pth' (iteration 4155)
2023-04-18 09:34:16,157	44k	INFO	====> Epoch: 4155, cost 19.50 s
2023-04-18 09:34:23,356	44k	INFO	====> Epoch: 4156, cost 7.20 s
2023-04-18 09:34:30,503	44k	INFO	====> Epoch: 4157, cost 7.15 s
2023-04-18 09:34:37,497	44k	INFO	====> Epoch: 4158, cost 6.99 s
2023-04-18 09:34:44,505	44k	INFO	====> Epoch: 4159, cost 7.01 s
2023-04-18 09:34:51,516	44k	INFO	====> Epoch: 4160, cost 7.01 s
2023-04-18 09:34:58,451	44k	INFO	====> Epoch: 4161, cost 6.94 s
2023-04-18 09:35:05,475	44k	INFO	====> Epoch: 4162, cost 7.02 s
2023-04-18 09:35:12,485	44k	INFO	====> Epoch: 4163, cost 7.01 s
2023-04-18 09:35:19,623	44k	INFO	====> Epoch: 4164, cost 7.14 s
2023-04-18 09:35:26,481	44k	INFO	====> Epoch: 4165, cost 6.86 s
2023-04-18 09:35:33,276	44k	INFO	====> Epoch: 4166, cost 6.79 s
2023-04-18 09:35:40,297	44k	INFO	====> Epoch: 4167, cost 7.02 s
2023-04-18 09:35:47,058	44k	INFO	====> Epoch: 4168, cost 6.76 s
2023-04-18 09:35:53,872	44k	INFO	====> Epoch: 4169, cost 6.81 s
2023-04-18 09:36:00,877	44k	INFO	====> Epoch: 4170, cost 7.00 s
2023-04-18 09:36:07,828	44k	INFO	====> Epoch: 4171, cost 6.95 s
2023-04-18 09:36:14,829	44k	INFO	====> Epoch: 4172, cost 7.00 s
2023-04-18 09:36:21,836	44k	INFO	====> Epoch: 4173, cost 7.01 s
2023-04-18 09:36:28,956	44k	INFO	====> Epoch: 4174, cost 7.12 s
2023-04-18 09:36:36,014	44k	INFO	====> Epoch: 4175, cost 7.06 s
2023-04-18 09:36:43,062	44k	INFO	====> Epoch: 4176, cost 7.05 s
2023-04-18 09:36:50,053	44k	INFO	====> Epoch: 4177, cost 6.99 s
2023-04-18 09:36:57,063	44k	INFO	====> Epoch: 4178, cost 7.01 s
2023-04-18 09:37:04,202	44k	INFO	====> Epoch: 4179, cost 7.14 s
2023-04-18 09:37:11,389	44k	INFO	====> Epoch: 4180, cost 7.19 s
2023-04-18 09:37:18,245	44k	INFO	====> Epoch: 4181, cost 6.86 s
2023-04-18 09:37:25,232	44k	INFO	====> Epoch: 4182, cost 6.99 s
2023-04-18 09:37:32,269	44k	INFO	====> Epoch: 4183, cost 7.04 s
2023-04-18 09:37:39,063	44k	INFO	====> Epoch: 4184, cost 6.79 s
2023-04-18 09:37:46,045	44k	INFO	====> Epoch: 4185, cost 6.98 s
2023-04-18 09:37:53,036	44k	INFO	====> Epoch: 4186, cost 6.99 s
2023-04-18 09:38:00,049	44k	INFO	====> Epoch: 4187, cost 7.01 s
2023-04-18 09:38:07,153	44k	INFO	====> Epoch: 4188, cost 7.10 s
2023-04-18 09:38:14,216	44k	INFO	====> Epoch: 4189, cost 7.06 s
2023-04-18 09:38:21,235	44k	INFO	====> Epoch: 4190, cost 7.02 s
2023-04-18 09:38:28,040	44k	INFO	====> Epoch: 4191, cost 6.80 s
2023-04-18 09:38:35,169	44k	INFO	====> Epoch: 4192, cost 7.13 s
2023-04-18 09:38:42,192	44k	INFO	====> Epoch: 4193, cost 7.02 s
2023-04-18 09:38:48,463	44k	INFO	Train Epoch: 4194 [80%]
2023-04-18 09:38:48,463	44k	INFO	Losses: [1.0800206661224365, 4.064505100250244, 15.424717903137207, 19.831018447875977, 0.3648463785648346], step: 21000, lr: 5.911663351026662e-05, reference_loss: 26.90112027122955
2023-04-18 09:38:49,673	44k	INFO	====> Epoch: 4194, cost 7.48 s
2023-04-18 09:38:56,654	44k	INFO	====> Epoch: 4195, cost 6.98 s
2023-04-18 09:39:03,624	44k	INFO	====> Epoch: 4196, cost 6.97 s
2023-04-18 09:39:10,415	44k	INFO	====> Epoch: 4197, cost 6.79 s
2023-04-18 09:39:17,242	44k	INFO	====> Epoch: 4198, cost 6.83 s
2023-04-18 09:39:23,999	44k	INFO	====> Epoch: 4199, cost 6.76 s
2023-04-18 09:39:31,070	44k	INFO	====> Epoch: 4200, cost 7.07 s
2023-04-18 09:39:38,214	44k	INFO	====> Epoch: 4201, cost 7.14 s
2023-04-18 09:39:45,221	44k	INFO	====> Epoch: 4202, cost 7.01 s
2023-04-18 09:39:52,633	44k	INFO	====> Epoch: 4203, cost 7.41 s
2023-04-18 09:39:59,948	44k	INFO	====> Epoch: 4204, cost 7.32 s
2023-04-18 09:40:07,205	44k	INFO	====> Epoch: 4205, cost 7.26 s
2023-04-18 09:40:14,472	44k	INFO	====> Epoch: 4206, cost 7.27 s
2023-04-18 09:40:21,267	44k	INFO	====> Epoch: 4207, cost 6.79 s
2023-04-18 09:40:28,029	44k	INFO	====> Epoch: 4208, cost 6.76 s
2023-04-18 09:40:35,251	44k	INFO	====> Epoch: 4209, cost 7.22 s
2023-04-18 09:40:42,567	44k	INFO	====> Epoch: 4210, cost 7.32 s
2023-04-18 09:40:49,788	44k	INFO	====> Epoch: 4211, cost 7.22 s
2023-04-18 09:40:57,181	44k	INFO	====> Epoch: 4212, cost 7.39 s
2023-04-18 09:41:04,581	44k	INFO	====> Epoch: 4213, cost 7.40 s
2023-04-18 09:41:11,961	44k	INFO	====> Epoch: 4214, cost 7.38 s
2023-04-18 09:41:19,223	44k	INFO	====> Epoch: 4215, cost 7.26 s
2023-04-18 09:41:26,156	44k	INFO	====> Epoch: 4216, cost 6.93 s
2023-04-18 09:41:33,114	44k	INFO	====> Epoch: 4217, cost 6.96 s
2023-04-18 09:41:39,961	44k	INFO	====> Epoch: 4218, cost 6.85 s
2023-04-18 09:41:46,999	44k	INFO	====> Epoch: 4219, cost 7.04 s
2023-04-18 09:41:53,956	44k	INFO	====> Epoch: 4220, cost 6.96 s
2023-04-18 09:42:00,966	44k	INFO	====> Epoch: 4221, cost 7.01 s
2023-04-18 09:42:07,971	44k	INFO	====> Epoch: 4222, cost 7.01 s
2023-04-18 09:42:14,960	44k	INFO	====> Epoch: 4223, cost 6.99 s
2023-04-18 09:42:21,943	44k	INFO	====> Epoch: 4224, cost 6.98 s
2023-04-18 09:42:28,961	44k	INFO	====> Epoch: 4225, cost 7.02 s
2023-04-18 09:42:35,951	44k	INFO	====> Epoch: 4226, cost 6.99 s
2023-04-18 09:42:42,933	44k	INFO	====> Epoch: 4227, cost 6.98 s
2023-04-18 09:42:49,954	44k	INFO	====> Epoch: 4228, cost 7.02 s
2023-04-18 09:42:57,011	44k	INFO	====> Epoch: 4229, cost 7.06 s
2023-04-18 09:43:03,967	44k	INFO	====> Epoch: 4230, cost 6.96 s
2023-04-18 09:43:10,945	44k	INFO	====> Epoch: 4231, cost 6.98 s
2023-04-18 09:43:17,946	44k	INFO	====> Epoch: 4232, cost 7.00 s
2023-04-18 09:43:24,962	44k	INFO	====> Epoch: 4233, cost 7.02 s
2023-04-18 09:43:31,123	44k	INFO	Train Epoch: 4234 [80%]
2023-04-18 09:43:31,124	44k	INFO	Losses: [1.0183080434799194, 3.8086729049682617, 17.133316040039062, 20.698583602905273, 0.18538106977939606], step: 21200, lr: 5.882176968723764e-05, reference_loss: 24.065002883507034
2023-04-18 09:43:32,331	44k	INFO	====> Epoch: 4234, cost 7.37 s
2023-04-18 09:43:39,333	44k	INFO	====> Epoch: 4235, cost 7.00 s
2023-04-18 09:43:46,345	44k	INFO	====> Epoch: 4236, cost 7.01 s
2023-04-18 09:43:53,353	44k	INFO	====> Epoch: 4237, cost 7.01 s
2023-04-18 09:44:00,457	44k	INFO	====> Epoch: 4238, cost 7.10 s
2023-04-18 09:44:07,568	44k	INFO	====> Epoch: 4239, cost 7.11 s
2023-04-18 09:44:14,483	44k	INFO	====> Epoch: 4240, cost 6.92 s
2023-04-18 09:44:21,336	44k	INFO	====> Epoch: 4241, cost 6.85 s
2023-04-18 09:44:28,332	44k	INFO	====> Epoch: 4242, cost 7.00 s
2023-04-18 09:44:35,170	44k	INFO	====> Epoch: 4243, cost 6.84 s
2023-04-18 09:44:42,134	44k	INFO	====> Epoch: 4244, cost 6.96 s
2023-04-18 09:44:49,084	44k	INFO	====> Epoch: 4245, cost 6.95 s
2023-04-18 09:44:55,920	44k	INFO	====> Epoch: 4246, cost 6.84 s
2023-04-18 09:45:02,928	44k	INFO	====> Epoch: 4247, cost 7.01 s
2023-04-18 09:45:09,722	44k	INFO	====> Epoch: 4248, cost 6.79 s
2023-04-18 09:45:16,692	44k	INFO	====> Epoch: 4249, cost 6.97 s
2023-04-18 09:45:23,679	44k	INFO	====> Epoch: 4250, cost 6.99 s
2023-04-18 09:45:30,709	44k	INFO	====> Epoch: 4251, cost 7.03 s
2023-04-18 09:45:37,836	44k	INFO	====> Epoch: 4252, cost 7.13 s
2023-04-18 09:45:45,289	44k	INFO	====> Epoch: 4253, cost 7.45 s
2023-04-18 09:45:52,102	44k	INFO	====> Epoch: 4254, cost 6.81 s
2023-04-18 09:45:59,170	44k	INFO	====> Epoch: 4255, cost 7.07 s
2023-04-18 09:46:06,285	44k	INFO	====> Epoch: 4256, cost 7.12 s
2023-04-18 09:46:13,114	44k	INFO	====> Epoch: 4257, cost 6.83 s
2023-04-18 09:46:19,899	44k	INFO	====> Epoch: 4258, cost 6.79 s
2023-04-18 09:46:26,868	44k	INFO	====> Epoch: 4259, cost 6.97 s
2023-04-18 09:46:33,669	44k	INFO	====> Epoch: 4260, cost 6.80 s
2023-04-18 09:46:40,690	44k	INFO	====> Epoch: 4261, cost 7.02 s
2023-04-18 09:46:47,659	44k	INFO	====> Epoch: 4262, cost 6.97 s
2023-04-18 09:46:54,645	44k	INFO	====> Epoch: 4263, cost 6.99 s
2023-04-18 09:47:01,637	44k	INFO	====> Epoch: 4264, cost 6.99 s
2023-04-18 09:47:08,659	44k	INFO	====> Epoch: 4265, cost 7.02 s
2023-04-18 09:47:15,675	44k	INFO	====> Epoch: 4266, cost 7.02 s
2023-04-18 09:47:22,893	44k	INFO	====> Epoch: 4267, cost 7.22 s
2023-04-18 09:47:29,880	44k	INFO	====> Epoch: 4268, cost 6.99 s
2023-04-18 09:47:36,841	44k	INFO	====> Epoch: 4269, cost 6.96 s
2023-04-18 09:47:44,082	44k	INFO	====> Epoch: 4270, cost 7.24 s
2023-04-18 09:47:51,259	44k	INFO	====> Epoch: 4271, cost 7.18 s
2023-04-18 09:47:58,268	44k	INFO	====> Epoch: 4272, cost 7.01 s
2023-04-18 09:48:05,247	44k	INFO	====> Epoch: 4273, cost 6.98 s
2023-04-18 09:48:11,453	44k	INFO	Train Epoch: 4274 [80%]
2023-04-18 09:48:11,453	44k	INFO	Losses: [1.2744048833847046, 4.040312767028809, 12.233488082885742, 18.33358383178711, -0.048623278737068176], step: 21400, lr: 5.852837659535434e-05, reference_loss: 30.625200847914876
2023-04-18 09:48:12,646	44k	INFO	====> Epoch: 4274, cost 7.40 s
2023-04-18 09:48:19,683	44k	INFO	====> Epoch: 4275, cost 7.04 s
2023-04-18 09:48:26,643	44k	INFO	====> Epoch: 4276, cost 6.96 s
2023-04-18 09:48:33,451	44k	INFO	====> Epoch: 4277, cost 6.81 s
2023-04-18 09:48:40,242	44k	INFO	====> Epoch: 4278, cost 6.79 s
2023-04-18 09:48:47,055	44k	INFO	====> Epoch: 4279, cost 6.81 s
2023-04-18 09:48:54,005	44k	INFO	====> Epoch: 4280, cost 6.95 s
2023-04-18 09:49:00,981	44k	INFO	====> Epoch: 4281, cost 6.98 s
2023-04-18 09:49:08,014	44k	INFO	====> Epoch: 4282, cost 7.03 s
2023-04-18 09:49:15,008	44k	INFO	====> Epoch: 4283, cost 6.99 s
2023-04-18 09:49:22,001	44k	INFO	====> Epoch: 4284, cost 6.99 s
2023-04-18 09:49:28,849	44k	INFO	====> Epoch: 4285, cost 6.85 s
2023-04-18 09:49:35,853	44k	INFO	====> Epoch: 4286, cost 7.00 s
2023-04-18 09:49:42,830	44k	INFO	====> Epoch: 4287, cost 6.98 s
2023-04-18 09:49:49,835	44k	INFO	====> Epoch: 4288, cost 7.00 s
2023-04-18 09:49:56,828	44k	INFO	====> Epoch: 4289, cost 6.99 s
2023-04-18 09:50:03,802	44k	INFO	====> Epoch: 4290, cost 6.97 s
2023-04-18 09:50:10,824	44k	INFO	====> Epoch: 4291, cost 7.02 s
2023-04-18 09:50:17,819	44k	INFO	====> Epoch: 4292, cost 7.00 s
2023-04-18 09:50:24,790	44k	INFO	====> Epoch: 4293, cost 6.97 s
2023-04-18 09:50:31,561	44k	INFO	====> Epoch: 4294, cost 6.77 s
2023-04-18 09:50:38,417	44k	INFO	====> Epoch: 4295, cost 6.86 s
2023-04-18 09:50:45,347	44k	INFO	====> Epoch: 4296, cost 6.93 s
2023-04-18 09:50:52,376	44k	INFO	====> Epoch: 4297, cost 7.03 s
2023-04-18 09:50:59,358	44k	INFO	====> Epoch: 4298, cost 6.98 s
2023-04-18 09:51:06,171	44k	INFO	====> Epoch: 4299, cost 6.81 s
2023-04-18 09:51:13,159	44k	INFO	====> Epoch: 4300, cost 6.99 s
2023-04-18 09:51:20,055	44k	INFO	====> Epoch: 4301, cost 6.90 s
2023-04-18 09:51:26,807	44k	INFO	====> Epoch: 4302, cost 6.75 s
2023-04-18 09:51:33,754	44k	INFO	====> Epoch: 4303, cost 6.95 s
2023-04-18 09:51:40,844	44k	INFO	====> Epoch: 4304, cost 7.09 s
2023-04-18 09:51:47,810	44k	INFO	====> Epoch: 4305, cost 6.97 s
2023-04-18 09:51:54,802	44k	INFO	====> Epoch: 4306, cost 6.99 s
2023-04-18 09:52:02,009	44k	INFO	====> Epoch: 4307, cost 7.21 s
2023-04-18 09:52:09,021	44k	INFO	====> Epoch: 4308, cost 7.01 s
2023-04-18 09:52:15,971	44k	INFO	====> Epoch: 4309, cost 6.95 s
2023-04-18 09:52:22,962	44k	INFO	====> Epoch: 4310, cost 6.99 s
2023-04-18 09:52:29,934	44k	INFO	====> Epoch: 4311, cost 6.97 s
2023-04-18 09:52:36,929	44k	INFO	====> Epoch: 4312, cost 7.00 s
2023-04-18 09:52:43,778	44k	INFO	====> Epoch: 4313, cost 6.85 s
2023-04-18 09:52:49,831	44k	INFO	Train Epoch: 4314 [80%]
2023-04-18 09:52:49,831	44k	INFO	Losses: [1.0186668634414673, 4.287105083465576, 14.895614624023438, 21.103084564208984, 0.8100769519805908], step: 21600, lr: 5.8236446898857163e-05, reference_loss: 30.461269509519603
2023-04-18 09:52:57,074	44k	INFO	Saving model and optimizer state at iteration 4314 to ./logs\44k\G_21600.pth
2023-04-18 09:52:57,777	44k	INFO	Saving model and optimizer state at iteration 4314 to ./logs\44k\D_21600.pth
2023-04-18 09:52:58,408	44k	INFO	.. Free up space by deleting ckpt ./logs\44k\G_19200.pth
2023-04-18 09:52:58,438	44k	INFO	.. Free up space by deleting ckpt ./logs\44k\D_19200.pth
2023-04-18 09:52:59,281	44k	INFO	====> Epoch: 4314, cost 15.50 s
2023-04-18 09:53:06,400	44k	INFO	====> Epoch: 4315, cost 7.12 s
2023-04-18 09:53:13,503	44k	INFO	====> Epoch: 4316, cost 7.10 s
2023-04-18 09:53:20,536	44k	INFO	====> Epoch: 4317, cost 7.03 s
2023-04-18 09:53:27,394	44k	INFO	====> Epoch: 4318, cost 6.86 s
2023-04-18 09:53:34,184	44k	INFO	====> Epoch: 4319, cost 6.79 s
2023-04-18 09:53:41,173	44k	INFO	====> Epoch: 4320, cost 6.99 s
2023-04-18 09:53:48,193	44k	INFO	====> Epoch: 4321, cost 7.02 s
2023-04-18 09:53:55,135	44k	INFO	====> Epoch: 4322, cost 6.94 s
2023-04-18 09:54:02,142	44k	INFO	====> Epoch: 4323, cost 7.01 s
2023-04-18 09:54:09,154	44k	INFO	====> Epoch: 4324, cost 7.01 s
2023-04-18 09:54:16,281	44k	INFO	====> Epoch: 4325, cost 7.13 s
2023-04-18 09:54:23,353	44k	INFO	====> Epoch: 4326, cost 7.07 s
2023-04-18 09:54:30,382	44k	INFO	====> Epoch: 4327, cost 7.03 s
2023-04-18 09:54:37,331	44k	INFO	====> Epoch: 4328, cost 6.95 s
2023-04-18 09:54:44,170	44k	INFO	====> Epoch: 4329, cost 6.84 s
2023-04-18 09:54:51,116	44k	INFO	====> Epoch: 4330, cost 6.95 s
2023-04-18 09:54:57,971	44k	INFO	====> Epoch: 4331, cost 6.86 s
2023-04-18 09:55:04,945	44k	INFO	====> Epoch: 4332, cost 6.97 s
2023-04-18 09:55:11,903	44k	INFO	====> Epoch: 4333, cost 6.96 s
2023-04-18 09:55:18,925	44k	INFO	====> Epoch: 4334, cost 7.02 s
2023-04-18 09:55:25,903	44k	INFO	====> Epoch: 4335, cost 6.98 s
2023-04-18 09:55:32,890	44k	INFO	====> Epoch: 4336, cost 6.99 s
2023-04-18 09:55:39,928	44k	INFO	====> Epoch: 4337, cost 7.04 s
2023-04-18 09:55:47,016	44k	INFO	====> Epoch: 4338, cost 7.09 s
2023-04-18 09:55:53,979	44k	INFO	====> Epoch: 4339, cost 6.96 s
2023-04-18 09:56:00,959	44k	INFO	====> Epoch: 4340, cost 6.98 s
2023-04-18 09:56:08,048	44k	INFO	====> Epoch: 4341, cost 7.09 s
2023-04-18 09:56:16,485	44k	INFO	====> Epoch: 4342, cost 8.44 s
2023-04-18 09:56:23,844	44k	INFO	====> Epoch: 4343, cost 7.36 s
2023-04-18 09:56:30,674	44k	INFO	====> Epoch: 4344, cost 6.83 s
2023-04-18 09:56:37,678	44k	INFO	====> Epoch: 4345, cost 7.00 s
2023-04-18 09:56:44,666	44k	INFO	====> Epoch: 4346, cost 6.99 s
2023-04-18 09:56:51,496	44k	INFO	====> Epoch: 4347, cost 6.83 s
2023-04-18 09:56:58,496	44k	INFO	====> Epoch: 4348, cost 7.00 s
2023-04-18 09:57:05,491	44k	INFO	====> Epoch: 4349, cost 7.00 s
2023-04-18 09:57:12,481	44k	INFO	====> Epoch: 4350, cost 6.99 s
2023-04-18 09:57:19,402	44k	INFO	====> Epoch: 4351, cost 6.92 s
2023-04-18 09:57:26,246	44k	INFO	====> Epoch: 4352, cost 6.84 s
2023-04-18 09:57:33,263	44k	INFO	====> Epoch: 4353, cost 7.02 s
2023-04-18 09:57:39,569	44k	INFO	Train Epoch: 4354 [80%]
2023-04-18 09:57:39,569	44k	INFO	Losses: [2.0403993129730225, 2.927959442138672, 8.839935302734375, 17.390018463134766, 0.8205661177635193], step: 21800, lr: 5.7945973298576136e-05, reference_loss: 28.771429352922596
2023-04-18 09:57:40,772	44k	INFO	====> Epoch: 4354, cost 7.51 s
2023-04-18 09:57:47,675	44k	INFO	====> Epoch: 4355, cost 6.90 s
2023-04-18 09:57:54,603	44k	INFO	====> Epoch: 4356, cost 6.93 s
2023-04-18 09:58:01,504	44k	INFO	====> Epoch: 4357, cost 6.90 s
2023-04-18 09:58:08,493	44k	INFO	====> Epoch: 4358, cost 6.99 s
2023-04-18 09:58:15,454	44k	INFO	====> Epoch: 4359, cost 6.96 s
2023-04-18 09:58:22,253	44k	INFO	====> Epoch: 4360, cost 6.80 s
2023-04-18 09:58:29,204	44k	INFO	====> Epoch: 4361, cost 6.95 s
2023-04-18 09:58:36,046	44k	INFO	====> Epoch: 4362, cost 6.84 s
2023-04-18 09:58:42,876	44k	INFO	====> Epoch: 4363, cost 6.83 s

Screenshot so-vits-svc and logs/44k folders and paste here

so-vits-svc
so-vits-svc

Supplementary description

No response

@mp075496706 mp075496706 added the help wanted The issue author is asking for help label Apr 18, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
help wanted The issue author is asking for help
Projects
None yet
Development

No branches or pull requests

1 participant