Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Strange output log #2

Open
launchauto opened this issue May 28, 2021 · 12 comments
Open

Strange output log #2

launchauto opened this issue May 28, 2021 · 12 comments

Comments

@launchauto
Copy link

Hi authors, I have pretrianed your moby_swin_tiny model using 8 Tesla V100 GPU
and reproduced your results in downstream task. I get 74.394% on linear evaluation and 43.1% on COCO object detection task, 39.3% on COCO segmentation task. But the loss and grad_norm is really weired during training. Can you show me your log?
Here is my log. The loss drops to 7 and then rises to 16, then never drop again. During the pretraining task, the grad norm average value sometimes rises to infinite.
log_rank0.txt

@launchauto
Copy link
Author

launchauto commented May 28, 2021

The uploaded txt log_rank0.txt is one of the eight gpus pretrain logs.
And the uploaded txt log_rank7.txt is one of the eight gpus linear evaluation logs.
log_rank7.txt

@michuanhaohao
Copy link

I also encountered the same problem.

@tbup
Copy link

tbup commented Feb 8, 2022

@launchauto @michuanhaohao me too, but I run it with precision O0. Did you run with the O0 precision?
log_rank0.txt

@Rocky1salady-killer
Copy link

我也遇到了这个问题!loss一直是16永远不会下降?

@Rocky1salady-killer
Copy link

怎么才能不适用apex混合精度呢?我使用swin transformer进行训练的时候,loss就会下降并且收敛。然而,我注意到swin transformer工程当中没有使用apex混合精度

@Chengyang852
Copy link

Is it normal for the loss value to be around 16? Has anyone encountered this problem?

@NonTerraePlusUltra
Copy link

怎么才能不适合用apex混合精度呢?我用swin transformer进行训练的时候,loss就会下降并收敛。不过,我注意到swin transformer工程中没有使用apex混

请问您的问题解决了吗

@NonTerraePlusUltra
Copy link

loss值在16左右正常吗?有没有人遇到过这个问题?

loss值在16左右正常吗?有没有人遇到过这个问题?

我也是

@Pang-b0
Copy link

Pang-b0 commented Apr 3, 2023

Excuse me, have you solved the problem that loss drops to 8.9 and then rises in the opposite direction? Is it caused by apex mixed precision training?

@NonTerraePlusUltra
Copy link

请问,loss下降到8.9然后反方向上升的问题解决了吗?是顶点混合精度训练导致的吗?

没有/(ㄒoㄒ)/~~

@Pang-b0
Copy link

Pang-b0 commented Apr 3, 2023

会不会是loss函数的问题呀 这个代码你还在关注吗,我的loss从开始就是16 降不下去

@NonTerraePlusUltra
Copy link

不会是loss随便数的问题呀这个代号你还在关注吗,我的loss从开始就是16降不下

我也没有解决。。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants