-
Notifications
You must be signed in to change notification settings - Fork 637
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Training take 2x longer since 1.13.0 with FastAI #1234
Comments
Hey @mhtrinh, have you observed the same slowdown with the newer versions of ClearML? The most recent one is 1.14.4 |
Yes, this happen also with the current version 1.14.4, as 2x slower. Note : this may be specific to fastai as we have another network based on yolov5 and this is not happening |
Hi @mhtrinh ! It looks like calculating the metrics that ClearML reports may take a long time. We will try to improve performance. |
Hi @mhtrinh ! We will release a fix for this issue in the next clearml release |
Training out model which is based on FastAI is taking 2x longer with Clearml 1.13.0 compare to 1.12.2
There are no error or warning
I cannot share our code. Here is the requirements.txt of the virtualenv:
Simply
pip install clearml==1.12.2
andpip install clearml==1.13.0
and re-run the same code.OS: openSUSE Leap 15.4
The text was updated successfully, but these errors were encountered: