-
Notifications
You must be signed in to change notification settings - Fork 228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
strange bug, due to missing copy ... ? #345
Comments
note that if I change
then it works, but it should not be necessary ... |
Yeah this is related to what is and isn't copied during a transform. It is indeed tricky. The problem is that deep-copying can be long if the tensor is large, and I'm not sure it's a good idea to copy e.g. the |
[I edited your code a bit to understand it more easily, sorry] |
sorry for the bad variable name ... (i'll make an effort next time) Ok then it may not be a bug, since it is a choice, but if you keep it like that, it clearly need to be well documented. I can understand the different choice for |
This was not on purpose. It's just that label maps are modified by spatial transforms, but not by intensity transforms. Do you think we should change the behaviour? The simplest way is just deep-copying everything all the time, but I think that would be too long. |
I see, it goes through transform.py __call function, and there a copy is made. If there are issue with memory, and/or time execution, it may be wiser, to let it configurable ... I do not know there was a similar discussion previously, for adding copy in transform.py ... and you finaly decide to add it |
actually the fix, is always easy: just add deepcopy in you script. So it may be it make sense (more pythonic .?) to do not have it in the code, ( and warn) the user, so that we add it when needed (not always the case) on the other hand, to do not have it will lead to very silent bug (difficult to detect) so for the user point of view, it is safer to have it systematically what I found strange it to have a different behaviour for intensity and labelmap ... |
Yes, but it's a shallow copy: the array is the same. I'll try to measure some times with and without deep copying and see what would need to be changed in the docs, or what the best solution is. |
馃悰Bug
I find python bugs related to missing copy, very difficult to see, I guess this one is related to this ..
when I modify the content of a transformed subject it also change the original subject
This happen only when using LabelMap AND RandomBlur
or to be more precise: it does not happen if on use either scalarImage OR RandomAffine
To reproduce
it plots a black image, although I only modify the transformed data,
Now un-comment either RandomAffine, or ScalarImage, and there is no more bug
Expected behavior
Actual behavior
TorchIO version
The text was updated successfully, but these errors were encountered: