New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak caused by eq.solve at each step? #907
Comments
What version of FiPy are you running?
This may be an issue that was reported (and fixed) a couple of months ago: #896 |
I am using FiPy version 3.4.3. Interestingly, I found that the problem did not arise while running it in Google Colab. In Colab, I installed FiPy via pip, whereas in my machine I am using a conda installation. |
I am attaching the package list of my conda environment below if it would be helpful. |
Thank you for forwarding your environment. We're delinquent in doing a release since fixing #896. Please try a git checkout of FiPy. |
Thanks a lot! I can confirm that the issue was fixed by using the latest Git version of Fipy (3.4.3+5.g4db188c1), but only together with the line |
I am running a simple code for diffusion in two dimensions, as shown below. However, the memory consumed increases quickly within the for loop, as much as 30GB RAM within a minute. Increasing the number of time steps leads it to eventually being 'Killed' since it runs out of 64GB memory. Is this because somehow the value of the target variable 'C' is stored for all time steps instead of the current one? I am quite new to FiPy, so any clue as to why this is happening would be very helpful :)
Code:
I am using Ubuntu 22.04 and a conda python 3.10 environment.
The text was updated successfully, but these errors were encountered: