You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When calculating my multiscene configuration ( gprMax.run(scenes=scenes, n=len(scenes) ... ) Each my scene uses about 2gb of memory. It turned out that each scene's memory is added to the previous scene, and after a certain number of scenes, all of my computer's RAM (32gb) ran out and the calculation crashed. Is it possible to provide a mechanism for clearing memory during multiscene calculation? I read that similar problems were in the release version of gprMax when using GPU, but I'm using the new python API and not using GPU.
One more question : When using multiscene, output files are created with numbers 1, 2, etc., how do I pass the parameters of a specific scene to write to the output file name or to the output file attributes of a specific scene?
The text was updated successfully, but these errors were encountered:
When calculating my multiscene configuration ( gprMax.run(scenes=scenes, n=len(scenes) ... ) Each my scene uses about 2gb of memory. It turned out that each scene's memory is added to the previous scene, and after a certain number of scenes, all of my computer's RAM (32gb) ran out and the calculation crashed. Is it possible to provide a mechanism for clearing memory during multiscene calculation? I read that similar problems were in the release version of gprMax when using GPU, but I'm using the new python API and not using GPU.
One more question : When using multiscene, output files are created with numbers 1, 2, etc., how do I pass the parameters of a specific scene to write to the output file name or to the output file attributes of a specific scene?
The text was updated successfully, but these errors were encountered: