Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Displacement Memory Growth & Purge Option + Disk Caching of Displacements #178

Open
SoundsSerious opened this issue Nov 25, 2023 · 3 comments

Comments

@SoundsSerious
Copy link
Contributor

SoundsSerious commented Nov 25, 2023

Hi again,

I've been running a very large number of load combos in a distributed system, with an early stopping on failure routine. I am having trouble predicting the size of task to allocate by memory since the memory grows as I continue to perform further analysis on load combos.

I am currently addressing this with the following function running after I've processed the combo:

def purge_combo(struct:'Frame3D',combo:str):

    struct._D.pop(combo,None)
    for node in struct.Nodes.values():
        node.DX.pop(combo,None)
        node.DY.pop(combo,None)
        node.DZ.pop(combo,None)
        node.RX.pop(combo,None)
        node.RY.pop(combo,None)
        node.RZ.pop(combo,None)

This is working well and keeping my memory consumption per task consistent (and allowing me to fully utilize my AWS resources!)

A reorganization of some of the data structures to have one authoritative dictionary for displacements _D would be useful, and the node dictionaries DX,DY...RX could be replaced with weakref.WeakRefValueDictionary so that removal of the object from _D would automatically drop the reference from the node's as well without looping.

Of course this brings into question what happens if you want to reactivate the combo, and believe there might be some interesting options with https://github.com/grantjenks/python-diskcache although this invokes issues with how to identify the structure,combo pair.

@SoundsSerious
Copy link
Contributor Author

Let me know if this is something that you'd support, and I can write a PR for this!

@JWock82
Copy link
Owner

JWock82 commented Nov 29, 2023

You've identified one of the hardest parts of finite element analysis - the shear amount of data and memory it consumes. I've had large models in commercial software run out of disk space when running calculations. For every 1 node you have 12 displacements multiplied by the number of load combinations to store.

What your code has done works, but purging the displacements for unused load combinations is basically purging all the results for those combinations too, since they are derived "on the fly" from the displacements. You could also use load combination tags to only run selected load combinations. That has already been implemented in a recent change. See "Tags" here https://pynite.readthedocs.io/en/latest/load_combo.html

I'll admit I've never worked with weak references. I am not a programmer by trade, I'm a structural engineer. My code is probably a little weak, but the math/science behind it is pretty solid. I skimmed the python docs on weak references, and it's an interesting concept for efficient memory management. Not sure how I'd go about implementing it in Pynite.

Regarding python-diskcache, it also looks interesting. I'd be hesitant to add any additional dependencies to the project.

@SoundsSerious
Copy link
Contributor Author

SoundsSerious commented Dec 4, 2023

Sounds like this is absolutely a pain point for many an FEA tool! PyNite should have a solution for this, and maybe to speed up analysis by loading results from a previous run. That would prevent rework if say you accidentally shutdown your python session.

The more I think about it the more it would make sense to have some kind of remove,save,load combo set of functions that could save progress displacements into some user defined storage say via a callback or dict-like interface.

As far as weak references I didn't really get them until I learned about python garbage collection where an object is kept as long as it has references to it, as explained by this example https://eli.thegreenplace.net/2009/06/12/safely-using-destructors-in-python/.

As far as implementation I think it would be relatively straight forward. The node dictionaries would be replaced with the WeakValueDict so that the results automatically disappeared when removed from the primary storage dictionary. Likewise the load_combo routine would have to reattach these weakvaluedict's to work again.

I'll find some time to write a demo of this and see if there are any gotchas

BTW Glad to see those load combo tags made it into the repo! I will be able to merge my fork on the latest release before submitting a PR so itll be on the same branch. The reorganization looks wonderful with the analysis code segmented out!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants