Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in forward simulation with large 3D geometry #529

Open
dmnk1308 opened this issue Apr 25, 2023 · 8 comments
Open

Error in forward simulation with large 3D geometry #529

dmnk1308 opened this issue Apr 25, 2023 · 8 comments
Assignees

Comments

@dmnk1308
Copy link

dmnk1308 commented Apr 25, 2023

Problem description

Hi,
First of all, thanks for the amazing library and to provide it open source!
I am trying to simulate the current flow through the human body with pyGimLi. For this I already have a volume mesh with about 3 million nodes and 17 million tetrahedrons as vtk-file. I load this via an external library (vedo) and then transfer the nodes and cells into a new, empty pyGimLi 3D mesh .
Next, I place the electrodes on the surface of the mesh, set the reference electrode and configuration point, and assign the resistivity.
Then, I perform my forward calculation. This worked fine for simpler meshes where I include only two modalities (e.g. an organ and the body shape). For multiple modalities, however, this no longer works. I then get the error message CHOLMOD error: problem too large. file: ../Supernodal/cholmod_super_symbolic.c line: 683. It cannot be due to too little RAM. This is not used up during the calculation. Are you already aware of this error and is there a solution? Unfortunately, I cannot provide you with the volume mesh for data protection reasons. I will try to reconstruct the case with less sensitive data.

Your environment

OS : Linux
CPU(s) : 80
Machine : x86_64
Architecture : 64bit
RAM : 754.6 GiB
Environment : Python

Python 3.10.10 | packaged by conda-forge | (main, Mar 24 2023, 20:08:06)
[GCC 11.3.0]

pygimli : 1.4.0
pgcore : 1.4.0
numpy : 1.24.2
matplotlib : 3.7.1
scipy : 1.10.1
IPython : 8.12.0
pyvista : 0.38.5

Steps to reproduce

import pygimli as pg
import vedo
from pygimli.physics import ert

# load volume mesh
mesh = vedo.TetMesh('path/to/mesh.vtk', mapper='tetra')
nodes = mesh.points()
index = mesh.celldata['Index']
cells = np.array(mesh.cells())

# load nodes and cells to pyGimLi mesh
test = pg.Mesh(3)
for i in nodes:
    test.createNode(i)
for i, k in zip(cells, index)):
    test.createCell(i, int(k))

# load electrode position
electrodes = scipy.io.loadmat('path/to/electrodes.mat')['coord']

# find closest existing node in mesh to place electrode at
corr_electrodes = []
for i in electrodes:
    corr_electrodes.append(np.asarray(test.nodes()[test.findNearestNode(i)].pos()))
electrodes = np.stack(corr_electrodes)

# set ert scheme
scheme = pg.physics.ert.createData(electrodes, schemeName='dd', inverse=False, closed=True)
for s in scheme.sensors():
    tmp_idx = test.findNearestNode(s)
    test.nodes()[tmp_idx].setMarker(-99)

# reference electrode
tmp_idx = test.findNearestNode([-0.180, -0.15, -0.6])
test.nodes()[tmp_idx].setMarker(-999)

# calibration node
tmp_idx = test.findNearestNode([0., 0., -0.5])
test.nodes()[tmp_idx].setMarker(-1000)

# forward simulation
res = [[1, 1/0.3], [2, 1/0.2], [3, 1/0.7], [4, 1/0.02], [5, 1/0.025]]
fwd = ert.simulate(test, res=res, scheme=scheme, sr=False, calcOnly=True, verbose=True)

Expected behavior

Return the potentials at the electrodes.

Actual behavior

Error message.

CHOLMOD error: problem too large. file: ../Supernodal/cholmod_super_symbolic.c line: 683
CHOLMOD error: argument missing. file: ../Cholesky/cholmod_factorize.c line: 121
 0 (296.12s)CHOLMOD error: argument missing. file: ../Cholesky/cholmod_solve.c line: 1062
Segmentation fault

"question"

@dmnk1308 dmnk1308 changed the title Forward simulation with large 3D geometry Error in forward simulation with large 3D geometry Apr 25, 2023
@halbmy
Copy link
Contributor

halbmy commented Apr 25, 2023

The code looks clean and sound. Can you specify the number of nodes and the available RAM size?

@dmnk1308
Copy link
Author

Thanks for your fast reply.
The mesh consists of - Nodes: 3017743 Cells: 17471149.
My RAM size is 754.6 GiB.

@carsten-forty2
Copy link
Contributor

We don't have much experiences with model sizes like this. But I remembered similar memory problems for node counts approx. > 1e6, which might be caused by internal constraints of the default direct solver (CHOLMOD). Alternatively, we also implemented a wrapper for the umfpack solver, which might or might not can handle such larger systems. Unfortunately, its not possible to manually switch the solver with the current release but I commited a small 'add' into the dev branch, where you can now set a environment variable BERTUSEUMFPACK=1 to force the use of umfpack instead of cholmod for BERT simulations. Might be worth a try.

Direct solvers (cholmod and umfpack) are supposed for 'smaller' problems, why we choose them as default. For larger systems (like yours) iterative solvers might be a better choice and could be implemented via the python interface in principle.

@dmnk1308
Copy link
Author

dmnk1308 commented Apr 25, 2023

I'll check it out and let you know if it works.
Would be nice if you plan to implement iterative solvers for those problems in the future. I also might look into that at some point, thanks!

@dmnk1308
Copy link
Author

Unfortunately the umfpack solver didn't work either.
In case you plan to extend the package to iterative solvers I uploaded the mesh and the corresponding electrode positions of my problem(link).

@mariosgeo
Copy link

mariosgeo commented Dec 11, 2023

Hi, I am dealing with a similar issue (problem too big for direct solver).

I search the code a bit, and on the solver.py, on the class LonSolver, def __init)), I added the following

    elif solver.lower() == 'iter':            
        self.solver='iter'

and on the function linSolve I added the follwing

elif solver == 'iter':
    from scipy.sparse.linalg import cg
    x=cgs(_m,b)

I wanted to add as staring guess, the mean value of x0=mean(resistivity), but don't know how to find that value.

Is this on the right track?
Also, where do you add the solver='iter' ?

@carsten-forty2
Copy link
Contributor

I assume you also want to solve ERT?

There are two spots where a linear solver is called:

  1. (Default) its hardcoded in the core and its calling the CHOLMOD solver or UMFPACK with the hack from above.
  2. The Python LinSolve is only called from pg.solve in the ERTModellingReference implementation but there also not yet an option for forwarding the solver string

I agree it would be good to have an option for changing the default solver by a python call, to have the possibility for other third party solver. However, this would need some changes in the core and a following core update. Unsure when we'll find priority for this at the moment.

It would be maybe easier to add a solver string to the reference implementation, however this implementation is much slower than the default and probably not suitable for an already large problem.

A starting guess for a CG solver need to be a potential distribution, maybe the analytical solution for mean(resistivity)?.
But please note, the Solver need to be called for every injection electrode (Because of this, we preferred direct solvers which can be prefactorized) . To be at least a little efficient, a good preconditioner need to be used for any iterative solver.
Maybe its advisable to try to reduce the problem size first, if possible.

@mariosgeo
Copy link

Thanks for the detailed reply. We will wait for the future version of the code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants