Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multiprocessing error when using graph_dir #603

Open
tkoskela opened this issue Dec 14, 2023 · 2 comments
Open

multiprocessing error when using graph_dir #603

tkoskela opened this issue Dec 14, 2023 · 2 comments

Comments

@tkoskela
Copy link

I'm setting up ford in my fortran project. The html documentation is getting produced fine. I have set graph: true to produce dependency graphs, but they aren't getting rendered well in the html output. I tried setting graph_dir: to save copies of graphs and I got the below warning and error.

/home/tkoskela/python_envs/conquest/lib/python3.10/site-packages/ford/graphs.py:1483: TqdmWarning: Iterable length 1112 > 1000 but `chunksize` is not set. This may seriously degrade multiprocess performance. Set `chunksize=1` or more.
  process_map(
Writing graphs:   0%|                                                      | 0/1112 [00:00<?, ?it/s]
concurrent.futures.process._RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/queues.py", line 244, in _feed
    obj = _ForkingPickler.dumps(obj)
  File "/usr/lib/python3.10/multiprocessing/reduction.py", line 51, in dumps
    cls(buf, protocol).dump(obj)
TypeError: cannot pickle '_io.TextIOWrapper' object
"""

I'm using ford version 7.0.3 and Python 3.10.12

@tkoskela
Copy link
Author

tkoskela commented Dec 14, 2023

I tried passing chunksize = 1 to process_map as the error message suggested and got

Writing graphs:   0%|                                                      | 0/1112 [00:00<?, ?it/s]
concurrent.futures.process._RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/queues.py", line 244, in _feed
    obj = _ForkingPickler.dumps(obj)
  File "/usr/lib/python3.10/multiprocessing/reduction.py", line 51, in dumps
    cls(buf, protocol).dump(obj)
RecursionError: maximum recursion depth exceeded while pickling an object
"""

Are my graphs too large? Can I do anything about this

Here is a link to my ford config

@ZedThree
Copy link
Member

ZedThree commented Jan 5, 2024

I think the second bug is a duplicate of #517 but with a public repo, so I have some chance of being able to debug it.

I've not managed to track this down yet, but this SO question/answer looks promising: https://stackoverflow.com/questions/63876046/cannot-pickle-object-maximum-recursion-depth-exceeded

Stack Overflow
I'm trying to pickle objects that I generate in a script to process them afterwards, but I get this error: File "<ipython-input-2-0f716e86ecd3>", line 1, in <module> pickle.dump(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants