Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Best practice for rendering a large datasets with multi rendering + avoid reading/writing COCO everytime #1062

Open
AlcatrAAz opened this issue Feb 8, 2024 · 1 comment
Labels
question Question, not yet a bug ;)

Comments

@AlcatrAAz
Copy link

Describe the issue

Hello everyone,

I have a question what would be the best practice to render a large dataset. In my case I have the objects that I want to create the labels for and then objects in the background with changing Images as Input in BDSF or changing Hue of simple objects.

From my understanding I can't use keyframes as I change the images/materials of the background objects. Basically a plane with changing COCO images (or cc textures) for a Background. Please correct me if I'm wrong!

After rendering +5000 images in one folder I experience a big bottleneck at reading/writing the COCO json file after rendering a new image. Like rendering 2 sec and then read/write 10+ seconds the json file on 1 core. Reading/writing with multiple cores would be definitely really nice (like bop now #994)

So is there a way to avoid this and what is the best way to create big datasets time efficiently when you can't use keyframes.

Maybe store multiple "render" and "render_segmap" and write multiple images with the "write_coco_annotations"?

Somewhere else in the issues here I read that it is not recommended to render a lot images with one script. Does this only apply for rendering a lot keyframes for one "render" or also for multiple "render" in one python file?

Best regards and thanks in advance!

Minimal code example

No response

Files required to run the code

No response

Expected behavior

Writing the COCO json file on multiple cores.

BlenderProc version

2.6.2

@AlcatrAAz AlcatrAAz added the question Question, not yet a bug ;) label Feb 8, 2024
@AlcatrAAz
Copy link
Author

Additional Info:

I don't think the problem is only the reading/writing json file. I ran a python script over the night and experienced following.

When I started the script and generated images in a new folder (new json) I could generate ca. 30-35 images per minute.
After 15 hours running the python script I could only generate ca. 4 images in a new folder (new json).

I experienced this with 2 python scripts that I ran parallel over the night.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question, not yet a bug ;)
Projects
None yet
Development

No branches or pull requests

1 participant