Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docker image doesn't follow concurrency limit for ffmpeg #283

Open
GlenHertz opened this issue Jan 24, 2022 · 4 comments
Open

Docker image doesn't follow concurrency limit for ffmpeg #283

GlenHertz opened this issue Jan 24, 2022 · 4 comments
Labels
bug Existing bug

Comments

@GlenHertz
Copy link

Bug description

I start thumbsup using the docker image and limit the concurrency to 2 and I get in the terminal "Processing media" with 2 files being processed (good) but checking the processes there are about 20 or so ffmpeg processes/threads running and my load is 10 to 20. I expect only 2 ffmpeg processes to be running at a time. I only have 4 CPUs so all these extra processes running makes it run slower and use more memory.

Looking at things a bit closer when I view the process tree I actually only have two toplevel ffmpeg processes running with each having 20 threads (using htop, and then press t). Should -threads 1 arguments be passed to ffmpeg? Is there some way to manually do this on form thumbsup? I don't recall the non-docker version having this issue.

Steps to reproduce

On Ubuntu 20.04:

dir=/mnt/zpool/tank/media
docker run -t -v $dir/Pictures:/Pictures:ro -v /etc/localtime:/etc/localtime -v $dir/Pictures.browse:/Pictures.browse -u $(id -u):$(id -g) ghcr.io/thumbsup/thumbsup thumbsup --input /Pictures --output /Pictures.browse --include '20[0-9][0-9]/**' --include-videos true --photo-download symlink --video-download symlink --link-prefix ../Pictures --sort-albums-by title --cleanup true --embed-exif true --log default --log-file /Pictures.browse/thumbsup.log --concurrency 2
@GlenHertz GlenHertz added the bug Existing bug label Jan 24, 2022
@rprieto
Copy link
Member

rprieto commented Jan 29, 2022

Hi @GlenHertz . Thank you for raising this!

You're right, --concurrency controls the number of files processed at once. From what you found this is working as expected.

It’s a very good point that ffmpeg can run multiple threads. I'm not sure if the difference in behaviour could be related to Docker vs non-Docker. Do you know how ffmpeg decides on the number of threads ? Or maybe it's the ffmpeg version that's running ?

I'll try running --threads 1 on some test galleries and seeing how this affects performance. We could either make it the default, or make it configurable.

@GlenHertz
Copy link
Author

Thanks. If I do one run with just photos and 4 CPUs and then a separate video only run with 1 CPU will the final database have both videos and photos?

@GlenHertz
Copy link
Author

I believe ffmpeg uses as many threads as possible to fully consume all CPUs so it really should not be run multiple times in parallel.

@rprieto
Copy link
Member

rprieto commented Jan 31, 2022

Hi @GlenHertz, for your first question: no. Each run is independant, so the second run would overwrite the first one and you'd only have videos.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Existing bug
Projects
None yet
Development

No branches or pull requests

2 participants