New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Expectations around CPU/Memory usage with serverless #390
Comments
Following up on this I feel like there might be something amiss, when running asset reindex we see hundreds of optimize image jobs put into the Craft queue. Would you expect this to trigger such a spike in both CPU and memory when the transforms are just being farmed off to the serverless API in AWS? We see a CPU peak of nearly 2 this time and max memory hits 2GB (which relative to normal load is massive) |
Two parts are not farmed out:
These are both done locally. The CPU / memory spikes you mention do seem excessive, but if the images are very large, it's possible I suppose. You could disable those options if you don't use them, or to test. |
We don't use silhouettes, do the low resolution placeholders always get regenerated regardless if they previously existed? We are looking at maybe 200 images of up to 4-5k resolution so guess that is plausible |
So this information is saved in the field data itself. It will be regenerated any time the optimized images fields are resaved |
Just to confirm, these jobs should be limited to max PHP memory allocated (in our case 512MB) - we only run 2 jobs concurrently so a little perplexed why it is hitting 2GB. |
No... because image processing packages like GD and Imagick are actually implemented as PHP extensions (written in C I believe) and use memory outside of the PHP memory pool: https://stackoverflow.com/questions/9993085/php-imagick-memory-leak |
Interesting - any way to limit that process outside of PHP? Sort of defeating the point of our nice serverless solution if the placeholder images can cause such large memory spikes |
Presumably something similar to this issue? craftcms/cms#13098 Suggestion from Brad is to limit using a security policy as per: https://www.bigbinary.com/blog/configuring-memory-allocation-in-imagemagick Worryingly there are quite a few 'memory related issues' on their GH repo: https://github.com/ImageMagick/ImageMagick/issues?q=is%3Aissue+is%3Aopen+memory |
True for Imagick, but I don’t think for GD, since that is usually part of the PHP compilation. GD should respect resource limits in php.ini. The way to limit resources in Imagick is via https://imagemagick.org/script/security-policy.php |
Ah thanks @angrybrad I thought they both worked the same way. |
Question
Hi Andrew, been long on our list to adopt this plugin and our initial tests/usage look good. However we have hit the classic memory issues for native Craft transforms and instead moved over to AWS serverless transforms. Again this looks promising but wanted to understand the expectations around CPU and memory usage, as you can see from the Grafana dashboard below we are using 1 full CPU for the single transform job (run via the CLI) and peaking at about 500MB memory even though there is no transforms happening on the container (returns to baseline of around 250MB when complete).
This is processing about 180 images with 3 iamgeOptimize fields each with 2-3 variants
The text was updated successfully, but these errors were encountered: