Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exporter use a lot of memory when scraping secrets from k8s, which leads to OOM sometimes. #222

Open
agrevtsev opened this issue Dec 16, 2023 · 1 comment

Comments

@agrevtsev
Copy link

As we are scraping all the secrets from k8s, regardless of their type, it could take a LOT of memory (3G+).
I suppose it could be the reason of #97 #103
For example, long-running cluster can have hundreds of helm.sh/release.v1 secrets per namespace.
So my proposal is fetch only kubeSecretTypes types secrets from k8s API.

I prepared some POC #221 which works.

Disclaimer: unfortunately i'm not proficient in Go programming, so can't program anything complicated yet.

Cheers, Alexey

go tool pprof -sample_index=inuse_space -text heap2.out
File: x509-certificate-exporter
Type: inuse_space
Time: Dec 15, 2023 at 6:40pm (-03)
Showing nodes accounting for 1.54GB, 99.65% of 1.54GB total
Dropped 20 nodes (cum <= 0.01GB)
      flat  flat%   sum%        cum   cum%
    1.54GB 99.65% 99.65%     1.54GB 99.68%  io.ReadAll
         0     0% 99.65%     1.54GB 99.68%  github.com/enix/x509-certificate-exporter/v3/internal.(*Exporter).DiscoverCertificates
         0     0% 99.65%     1.54GB 99.68%  github.com/enix/x509-certificate-exporter/v3/internal.(*Exporter).ListenAndServe
         0     0% 99.65%     1.54GB 99.68%  github.com/enix/x509-certificate-exporter/v3/internal.(*Exporter).getWatchedSecrets
         0     0% 99.65%     1.54GB 99.68%  github.com/enix/x509-certificate-exporter/v3/internal.(*Exporter).parseAllCertificates
         0     0% 99.65%     1.54GB 99.68%  github.com/enix/x509-certificate-exporter/v3/internal.(*Exporter).parseAllKubeSecrets
         0     0% 99.65%     1.54GB 99.68%  k8s.io/client-go/kubernetes/typed/core/v1.(*secrets).List
         0     0% 99.65%     1.54GB 99.68%  k8s.io/client-go/rest.(*Request).Do
         0     0% 99.65%     1.54GB 99.68%  k8s.io/client-go/rest.(*Request).Do.func1
         0     0% 99.65%     1.54GB 99.68%  k8s.io/client-go/rest.(*Request).request
         0     0% 99.65%     1.54GB 99.68%  k8s.io/client-go/rest.(*Request).request.func3
         0     0% 99.65%     1.54GB 99.68%  k8s.io/client-go/rest.(*Request).request.func3.1 (inline)
         0     0% 99.65%     1.54GB 99.68%  k8s.io/client-go/rest.(*Request).transformResponse
         0     0% 99.65%     1.54GB 99.68%  main.main
         0     0% 99.65%     1.54GB 99.68%  runtime.main
@tech1ndex
Copy link

+1 seeing the same issue in our environments and have had to bump our memory usage twice already.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants