Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High CPU and memory usage in ztunnel when client opens many connections #864

Closed
2 tasks done
ably77 opened this issue Apr 2, 2024 · 2 comments
Closed
2 tasks done
Assignees

Comments

@ably77
Copy link

ably77 commented Apr 2, 2024

Is this the right place to submit this?

  • This is not a security vulnerability or a crashing bug
  • This is not a question about how to use Istio

Bug Description

While running a load test against an application running in Ambient mode, I ran across high CPU and memory usage in ztunnel when the client opens many connections

Output of kubectl top pods -n kube-system --sort-by cpu

NAME                                                             CPU(cores)   MEMORY(bytes)
ztunnel-p9lx7                                                    2021m        7757Mi
ztunnel-lvn4h                                                    1008m        937Mi
ztunnel-hkrxt                                                    928m         4510Mi
ztunnel-p26kl                                                    885m         3374Mi
ztunnel-g4hx7                                                    865m         3757Mi
ztunnel-r6nq9                                                    828m         3006Mi
ztunnel-bfl9n                                                    825m         2890Mi
ztunnel-69d4v                                                    779m         2211Mi
ztunnel-tcq6z                                                    765m         1542Mi
ztunnel-tw6rr                                                    732m         1389Mi
ztunnel-zn9nb                                                    694m         1408Mi
ztunnel-sxvb2                                                    681m         1452Mi
ztunnel-h2pf9                                                    659m         3198Mi
ztunnel-cj2tm                                                    658m         1900Mi
ztunnel-cxrmw                                                    657m         1319Mi
ztunnel-sd6bn                                                    655m         1431Mi
ztunnel-jw9ch                                                    643m         3887Mi
ztunnel-45kp8                                                    610m         1249Mi
ztunnel-7n4hw                                                    596m         1886Mi
ztunnel-r5x4g                                                    562m         4716Mi
ztunnel-dpxb4                                                    525m         3465Mi
ztunnel-ptn8j                                                    471m         3807Mi
ztunnel-cnxpc                                                    439m         3252Mi

ztunnel-p9lx7-profile.zip

Flamegraph

Reproduction Steps
I have outlined the steps in this repo here to reproduce the issue in a small cluster (2-3 nodes) with a 5 namespace app + loadgenerators

Version

% istioctl version
client version: 1.21.0
control plane version: 1.21.0
data plane version: 1.21.0

% kubectl version
Client Version: v1.29.0
Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
Server Version: v1.28.3-gke.1118000

Additional Information

No response

@howardjohn
Copy link
Member

With 1.22-beta1 this is basically expected to be fixed

@howardjohn
Copy link
Member

Assumed fix; anyone seeing this in rc0+ let us know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants