Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Helm install fails in PSS restricted clusters because of hardcoded securityContext values #599

Open
m00nyONE opened this issue Jan 19, 2024 · 2 comments

Comments

@m00nyONE
Copy link

What happened:

i wanted to deploy kubeclarity via Helm on a PSS (restriced mode) secured cluster

What you expected to happen:

that the helm chart gives the oportunity to set seccompProfiles

How to reproduce it (as minimally and precisely as possible):

create cluster with PSS and try install kubeclarify in a restricted namespace

Are there any error messages in KubeClarity logs?

no

Anything else we need to know?:

pods "kubeclarity-kubeclarity-grype-server-59f88f8f8d-x8mzb" is forbidden: violates PodSecurity "restricted:latest": seccompProfile (pod or container "grype-server" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost")

The Problem can easily be solved by not hardcoding the securityContext and allow a manual override in the Helm Chart

Environment:

  • Kubernetes version (use kubectl version --short): 1.26
  • Helm version (use helm version): 3.13.2
  • KubeClarity version (use kubectl -n kubeclarity exec deploy/kubeclarity -- ./backend version): 2.23.1
  • KubeClarity Helm Chart version (use helm -n kubeclarity list): 2.23.1
  • Cloud provider or hardware configuration: i can not tell you about that. The only thing i am allowed to tell is that it is a gardener provisioned cluster
  • Others:
Copy link

Thank you for your contribution! This issue has been automatically marked as stale because it has no recent activity in the last 60 days. It will be closed in 14 days, if no further activity occurs. If this issue is still relevant, please leave a comment to let us know, and the stale label will be automatically removed.

@github-actions github-actions bot added the stale label Mar 24, 2024
@tobiasbartel
Copy link

+1 facing the same issue right now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

No branches or pull requests

2 participants