Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Postgresql pvc migration doesn't seem to work #231

Open
mbareck7 opened this issue Jul 13, 2023 · 0 comments
Open

Postgresql pvc migration doesn't seem to work #231

mbareck7 opened this issue Jul 13, 2023 · 0 comments

Comments

@mbareck7
Copy link

Describe the bug
When migrating a pvc that hold a postgresql data, when deleting the pods after running the migration script, the pod crash with the attached logs, I've encountered the problem with too different app(helm chart ) keycloak and minio

we are using skaffold, bellow is the relevant part of skaffold.yaml file

    # truncated  
      - name: keycloak
        namespace: softwarefactory
        repo: https://codecentric.github.io/helm-charts
        remoteChart: keycloak
        version: "15.1.0"
        valuesFiles:
          - values/keycloak.yaml

      - name: outline
        namespace: softwarefactory
        upgradeOnChange: true
        repo: https://gitlab.com/api/v4/projects/30221184/packages/helm/stable/
        remoteChart: outline
        version: 0.0.8
        valuesFiles:
          - values/outline.yaml
         
    # truncated 

To Reproduce
Steps to reproduce the behavior:
install/download skaffold binary

  1. Run command 'skaffold run '
  2. after using the pv-migrate to migrate the PVCs and deleting pods for restart purpose
  3. You will get the i error in the output section, when viewing the newly created pod logs kubectl logs pod-name -f

Expected behavior
The pod is in a Running state
Console output

mbareck@bombry:~/bitamer/workflow/devsecops/work/k8s$ k logs outline-postgresql-0 -f
postgresql 10:44:17.96 
postgresql 10:44:17.96 Welcome to the Bitnami postgresql container
postgresql 10:44:17.96 Subscribe to project updates by watching https://github.com/bitnami/bitnami-docker-postgresql
postgresql 10:44:17.96 Submit issues and feature requests at https://github.com/bitnami/bitnami-docker-postgresql/issues
postgresql 10:44:17.96 
postgresql 10:44:17.98 INFO  ==> ** Starting PostgreSQL setup **
postgresql 10:44:18.01 INFO  ==> Validating settings in POSTGRESQL_* env vars..
postgresql 10:44:18.01 INFO  ==> Loading custom pre-init scripts...
postgresql 10:44:18.02 INFO  ==> Initializing PostgreSQL database...
postgresql 10:44:18.03 INFO  ==> pg_hba.conf file not detected. Generating it...
postgresql 10:44:18.03 INFO  ==> Generating local authentication configuration
postgresql 10:44:18.04 INFO  ==> Deploying PostgreSQL with persisted data...
postgresql 10:44:18.05 INFO  ==> Configuring replication parameters
postgresql 10:44:18.07 INFO  ==> Configuring fsync
postgresql 10:44:18.09 INFO  ==> Loading custom scripts...
postgresql 10:44:18.10 INFO  ==> Enabling remote connections
postgresql 10:44:18.11 INFO  ==> ** PostgreSQL setup finished! **

postgresql 10:44:18.12 INFO  ==> ** Starting PostgreSQL **
2023-07-13 10:44:18.137 GMT [1] LOG:  pgaudit extension initialized
2023-07-13 10:44:18.137 GMT [1] LOG:  listening on IPv4 address "0.0.0.0", port 5432
2023-07-13 10:44:18.137 GMT [1] LOG:  listening on IPv6 address "::", port 5432
2023-07-13 10:44:18.147 GMT [1] LOG:  listening on Unix socket "/tmp/.s.PGSQL.5432"
2023-07-13 10:44:18.166 GMT [90] LOG:  database system shutdown was interrupted; last known up at 2023-07-13 10:43:47 GMT
2023-07-13 10:44:18.706 GMT [90] LOG:  invalid primary checkpoint record
2023-07-13 10:44:18.706 GMT [90] PANIC:  could not locate a valid checkpoint record
2023-07-13 10:44:18.790 GMT [1] LOG:  startup process (PID 90) was terminated by signal 6: Aborted
2023-07-13 10:44:18.790 GMT [1] LOG:  aborting startup due to startup process failure
2023-07-13 10:44:18.791 GMT [1] LOG:  database system is shut down

Version

  • Source and destination Kubernetes versions [ v1.24.5(scaleway managed k8s ), v1.25.11+rke2r1]
  • Source and destination container runtimes [ containerd://1.6.6, containerd://1.7.1-k3s1]
  • pv-migrate version and architecture [v1.1.0 - linux_x86_64]
  • Installation method [e.g. ` binary download]
  • Source and destination PVC type, size and accessModes [ReadWriteOnce, 4G, csi.scaleway.com -> ReadWriteOnce, 4Gi, openebs.io/local ]

Additional context
Content of skaffold.yaml, keycloak.yaml and outline.yaml files,

skaffold.yaml

apiVersion: skaffold/v2beta28
kind: Config
metadata:
  name: softwarefactory
deploy:
  helm:
    flags:
      upgrade:
        - --install
    releases:
      - name: keycloak
        namespace: softwarefactory
        repo: https://codecentric.github.io/helm-charts
        remoteChart: keycloak
        version: "15.1.0"
        valuesFiles:
          - values/keycloak.yaml

      - name: outline
        namespace: softwarefactory
        upgradeOnChange: true
        repo: https://gitlab.com/api/v4/projects/30221184/packages/helm/stable/
        remoteChart: outline
        version: 0.0.8
        valuesFiles:
          - values/outline.yaml
        

keycloak.yaml

replicas: 1
extraEnv: |
  - name: KEYCLOAK_USER
    value: admin-user
  - name: KEYCLOAK_PASSWORD
    value: aVerySecureAdminPassword 
  - name: PROXY_ADDRESS_FORWARDING
    value: "true"
ingress:
  enabled: true
  annotations:
    kubernetes.io/ingress.class: "nginx" #change this if not using nginx as ingress controller
    cert-manager.io/cluster-issuer: "cluster-issuer-name" #change this
  rules:
    - host: somedomain.something
      paths:
        - path: /
          pathType: Prefix
  tls:
    - hosts:
        - somedomain.something #change this
      secretName: keycloak-adias-tls
postgresql:
  postgresqlPostgresPassword: postgres

outline.yaml

env:
  OIDC_CLIENT_ID: client-id #change this
  OIDC_CLIENT_SECRET: client-secret #change this
  OIDC_AUTH_URI: https://sso.somedomain.something/auth/realms/Adias/protocol/openid-connect/auth
  OIDC_TOKEN_URI: https://sso.somedomain.something/auth/realms/Adias/protocol/openid-connect/token
  OIDC_USERINFO_URI: https://sso.somedomain.something/auth/realms/Adias/protocol/openid-connect/userinfo
envSecrets:
secretKey: "5974e36f82085f057223565ac2d0d01427a90248ade611108b4b937672649bd4" # MUST be replaced! Generate a hex-encoded 32-byte random key. You should use `openssl rand -hex 32`
utilsSecret: "8c5a14973325ba56694a99856e6be330812b875e5f683dd37022cf76444774bb" # Generate a unique random key. The format is not important but you could still use `openssl rand -hex 32`


ingress:
  host: wiki.somedomain.something
  annotations:
    kubernetes.io/ingress.class: "nginx"
    cert-manager.io/cluster-issuer: "cluster-issuer-name"
  tls:
    enabled: true

redis:
  enabled: true
  architecture: "standalone"
  persistence:
    size: 2Gi
    storageClass: "storageclass-name" 
  auth:
    enabled: false

postgresql:
  enabled: true
  architecture: "standalone"
  postgresqlDatabase: "outline"
  postgresqlUsername: "outline"
  postgresqlPassword: "pg-passs"
  postgresqlPostgresPassword: "pg-postgres-pass"
  persistence:
    enabled: true
    storageClass: "storageclass-name"
    size: 4Gi

minio:
  ingress:
    hostname: "minio.somedomain.something"
    certManager: true
    tls: true
    annotations:
      kubernetes.io/ingress.class: "nginx"
      cert-manager.io/cluster-issuer: "cluster-issuer-name"
  secretKey:
    password: "secretKeyHere"
  accessKey:
    password: "accessKeyHere2"
  persistence:
    storageClass: "storageclass-name"
    size: 30Gi
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant