-
Hello all, Also, how can I use the external Postgres Server that we have in production? Ok, confusing, lets drill down this: 1 - We need to export / Migrate the data that we already have in the DD PROD environment, with some tests and findings and so on, to a Postgres external DB. 2 - We need to install the DD using the external Postgres DB. I have found already some info about this but its a bit confusing where all the info should be configured.
3 - Can we streamline the docker-compose file so we use only the configuration that we need (Postgres and so on), instead all teh confusion of the profiles?
Thanks in advance for your support. Jorge Gomes |
Beta Was this translation helpful? Give feedback.
Replies: 7 comments 13 replies
-
@quirinziessler , maybe you can help here |
Beta Was this translation helpful? Give feedback.
-
Hi @jasgggit I actually had nearly the same situation as you a few months ago. I will try to answer your question. Regarding 1.: Regarding 2: Regarding 3: Hope this helps you a bit. |
Beta Was this translation helpful? Give feedback.
-
@quirinziessler Hi, I did, the '-v3' should be verbose 3 ... this is what it outputs: /opt/containers/django-DefectDojo # docker exec -it django-defectdojo-uwsgi-1 python manage.py loaddata -v 3 dumpfile.json Can't figure out what the issue is .... |
Beta Was this translation helpful? Give feedback.
-
About this:
An example of that is available in the Community Contribs repo: If you navigate into the dojo directory in that part of Community Contribs, you can see a 'simplified' version of compose at: |
Beta Was this translation helpful? Give feedback.
-
Hi @jasgggit I just tried to reproduce your case. However dumping and loading the file works fine for me. Could you please check if the dumpfile is loaded inside the container? Just a side note: The command I sent you saved the file in the current directory (not inside the docker container/volume). In order to have it inside the container, move it to one of the defectdojo volumes e.g. media. Another option would be to login to the uwsgi container and drop it directly to media. After configuring with Postgres login again and upload from media. |
Beta Was this translation helpful? Give feedback.
-
Hello all, the problem has been solved by one of my colleagues, we know more about Postgres than I do, and he found the issue of the import of the DUMP JSON file was related to the 'Sequences', on the 'auditlog_logentry_id_seq'. This way all is imported. Thanks for the support. Regards. |
Beta Was this translation helpful? Give feedback.
-
For future readers of this discussion, there's some good info in OWASP's Slack instance in the #defectdojo channel. I'm summarizing greatly here but one suggestion I found interesting was to use mysqldump to get the schema/data from MySQL then use a tool like pgLoader to convert the mysqldump to Postgres. Anyway, hope that helps those that want/need to migrate. |
Beta Was this translation helpful? Give feedback.
Hello all, the problem has been solved by one of my colleagues, we know more about Postgres than I do, and he found the issue of the import of the DUMP JSON file was related to the 'Sequences', on the 'auditlog_logentry_id_seq'.
It needs to be changed from the default '1' to something greater than the one on the JSON file.
Then we can use the commands to load the JSON after we flush it first.
This way all is imported.
Thanks for the support.
Regards.
JG