-
Hi. We love SnowDDL on my team, but we have a gap right now. We rely on I guess it's either of:
It seems a pity to spin full on Terraform for just these two items. Thank you! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Ok, it is a complicated topic. The main problem with these types of integrations is the strict requirement to do something outside of Snowflake to start or finish the configuration. In some cases (like storage integration in GCP), you even have to create an integration first, describe it, get some meta-information and use it to create actual infra objects. With this mind, it makes more sense to manage integrations together with infra objects outside of Snowflake. Normally you would have a DevOps team with their own tools to do the job. Creating or changing integrations without proper infra won't work, since you'll get error message or create an object which is not functioning and fail on the first call. Monitoring integrations in SnowDDL would be a bit awkward in terms of "single responsibility" / "single point of truth". As data team, how would you know bucket names, role IDs, queue IDs? Roles, buckets and queues are normally created by DevOps team, so DevOps should normally hold this information. If you hardcode it in your config, now you have a second version of this information, which is going to be outdated and obsolete on every infra update. On top of that, validating integration definition does not mean it actually works and is configured correctly. It is possible for some ACL's to be revoked, and your integration won't work anymore, even if it remains perfectly correct from config standpoint. So.. what can we do instead? If integration monitoring is really an important issue for you, I suggest to write a basic Python script to check every external stage with LIST command and every API integration with a test call. You may do it regularly or after every run of SnowDDL. This would not only check that integrations are correctly configured in Snowflake, but also that your data infra is fully configured and working. It is like writing a functional test. |
Beta Was this translation helpful? Give feedback.
Ok, it is a complicated topic.
The main problem with these types of integrations is the strict requirement to do something outside of Snowflake to start or finish the configuration. In some cases (like storage integration in GCP), you even have to create an integration first, describe it, get some meta-information and use it to create actual infra objects.
With this mind, it makes more sense to manage integrations together with infra objects outside of Snowflake. Normally you would have a DevOps team with their own tools to do the job.
Creating or changing integrations without proper infra won't work, since you'll get error message or create an object which is not functioning and fail on …