We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The one known issue is inability to do insert/overwrite on SQL Warehouse: https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/td-p/43646
We may have to detect a SQL Warehouse connection and then have it do a delete/insert instead.
We should add a new test to our engine test_integrations tests to test both general purpose clusters and Serverless SQL Warehouse.
test_integrations
The text was updated successfully, but these errors were encountered:
this is now implemented i believe? @eakmanrq
Sorry, something went wrong.
Although we don't test it, we do believe this addressed now. Thanks.
No branches or pull requests
The one known issue is inability to do insert/overwrite on SQL Warehouse: https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/td-p/43646
We may have to detect a SQL Warehouse connection and then have it do a delete/insert instead.
We should add a new test to our engine
test_integrations
tests to test both general purpose clusters and Serverless SQL Warehouse.The text was updated successfully, but these errors were encountered: