DB Parameters not passed through to Snowflake API Connection #39622
Labels
area:providers
good first issue
kind:bug
This is a clearly a bug
provider:snowflake
Issues related to Snowflake provider
Apache Airflow Provider(s)
snowflake
Versions of Apache Airflow Providers
apache-airflow-providers-snowflake == 5.5.0
Apache Airflow version
2.9.1
Operating System
mac os
Deployment
Official Apache Airflow Helm Chart
Deployment details
No response
What happened
When using the SnowflakeSqlApiOperator, and specifying a DB parameter such as
schema
to override what is in the airflow connection, the DB parameters are ignored.For example, if the schema specified and retrieved via the
snowflake_conn_id
isDEFAULT
, but the SnowflakeSqlApiOperator is instantiated with the argumentschema='OTHER'
, the query is still executing against the API with theDEFAULT
schema.The same is true for Role, Warehouse, etc
What you think should happen instead
In this case, the query should execute with the schema specified in the arguments of SnowflakeSQLApiOperator, which should take preference over what is in the snowflake_conn_id specified object.
How to reproduce
Assume in snowflake that you have a schema structure like:
Database:
Then in airflow, the connection object 'db_conn' contains the key: {.... "schema": "DEFAULT"....}
Finally, DAG code contains
This will fail with
Further debugging (checking the
conn_config['schema']
in hook's execute_query) indicates this is because schema is still on DEFAULT for post to API.Anything else
hook_params are set in the SnowflakeSqlApiOperator init, a similar pattern that is used in the base airflow SQL operator. However, in the case of the SnowflakeSqlApiOperator, these hook_params are never used in creating the hook.
I think this issue can be solved by modifying the the instantiation of the SnowflakeSqlApiHook by passing the hook_params as the kwargs:
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: