You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When trying to insert rows (even a single row) into table using pyodbc Python package with user-level async_insert=1 setting or with SETTINGS async_insert=1 inside the query, I receive the following error:
DB::Exception: Substitution `odbc_positional_1` is not set: While executing WaitForAsyncInsert.
(UNKNOWN_QUERY_PARAMETER) (version 23.3.5.9 (official build))
Everything works fine when async_insert=0 or when using isql utility from unixODBC.
OS: Ubuntu 22.04.2 LTS (GNU/Linux 5.15.0-76-generic x86_64)
ODBC Drivers Manager: unixODBC 2.3.9-5
ClickHouse ODBC driver: 1.2.1.20220905 (built from sources according to this doc)
ClickHouse server: 23.3
Python: 3.8.17
Pyodbc: 4.0.39
There is also a more general question if it is possible to make bulk inserts with pyodbc + clickhouse-odbc driver, because it seems like pyodbc makes inserts row by row, according to logs. It takes 20 seconds to insert 1000 rows into table with 3 columns (Int64, Float64 and String types) when using pyodbc.Cursor.executemany() even with fast_executemany=True.
The text was updated successfully, but these errors were encountered:
When trying to insert rows (even a single row) into table using
pyodbc
Python package with user-levelasync_insert=1
setting or withSETTINGS async_insert=1
inside the query, I receive the following error:Everything works fine when
async_insert=0
or when usingisql
utility from unixODBC.OS: Ubuntu 22.04.2 LTS (GNU/Linux 5.15.0-76-generic x86_64)
ODBC Drivers Manager: unixODBC 2.3.9-5
ClickHouse ODBC driver: 1.2.1.20220905 (built from sources according to this doc)
ClickHouse server: 23.3
Python: 3.8.17
Pyodbc: 4.0.39
There is also a more general question if it is possible to make bulk inserts with
pyodbc
+clickhouse-odbc
driver, because it seems likepyodbc
makes inserts row by row, according to logs. It takes 20 seconds to insert 1000 rows into table with 3 columns (Int64, Float64 and String types) when usingpyodbc.Cursor.executemany()
even withfast_executemany=True
.The text was updated successfully, but these errors were encountered: