New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support of Storage API #41
Comments
This requires adding support for the BigQuery Storage API to the DB-API adapter in google-cloud-bigquery, first. I assume you want to improve the performance of downloading large results? |
You are right on with your comment in https://github.com/googleapis/google-cloud-python/issues/9465, it is indeed to test/use with superset. As a workaround, I can try pyodbc with the BQ ODBC driver. |
@tswast does this close it ? : googleapis/python-bigquery#36 (comment) |
It unblocks this feature from being completed, but doesn't quite fix it yet. We need to update As I understand it, the return value of This will also require adding a |
Can you guys add some more details about the usage of this feature? |
@alonme nothing to document since not implemented yet. |
Maybe I wasn't clear, |
@alonme Thank you for volunteering to help. To verify the updates suggested in #41 (comment) make a query to table where the expected result set is moderately large (100 MB or more). When |
@tswast cool, got a PoC to work. |
closing as per #61 (comment) |
Are there any plans to implement this feature as optional? My understanding is that using the storage API incurs an extra financial cost, so it would be nice to be able to disable it when desired. |
Any plan on supporting the bigquery storage api ?
Edit Sub-steps:
create_connect_args
to (optionally) return a BQ Storage API client alongside the BQ client.use_bqstorage_api
option to the connection URL parsing logic.The text was updated successfully, but these errors were encountered: