New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ISSUE-41: Support storage api #61
Conversation
e947734
to
2149ba4
Compare
dev_requirements.txt
Outdated
@@ -1,5 +1,6 @@ | |||
sqlalchemy>=1.1.9 | |||
google-cloud-bigquery>=1.6.0 | |||
google-cloud-bigquery[bqstorage]>=1.25.0 | |||
google-cloud-bigquery-storage[fastavro]>=1.0.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure what should be done concerning this.
fastavro
was needed for the most basic functionality for me, and i think pyarrow
may also be needed for other basic scenarios.
Should we include both as extras?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is expected. googleapis/python-bigquery#55 updates the DB API implementation to use the Arrow wire format, but I believe hasn't been released yet.
Should we include both as extras?
Some thoughts:
- The BigQuery Storage API is now GA, I'm more comfortable making a GA library a required dependency than when it was in Beta.
- When we bump the minimum version of anything, it will generate some complaints.
- The BigQuery Storage API, Arrow, and Avro all are a bit heavier of dependencies (contain compiled code). If we can avoid making these dependencies required, it will prevent some trouble.
How about we try to make these optional dependencies (extras) for now and revisit once there have been a few releases with Arrow support in the BQ DB-API?
setup.py
Outdated
@@ -25,7 +25,8 @@ def readme(): | |||
], | |||
install_requires=[ | |||
'sqlalchemy>=1.1.9', | |||
'google-cloud-bigquery>=1.6.0', | |||
'google-cloud-bigquery[bqstorage]>=1.25.0', | |||
'google-cloud-bigquery-storage[fastavro]>=1.0.0', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See dev_requirements.txt comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! A few comments. I think we need to make the BigQuery Storage API an optional dependency (at least for now).
dev_requirements.txt
Outdated
@@ -1,5 +1,6 @@ | |||
sqlalchemy>=1.1.9 | |||
google-cloud-bigquery>=1.6.0 | |||
google-cloud-bigquery[bqstorage]>=1.25.0 | |||
google-cloud-bigquery-storage[fastavro]>=1.0.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is expected. googleapis/python-bigquery#55 updates the DB API implementation to use the Arrow wire format, but I believe hasn't been released yet.
Should we include both as extras?
Some thoughts:
- The BigQuery Storage API is now GA, I'm more comfortable making a GA library a required dependency than when it was in Beta.
- When we bump the minimum version of anything, it will generate some complaints.
- The BigQuery Storage API, Arrow, and Avro all are a bit heavier of dependencies (contain compiled code). If we can avoid making these dependencies required, it will prevent some trouble.
How about we try to make these optional dependencies (extras) for now and revisit once there have been a few releases with Arrow support in the BQ DB-API?
pybigquery/sqlalchemy_bigquery.py
Outdated
@@ -4,7 +4,7 @@ | |||
from __future__ import unicode_literals | |||
|
|||
from google import auth | |||
from google.cloud import bigquery | |||
from google.cloud import bigquery, bigquery_storage_v1beta1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe the bigquery_storage_v1
library now works with the DB-API now. Let's use that, since it points to the GA endpoint.
Also, if we make the BQ Storage library an "extra", then let's catch any import errors to make this an optional dependency.
@tswast its a bit of a funny situation, we can't work with a storage api with i will try and make it work without changing the current dependencies, but it feels to me like we should try and move to 1.26, if there are not many breaking changes between 1.12 and 1.26 (current is 1.6, but it seems we already break with anything below 1.12) |
e6cbc1f
to
47d0310
Compare
@@ -1,6 +1,7 @@ | |||
sqlalchemy>=1.1.9 | |||
google-cloud-bigquery>=1.6.0 | |||
google-cloud-bigquery>=1.12.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the module doesn't work with anything below that ( also in master)
@alonme I didn't realize that the DB-API is using the BigQuery Storage API by default now. That's actually really good news. I wonder if that means we can drop this PR altogether? Or do you think it'd be useful to be able to explicitly enable/disable this API usage? |
@tswast Its used by default for version 1.26 (which we don't require) and only if you install the extra dependencies. The storage api has different pricing than the regular queries, so it kind of makes sense to me, however they are using it as default in the DB-API... We will probably want to add an option to install the extras through this package, This PR also contains some refactoring and fixing a requirements issue - so if we decide to not use it - i'll open another one for these changes |
It's actually free to download results from query results tables and can be used with BigQuery Sandbox, so it shouldn't make a difference regarding pricing.
(from https://cloud.google.com/bigquery/pricing#storage-api) It is an extra API to enable and extra packages to install, so there will certainly be some users who aren't able to use it. 🤔 |
So what do we want to do with this? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking at the DB-API docs, it looks like it's not actually possible to explicitly disable the BQ Storage API with the DB-API. That makes setting an option to False
a bit misleading. If we do want to proceed with this PR, we should file an issue on https://github.com/googleapis/python-bigquery to have a way to disable using the BQ Storage API.
Probably worth waiting to see if folks have trouble with the current default behavior before filing that, though.
if parse_version(bigquery.__version__) >= Version("1.26.0"): | ||
try: | ||
storage_client = bigquery_storage_v1.BigQueryReadClient(credentials=credentials) | ||
clients.append(storage_client) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since bqstorage_client
is an optional argument to connect()
, we should probably be using the dictionary for this rather than appending to the list of positional arguments. (In fact, maybe we should do the same for the BigQuery client object, too)
Closing, since there doesn't seem to be demand for turning off the BigQuery Storage API support and that API is now used by default. Thank you for your investigation and work on this @alonme! |
closes #41