Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ISSUE-41: Support storage api #61

Closed
wants to merge 4 commits into from

Conversation

alonme
Copy link
Contributor

@alonme alonme commented Jun 26, 2020

closes #41

@@ -1,5 +1,6 @@
sqlalchemy>=1.1.9
google-cloud-bigquery>=1.6.0
google-cloud-bigquery[bqstorage]>=1.25.0
google-cloud-bigquery-storage[fastavro]>=1.0.0
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure what should be done concerning this.
fastavro was needed for the most basic functionality for me, and i think pyarrow may also be needed for other basic scenarios.
Should we include both as extras?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is expected. googleapis/python-bigquery#55 updates the DB API implementation to use the Arrow wire format, but I believe hasn't been released yet.

Should we include both as extras?

Some thoughts:

  • The BigQuery Storage API is now GA, I'm more comfortable making a GA library a required dependency than when it was in Beta.
  • When we bump the minimum version of anything, it will generate some complaints.
  • The BigQuery Storage API, Arrow, and Avro all are a bit heavier of dependencies (contain compiled code). If we can avoid making these dependencies required, it will prevent some trouble.

How about we try to make these optional dependencies (extras) for now and revisit once there have been a few releases with Arrow support in the BQ DB-API?

setup.py Outdated
@@ -25,7 +25,8 @@ def readme():
],
install_requires=[
'sqlalchemy>=1.1.9',
'google-cloud-bigquery>=1.6.0',
'google-cloud-bigquery[bqstorage]>=1.25.0',
'google-cloud-bigquery-storage[fastavro]>=1.0.0',
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See dev_requirements.txt comment

Copy link
Collaborator

@tswast tswast left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! A few comments. I think we need to make the BigQuery Storage API an optional dependency (at least for now).

@@ -1,5 +1,6 @@
sqlalchemy>=1.1.9
google-cloud-bigquery>=1.6.0
google-cloud-bigquery[bqstorage]>=1.25.0
google-cloud-bigquery-storage[fastavro]>=1.0.0
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is expected. googleapis/python-bigquery#55 updates the DB API implementation to use the Arrow wire format, but I believe hasn't been released yet.

Should we include both as extras?

Some thoughts:

  • The BigQuery Storage API is now GA, I'm more comfortable making a GA library a required dependency than when it was in Beta.
  • When we bump the minimum version of anything, it will generate some complaints.
  • The BigQuery Storage API, Arrow, and Avro all are a bit heavier of dependencies (contain compiled code). If we can avoid making these dependencies required, it will prevent some trouble.

How about we try to make these optional dependencies (extras) for now and revisit once there have been a few releases with Arrow support in the BQ DB-API?

@@ -4,7 +4,7 @@
from __future__ import unicode_literals

from google import auth
from google.cloud import bigquery
from google.cloud import bigquery, bigquery_storage_v1beta1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the bigquery_storage_v1 library now works with the DB-API now. Let's use that, since it points to the GA endpoint.

Also, if we make the BQ Storage library an "extra", then let's catch any import errors to make this an optional dependency.

@alonme
Copy link
Contributor Author

alonme commented Aug 1, 2020

@tswast its a bit of a funny situation, we can't work with a storage api with google-cloud-bigquery < 1.25.0, and we can't use the V1 with less then 1.26.
However, if the user does have 1.26, and the bqstorage extras, than the use_bqstorage_api flag is irrelevant as it is used by default in 1.26.

i will try and make it work without changing the current dependencies, but it feels to me like we should try and move to 1.26, if there are not many breaking changes between 1.12 and 1.26 (current is 1.6, but it seems we already break with anything below 1.12)

@@ -1,6 +1,7 @@
sqlalchemy>=1.1.9
google-cloud-bigquery>=1.6.0
google-cloud-bigquery>=1.12.0
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the module doesn't work with anything below that ( also in master)

@alonme alonme requested a review from tswast August 6, 2020 19:15
@tswast
Copy link
Collaborator

tswast commented Aug 11, 2020

its a bit of a funny situation, we can't work with a storage api with google-cloud-bigquery < 1.25.0, and we can't use the V1 with less then 1.26. However, if the user does have 1.26, and the bqstorage extras, than the use_bqstorage_api flag is irrelevant as it is used by default in 1.26.

@alonme I didn't realize that the DB-API is using the BigQuery Storage API by default now. That's actually really good news. I wonder if that means we can drop this PR altogether? Or do you think it'd be useful to be able to explicitly enable/disable this API usage?

@alonme
Copy link
Contributor Author

alonme commented Aug 12, 2020

@tswast Its used by default for version 1.26 (which we don't require) and only if you install the extra dependencies.

The storage api has different pricing than the regular queries, so it kind of makes sense to me, however they are using it as default in the DB-API...

We will probably want to add an option to install the extras through this package,
Another question is if we should reflect the usage of the storage api to the user in any way? reflect that he misses dependencies needed to use it?

This PR also contains some refactoring and fixing a requirements issue - so if we decide to not use it - i'll open another one for these changes

@tswast
Copy link
Collaborator

tswast commented Aug 12, 2020

The storage api has different pricing than the regular queries, so it kind of makes sense to me, however they are using it as default in the DB-API...

It's actually free to download results from query results tables and can be used with BigQuery Sandbox, so it shouldn't make a difference regarding pricing.

The number of bytes read includes data used for filtering but not returned to you as output from ReadRows. You are not charged for data read from temporary tables.

(from https://cloud.google.com/bigquery/pricing#storage-api)

It is an extra API to enable and extra packages to install, so there will certainly be some users who aren't able to use it. 🤔

@alonme
Copy link
Contributor Author

alonme commented Aug 27, 2020

So what do we want to do with this?

Copy link
Collaborator

@tswast tswast left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking at the DB-API docs, it looks like it's not actually possible to explicitly disable the BQ Storage API with the DB-API. That makes setting an option to False a bit misleading. If we do want to proceed with this PR, we should file an issue on https://github.com/googleapis/python-bigquery to have a way to disable using the BQ Storage API.

Probably worth waiting to see if folks have trouble with the current default behavior before filing that, though.

if parse_version(bigquery.__version__) >= Version("1.26.0"):
try:
storage_client = bigquery_storage_v1.BigQueryReadClient(credentials=credentials)
clients.append(storage_client)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since bqstorage_client is an optional argument to connect(), we should probably be using the dictionary for this rather than appending to the list of positional arguments. (In fact, maybe we should do the same for the BigQuery client object, too)

@tswast
Copy link
Collaborator

tswast commented Jan 7, 2021

Closing, since there doesn't seem to be demand for turning off the BigQuery Storage API support and that API is now used by default.

Thank you for your investigation and work on this @alonme!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support of Storage API
2 participants