Skip to content

Commit

Permalink
doc: update wording in RowIterator docstrings to reduce confusion (#127)
Browse files Browse the repository at this point in the history
Goal: Talking about partition filtration can be problematic, so refer
to Storage API capabilities more obliquely.  Particularly, partition
filtration is possible via a query or a direct storage API read, so
don't use as severe of language when describing helper mechanisms such
as to_dataframe() on row iterators.

Fixes: googleapis/python-bigquery-storage#22

Co-authored-by: Peter Lamut <plamut@users.noreply.github.com>
  • Loading branch information
shollyman and plamut committed Jun 11, 2020
1 parent e75ff82 commit 445ae08
Showing 1 changed file with 9 additions and 6 deletions.
15 changes: 9 additions & 6 deletions google/cloud/bigquery/table.py
Expand Up @@ -1515,8 +1515,9 @@ def to_arrow(
This method requires the ``pyarrow`` and
``google-cloud-bigquery-storage`` libraries.
Reading from a specific partition or snapshot is not
currently supported by this method.
This method only exposes a subset of the capabilities of the
BigQuery Storage API. For full access to all features
(projections, filters, snapshots) use the Storage API directly.
create_bqstorage_client (bool):
Optional. If ``True`` (default), create a BigQuery Storage API
client using the default API settings. The BigQuery Storage API
Expand Down Expand Up @@ -1598,8 +1599,9 @@ def to_dataframe_iterable(self, bqstorage_client=None, dtypes=None):
This method requires the ``pyarrow`` and
``google-cloud-bigquery-storage`` libraries.
Reading from a specific partition or snapshot is not
currently supported by this method.
This method only exposes a subset of the capabilities of the
BigQuery Storage API. For full access to all features
(projections, filters, snapshots) use the Storage API directly.
**Caution**: There is a known issue reading small anonymous
query result tables with the BQ Storage API. When a problem
Expand Down Expand Up @@ -1666,8 +1668,9 @@ def to_dataframe(
This method requires the ``pyarrow`` and
``google-cloud-bigquery-storage`` libraries.
Reading from a specific partition or snapshot is not
currently supported by this method.
This method only exposes a subset of the capabilities of the
BigQuery Storage API. For full access to all features
(projections, filters, snapshots) use the Storage API directly.
**Caution**: There is a known issue reading small anonymous
query result tables with the BQ Storage API. When a problem
Expand Down

0 comments on commit 445ae08

Please sign in to comment.