Skip to content

Commit

Permalink
fix: RowIterator.to_arrow() error when BQ Storage client cannot be cr…
Browse files Browse the repository at this point in the history
…eated (#181)

* fix: to_arrow() when can't create BQ Storage client

* Clarify using BQ Storage client by default
  • Loading branch information
plamut committed Jul 23, 2020
1 parent 1ec41ef commit 7afa3d7
Show file tree
Hide file tree
Showing 3 changed files with 29 additions and 2 deletions.
2 changes: 1 addition & 1 deletion CHANGELOG.md
Expand Up @@ -9,7 +9,7 @@

### Features

* use BigQuery Storage client by default ([#55](https://www.github.com/googleapis/python-bigquery/issues/55)) ([e75ff82](https://www.github.com/googleapis/python-bigquery/commit/e75ff8297c65981545b097f75a17cf9e78ac6772)), closes [#91](https://www.github.com/googleapis/python-bigquery/issues/91)
* use BigQuery Storage client by default (if dependencies available) ([#55](https://www.github.com/googleapis/python-bigquery/issues/55)) ([e75ff82](https://www.github.com/googleapis/python-bigquery/commit/e75ff8297c65981545b097f75a17cf9e78ac6772)), closes [#91](https://www.github.com/googleapis/python-bigquery/issues/91)
* **bigquery:** add __eq__ method for class PartitionRange and RangePartitioning ([#162](https://www.github.com/googleapis/python-bigquery/issues/162)) ([0d2a88d](https://www.github.com/googleapis/python-bigquery/commit/0d2a88d8072154cfc9152afd6d26a60ddcdfbc73))
* **bigquery:** expose date_as_object parameter to users ([#150](https://www.github.com/googleapis/python-bigquery/issues/150)) ([a2d5ce9](https://www.github.com/googleapis/python-bigquery/commit/a2d5ce9e97992318d7dc85c51c053cab74e25a11))
* **bigquery:** expose date_as_object parameter to users ([#150](https://www.github.com/googleapis/python-bigquery/issues/150)) ([cbd831e](https://www.github.com/googleapis/python-bigquery/commit/cbd831e08024a67148723afd49e1db085e0a862c))
Expand Down
2 changes: 1 addition & 1 deletion google/cloud/bigquery/table.py
Expand Up @@ -1534,8 +1534,8 @@ def to_arrow(

owns_bqstorage_client = False
if not bqstorage_client and create_bqstorage_client:
owns_bqstorage_client = True
bqstorage_client = self.client._create_bqstorage_client()
owns_bqstorage_client = bqstorage_client is not None

try:
progress_bar = self._get_progress_bar(progress_bar_type)
Expand Down
27 changes: 27 additions & 0 deletions tests/unit/test_table.py
Expand Up @@ -1990,6 +1990,33 @@ def test_to_arrow_w_bqstorage_creates_client(self):
mock_client._create_bqstorage_client.assert_called_once()
bqstorage_client.transport.channel.close.assert_called_once()

@unittest.skipIf(pyarrow is None, "Requires `pyarrow`")
def test_to_arrow_create_bqstorage_client_wo_bqstorage(self):
from google.cloud.bigquery.schema import SchemaField

schema = [
SchemaField("name", "STRING", mode="REQUIRED"),
SchemaField("age", "INTEGER", mode="REQUIRED"),
]
rows = [
{"f": [{"v": "Alice"}, {"v": "98"}]},
{"f": [{"v": "Bob"}, {"v": "99"}]},
]
path = "/foo"
api_request = mock.Mock(return_value={"rows": rows})

mock_client = _mock_client()
mock_client._create_bqstorage_client.return_value = None
row_iterator = self._make_one(mock_client, api_request, path, schema)

tbl = row_iterator.to_arrow(create_bqstorage_client=True)

# The client attempted to create a BQ Storage client, and even though
# that was not possible, results were still returned without errors.
mock_client._create_bqstorage_client.assert_called_once()
self.assertIsInstance(tbl, pyarrow.Table)
self.assertEqual(tbl.num_rows, 2)

@unittest.skipIf(pyarrow is None, "Requires `pyarrow`")
@unittest.skipIf(
bigquery_storage_v1 is None, "Requires `google-cloud-bigquery-storage`"
Expand Down

0 comments on commit 7afa3d7

Please sign in to comment.