Skip to content

Commit

Permalink
chore: Prepare for 2.0 release (#278)
Browse files Browse the repository at this point in the history
* Remove BQ Storage v1beta1 compatibility code

* Adjust code to new BQ Storage 2.0

* Remove Python 2/3 compatibility code

* Bump test coverage to 100%

* Update supported Python versions in README

* Add UPGRADING guide.

* Regenerate bigquery_v2 code with microgenerator

* Adjust hand-written unit tests to regened BQ v2

* Adjust samples to BQ v2 regenerated code

* Adjust system tests to regenerated BQ v2

* Skip failing generated unit test

The assertion seems to fail for a banal reason, i.e. an extra newline
in the string representation.

* Delete Kokoro config for Python 2.7

* Fix docs build

* Undelete failing test, but mark as skipped

* Fix namespace name in docstrings and comments

* Define minimum dependency versions for Python 3.6

* Exclude autogenerated docs from docs index

* Exclude generated services from the library

There are currently no public API endpoints for the ModelServiceClient,
thus there is no point in generating that code in the first place.

* Bump minumum proto-plus version to 1.10.0

The old pin (1.4.0) does not work, tests detected some problem.

* Include generated types in the docs and rebuild

* Ignore skipped test in coverage check

* Explain moved enums in UPGRADING guide
  • Loading branch information
plamut committed Sep 30, 2020
1 parent fbbe0cb commit cbcb4b8
Show file tree
Hide file tree
Showing 69 changed files with 1,974 additions and 1,682 deletions.
8 changes: 1 addition & 7 deletions .kokoro/presubmit/presubmit.cfg
@@ -1,7 +1 @@
# Format: //devtools/kokoro/config/proto/build.proto

# Disable system tests.
env_vars: {
key: "RUN_SYSTEM_TESTS"
value: "false"
}
# Format: //devtools/kokoro/config/proto/build.proto
7 changes: 0 additions & 7 deletions .kokoro/presubmit/system-2.7.cfg

This file was deleted.

6 changes: 6 additions & 0 deletions .kokoro/samples/python3.6/common.cfg
Expand Up @@ -13,6 +13,12 @@ env_vars: {
value: "py-3.6"
}

# Declare build specific Cloud project.
env_vars: {
key: "BUILD_SPECIFIC_GCLOUD_PROJECT"
value: "python-docs-samples-tests-py36"
}

env_vars: {
key: "TRAMPOLINE_BUILD_FILE"
value: "github/python-bigquery/.kokoro/test-samples.sh"
Expand Down
6 changes: 6 additions & 0 deletions .kokoro/samples/python3.7/common.cfg
Expand Up @@ -13,6 +13,12 @@ env_vars: {
value: "py-3.7"
}

# Declare build specific Cloud project.
env_vars: {
key: "BUILD_SPECIFIC_GCLOUD_PROJECT"
value: "python-docs-samples-tests-py37"
}

env_vars: {
key: "TRAMPOLINE_BUILD_FILE"
value: "github/python-bigquery/.kokoro/test-samples.sh"
Expand Down
6 changes: 6 additions & 0 deletions .kokoro/samples/python3.8/common.cfg
Expand Up @@ -13,6 +13,12 @@ env_vars: {
value: "py-3.8"
}

# Declare build specific Cloud project.
env_vars: {
key: "BUILD_SPECIFIC_GCLOUD_PROJECT"
value: "python-docs-samples-tests-py38"
}

env_vars: {
key: "TRAMPOLINE_BUILD_FILE"
value: "github/python-bigquery/.kokoro/test-samples.sh"
Expand Down
19 changes: 0 additions & 19 deletions CONTRIBUTING.rst
Expand Up @@ -80,25 +80,6 @@ We use `nox <https://nox.readthedocs.io/en/latest/>`__ to instrument our tests.

.. nox: https://pypi.org/project/nox/
Note on Editable Installs / Develop Mode
========================================

- As mentioned previously, using ``setuptools`` in `develop mode`_
or a ``pip`` `editable install`_ is not possible with this
library. This is because this library uses `namespace packages`_.
For context see `Issue #2316`_ and the relevant `PyPA issue`_.

Since ``editable`` / ``develop`` mode can't be used, packages
need to be installed directly. Hence your changes to the source
tree don't get incorporated into the **already installed**
package.

.. _namespace packages: https://www.python.org/dev/peps/pep-0420/
.. _Issue #2316: https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2316
.. _PyPA issue: https://github.com/pypa/packaging-problems/issues/12
.. _develop mode: https://setuptools.readthedocs.io/en/latest/setuptools.html#development-mode
.. _editable install: https://pip.pypa.io/en/stable/reference/pip_install/#editable-installs

*****************************************
I'm getting weird errors... Can you help?
*****************************************
Expand Down
11 changes: 7 additions & 4 deletions README.rst
Expand Up @@ -52,11 +52,14 @@ dependencies.

Supported Python Versions
^^^^^^^^^^^^^^^^^^^^^^^^^
Python >= 3.5
Python >= 3.6

Deprecated Python Versions
^^^^^^^^^^^^^^^^^^^^^^^^^^
Python == 2.7. Python 2.7 support will be removed on January 1, 2020.
Unsupported Python Versions
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Python == 2.7, Python == 3.5.

The last version of this library compatible with Python 2.7 and 3.5 is
`google-cloud-bigquery==1.28.0`.


Mac/Linux
Expand Down
59 changes: 59 additions & 0 deletions UPGRADING.md
@@ -0,0 +1,59 @@
<!--
Copyright 2020 Google LLC
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->


# 2.0.0 Migration Guide

The 2.0 release of the `google-cloud-bigquery` client drops support for Python
versions below 3.6. The client surface itself has not changed, but the 1.x series
will not be receiving any more feature updates or bug fixes. You are thus
encouraged to upgrade to the 2.x series.

If you experience issues or have questions, please file an
[issue](https://github.com/googleapis/python-bigquery/issues).


## Supported Python Versions

> **WARNING**: Breaking change
The 2.0.0 release requires Python 3.6+.


## Supported BigQuery Storage Clients

The 2.0.0 release requires BigQuery Storage `>= 2.0.0`, which dropped support
for `v1beta1` and `v1beta2` versions of the BigQuery Storage API. If you want to
use a BigQuery Storage client, it must be the one supporting the `v1` API version.


## Changed GAPIC Enums Path

> **WARNING**: Breaking change
Generated GAPIC enum types have been moved under `types`. Import paths need to be
adjusted.

**Before:**
```py
from google.cloud.bigquery_v2.gapic import enums

distance_type = enums.Model.DistanceType.COSINE
```

**After:**
```py
from google.cloud.bigquery_v2 import types

distance_type = types.Model.DistanceType.COSINE
```
1 change: 1 addition & 0 deletions docs/UPGRADING.md
6 changes: 6 additions & 0 deletions docs/bigquery_v2/services.rst
@@ -0,0 +1,6 @@
Services for Google Cloud Bigquery v2 API
=========================================

.. automodule:: google.cloud.bigquery_v2.services.model_service
:members:
:inherited-members:
5 changes: 5 additions & 0 deletions docs/bigquery_v2/types.rst
@@ -0,0 +1,5 @@
Types for Google Cloud Bigquery v2 API
======================================

.. automodule:: google.cloud.bigquery_v2.types
:members:
1 change: 1 addition & 0 deletions docs/conf.py
Expand Up @@ -100,6 +100,7 @@
"samples/AUTHORING_GUIDE.md",
"samples/CONTRIBUTING.md",
"samples/snippets/README.rst",
"bigquery_v2/services.rst", # generated by the code generator
]

# The reST default role (used for this markup: `text`) to use for all
Expand Down
8 changes: 0 additions & 8 deletions docs/gapic/v2/enums.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/gapic/v2/types.rst

This file was deleted.

10 changes: 10 additions & 0 deletions docs/index.rst
Expand Up @@ -27,6 +27,16 @@ API Reference
reference
dbapi

Migration Guide
---------------

See the guide below for instructions on migrating to the 2.x release of this library.

.. toctree::
:maxdepth: 2

UPGRADING

Changelog
---------

Expand Down
4 changes: 2 additions & 2 deletions docs/reference.rst
Expand Up @@ -182,6 +182,7 @@ Encryption Configuration

encryption_configuration.EncryptionConfiguration


Additional Types
================

Expand All @@ -190,5 +191,4 @@ Protocol buffer classes for working with the Models API.
.. toctree::
:maxdepth: 2

gapic/v2/enums
gapic/v2/types
bigquery_v2/types
77 changes: 14 additions & 63 deletions google/cloud/bigquery/_pandas_helpers.py
Expand Up @@ -22,11 +22,6 @@
import six
from six.moves import queue

try:
from google.cloud import bigquery_storage_v1
except ImportError: # pragma: NO COVER
bigquery_storage_v1 = None

try:
import pandas
except ImportError: # pragma: NO COVER
Expand Down Expand Up @@ -287,14 +282,6 @@ def dataframe_to_bq_schema(dataframe, bq_schema):
"""
if bq_schema:
bq_schema = schema._to_schema_fields(bq_schema)
if six.PY2:
for field in bq_schema:
if field.field_type in schema._STRUCT_TYPES:
raise ValueError(
"Uploading dataframes with struct (record) column types "
"is not supported under Python2. See: "
"https://github.com/googleapis/python-bigquery/issues/21"
)
bq_schema_index = {field.name: field for field in bq_schema}
bq_schema_unused = set(bq_schema_index.keys())
else:
Expand Down Expand Up @@ -578,19 +565,7 @@ def _bqstorage_page_to_dataframe(column_names, dtypes, page):
def _download_table_bqstorage_stream(
download_state, bqstorage_client, session, stream, worker_queue, page_to_item
):
# Passing a BQ Storage client in implies that the BigQuery Storage library
# is available and can be imported.
from google.cloud import bigquery_storage_v1beta1

# We want to preserve comaptibility with the v1beta1 BQ Storage clients,
# thus adjust constructing the rowstream if needed.
# The assumption is that the caller provides a BQ Storage `session` that is
# compatible with the version of the BQ Storage client passed in.
if isinstance(bqstorage_client, bigquery_storage_v1beta1.BigQueryStorageClient):
position = bigquery_storage_v1beta1.types.StreamPosition(stream=stream)
rowstream = bqstorage_client.read_rows(position).rows(session)
else:
rowstream = bqstorage_client.read_rows(stream.name).rows(session)
rowstream = bqstorage_client.read_rows(stream.name).rows(session)

for page in rowstream.pages:
if download_state.done:
Expand Down Expand Up @@ -625,8 +600,7 @@ def _download_table_bqstorage(

# Passing a BQ Storage client in implies that the BigQuery Storage library
# is available and can be imported.
from google.cloud import bigquery_storage_v1
from google.cloud import bigquery_storage_v1beta1
from google.cloud import bigquery_storage

if "$" in table.table_id:
raise ValueError(
Expand All @@ -637,41 +611,18 @@ def _download_table_bqstorage(

requested_streams = 1 if preserve_order else 0

# We want to preserve comaptibility with the v1beta1 BQ Storage clients,
# thus adjust the session creation if needed.
if isinstance(bqstorage_client, bigquery_storage_v1beta1.BigQueryStorageClient):
warnings.warn(
"Support for BigQuery Storage v1beta1 clients is deprecated, please "
"consider upgrading the client to BigQuery Storage v1 stable version.",
category=DeprecationWarning,
)
read_options = bigquery_storage_v1beta1.types.TableReadOptions()

if selected_fields is not None:
for field in selected_fields:
read_options.selected_fields.append(field.name)

session = bqstorage_client.create_read_session(
table.to_bqstorage(v1beta1=True),
"projects/{}".format(project_id),
format_=bigquery_storage_v1beta1.enums.DataFormat.ARROW,
read_options=read_options,
requested_streams=requested_streams,
)
else:
requested_session = bigquery_storage_v1.types.ReadSession(
table=table.to_bqstorage(),
data_format=bigquery_storage_v1.enums.DataFormat.ARROW,
)
if selected_fields is not None:
for field in selected_fields:
requested_session.read_options.selected_fields.append(field.name)

session = bqstorage_client.create_read_session(
parent="projects/{}".format(project_id),
read_session=requested_session,
max_stream_count=requested_streams,
)
requested_session = bigquery_storage.types.ReadSession(
table=table.to_bqstorage(), data_format=bigquery_storage.types.DataFormat.ARROW
)
if selected_fields is not None:
for field in selected_fields:
requested_session.read_options.selected_fields.append(field.name)

session = bqstorage_client.create_read_session(
parent="projects/{}".format(project_id),
read_session=requested_session,
max_stream_count=requested_streams,
)

_LOGGER.debug(
"Started reading table '{}.{}.{}' with BQ Storage API session '{}'.".format(
Expand Down
12 changes: 4 additions & 8 deletions google/cloud/bigquery/client.py
Expand Up @@ -17,11 +17,7 @@
from __future__ import absolute_import
from __future__ import division

try:
from collections import abc as collections_abc
except ImportError: # Python 2.7
import collections as collections_abc

from collections import abc as collections_abc
import copy
import functools
import gzip
Expand Down Expand Up @@ -435,19 +431,19 @@ def _create_bqstorage_client(self):
warning and return ``None``.
Returns:
Optional[google.cloud.bigquery_storage_v1.BigQueryReadClient]:
Optional[google.cloud.bigquery_storage.BigQueryReadClient]:
A BigQuery Storage API client.
"""
try:
from google.cloud import bigquery_storage_v1
from google.cloud import bigquery_storage
except ImportError:
warnings.warn(
"Cannot create BigQuery Storage client, the dependency "
"google-cloud-bigquery-storage is not installed."
)
return None

return bigquery_storage_v1.BigQueryReadClient(credentials=self._credentials)
return bigquery_storage.BigQueryReadClient(credentials=self._credentials)

def create_dataset(
self, dataset, exists_ok=False, retry=DEFAULT_RETRY, timeout=None
Expand Down
5 changes: 1 addition & 4 deletions google/cloud/bigquery/dbapi/_helpers.py
Expand Up @@ -12,11 +12,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.

try:
from collections import abc as collections_abc
except ImportError: # Python 2.7
import collections as collections_abc

from collections import abc as collections_abc
import datetime
import decimal
import functools
Expand Down

0 comments on commit cbcb4b8

Please sign in to comment.