Skip to content

Commit

Permalink
chore: upgrade gapic-generator-python to 0.46.3 (#196)
Browse files Browse the repository at this point in the history
* changes without context

        autosynth cannot find the source of changes triggered by earlier changes in this
        repository, or by version upgrades to tools such as linters.

* chore: upgrade to gapic-generator-python 0.44.0

chore: add GAPIC metadata
feat: add support for self-signed JWT
PiperOrigin-RevId: 370525906

Source-Author: Google APIs <noreply@google.com>
Source-Date: Mon Apr 26 13:12:26 2021 -0700
Source-Repo: googleapis/googleapis
Source-Sha: 60e129d0672a1be2c70b41bf76aadc7ad1b1ca0f
Source-Link: googleapis/googleapis@60e129d

* chore: revert to gapic-generator-python 0.43.3

PiperOrigin-RevId: 371362703

Source-Author: Google APIs <noreply@google.com>
Source-Date: Fri Apr 30 10:44:40 2021 -0700
Source-Repo: googleapis/googleapis
Source-Sha: 5a04154e7c7c0e98e0e4085f6e2c67bd5bff6ff8
Source-Link: googleapis/googleapis@5a04154

* fix: add async client to %name_%version/init.py chore: add autogenerated snippets chore: remove auth, policy, and options from the reserved names list feat: support self-signed JWT flow for service accounts chore: enable GAPIC metadata generation chore: sort subpackages in %namespace/%name/init.py

PiperOrigin-RevId: 372197450

Source-Author: Google APIs <noreply@google.com>
Source-Date: Wed May 5 13:39:02 2021 -0700
Source-Repo: googleapis/googleapis
Source-Sha: 83a7e1c8c2f7421ded45ed323eb1fda99ef5ea46
Source-Link: googleapis/googleapis@83a7e1c

* chore: upgrade gapic-generator-python to 0.46.1

PiperOrigin-RevId: 373400747

Source-Author: Google APIs <noreply@google.com>
Source-Date: Wed May 12 10:34:35 2021 -0700
Source-Repo: googleapis/googleapis
Source-Sha: 162641cfe5573c648df679a6dd30385650a08704
Source-Link: googleapis/googleapis@162641c

* chore: upgrade gapic-generator-python to 0.46.3

PiperOrigin-RevId: 373649163

Source-Author: Google APIs <noreply@google.com>
Source-Date: Thu May 13 13:40:36 2021 -0700
Source-Repo: googleapis/googleapis
Source-Sha: 7e1b14e6c7a9ab96d2db7e4a131981f162446d34
Source-Link: googleapis/googleapis@7e1b14e
  • Loading branch information
yoshi-automation committed May 18, 2021
1 parent ceae220 commit 0fe6484
Show file tree
Hide file tree
Showing 52 changed files with 1,775 additions and 1,081 deletions.
1 change: 0 additions & 1 deletion .coveragerc
Expand Up @@ -2,7 +2,6 @@
branch = True

[report]
fail_under = 100
show_missing = True
omit =
google/cloud/bigquery_storage/__init__.py
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Expand Up @@ -26,6 +26,6 @@ repos:
hooks:
- id: black
- repo: https://gitlab.com/pycqa/flake8
rev: 3.9.0
rev: 3.9.2
hooks:
- id: flake8
16 changes: 1 addition & 15 deletions CONTRIBUTING.rst
Expand Up @@ -160,21 +160,7 @@ Running System Tests
auth settings and change some configuration in your project to
run all the tests.

- System tests will be run against an actual project and
so you'll need to provide some environment variables to facilitate
authentication to your project:

- ``GOOGLE_APPLICATION_CREDENTIALS``: The path to a JSON key file;
Such a file can be downloaded directly from the developer's console by clicking
"Generate new JSON key". See private key
`docs <https://cloud.google.com/storage/docs/authentication#generating-a-private-key>`__
for more details.

- Once you have downloaded your json keys, set the environment variable
``GOOGLE_APPLICATION_CREDENTIALS`` to the absolute path of the json file::

$ export GOOGLE_APPLICATION_CREDENTIALS="/Users/<your_username>/path/to/app_credentials.json"

- System tests will be run against an actual project. You should use local credentials from gcloud when possible. See `Best practices for application authentication <https://cloud.google.com/docs/authentication/best-practices-applications#local_development_and_testing_with_the>`__. Some tests require a service account. For those tests see `Authenticating as a service account <https://cloud.google.com/docs/authentication/production>`__.

*************
Test Coverage
Expand Down
12 changes: 6 additions & 6 deletions google/cloud/bigquery_storage/__init__.py
@@ -1,5 +1,4 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
Expand All @@ -16,6 +15,7 @@
#

from google.cloud.bigquery_storage_v1 import BigQueryReadClient

from google.cloud.bigquery_storage_v1 import gapic_types as types
from google.cloud.bigquery_storage_v1 import __version__
from google.cloud.bigquery_storage_v1.types.arrow import ArrowRecordBatch
Expand All @@ -30,27 +30,27 @@
from google.cloud.bigquery_storage_v1.types.storage import SplitReadStreamResponse
from google.cloud.bigquery_storage_v1.types.storage import StreamStats
from google.cloud.bigquery_storage_v1.types.storage import ThrottleState
from google.cloud.bigquery_storage_v1.types.stream import DataFormat
from google.cloud.bigquery_storage_v1.types.stream import ReadSession
from google.cloud.bigquery_storage_v1.types.stream import ReadStream
from google.cloud.bigquery_storage_v1.types.stream import DataFormat

__all__ = (
"BigQueryReadClient",
"__version__",
"types",
"ArrowRecordBatch",
"ArrowSchema",
"ArrowSerializationOptions",
"AvroRows",
"AvroSchema",
"BigQueryReadClient",
"CreateReadSessionRequest",
"DataFormat",
"ReadRowsRequest",
"ReadRowsResponse",
"ReadSession",
"ReadStream",
"SplitReadStreamRequest",
"SplitReadStreamResponse",
"StreamStats",
"ThrottleState",
"ReadSession",
"ReadStream",
"DataFormat",
)
53 changes: 53 additions & 0 deletions google/cloud/bigquery_storage_v1/gapic_metadata.json
@@ -0,0 +1,53 @@
{
"comment": "This file maps proto services/RPCs to the corresponding library clients/methods",
"language": "python",
"libraryPackage": "google.cloud.bigquery_storage_v1",
"protoPackage": "google.cloud.bigquery.storage.v1",
"schema": "1.0",
"services": {
"BigQueryRead": {
"clients": {
"grpc": {
"libraryClient": "BigQueryReadClient",
"rpcs": {
"CreateReadSession": {
"methods": [
"create_read_session"
]
},
"ReadRows": {
"methods": [
"read_rows"
]
},
"SplitReadStream": {
"methods": [
"split_read_stream"
]
}
}
},
"grpc-async": {
"libraryClient": "BigQueryReadAsyncClient",
"rpcs": {
"CreateReadSession": {
"methods": [
"create_read_session"
]
},
"ReadRows": {
"methods": [
"read_rows"
]
},
"SplitReadStream": {
"methods": [
"split_read_stream"
]
}
}
}
}
}
}
}
1 change: 0 additions & 1 deletion google/cloud/bigquery_storage_v1/services/__init__.py
@@ -1,5 +1,4 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
Expand Down
@@ -1,5 +1,4 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
Expand All @@ -14,7 +13,6 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#

from .client import BigQueryReadClient
from .async_client import BigQueryReadAsyncClient

Expand Down
@@ -1,5 +1,4 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
Expand All @@ -14,26 +13,24 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#

from collections import OrderedDict
import functools
import re
from typing import Dict, AsyncIterable, Awaitable, Sequence, Tuple, Type, Union
import pkg_resources

import google.api_core.client_options as ClientOptions # type: ignore
from google.api_core import exceptions # type: ignore
from google.api_core import exceptions as core_exceptions # type: ignore
from google.api_core import gapic_v1 # type: ignore
from google.api_core import retry as retries # type: ignore
from google.auth import credentials # type: ignore
from google.auth import credentials as ga_credentials # type: ignore
from google.oauth2 import service_account # type: ignore

from google.cloud.bigquery_storage_v1.types import arrow
from google.cloud.bigquery_storage_v1.types import avro
from google.cloud.bigquery_storage_v1.types import storage
from google.cloud.bigquery_storage_v1.types import stream
from google.protobuf import timestamp_pb2 as timestamp # type: ignore

from google.protobuf import timestamp_pb2 # type: ignore
from .transports.base import BigQueryReadTransport, DEFAULT_CLIENT_INFO
from .transports.grpc_asyncio import BigQueryReadGrpcAsyncIOTransport
from .client import BigQueryReadClient
Expand All @@ -55,35 +52,31 @@ class BigQueryReadAsyncClient:
parse_read_stream_path = staticmethod(BigQueryReadClient.parse_read_stream_path)
table_path = staticmethod(BigQueryReadClient.table_path)
parse_table_path = staticmethod(BigQueryReadClient.parse_table_path)

common_billing_account_path = staticmethod(
BigQueryReadClient.common_billing_account_path
)
parse_common_billing_account_path = staticmethod(
BigQueryReadClient.parse_common_billing_account_path
)

common_folder_path = staticmethod(BigQueryReadClient.common_folder_path)
parse_common_folder_path = staticmethod(BigQueryReadClient.parse_common_folder_path)

common_organization_path = staticmethod(BigQueryReadClient.common_organization_path)
parse_common_organization_path = staticmethod(
BigQueryReadClient.parse_common_organization_path
)

common_project_path = staticmethod(BigQueryReadClient.common_project_path)
parse_common_project_path = staticmethod(
BigQueryReadClient.parse_common_project_path
)

common_location_path = staticmethod(BigQueryReadClient.common_location_path)
parse_common_location_path = staticmethod(
BigQueryReadClient.parse_common_location_path
)

@classmethod
def from_service_account_info(cls, info: dict, *args, **kwargs):
"""Creates an instance of this client using the provided credentials info.
"""Creates an instance of this client using the provided credentials
info.
Args:
info (dict): The service account private key info.
Expand All @@ -98,7 +91,7 @@ def from_service_account_info(cls, info: dict, *args, **kwargs):
@classmethod
def from_service_account_file(cls, filename: str, *args, **kwargs):
"""Creates an instance of this client using the provided credentials
file.
file.
Args:
filename (str): The path to the service account private key json
Expand All @@ -115,7 +108,7 @@ def from_service_account_file(cls, filename: str, *args, **kwargs):

@property
def transport(self) -> BigQueryReadTransport:
"""Return the transport used by the client instance.
"""Returns the transport used by the client instance.
Returns:
BigQueryReadTransport: The transport used by the client instance.
Expand All @@ -129,12 +122,12 @@ def transport(self) -> BigQueryReadTransport:
def __init__(
self,
*,
credentials: credentials.Credentials = None,
credentials: ga_credentials.Credentials = None,
transport: Union[str, BigQueryReadTransport] = "grpc_asyncio",
client_options: ClientOptions = None,
client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
) -> None:
"""Instantiate the big query read client.
"""Instantiates the big query read client.
Args:
credentials (Optional[google.auth.credentials.Credentials]): The
Expand Down Expand Up @@ -166,7 +159,6 @@ def __init__(
google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
creation failed for any reason.
"""

self._client = BigQueryReadClient(
credentials=credentials,
transport=transport,
Expand Down Expand Up @@ -244,7 +236,6 @@ async def create_read_session(
This corresponds to the ``max_stream_count`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
Expand All @@ -269,7 +260,6 @@ async def create_read_session(

# If we have keyword arguments corresponding to fields on the
# request, apply these.

if parent is not None:
request.parent = parent
if read_session is not None:
Expand All @@ -286,7 +276,8 @@ async def create_read_session(
maximum=60.0,
multiplier=1.3,
predicate=retries.if_exception_type(
exceptions.DeadlineExceeded, exceptions.ServiceUnavailable,
core_exceptions.DeadlineExceeded,
core_exceptions.ServiceUnavailable,
),
deadline=600.0,
),
Expand Down Expand Up @@ -345,7 +336,6 @@ def read_rows(
This corresponds to the ``offset`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
Expand All @@ -372,7 +362,6 @@ def read_rows(

# If we have keyword arguments corresponding to fields on the
# request, apply these.

if read_stream is not None:
request.read_stream = read_stream
if offset is not None:
Expand All @@ -386,7 +375,9 @@ def read_rows(
initial=0.1,
maximum=60.0,
multiplier=1.3,
predicate=retries.if_exception_type(exceptions.ServiceUnavailable,),
predicate=retries.if_exception_type(
core_exceptions.ServiceUnavailable,
),
deadline=86400.0,
),
default_timeout=86400.0,
Expand Down Expand Up @@ -433,7 +424,6 @@ async def split_read_stream(
request (:class:`google.cloud.bigquery_storage_v1.types.SplitReadStreamRequest`):
The request object. Request message for
`SplitReadStream`.
retry (google.api_core.retry.Retry): Designation of what errors, if any,
should be retried.
timeout (float): The timeout for this request.
Expand All @@ -445,7 +435,6 @@ async def split_read_stream(
Response message for SplitReadStream.
"""
# Create or coerce a protobuf request object.

request = storage.SplitReadStreamRequest(request)

# Wrap the RPC method; this adds retry and timeout information,
Expand All @@ -457,7 +446,8 @@ async def split_read_stream(
maximum=60.0,
multiplier=1.3,
predicate=retries.if_exception_type(
exceptions.DeadlineExceeded, exceptions.ServiceUnavailable,
core_exceptions.DeadlineExceeded,
core_exceptions.ServiceUnavailable,
),
deadline=600.0,
),
Expand Down

0 comments on commit 0fe6484

Please sign in to comment.