Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: remove manual workaround for response size validation #112

Merged
merged 2 commits into from
Jan 12, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
31 changes: 7 additions & 24 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -1,35 +1,18 @@
# -*- coding: utf-8 -*-
#
# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Generated by synthtool. DO NOT EDIT!
[run]
branch = True

[report]
fail_under = 100
show_missing = True
omit =
google/cloud/bigquery_storage/__init__.py
exclude_lines =
# Re-enable the standard pragma
pragma: NO COVER
# Ignore debug-only repr
def __repr__
# Ignore abstract methods
raise NotImplementedError
omit =
*/gapic/*.py
*/proto/*.py
*/core/*.py
*/site-packages/*.py
# Ignore pkg_resources exceptions.
# This is added at the module level as a safeguard for if someone
# generates the code and tries to run it without pip installing. This
# makes it virtually impossible to test properly.
except pkg_resources.DistributionNotFound
1 change: 1 addition & 0 deletions .flake8
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ exclude =
*_pb2.py

# Standard linting exemptions.
**/.nox/**
__pycache__,
.git,
*.pyc,
Expand Down
16 changes: 10 additions & 6 deletions .kokoro/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,11 @@

set -eo pipefail

cd github/python-bigquery-storage
if [[ -z "${PROJECT_ROOT:-}" ]]; then
PROJECT_ROOT="github/python-bigquery-storage"
fi

cd "${PROJECT_ROOT}"

# Disable buffering, so that the logs stream through.
export PYTHONUNBUFFERED=1
Expand All @@ -30,16 +34,16 @@ export GOOGLE_APPLICATION_CREDENTIALS=${KOKORO_GFILE_DIR}/service-account.json
export PROJECT_ID=$(cat "${KOKORO_GFILE_DIR}/project-id.json")

# Remove old nox
python3.6 -m pip uninstall --yes --quiet nox-automation
python3 -m pip uninstall --yes --quiet nox-automation

# Install nox
python3.6 -m pip install --upgrade --quiet nox
python3.6 -m nox --version
python3 -m pip install --upgrade --quiet nox
python3 -m nox --version

# If NOX_SESSION is set, it only runs the specified session,
# otherwise run all the sessions.
if [[ -n "${NOX_SESSION:-}" ]]; then
python3.6 -m nox -s "${NOX_SESSION:-}"
python3 -m nox -s ${NOX_SESSION:-}
else
python3.6 -m nox
python3 -m nox
fi
11 changes: 11 additions & 0 deletions .kokoro/docs/docs-presubmit.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,14 @@ env_vars: {
key: "TRAMPOLINE_IMAGE_UPLOAD"
value: "false"
}

env_vars: {
key: "TRAMPOLINE_BUILD_FILE"
value: "github/python-bigquery-storage/.kokoro/build.sh"
}

# Only run this nox session.
env_vars: {
key: "NOX_SESSION"
value: "docs docfx"
}
17 changes: 17 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.4.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- repo: https://github.com/psf/black
rev: 19.10b0
hooks:
- id: black
- repo: https://gitlab.com/pycqa/flake8
rev: 3.8.4
hooks:
- id: flake8
2 changes: 2 additions & 0 deletions .trampolinerc
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,14 @@
required_envvars+=(
"STAGING_BUCKET"
"V2_STAGING_BUCKET"
"NOX_SESSION"
)

# Add env vars which are passed down into the container here.
pass_down_envvars+=(
"STAGING_BUCKET"
"V2_STAGING_BUCKET"
"NOX_SESSION"
)

# Prevent unintentional override on the default image.
Expand Down
21 changes: 15 additions & 6 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,8 @@ In order to add a feature:
- The feature must be documented in both the API and narrative
documentation.

- The feature must work fully on the following CPython versions: 2.7,
3.5, 3.6, 3.7 and 3.8 on both UNIX and Windows.
- The feature must work fully on the following CPython versions:
3.6, 3.7, 3.8 and 3.9 on both UNIX and Windows.

- The feature must not add unnecessary dependencies (where
"unnecessary" is of course subjective, but new dependencies should
Expand Down Expand Up @@ -111,6 +111,16 @@ Coding Style
should point to the official ``googleapis`` checkout and the
the branch should be the main branch on that remote (``master``).

- This repository contains configuration for the
`pre-commit <https://pre-commit.com/>`__ tool, which automates checking
our linters during a commit. If you have it installed on your ``$PATH``,
you can enable enforcing those checks via:

.. code-block:: bash

$ pre-commit install
pre-commit installed at .git/hooks/pre-commit

Exceptions to PEP8:

- Many unit tests use a helper method, ``_call_fut`` ("FUT" is short for
Expand Down Expand Up @@ -192,25 +202,24 @@ Supported Python Versions

We support:

- `Python 3.5`_
- `Python 3.6`_
- `Python 3.7`_
- `Python 3.8`_
- `Python 3.9`_

.. _Python 3.5: https://docs.python.org/3.5/
.. _Python 3.6: https://docs.python.org/3.6/
.. _Python 3.7: https://docs.python.org/3.7/
.. _Python 3.8: https://docs.python.org/3.8/
.. _Python 3.9: https://docs.python.org/3.9/


Supported versions can be found in our ``noxfile.py`` `config`_.

.. _config: https://github.com/googleapis/python-bigquery-storage/blob/master/noxfile.py

Python 2.7 support is deprecated. All code changes should maintain Python 2.7 compatibility until January 1, 2020.

We also explicitly decided to support Python 3 beginning with version
3.5. Reasons for this include:
3.6. Reasons for this include:

- Encouraging use of newest versions of Python 3
- Taking the lead of `prominent`_ open-source `projects`_
Expand Down
7 changes: 4 additions & 3 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
Apache License

Apache License
Version 2.0, January 2004
https://www.apache.org/licenses/
http://www.apache.org/licenses/

TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION

Expand Down Expand Up @@ -192,7 +193,7 @@
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

https://www.apache.org/licenses/LICENSE-2.0
http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
Expand Down
7 changes: 6 additions & 1 deletion docs/_static/custom.css
Original file line number Diff line number Diff line change
@@ -1,4 +1,9 @@
div#python2-eol {
border-color: red;
border-width: medium;
}
}

/* Ensure minimum width for 'Parameters' / 'Returns' column */
dl.field-list > dt {
min-width: 100px
}
6 changes: 6 additions & 0 deletions docs/bigquery_storage_v1/big_query_read.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
BigQueryRead
------------------------------

.. automodule:: google.cloud.bigquery_storage_v1.services.big_query_read
:members:
:inherited-members:
6 changes: 3 additions & 3 deletions docs/bigquery_storage_v1/services.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Services for Google Cloud Bigquery Storage v1 API
=================================================
.. toctree::
:maxdepth: 2

.. automodule:: google.cloud.bigquery_storage_v1.services.big_query_read
:members:
:inherited-members:
big_query_read
1 change: 1 addition & 0 deletions docs/bigquery_storage_v1/types.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,5 @@ Types for Google Cloud Bigquery Storage v1 API

.. automodule:: google.cloud.bigquery_storage_v1.types
:members:
:undoc-members:
:show-inheritance:
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,7 @@ class BigQueryReadAsyncClient:
BigQueryReadClient.parse_common_location_path
)

from_service_account_info = BigQueryReadClient.from_service_account_info
from_service_account_file = BigQueryReadClient.from_service_account_file
from_service_account_json = from_service_account_file

Expand Down Expand Up @@ -181,16 +182,17 @@ async def create_read_session(
caller.

Args:
request (:class:`~.storage.CreateReadSessionRequest`):
request (:class:`google.cloud.bigquery_storage_v1.types.CreateReadSessionRequest`):
The request object. Request message for
`CreateReadSession`.
parent (:class:`str`):
Required. The request project that owns the session, in
the form of ``projects/{project_id}``.

This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
read_session (:class:`~.stream.ReadSession`):
read_session (:class:`google.cloud.bigquery_storage_v1.types.ReadSession`):
Required. Session to be created.
This corresponds to the ``read_session`` field
on the ``request`` instance; if ``request`` is provided, this
Expand All @@ -210,6 +212,7 @@ async def create_read_session(

Streams must be read starting from
offset 0.

This corresponds to the ``max_stream_count`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
Expand All @@ -221,7 +224,7 @@ async def create_read_session(
sent along with the request as metadata.

Returns:
~.stream.ReadSession:
google.cloud.bigquery_storage_v1.types.ReadSession:
Information about the ReadSession.
"""
# Create or coerce a protobuf request object.
Expand Down Expand Up @@ -296,7 +299,7 @@ def read_rows(
reflecting the current state of the stream.

Args:
request (:class:`~.storage.ReadRowsRequest`):
request (:class:`google.cloud.bigquery_storage_v1.types.ReadRowsRequest`):
The request object. Request message for `ReadRows`.
read_stream (:class:`str`):
Required. Stream to read rows from.
Expand All @@ -309,6 +312,7 @@ def read_rows(
Requesting a larger offset is undefined.
If not specified, start reading from
offset zero.

This corresponds to the ``offset`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.
Expand All @@ -320,9 +324,9 @@ def read_rows(
sent along with the request as metadata.

Returns:
AsyncIterable[~.storage.ReadRowsResponse]:
Response from calling ``ReadRows`` may include row data,
progress and throttling information.
AsyncIterable[google.cloud.bigquery_storage_v1.types.ReadRowsResponse]:
Response from calling ReadRows may include row data, progress and
throttling information.

"""
# Create or coerce a protobuf request object.
Expand Down Expand Up @@ -396,7 +400,7 @@ async def split_read_stream(
once the streams have been read to completion.

Args:
request (:class:`~.storage.SplitReadStreamRequest`):
request (:class:`google.cloud.bigquery_storage_v1.types.SplitReadStreamRequest`):
The request object. Request message for
`SplitReadStream`.

Expand All @@ -407,8 +411,8 @@ async def split_read_stream(
sent along with the request as metadata.

Returns:
~.storage.SplitReadStreamResponse:
Response message for ``SplitReadStream``.
google.cloud.bigquery_storage_v1.types.SplitReadStreamResponse:
Response message for SplitReadStream.
"""
# Create or coerce a protobuf request object.

Expand Down