From 14fbdb811688296e3978b4dfe7d4c240b5b1da5d Mon Sep 17 00:00:00 2001 From: hkdevandla <60490673+hkdevandla@users.noreply.github.com> Date: Thu, 20 Aug 2020 10:28:46 -0700 Subject: [PATCH] feat: Migrate API client to Microgenerator (#54) * Add samples for Data Catalog lookup_entry [(#2148)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/2148) * Add samples for Data Catalog lookup_entry * Add tests for Data Catalog lookup_entry * Add samples for lookup_entry by SQL Resource * Add README.rst * Improve command line interface * Removed the "lookup-" prefix from commands * Handle the --sql-resource optional argument by subparsers * Refer to GCP public assets in tests * Add region tags to support Data Catalog docs [(#2169)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/2169) * Adds updates including compute [(#2436)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/2436) * Adds updates including compute * Python 2 compat pytest * Fixing weird \r\n issue from GH merge * Put asset tests back in * Re-add pod operator test * Hack parameter for k8s pod operator * Auto-update dependencies. [(#2005)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/2005) * Auto-update dependencies. * Revert update of appengine/flexible/datastore. * revert update of appengine/flexible/scipy * revert update of bigquery/bqml * revert update of bigquery/cloud-client * revert update of bigquery/datalab-migration * revert update of bigtable/quickstart * revert update of compute/api * revert update of container_registry/container_analysis * revert update of dataflow/run_template * revert update of datastore/cloud-ndb * revert update of dialogflow/cloud-client * revert update of dlp * revert update of functions/imagemagick * revert update of functions/ocr/app * revert update of healthcare/api-client/fhir * revert update of iam/api-client * revert update of iot/api-client/gcs_file_to_device * revert update of iot/api-client/mqtt_example * revert update of language/automl * revert update of run/image-processing * revert update of vision/automl * revert update testing/requirements.txt * revert update of vision/cloud-client/detect * revert update of vision/cloud-client/product_search * revert update of jobs/v2/api_client * revert update of jobs/v3/api_client * revert update of opencensus * revert update of translate/cloud-client * revert update to speech/cloud-client Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh * chore(deps): update dependency google-cloud-datacatalog to v0.6.0 [(#3069)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/3069) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [google-cloud-datacatalog](https://togithub.com/googleapis/python-datacatalog) | minor | `==0.5.0` -> `==0.6.0` | --- ### Release Notes
googleapis/python-datacatalog ### [`v0.6.0`](https://togithub.com/googleapis/python-datacatalog/blob/master/CHANGELOG.md#​060httpswwwgithubcomgoogleapispython-datacatalogcomparev050v060-2020-02-24) [Compare Source](https://togithub.com/googleapis/python-datacatalog/compare/v0.5.0...v0.6.0) ##### Features - **datacatalog:** add sample for create a fileset entry quickstart ([#​9977](https://www.github.com/googleapis/python-datacatalog/issues/9977)) ([16eaf4b](https://www.github.com/googleapis/python-datacatalog/commit/16eaf4b16cdc0ce7361afb1d8dac666cea2a9db0)) - **datacatalog:** undeprecate resource name helper methods, bump copyright year to 2020, tweak docstring formatting (via synth) ([#​10228](https://www.github.com/googleapis/python-datacatalog/issues/10228)) ([84e5e7c](https://www.github.com/googleapis/python-datacatalog/commit/84e5e7c340fa189ce4cffca4fdee82cc7ded9f70)) - add `list_entry_groups`, `list_entries`, `update_entry_group` methods to v1beta1 (via synth) ([#​6](https://www.github.com/googleapis/python-datacatalog/issues/6)) ([b51902e](https://www.github.com/googleapis/python-datacatalog/commit/b51902e26d590f52c9412756a178265850b7d516)) ##### Bug Fixes - **datacatalog:** deprecate resource name helper methods (via synth) ([#​9831](https://www.github.com/googleapis/python-datacatalog/issues/9831)) ([22db3f0](https://www.github.com/googleapis/python-datacatalog/commit/22db3f0683b8aca544cd96c0063dcc8157ad7335))
--- ### Renovate configuration :date: **Schedule**: At any time (no schedule defined). :vertical_traffic_light: **Automerge**: Disabled by config. Please merge this manually once you are satisfied. :recycle: **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. :no_bell: **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] If you want to rebase/retry this PR, check this box --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#GoogleCloudPlatform/python-docs-samples). * Simplify noxfile setup. [(#2806)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/2806) * chore(deps): update dependency requests to v2.23.0 * Simplify noxfile and add version control. * Configure appengine/standard to only test Python 2.7. * Update Kokokro configs to match noxfile. * Add requirements-test to each folder. * Remove Py2 versions from everything execept appengine/standard. * Remove conftest.py. * Remove appengine/standard/conftest.py * Remove 'no-sucess-flaky-report' from pytest.ini. * Add GAE SDK back to appengine/standard tests. * Fix typo. * Roll pytest to python 2 version. * Add a bunch of testing requirements. * Remove typo. * Add appengine lib directory back in. * Add some additional requirements. * Fix issue with flake8 args. * Even more requirements. * Readd appengine conftest.py. * Add a few more requirements. * Even more Appengine requirements. * Add webtest for appengine/standard/mailgun. * Add some additional requirements. * Add workaround for issue with mailjet-rest. * Add responses for appengine/standard/mailjet. Co-authored-by: Renovate Bot * Update dependency google-cloud-datacatalog to v0.7.0 [(#3320)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/3320) Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> * Update Data Catalog samples to V1 [(#3382)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/3382) Co-authored-by: Takashi Matsuo * chore(deps): update dependency google-cloud-datacatalog to v0.8.0 [(#3850)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/3850) * Update dependency google-cloud-datacatalog to v1 [(#4115)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/4115) * chore(deps): update dependency pytest to v5.4.3 [(#4279)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/4279) * chore(deps): update dependency pytest to v5.4.3 * specify pytest for python 2 in appengine Co-authored-by: Leah Cole * Update dependency pytest to v6 [(#4390)](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/4390) * chore: update templates * chore: update templates * feat: Migrate to use Microgenerator * feat: Migrate to use Microgenerator * feat: Migrate to use Microgenerator * Migrate API to microgenerator * Migrate API to microgenerator * Samples tests * fix samples tests * fix lint errors and test coverage metrics * docs update * fix docs * fix docs * fix docs * remove .python-version file Co-authored-by: Ricardo Mendes <50331050+ricardosm-cit@users.noreply.github.com> Co-authored-by: Gus Class Co-authored-by: DPEBot Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com> Co-authored-by: Doug Mahugh Co-authored-by: WhiteSource Renovate Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Marcelo Costa Co-authored-by: Takashi Matsuo Co-authored-by: Leah Cole --- .coveragerc | 11 +- README.rst | 11 + UPGRADING.md | 150 + docs/UPGRADING.md | 1 + docs/datacatalog_v1/services.rst | 6 + docs/datacatalog_v1/types.rst | 5 + docs/datacatalog_v1beta1/services.rst | 12 + .../v1beta1 => datacatalog_v1beta1}/types.rst | 4 +- docs/gapic/v1/api.rst | 6 - docs/gapic/v1/types.rst | 5 - docs/gapic/v1beta1/api.rst | 6 - docs/index.rst | 18 +- google/cloud/datacatalog/__init__.py | 226 + google/cloud/datacatalog/py.typed | 2 + google/cloud/datacatalog_v1/__init__.py | 130 +- google/cloud/datacatalog_v1/gapic/__init__.py | 0 .../gapic/data_catalog_client.py | 2725 ------- .../gapic/data_catalog_client_config.py | 177 - google/cloud/datacatalog_v1/gapic/enums.py | 110 - .../gapic/transports/__init__.py | 0 .../transports/data_catalog_grpc_transport.py | 603 -- google/cloud/datacatalog_v1/proto/__init__.py | 0 .../cloud/datacatalog_v1/proto/common.proto | 38 + .../cloud/datacatalog_v1/proto/common_pb2.py | 75 - .../datacatalog_v1/proto/common_pb2_grpc.py | 3 - .../datacatalog_v1/proto/datacatalog.proto | 1261 +++ .../datacatalog_v1/proto/datacatalog_pb2.py | 3906 ---------- .../proto/datacatalog_pb2_grpc.py | 1373 ---- .../proto/gcs_fileset_spec.proto | 77 + .../proto/gcs_fileset_spec_pb2.py | 254 - .../proto/gcs_fileset_spec_pb2_grpc.py | 3 - .../cloud/datacatalog_v1/proto/schema.proto | 55 + .../cloud/datacatalog_v1/proto/schema_pb2.py | 249 - .../datacatalog_v1/proto/schema_pb2_grpc.py | 3 - .../cloud/datacatalog_v1/proto/search.proto | 84 + .../cloud/datacatalog_v1/proto/search_pb2.py | 305 - .../datacatalog_v1/proto/search_pb2_grpc.py | 3 - .../datacatalog_v1/proto/table_spec.proto | 101 + .../datacatalog_v1/proto/table_spec_pb2.py | 450 -- .../proto/table_spec_pb2_grpc.py | 3 - google/cloud/datacatalog_v1/proto/tags.proto | 229 + google/cloud/datacatalog_v1/proto/tags_pb2.py | 1217 --- .../datacatalog_v1/proto/tags_pb2_grpc.py | 3 - .../datacatalog_v1/proto/timestamps.proto | 41 + .../datacatalog_v1/proto/timestamps_pb2.py | 149 - .../proto/timestamps_pb2_grpc.py | 3 - google/cloud/datacatalog_v1/py.typed | 2 + .../cloud/datacatalog_v1/services/__init__.py | 16 + .../services/data_catalog}/__init__.py | 18 +- .../services/data_catalog/async_client.py | 2716 +++++++ .../services/data_catalog/client.py | 2913 +++++++ .../services/data_catalog/pagers.py | 534 ++ .../data_catalog/transports/__init__.py | 36 + .../services/data_catalog/transports/base.py | 528 ++ .../services/data_catalog/transports/grpc.py | 1065 +++ .../data_catalog/transports/grpc_asyncio.py | 1086 +++ google/cloud/datacatalog_v1/types.py | 72 - google/cloud/datacatalog_v1/types/__init__.py | 121 + google/cloud/datacatalog_v1/types/common.py | 35 + .../cloud/datacatalog_v1/types/datacatalog.py | 1042 +++ .../datacatalog_v1/types/gcs_fileset_spec.py | 102 + google/cloud/datacatalog_v1/types/schema.py | 71 + google/cloud/datacatalog_v1/types/search.py | 94 + .../cloud/datacatalog_v1/types/table_spec.py | 119 + google/cloud/datacatalog_v1/types/tags.py | 291 + .../cloud/datacatalog_v1/types/timestamps.py | 53 + google/cloud/datacatalog_v1beta1/__init__.py | 190 +- .../datacatalog_v1beta1/gapic/__init__.py | 0 .../gapic/data_catalog_client.py | 2715 ------- .../gapic/data_catalog_client_config.py | 177 - .../cloud/datacatalog_v1beta1/gapic/enums.py | 125 - .../gapic/policy_tag_manager_client.py | 1270 --- .../gapic/policy_tag_manager_client_config.py | 107 - ...policy_tag_manager_serialization_client.py | 399 - ...tag_manager_serialization_client_config.py | 52 - .../gapic/transports/__init__.py | 0 .../transports/data_catalog_grpc_transport.py | 591 -- .../policy_tag_manager_grpc_transport.py | 282 - ...ag_manager_serialization_grpc_transport.py | 145 - .../datacatalog_v1beta1/proto/__init__.py | 0 .../datacatalog_v1beta1/proto/common.proto | 38 + .../datacatalog_v1beta1/proto/common_pb2.py | 75 - .../proto/common_pb2_grpc.py | 3 - .../proto/datacatalog.proto | 502 +- .../proto/datacatalog_pb2.py | 3849 ---------- .../proto/datacatalog_pb2_grpc.py | 1361 ---- .../proto/gcs_fileset_spec.proto | 42 +- .../proto/gcs_fileset_spec_pb2.py | 254 - .../proto/gcs_fileset_spec_pb2_grpc.py | 3 - .../proto/policytagmanager.proto | 417 + .../proto/policytagmanager_pb2.py | 1514 ---- .../proto/policytagmanager_pb2_grpc.py | 620 -- .../proto/policytagmanagerserialization.proto | 157 + .../policytagmanagerserialization_pb2.py | 713 -- .../policytagmanagerserialization_pb2_grpc.py | 138 - .../datacatalog_v1beta1/proto/schema.proto | 21 +- .../datacatalog_v1beta1/proto/schema_pb2.py | 249 - .../proto/schema_pb2_grpc.py | 3 - .../datacatalog_v1beta1/proto/search.proto | 9 +- .../datacatalog_v1beta1/proto/search_pb2.py | 230 - .../proto/search_pb2_grpc.py | 3 - .../proto/table_spec.proto | 24 +- .../proto/table_spec_pb2.py | 450 -- .../proto/table_spec_pb2_grpc.py | 3 - .../datacatalog_v1beta1/proto/tags.proto | 57 +- .../datacatalog_v1beta1/proto/tags_pb2.py | 1216 --- .../proto/tags_pb2_grpc.py | 3 - .../proto/timestamps.proto | 11 +- .../proto/timestamps_pb2.py | 149 - .../proto/timestamps_pb2_grpc.py | 3 - google/cloud/datacatalog_v1beta1/py.typed | 2 + .../datacatalog_v1beta1/services/__init__.py | 16 + .../services/data_catalog}/__init__.py | 18 +- .../services/data_catalog/async_client.py | 2740 +++++++ .../services/data_catalog/client.py | 2905 +++++++ .../services/data_catalog/pagers.py | 534 ++ .../data_catalog/transports/__init__.py | 36 + .../services/data_catalog/transports/base.py | 560 ++ .../services/data_catalog/transports/grpc.py | 1054 +++ .../data_catalog/transports/grpc_asyncio.py | 1075 +++ .../services/policy_tag_manager/__init__.py | 24 + .../policy_tag_manager/async_client.py | 1177 +++ .../services/policy_tag_manager/client.py | 1344 ++++ .../services/policy_tag_manager/pagers.py | 276 + .../policy_tag_manager/transports/__init__.py | 36 + .../policy_tag_manager/transports/base.py | 288 + .../policy_tag_manager/transports/grpc.py | 566 ++ .../transports/grpc_asyncio.py | 568 ++ .../__init__.py} | 21 +- .../async_client.py | 222 + .../client.py | 357 + .../transports/__init__.py | 38 + .../transports/base.py | 136 + .../transports/grpc.py | 277 + .../transports/grpc_asyncio.py | 270 + google/cloud/datacatalog_v1beta1/types.py | 76 - .../datacatalog_v1beta1/types/__init__.py | 167 + .../cloud/datacatalog_v1beta1/types/common.py | 35 + .../datacatalog_v1beta1/types/datacatalog.py | 996 +++ .../types/gcs_fileset_spec.py | 103 + .../types/policytagmanager.py | 368 + .../types/policytagmanagerserialization.py | 174 + .../cloud/datacatalog_v1beta1/types/schema.py | 71 + .../cloud/datacatalog_v1beta1/types/search.py | 77 + .../datacatalog_v1beta1/types/table_spec.py | 119 + .../cloud/datacatalog_v1beta1/types/tags.py | 291 + .../datacatalog_v1beta1/types/timestamps.py | 53 + mypy.ini | 3 + noxfile.py | 10 +- .../create_fileset_entry_quickstart.py | 25 +- samples/snippets/README.rst | 24 +- samples/snippets/lookup_entry.py | 12 +- samples/tests/conftest.py | 13 +- samples/tests/test_create_entry_group.py | 2 +- samples/v1beta1/create_entry_group.py | 5 +- samples/v1beta1/create_fileset_entry.py | 2 +- samples/v1beta1/datacatalog_get_entry.py | 6 +- samples/v1beta1/datacatalog_lookup_entry.py | 5 +- .../datacatalog_lookup_entry_sql_resource.py | 5 +- samples/v1beta1/datacatalog_search.py | 2 +- scripts/fixup_datacatalog_v1_keywords.py | 204 + scripts/fixup_datacatalog_v1beta1_keywords.py | 216 + setup.py | 17 +- synth.metadata | 14 +- synth.py | 15 +- tests/unit/gapic/datacatalog_v1/__init__.py | 1 + .../gapic/datacatalog_v1/test_data_catalog.py | 6836 +++++++++++++++++ .../gapic/datacatalog_v1beta1/__init__.py | 1 + .../datacatalog_v1beta1/test_data_catalog.py | 6833 ++++++++++++++++ .../test_policy_tag_manager.py | 3683 +++++++++ .../test_policy_tag_manager_serialization.py | 967 +++ .../gapic/v1/test_data_catalog_client_v1.py | 1268 --- .../test_data_catalog_client_v1beta1.py | 1268 --- .../test_policy_tag_manager_client_v1beta1.py | 613 -- ...ag_manager_serialization_client_v1beta1.py | 145 - 175 files changed, 50245 insertions(+), 32197 deletions(-) create mode 100644 UPGRADING.md create mode 100644 docs/UPGRADING.md create mode 100644 docs/datacatalog_v1/services.rst create mode 100644 docs/datacatalog_v1/types.rst create mode 100644 docs/datacatalog_v1beta1/services.rst rename docs/{gapic/v1beta1 => datacatalog_v1beta1}/types.rst (62%) delete mode 100644 docs/gapic/v1/api.rst delete mode 100644 docs/gapic/v1/types.rst delete mode 100644 docs/gapic/v1beta1/api.rst create mode 100644 google/cloud/datacatalog/__init__.py create mode 100644 google/cloud/datacatalog/py.typed delete mode 100644 google/cloud/datacatalog_v1/gapic/__init__.py delete mode 100644 google/cloud/datacatalog_v1/gapic/data_catalog_client.py delete mode 100644 google/cloud/datacatalog_v1/gapic/data_catalog_client_config.py delete mode 100644 google/cloud/datacatalog_v1/gapic/enums.py delete mode 100644 google/cloud/datacatalog_v1/gapic/transports/__init__.py delete mode 100644 google/cloud/datacatalog_v1/gapic/transports/data_catalog_grpc_transport.py delete mode 100644 google/cloud/datacatalog_v1/proto/__init__.py create mode 100644 google/cloud/datacatalog_v1/proto/common.proto delete mode 100644 google/cloud/datacatalog_v1/proto/common_pb2.py delete mode 100644 google/cloud/datacatalog_v1/proto/common_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1/proto/datacatalog.proto delete mode 100644 google/cloud/datacatalog_v1/proto/datacatalog_pb2.py delete mode 100644 google/cloud/datacatalog_v1/proto/datacatalog_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1/proto/gcs_fileset_spec.proto delete mode 100644 google/cloud/datacatalog_v1/proto/gcs_fileset_spec_pb2.py delete mode 100644 google/cloud/datacatalog_v1/proto/gcs_fileset_spec_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1/proto/schema.proto delete mode 100644 google/cloud/datacatalog_v1/proto/schema_pb2.py delete mode 100644 google/cloud/datacatalog_v1/proto/schema_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1/proto/search.proto delete mode 100644 google/cloud/datacatalog_v1/proto/search_pb2.py delete mode 100644 google/cloud/datacatalog_v1/proto/search_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1/proto/table_spec.proto delete mode 100644 google/cloud/datacatalog_v1/proto/table_spec_pb2.py delete mode 100644 google/cloud/datacatalog_v1/proto/table_spec_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1/proto/tags.proto delete mode 100644 google/cloud/datacatalog_v1/proto/tags_pb2.py delete mode 100644 google/cloud/datacatalog_v1/proto/tags_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1/proto/timestamps.proto delete mode 100644 google/cloud/datacatalog_v1/proto/timestamps_pb2.py delete mode 100644 google/cloud/datacatalog_v1/proto/timestamps_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1/py.typed create mode 100644 google/cloud/datacatalog_v1/services/__init__.py rename google/{ => cloud/datacatalog_v1/services/data_catalog}/__init__.py (71%) create mode 100644 google/cloud/datacatalog_v1/services/data_catalog/async_client.py create mode 100644 google/cloud/datacatalog_v1/services/data_catalog/client.py create mode 100644 google/cloud/datacatalog_v1/services/data_catalog/pagers.py create mode 100644 google/cloud/datacatalog_v1/services/data_catalog/transports/__init__.py create mode 100644 google/cloud/datacatalog_v1/services/data_catalog/transports/base.py create mode 100644 google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py create mode 100644 google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py delete mode 100644 google/cloud/datacatalog_v1/types.py create mode 100644 google/cloud/datacatalog_v1/types/__init__.py create mode 100644 google/cloud/datacatalog_v1/types/common.py create mode 100644 google/cloud/datacatalog_v1/types/datacatalog.py create mode 100644 google/cloud/datacatalog_v1/types/gcs_fileset_spec.py create mode 100644 google/cloud/datacatalog_v1/types/schema.py create mode 100644 google/cloud/datacatalog_v1/types/search.py create mode 100644 google/cloud/datacatalog_v1/types/table_spec.py create mode 100644 google/cloud/datacatalog_v1/types/tags.py create mode 100644 google/cloud/datacatalog_v1/types/timestamps.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/__init__.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/data_catalog_client.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/data_catalog_client_config.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/enums.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_client.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_client_config.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_serialization_client.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_serialization_client_config.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/transports/__init__.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/transports/data_catalog_grpc_transport.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/transports/policy_tag_manager_grpc_transport.py delete mode 100644 google/cloud/datacatalog_v1beta1/gapic/transports/policy_tag_manager_serialization_grpc_transport.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/__init__.py create mode 100644 google/cloud/datacatalog_v1beta1/proto/common.proto delete mode 100644 google/cloud/datacatalog_v1beta1/proto/common_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/common_pb2_grpc.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/datacatalog_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/datacatalog_pb2_grpc.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1beta1/proto/policytagmanager.proto delete mode 100644 google/cloud/datacatalog_v1beta1/proto/policytagmanager_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/policytagmanager_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1beta1/proto/policytagmanagerserialization.proto delete mode 100644 google/cloud/datacatalog_v1beta1/proto/policytagmanagerserialization_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/policytagmanagerserialization_pb2_grpc.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/schema_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/schema_pb2_grpc.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/search_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/search_pb2_grpc.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/table_spec_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/table_spec_pb2_grpc.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/tags_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/tags_pb2_grpc.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/timestamps_pb2.py delete mode 100644 google/cloud/datacatalog_v1beta1/proto/timestamps_pb2_grpc.py create mode 100644 google/cloud/datacatalog_v1beta1/py.typed create mode 100644 google/cloud/datacatalog_v1beta1/services/__init__.py rename google/cloud/{ => datacatalog_v1beta1/services/data_catalog}/__init__.py (71%) create mode 100644 google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py create mode 100644 google/cloud/datacatalog_v1beta1/services/data_catalog/client.py create mode 100644 google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py create mode 100644 google/cloud/datacatalog_v1beta1/services/data_catalog/transports/__init__.py create mode 100644 google/cloud/datacatalog_v1beta1/services/data_catalog/transports/base.py create mode 100644 google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py create mode 100644 google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager/__init__.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/__init__.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/base.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py rename google/cloud/{datacatalog.py => datacatalog_v1beta1/services/policy_tag_manager_serialization/__init__.py} (55%) create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/__init__.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/base.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py create mode 100644 google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py delete mode 100644 google/cloud/datacatalog_v1beta1/types.py create mode 100644 google/cloud/datacatalog_v1beta1/types/__init__.py create mode 100644 google/cloud/datacatalog_v1beta1/types/common.py create mode 100644 google/cloud/datacatalog_v1beta1/types/datacatalog.py create mode 100644 google/cloud/datacatalog_v1beta1/types/gcs_fileset_spec.py create mode 100644 google/cloud/datacatalog_v1beta1/types/policytagmanager.py create mode 100644 google/cloud/datacatalog_v1beta1/types/policytagmanagerserialization.py create mode 100644 google/cloud/datacatalog_v1beta1/types/schema.py create mode 100644 google/cloud/datacatalog_v1beta1/types/search.py create mode 100644 google/cloud/datacatalog_v1beta1/types/table_spec.py create mode 100644 google/cloud/datacatalog_v1beta1/types/tags.py create mode 100644 google/cloud/datacatalog_v1beta1/types/timestamps.py create mode 100644 mypy.ini create mode 100644 scripts/fixup_datacatalog_v1_keywords.py create mode 100644 scripts/fixup_datacatalog_v1beta1_keywords.py create mode 100644 tests/unit/gapic/datacatalog_v1/__init__.py create mode 100644 tests/unit/gapic/datacatalog_v1/test_data_catalog.py create mode 100644 tests/unit/gapic/datacatalog_v1beta1/__init__.py create mode 100644 tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py create mode 100644 tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py create mode 100644 tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py delete mode 100644 tests/unit/gapic/v1/test_data_catalog_client_v1.py delete mode 100644 tests/unit/gapic/v1beta1/test_data_catalog_client_v1beta1.py delete mode 100644 tests/unit/gapic/v1beta1/test_policy_tag_manager_client_v1beta1.py delete mode 100644 tests/unit/gapic/v1beta1/test_policy_tag_manager_serialization_client_v1beta1.py diff --git a/.coveragerc b/.coveragerc index dd39c854..7b592626 100644 --- a/.coveragerc +++ b/.coveragerc @@ -21,6 +21,7 @@ branch = True [report] fail_under = 100 show_missing = True +omit = google/cloud/datacatalog/__init.py exclude_lines = # Re-enable the standard pragma pragma: NO COVER @@ -28,8 +29,8 @@ exclude_lines = def __repr__ # Ignore abstract methods raise NotImplementedError -omit = - */gapic/*.py - */proto/*.py - */core/*.py - */site-packages/*.py \ No newline at end of file + # Ignore pkg_resources exceptions. + # This is added at the module level as a safeguard for if someone + # generates the code and tries to run it without pip installing. This + # makes it virtually impossible to test properly. + except pkg_resources.DistributionNotFound diff --git a/README.rst b/README.rst index e14f861f..556d9583 100644 --- a/README.rst +++ b/README.rst @@ -35,6 +35,17 @@ In order to use this library, you first need to go through the following steps: .. _Enable the Google Cloud Data Catalog API.: https://cloud.google.com/data-catalog .. _Setup Authentication.: https://googleapis.dev/python/google-api-core/latest/auth.html + +Supported Python Versions +^^^^^^^^^^^^^^^^^^^^^^^^^ +Python >= 3.6 + +Deprecated Python Versions +^^^^^^^^^^^^^^^^^^^^^^^^^^ +Python == 2.7. + +The last version of this library compatible with Python 2.7 is google-cloud-datacatalog==1.0.0. + Installation ~~~~~~~~~~~~ diff --git a/UPGRADING.md b/UPGRADING.md new file mode 100644 index 00000000..6a3961ce --- /dev/null +++ b/UPGRADING.md @@ -0,0 +1,150 @@ +# 2.0.0 Migration Guide + +The 2.0 release of the `google-cloud-datacatalog` client is a significant upgrade based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python), and includes substantial interface changes. Existing code written for earlier versions of this library will likely require updates to use this version. This document describes the changes that have been made, and what you need to do to update your usage. + +If you experience issues or have questions, please file an [issue](https://github.com/googleapis/python-datacatalog/issues). + +## Supported Python Versions + +> **WARNING**: Breaking change +The 2.0.0 release requires Python 3.6+. + + +## Method Calls + +> **WARNING**: Breaking change +Methods expect request objects. We provide a script that will convert most common use cases. + +* Install the library + +```py +python3 -m pip install google-cloud-datacatalog +``` + +* The script `fixup_datacatalog_v1_keywords.py` is shipped with the library. It expects +an input directory (with the code to convert) and an empty destination directory. + +```sh +$ fixup_datacatalog_v1_keywords.py --input-directory .samples/ --output-directory samples/ +``` + +**Before:** +```py +from google.cloud import datacatalog_v1 +datacatalog = datacatalog_v1.DataCatalogClient() +return datacatalog.lookup_entry(linked_resource=resource_name) +``` + + +**After:** +```py +from google.cloud import datacatalog_v1 +datacatalog = datacatalog_v1.DataCatalogClient() +return datacatalog.lookup_entry(request={'linked_resource': resource_name}) +``` + +### More Details + +In `google-cloud-datacatalog<2.0.0`, parameters required by the API were positional parameters and optional parameters were keyword parameters. + +**Before:** +```py + def create_entry_group( + self, + parent, + entry_group_id, + entry_group=None, + retry=google.api_core.gapic_v1.method.DEFAULT, + timeout=google.api_core.gapic_v1.method.DEFAULT, + metadata=None, + ): +``` + +In the 2.0.0 release, all methods have a single positional parameter `request`. Method docstrings indicate whether a parameter is required or optional. + +Some methods have additional keyword only parameters. The available parameters depend on the `google.api.method_signature` annotation specified by the API producer. + + +**After:** +```py + def create_entry_group( + self, + request: datacatalog.CreateEntryGroupRequest = None, + *, + parent: str = None, + entry_group_id: str = None, + entry_group: datacatalog.EntryGroup = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: +``` + +> **NOTE:** The `request` parameter and flattened keyword parameters for the API are mutually exclusive. +> Passing both will result in an error. + +Both of these calls are valid: + +```py +response = client.create_entry_group( + request={ + "parent": parent, + "entry_group_id": entry_group_id, + "entry_group": entry_group + } +) +``` + +```py +response = client.create_entry_group( + parent=parent, + entry_group_id=entry_group_id, + entry_group=entry_group + ) # Make an API request. +``` + +This call is invalid because it mixes `request` with a keyword argument `entry_group`. Executing this code +will result in an error. + +```py +response = client.create_entry_group( + request={ + "parent": parent, + "entry_group_id"=entry_group_id + }, + entry_group=entry_group +) +``` + + + +## Enums and Types + + +> **WARNING**: Breaking change +The submodules `enums` and `types` have been removed. + +**Before:** +```py +from google.cloud import datacatalog_v1 +entry = datacatalog_v1beta1.types.Entry() +entry.type = datacatalog_v1beta1.enums.EntryType.FILESET +``` + + +**After:** +```py +from google.cloud import datacatalog_v1 +entry = datacatalog_v1beta1.Entry() +entry.type = datacatalog_v1beta1.EntryType.FILESET +``` + +## Project Path Helper Methods + +The project path helper method `project_path` has been removed. Please construct +this path manually. + +```py +project = 'my-project' +project_path = f'projects/{project}' +``` \ No newline at end of file diff --git a/docs/UPGRADING.md b/docs/UPGRADING.md new file mode 100644 index 00000000..01097c8c --- /dev/null +++ b/docs/UPGRADING.md @@ -0,0 +1 @@ +../UPGRADING.md \ No newline at end of file diff --git a/docs/datacatalog_v1/services.rst b/docs/datacatalog_v1/services.rst new file mode 100644 index 00000000..a73ca817 --- /dev/null +++ b/docs/datacatalog_v1/services.rst @@ -0,0 +1,6 @@ +Services for Google Cloud Datacatalog v1 API +============================================ + +.. automodule:: google.cloud.datacatalog_v1.services.data_catalog + :members: + :inherited-members: diff --git a/docs/datacatalog_v1/types.rst b/docs/datacatalog_v1/types.rst new file mode 100644 index 00000000..cb94a5e5 --- /dev/null +++ b/docs/datacatalog_v1/types.rst @@ -0,0 +1,5 @@ +Types for Google Cloud Datacatalog v1 API +========================================= + +.. automodule:: google.cloud.datacatalog_v1.types + :members: diff --git a/docs/datacatalog_v1beta1/services.rst b/docs/datacatalog_v1beta1/services.rst new file mode 100644 index 00000000..43425e2f --- /dev/null +++ b/docs/datacatalog_v1beta1/services.rst @@ -0,0 +1,12 @@ +Services for Google Cloud Datacatalog v1beta1 API +================================================= + +.. automodule:: google.cloud.datacatalog_v1beta1.services.data_catalog + :members: + :inherited-members: +.. automodule:: google.cloud.datacatalog_v1beta1.services.policy_tag_manager + :members: + :inherited-members: +.. automodule:: google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization + :members: + :inherited-members: diff --git a/docs/gapic/v1beta1/types.rst b/docs/datacatalog_v1beta1/types.rst similarity index 62% rename from docs/gapic/v1beta1/types.rst rename to docs/datacatalog_v1beta1/types.rst index bcc6cefb..75ee2bb4 100644 --- a/docs/gapic/v1beta1/types.rst +++ b/docs/datacatalog_v1beta1/types.rst @@ -1,5 +1,5 @@ -Types for Google Cloud Data Catalog API Client +Types for Google Cloud Datacatalog v1beta1 API ============================================== .. automodule:: google.cloud.datacatalog_v1beta1.types - :members: \ No newline at end of file + :members: diff --git a/docs/gapic/v1/api.rst b/docs/gapic/v1/api.rst deleted file mode 100644 index c9dfa5cf..00000000 --- a/docs/gapic/v1/api.rst +++ /dev/null @@ -1,6 +0,0 @@ -Client for Google Cloud Data Catalog API -======================================== - -.. automodule:: google.cloud.datacatalog_v1 - :members: - :inherited-members: \ No newline at end of file diff --git a/docs/gapic/v1/types.rst b/docs/gapic/v1/types.rst deleted file mode 100644 index 98c3a90a..00000000 --- a/docs/gapic/v1/types.rst +++ /dev/null @@ -1,5 +0,0 @@ -Types for Google Cloud Data Catalog API Client -============================================== - -.. automodule:: google.cloud.datacatalog_v1.types - :members: \ No newline at end of file diff --git a/docs/gapic/v1beta1/api.rst b/docs/gapic/v1beta1/api.rst deleted file mode 100644 index 4c56460c..00000000 --- a/docs/gapic/v1beta1/api.rst +++ /dev/null @@ -1,6 +0,0 @@ -Client for Google Cloud Data Catalog API -======================================== - -.. automodule:: google.cloud.datacatalog_v1beta1 - :members: - :inherited-members: \ No newline at end of file diff --git a/docs/index.rst b/docs/index.rst index ad554342..2c61ee9b 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -8,9 +8,8 @@ v1 .. toctree:: :maxdepth: 2 - gapic/v1/api - gapic/v1/types - + datacatalog_v1/services + datacatalog_v1/types v1beta1 ------------- @@ -18,9 +17,18 @@ v1beta1 .. toctree:: :maxdepth: 2 - gapic/v1beta1/api - gapic/v1beta1/types + datacatalog_v1beta1/services + datacatalog_v1beta1/types + +Migration Guide +--------------- + +See the guide below for instructions on migrating to the 2.x release of this library. + +.. toctree:: + :maxdepth: 2 + UPGRADING Changelog --------- diff --git a/google/cloud/datacatalog/__init__.py b/google/cloud/datacatalog/__init__.py new file mode 100644 index 00000000..c33e190b --- /dev/null +++ b/google/cloud/datacatalog/__init__.py @@ -0,0 +1,226 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from google.cloud.datacatalog_v1beta1.services.data_catalog.async_client import ( + DataCatalogAsyncClient, +) +from google.cloud.datacatalog_v1beta1.services.data_catalog.client import ( + DataCatalogClient, +) +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager.async_client import ( + PolicyTagManagerAsyncClient, +) +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager.client import ( + PolicyTagManagerClient, +) +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization.async_client import ( + PolicyTagManagerSerializationAsyncClient, +) +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization.client import ( + PolicyTagManagerSerializationClient, +) +from google.cloud.datacatalog_v1beta1.types.common import IntegratedSystem +from google.cloud.datacatalog_v1beta1.types.datacatalog import CreateEntryGroupRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import CreateEntryRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import CreateTagRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import ( + CreateTagTemplateFieldRequest, +) +from google.cloud.datacatalog_v1beta1.types.datacatalog import CreateTagTemplateRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import DeleteEntryGroupRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import DeleteEntryRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import DeleteTagRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import ( + DeleteTagTemplateFieldRequest, +) +from google.cloud.datacatalog_v1beta1.types.datacatalog import DeleteTagTemplateRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import Entry +from google.cloud.datacatalog_v1beta1.types.datacatalog import EntryGroup +from google.cloud.datacatalog_v1beta1.types.datacatalog import EntryType +from google.cloud.datacatalog_v1beta1.types.datacatalog import GetEntryGroupRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import GetEntryRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import GetTagTemplateRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import ListEntriesRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import ListEntriesResponse +from google.cloud.datacatalog_v1beta1.types.datacatalog import ListEntryGroupsRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import ListEntryGroupsResponse +from google.cloud.datacatalog_v1beta1.types.datacatalog import ListTagsRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import ListTagsResponse +from google.cloud.datacatalog_v1beta1.types.datacatalog import LookupEntryRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import ( + RenameTagTemplateFieldRequest, +) +from google.cloud.datacatalog_v1beta1.types.datacatalog import SearchCatalogRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import SearchCatalogResponse +from google.cloud.datacatalog_v1beta1.types.datacatalog import UpdateEntryGroupRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import UpdateEntryRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import UpdateTagRequest +from google.cloud.datacatalog_v1beta1.types.datacatalog import ( + UpdateTagTemplateFieldRequest, +) +from google.cloud.datacatalog_v1beta1.types.datacatalog import UpdateTagTemplateRequest +from google.cloud.datacatalog_v1beta1.types.gcs_fileset_spec import GcsFileSpec +from google.cloud.datacatalog_v1beta1.types.gcs_fileset_spec import GcsFilesetSpec +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + CreatePolicyTagRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + CreateTaxonomyRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + DeletePolicyTagRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + DeleteTaxonomyRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanager import GetPolicyTagRequest +from google.cloud.datacatalog_v1beta1.types.policytagmanager import GetTaxonomyRequest +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + ListPolicyTagsRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + ListPolicyTagsResponse, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + ListTaxonomiesRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + ListTaxonomiesResponse, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanager import PolicyTag +from google.cloud.datacatalog_v1beta1.types.policytagmanager import Taxonomy +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + UpdatePolicyTagRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanager import ( + UpdateTaxonomyRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanagerserialization import ( + ExportTaxonomiesRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanagerserialization import ( + ExportTaxonomiesResponse, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanagerserialization import ( + ImportTaxonomiesRequest, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanagerserialization import ( + ImportTaxonomiesResponse, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanagerserialization import ( + InlineSource, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanagerserialization import ( + SerializedPolicyTag, +) +from google.cloud.datacatalog_v1beta1.types.policytagmanagerserialization import ( + SerializedTaxonomy, +) +from google.cloud.datacatalog_v1beta1.types.schema import ColumnSchema +from google.cloud.datacatalog_v1beta1.types.schema import Schema +from google.cloud.datacatalog_v1beta1.types.search import SearchCatalogResult +from google.cloud.datacatalog_v1beta1.types.search import SearchResultType +from google.cloud.datacatalog_v1beta1.types.table_spec import BigQueryDateShardedSpec +from google.cloud.datacatalog_v1beta1.types.table_spec import BigQueryTableSpec +from google.cloud.datacatalog_v1beta1.types.table_spec import TableSourceType +from google.cloud.datacatalog_v1beta1.types.table_spec import TableSpec +from google.cloud.datacatalog_v1beta1.types.table_spec import ViewSpec +from google.cloud.datacatalog_v1beta1.types.tags import FieldType +from google.cloud.datacatalog_v1beta1.types.tags import Tag +from google.cloud.datacatalog_v1beta1.types.tags import TagField +from google.cloud.datacatalog_v1beta1.types.tags import TagTemplate +from google.cloud.datacatalog_v1beta1.types.tags import TagTemplateField +from google.cloud.datacatalog_v1beta1.types.timestamps import SystemTimestamps + +__all__ = ( + "BigQueryDateShardedSpec", + "BigQueryTableSpec", + "ColumnSchema", + "CreateEntryGroupRequest", + "CreateEntryRequest", + "CreatePolicyTagRequest", + "CreateTagRequest", + "CreateTagTemplateFieldRequest", + "CreateTagTemplateRequest", + "CreateTaxonomyRequest", + "DataCatalogAsyncClient", + "DataCatalogClient", + "DeleteEntryGroupRequest", + "DeleteEntryRequest", + "DeletePolicyTagRequest", + "DeleteTagRequest", + "DeleteTagTemplateFieldRequest", + "DeleteTagTemplateRequest", + "DeleteTaxonomyRequest", + "Entry", + "EntryGroup", + "EntryType", + "ExportTaxonomiesRequest", + "ExportTaxonomiesResponse", + "FieldType", + "GcsFileSpec", + "GcsFilesetSpec", + "GetEntryGroupRequest", + "GetEntryRequest", + "GetPolicyTagRequest", + "GetTagTemplateRequest", + "GetTaxonomyRequest", + "ImportTaxonomiesRequest", + "ImportTaxonomiesResponse", + "InlineSource", + "IntegratedSystem", + "ListEntriesRequest", + "ListEntriesResponse", + "ListEntryGroupsRequest", + "ListEntryGroupsResponse", + "ListPolicyTagsRequest", + "ListPolicyTagsResponse", + "ListTagsRequest", + "ListTagsResponse", + "ListTaxonomiesRequest", + "ListTaxonomiesResponse", + "LookupEntryRequest", + "PolicyTag", + "PolicyTagManagerAsyncClient", + "PolicyTagManagerClient", + "PolicyTagManagerSerializationAsyncClient", + "PolicyTagManagerSerializationClient", + "RenameTagTemplateFieldRequest", + "Schema", + "SearchCatalogRequest", + "SearchCatalogResponse", + "SearchCatalogResult", + "SearchResultType", + "SerializedPolicyTag", + "SerializedTaxonomy", + "SystemTimestamps", + "TableSourceType", + "TableSpec", + "Tag", + "TagField", + "TagTemplate", + "TagTemplateField", + "Taxonomy", + "UpdateEntryGroupRequest", + "UpdateEntryRequest", + "UpdatePolicyTagRequest", + "UpdateTagRequest", + "UpdateTagTemplateFieldRequest", + "UpdateTagTemplateRequest", + "UpdateTaxonomyRequest", + "ViewSpec", +) diff --git a/google/cloud/datacatalog/py.typed b/google/cloud/datacatalog/py.typed new file mode 100644 index 00000000..bb4088a3 --- /dev/null +++ b/google/cloud/datacatalog/py.typed @@ -0,0 +1,2 @@ +# Marker file for PEP 561. +# The google-cloud-datacatalog package uses inline types. diff --git a/google/cloud/datacatalog_v1/__init__.py b/google/cloud/datacatalog_v1/__init__.py index c4476222..734df087 100644 --- a/google/cloud/datacatalog_v1/__init__.py +++ b/google/cloud/datacatalog_v1/__init__.py @@ -1,41 +1,121 @@ # -*- coding: utf-8 -*- -# + # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # -# https://www.apache.org/licenses/LICENSE-2.0 +# http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. +# - -from __future__ import absolute_import -import sys -import warnings - -from google.cloud.datacatalog_v1 import types -from google.cloud.datacatalog_v1.gapic import data_catalog_client -from google.cloud.datacatalog_v1.gapic import enums - - -if sys.version_info[:2] == (2, 7): - message = ( - "A future version of this library will drop support for Python 2.7. " - "More details about Python 2 support for Google Cloud Client Libraries " - "can be found at https://cloud.google.com/python/docs/python2-sunset/" - ) - warnings.warn(message, DeprecationWarning) - - -class DataCatalogClient(data_catalog_client.DataCatalogClient): - __doc__ = data_catalog_client.DataCatalogClient.__doc__ - enums = enums +from .services.data_catalog import DataCatalogClient +from .types.common import IntegratedSystem +from .types.datacatalog import CreateEntryGroupRequest +from .types.datacatalog import CreateEntryRequest +from .types.datacatalog import CreateTagRequest +from .types.datacatalog import CreateTagTemplateFieldRequest +from .types.datacatalog import CreateTagTemplateRequest +from .types.datacatalog import DeleteEntryGroupRequest +from .types.datacatalog import DeleteEntryRequest +from .types.datacatalog import DeleteTagRequest +from .types.datacatalog import DeleteTagTemplateFieldRequest +from .types.datacatalog import DeleteTagTemplateRequest +from .types.datacatalog import Entry +from .types.datacatalog import EntryGroup +from .types.datacatalog import EntryType +from .types.datacatalog import GetEntryGroupRequest +from .types.datacatalog import GetEntryRequest +from .types.datacatalog import GetTagTemplateRequest +from .types.datacatalog import ListEntriesRequest +from .types.datacatalog import ListEntriesResponse +from .types.datacatalog import ListEntryGroupsRequest +from .types.datacatalog import ListEntryGroupsResponse +from .types.datacatalog import ListTagsRequest +from .types.datacatalog import ListTagsResponse +from .types.datacatalog import LookupEntryRequest +from .types.datacatalog import RenameTagTemplateFieldRequest +from .types.datacatalog import SearchCatalogRequest +from .types.datacatalog import SearchCatalogResponse +from .types.datacatalog import UpdateEntryGroupRequest +from .types.datacatalog import UpdateEntryRequest +from .types.datacatalog import UpdateTagRequest +from .types.datacatalog import UpdateTagTemplateFieldRequest +from .types.datacatalog import UpdateTagTemplateRequest +from .types.gcs_fileset_spec import GcsFileSpec +from .types.gcs_fileset_spec import GcsFilesetSpec +from .types.schema import ColumnSchema +from .types.schema import Schema +from .types.search import SearchCatalogResult +from .types.search import SearchResultType +from .types.table_spec import BigQueryDateShardedSpec +from .types.table_spec import BigQueryTableSpec +from .types.table_spec import TableSourceType +from .types.table_spec import TableSpec +from .types.table_spec import ViewSpec +from .types.tags import FieldType +from .types.tags import Tag +from .types.tags import TagField +from .types.tags import TagTemplate +from .types.tags import TagTemplateField +from .types.timestamps import SystemTimestamps -__all__ = ("enums", "types", "DataCatalogClient") +__all__ = ( + "BigQueryDateShardedSpec", + "BigQueryTableSpec", + "ColumnSchema", + "CreateEntryGroupRequest", + "CreateEntryRequest", + "CreateTagRequest", + "CreateTagTemplateFieldRequest", + "CreateTagTemplateRequest", + "DeleteEntryGroupRequest", + "DeleteEntryRequest", + "DeleteTagRequest", + "DeleteTagTemplateFieldRequest", + "DeleteTagTemplateRequest", + "Entry", + "EntryGroup", + "EntryType", + "FieldType", + "GcsFileSpec", + "GcsFilesetSpec", + "GetEntryGroupRequest", + "GetEntryRequest", + "GetTagTemplateRequest", + "IntegratedSystem", + "ListEntriesRequest", + "ListEntriesResponse", + "ListEntryGroupsRequest", + "ListEntryGroupsResponse", + "ListTagsRequest", + "ListTagsResponse", + "LookupEntryRequest", + "RenameTagTemplateFieldRequest", + "Schema", + "SearchCatalogRequest", + "SearchCatalogResponse", + "SearchCatalogResult", + "SearchResultType", + "SystemTimestamps", + "TableSourceType", + "TableSpec", + "Tag", + "TagField", + "TagTemplate", + "TagTemplateField", + "UpdateEntryGroupRequest", + "UpdateEntryRequest", + "UpdateTagRequest", + "UpdateTagTemplateFieldRequest", + "UpdateTagTemplateRequest", + "ViewSpec", + "DataCatalogClient", +) diff --git a/google/cloud/datacatalog_v1/gapic/__init__.py b/google/cloud/datacatalog_v1/gapic/__init__.py deleted file mode 100644 index e69de29b..00000000 diff --git a/google/cloud/datacatalog_v1/gapic/data_catalog_client.py b/google/cloud/datacatalog_v1/gapic/data_catalog_client.py deleted file mode 100644 index bcf39755..00000000 --- a/google/cloud/datacatalog_v1/gapic/data_catalog_client.py +++ /dev/null @@ -1,2725 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Accesses the google.cloud.datacatalog.v1 DataCatalog API.""" - -import functools -import pkg_resources -import warnings - -from google.oauth2 import service_account -import google.api_core.client_options -import google.api_core.gapic_v1.client_info -import google.api_core.gapic_v1.config -import google.api_core.gapic_v1.method -import google.api_core.gapic_v1.routing_header -import google.api_core.grpc_helpers -import google.api_core.page_iterator -import google.api_core.path_template -import google.api_core.protobuf_helpers -import grpc - -from google.cloud.datacatalog_v1.gapic import data_catalog_client_config -from google.cloud.datacatalog_v1.gapic import enums -from google.cloud.datacatalog_v1.gapic.transports import data_catalog_grpc_transport -from google.cloud.datacatalog_v1.proto import datacatalog_pb2 -from google.cloud.datacatalog_v1.proto import datacatalog_pb2_grpc -from google.cloud.datacatalog_v1.proto import tags_pb2 -from google.iam.v1 import iam_policy_pb2 -from google.iam.v1 import options_pb2 -from google.iam.v1 import policy_pb2 -from google.protobuf import empty_pb2 -from google.protobuf import field_mask_pb2 - - -_GAPIC_LIBRARY_VERSION = pkg_resources.get_distribution( - "google-cloud-datacatalog" -).version - - -class DataCatalogClient(object): - """ - Data Catalog API service allows clients to discover, understand, and manage - their data. - """ - - SERVICE_ADDRESS = "datacatalog.googleapis.com:443" - """The default address of the service.""" - - # The name of the interface for this client. This is the key used to - # find the method configuration in the client_config dictionary. - _INTERFACE_NAME = "google.cloud.datacatalog.v1.DataCatalog" - - @classmethod - def from_service_account_file(cls, filename, *args, **kwargs): - """Creates an instance of this client using the provided credentials - file. - - Args: - filename (str): The path to the service account private key json - file. - args: Additional arguments to pass to the constructor. - kwargs: Additional arguments to pass to the constructor. - - Returns: - DataCatalogClient: The constructed client. - """ - credentials = service_account.Credentials.from_service_account_file(filename) - kwargs["credentials"] = credentials - return cls(*args, **kwargs) - - from_service_account_json = from_service_account_file - - @classmethod - def entry_path(cls, project, location, entry_group, entry): - """Return a fully-qualified entry string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}", - project=project, - location=location, - entry_group=entry_group, - entry=entry, - ) - - @classmethod - def entry_group_path(cls, project, location, entry_group): - """Return a fully-qualified entry_group string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/entryGroups/{entry_group}", - project=project, - location=location, - entry_group=entry_group, - ) - - @classmethod - def location_path(cls, project, location): - """Return a fully-qualified location string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}", - project=project, - location=location, - ) - - @classmethod - def tag_path(cls, project, location, entry_group, entry, tag): - """Return a fully-qualified tag string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}", - project=project, - location=location, - entry_group=entry_group, - entry=entry, - tag=tag, - ) - - @classmethod - def tag_template_path(cls, project, location, tag_template): - """Return a fully-qualified tag_template string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/tagTemplates/{tag_template}", - project=project, - location=location, - tag_template=tag_template, - ) - - @classmethod - def tag_template_field_path(cls, project, location, tag_template, field): - """Return a fully-qualified tag_template_field string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}", - project=project, - location=location, - tag_template=tag_template, - field=field, - ) - - def __init__( - self, - transport=None, - channel=None, - credentials=None, - client_config=None, - client_info=None, - client_options=None, - ): - """Constructor. - - Args: - transport (Union[~.DataCatalogGrpcTransport, - Callable[[~.Credentials, type], ~.DataCatalogGrpcTransport]): A transport - instance, responsible for actually making the API calls. - The default transport uses the gRPC protocol. - This argument may also be a callable which returns a - transport instance. Callables will be sent the credentials - as the first argument and the default transport class as - the second argument. - channel (grpc.Channel): DEPRECATED. A ``Channel`` instance - through which to make calls. This argument is mutually exclusive - with ``credentials``; providing both will raise an exception. - credentials (google.auth.credentials.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If none - are specified, the client will attempt to ascertain the - credentials from the environment. - This argument is mutually exclusive with providing a - transport instance to ``transport``; doing so will raise - an exception. - client_config (dict): DEPRECATED. A dictionary of call options for - each method. If not specified, the default configuration is used. - client_info (google.api_core.gapic_v1.client_info.ClientInfo): - The client info used to send a user-agent string along with - API requests. If ``None``, then default info will be used. - Generally, you only need to set this if you're developing - your own client library. - client_options (Union[dict, google.api_core.client_options.ClientOptions]): - Client options used to set user options on the client. API Endpoint - should be set through client_options. - """ - # Raise deprecation warnings for things we want to go away. - if client_config is not None: - warnings.warn( - "The `client_config` argument is deprecated.", - PendingDeprecationWarning, - stacklevel=2, - ) - else: - client_config = data_catalog_client_config.config - - if channel: - warnings.warn( - "The `channel` argument is deprecated; use " "`transport` instead.", - PendingDeprecationWarning, - stacklevel=2, - ) - - api_endpoint = self.SERVICE_ADDRESS - if client_options: - if type(client_options) == dict: - client_options = google.api_core.client_options.from_dict( - client_options - ) - if client_options.api_endpoint: - api_endpoint = client_options.api_endpoint - - # Instantiate the transport. - # The transport is responsible for handling serialization and - # deserialization and actually sending data to the service. - if transport: - if callable(transport): - self.transport = transport( - credentials=credentials, - default_class=data_catalog_grpc_transport.DataCatalogGrpcTransport, - address=api_endpoint, - ) - else: - if credentials: - raise ValueError( - "Received both a transport instance and " - "credentials; these are mutually exclusive." - ) - self.transport = transport - else: - self.transport = data_catalog_grpc_transport.DataCatalogGrpcTransport( - address=api_endpoint, channel=channel, credentials=credentials - ) - - if client_info is None: - client_info = google.api_core.gapic_v1.client_info.ClientInfo( - gapic_version=_GAPIC_LIBRARY_VERSION - ) - else: - client_info.gapic_version = _GAPIC_LIBRARY_VERSION - self._client_info = client_info - - # Parse out the default settings for retry and timeout for each RPC - # from the client configuration. - # (Ordinarily, these are the defaults specified in the `*_config.py` - # file next to this one.) - self._method_configs = google.api_core.gapic_v1.config.parse_method_configs( - client_config["interfaces"][self._INTERFACE_NAME] - ) - - # Save a dictionary of cached API call functions. - # These are the actual callables which invoke the proper - # transport methods, wrapped with `wrap_method` to add retry, - # timeout, and the like. - self._inner_api_calls = {} - - # Service calls - def search_catalog( - self, - scope, - query, - page_size=None, - order_by=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Searches Data Catalog for multiple resources like entries, tags that - match a query. - - This is a custom method - (https://cloud.google.com/apis/design/custom_methods) and does not - return the complete resource, only the resource identifier and high - level fields. Clients can subsequentally call ``Get`` methods. - - Note that Data Catalog search queries do not guarantee full recall. - Query results that match your query may not be returned, even in - subsequent result pages. Also note that results returned (and not - returned) can vary across repeated search queries. - - See `Data Catalog Search - Syntax `__ - for more information. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> # TODO: Initialize `scope`: - >>> scope = {} - >>> - >>> # TODO: Initialize `query`: - >>> query = '' - >>> - >>> # Iterate over all results - >>> for element in client.search_catalog(scope, query): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.search_catalog(scope, query).pages: - ... for element in page: - ... # process element - ... pass - - Args: - scope (Union[dict, ~google.cloud.datacatalog_v1.types.Scope]): Required. The scope of this search request. A ``scope`` that has - empty ``include_org_ids``, ``include_project_ids`` AND false - ``include_gcp_public_datasets`` is considered invalid. Data Catalog will - return an error in such a case. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.Scope` - query (str): Required. The query string in search query syntax. The query must be - non-empty. - - Query strings can be simple as "x" or more qualified as: - - - name:x - - column:x - - description:y - - Note: Query tokens need to have a minimum of 3 characters for substring - matching to work correctly. See `Data Catalog Search - Syntax `__ - for more information. - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - order_by (str): Specifies the ordering of results, currently supported - case-sensitive choices are: - - - ``relevance``, only supports descending - - ``last_modified_timestamp [asc|desc]``, defaults to descending if not - specified - - If not specified, defaults to ``relevance`` descending. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1.types.SearchCatalogResult` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "search_catalog" not in self._inner_api_calls: - self._inner_api_calls[ - "search_catalog" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.search_catalog, - default_retry=self._method_configs["SearchCatalog"].retry, - default_timeout=self._method_configs["SearchCatalog"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.SearchCatalogRequest( - scope=scope, query=query, page_size=page_size, order_by=order_by - ) - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["search_catalog"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="results", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def create_entry_group( - self, - parent, - entry_group_id, - entry_group=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates an EntryGroup. - - An entry group contains logically related entries together with Cloud - Identity and Access Management policies that specify the users who can - create, edit, and view entries within the entry group. - - Data Catalog automatically creates an entry group for BigQuery entries - ("@bigquery") and Pub/Sub topics ("@pubsub"). Users create their own - entry group to contain Cloud Storage fileset entries or custom type - entries, and the IAM policies associated with those entries. Entry - groups, like entries, can be searched. - - A maximum of 10,000 entry groups may be created per organization across - all locations. - - Users should enable the Data Catalog API in the project identified by - the ``parent`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> parent = client.location_path('[PROJECT]', '[LOCATION]') - >>> - >>> # TODO: Initialize `entry_group_id`: - >>> entry_group_id = '' - >>> - >>> response = client.create_entry_group(parent, entry_group_id) - - Args: - parent (str): Required. The name of the project this entry group is in. Example: - - - projects/{project_id}/locations/{location} - - Note that this EntryGroup and its child resources may not actually be - stored in the location in this name. - entry_group_id (str): Required. The id of the entry group to create. - The id must begin with a letter or underscore, contain only English - letters, numbers and underscores, and be at most 64 characters. - entry_group (Union[dict, ~google.cloud.datacatalog_v1.types.EntryGroup]): The entry group to create. Defaults to an empty entry group. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.EntryGroup` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.EntryGroup` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_entry_group" not in self._inner_api_calls: - self._inner_api_calls[ - "create_entry_group" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_entry_group, - default_retry=self._method_configs["CreateEntryGroup"].retry, - default_timeout=self._method_configs["CreateEntryGroup"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateEntryGroupRequest( - parent=parent, entry_group_id=entry_group_id, entry_group=entry_group - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_entry_group"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def get_entry_group( - self, - name, - read_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets an EntryGroup. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> response = client.get_entry_group(name) - - Args: - name (str): Required. The name of the entry group. For example, - ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. - read_mask (Union[dict, ~google.cloud.datacatalog_v1.types.FieldMask]): The fields to return. If not set or empty, all fields are returned. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.EntryGroup` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_entry_group" not in self._inner_api_calls: - self._inner_api_calls[ - "get_entry_group" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_entry_group, - default_retry=self._method_configs["GetEntryGroup"].retry, - default_timeout=self._method_configs["GetEntryGroup"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.GetEntryGroupRequest(name=name, read_mask=read_mask) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_entry_group"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_entry_group( - self, - entry_group, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates an EntryGroup. The user should enable the Data Catalog API - in the project identified by the ``entry_group.name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> # TODO: Initialize `entry_group`: - >>> entry_group = {} - >>> - >>> response = client.update_entry_group(entry_group) - - Args: - entry_group (Union[dict, ~google.cloud.datacatalog_v1.types.EntryGroup]): Required. The updated entry group. "name" field must be set. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.EntryGroup` - update_mask (Union[dict, ~google.cloud.datacatalog_v1.types.FieldMask]): The fields to update on the entry group. If absent or empty, all modifiable - fields are updated. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.EntryGroup` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_entry_group" not in self._inner_api_calls: - self._inner_api_calls[ - "update_entry_group" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_entry_group, - default_retry=self._method_configs["UpdateEntryGroup"].retry, - default_timeout=self._method_configs["UpdateEntryGroup"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateEntryGroupRequest( - entry_group=entry_group, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("entry_group.name", entry_group.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_entry_group"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_entry_group( - self, - name, - force=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes an EntryGroup. Only entry groups that do not contain entries - can be deleted. Users should enable the Data Catalog API in the project - identified by the ``name`` parameter (see [Data Catalog Resource - Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> client.delete_entry_group(name) - - Args: - name (str): Required. The name of the entry group. For example, - ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. - force (bool): Optional. If true, deletes all entries in the entry group. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_entry_group" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_entry_group" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_entry_group, - default_retry=self._method_configs["DeleteEntryGroup"].retry, - default_timeout=self._method_configs["DeleteEntryGroup"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteEntryGroupRequest(name=name, force=force) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_entry_group"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def list_entry_groups( - self, - parent, - page_size=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Lists entry groups. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> parent = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> # Iterate over all results - >>> for element in client.list_entry_groups(parent): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.list_entry_groups(parent).pages: - ... for element in page: - ... # process element - ... pass - - Args: - parent (str): Required. The name of the location that contains the entry groups, - which can be provided in URL format. Example: - - - projects/{project_id}/locations/{location} - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1.types.EntryGroup` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "list_entry_groups" not in self._inner_api_calls: - self._inner_api_calls[ - "list_entry_groups" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.list_entry_groups, - default_retry=self._method_configs["ListEntryGroups"].retry, - default_timeout=self._method_configs["ListEntryGroups"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.ListEntryGroupsRequest( - parent=parent, page_size=page_size - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["list_entry_groups"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="entry_groups", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def create_entry( - self, - parent, - entry_id, - entry, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates an entry. Only entries of 'FILESET' type or user-specified - type can be created. - - Users should enable the Data Catalog API in the project identified by - the ``parent`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - A maximum of 100,000 entries may be created per entry group. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> parent = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> # TODO: Initialize `entry_id`: - >>> entry_id = '' - >>> - >>> # TODO: Initialize `entry`: - >>> entry = {} - >>> - >>> response = client.create_entry(parent, entry_id, entry) - - Args: - parent (str): Required. The name of the entry group this entry is in. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - - Note that this Entry and its child resources may not actually be stored - in the location in this name. - entry_id (str): Required. The id of the entry to create. - entry (Union[dict, ~google.cloud.datacatalog_v1.types.Entry]): Required. The entry to create. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.Entry` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.Entry` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "create_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_entry, - default_retry=self._method_configs["CreateEntry"].retry, - default_timeout=self._method_configs["CreateEntry"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateEntryRequest( - parent=parent, entry_id=entry_id, entry=entry - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_entry( - self, - entry, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates an existing entry. Users should enable the Data Catalog API - in the project identified by the ``entry.name`` parameter (see [Data - Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> # TODO: Initialize `entry`: - >>> entry = {} - >>> - >>> response = client.update_entry(entry) - - Args: - entry (Union[dict, ~google.cloud.datacatalog_v1.types.Entry]): Required. The updated entry. The "name" field must be set. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.Entry` - update_mask (Union[dict, ~google.cloud.datacatalog_v1.types.FieldMask]): The fields to update on the entry. If absent or empty, all - modifiable fields are updated. - - The following fields are modifiable: - - - For entries with type ``DATA_STREAM``: - - - ``schema`` - - - For entries with type ``FILESET`` - - - ``schema`` - - ``display_name`` - - ``description`` - - ``gcs_fileset_spec`` - - ``gcs_fileset_spec.file_patterns`` - - - For entries with ``user_specified_type`` - - - ``schema`` - - ``display_name`` - - ``description`` - - user_specified_type - - user_specified_system - - linked_resource - - source_system_timestamps - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.Entry` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "update_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_entry, - default_retry=self._method_configs["UpdateEntry"].retry, - default_timeout=self._method_configs["UpdateEntry"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateEntryRequest( - entry=entry, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("entry.name", entry.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_entry( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes an existing entry. Only entries created through - ``CreateEntry`` method can be deleted. Users should enable the Data - Catalog API in the project identified by the ``name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.entry_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]') - >>> - >>> client.delete_entry(name) - - Args: - name (str): Required. The name of the entry. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_entry, - default_retry=self._method_configs["DeleteEntry"].retry, - default_timeout=self._method_configs["DeleteEntry"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteEntryRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def get_entry( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets an entry. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.entry_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]') - >>> - >>> response = client.get_entry(name) - - Args: - name (str): Required. The name of the entry. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.Entry` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "get_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_entry, - default_retry=self._method_configs["GetEntry"].retry, - default_timeout=self._method_configs["GetEntry"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.GetEntryRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def lookup_entry( - self, - linked_resource=None, - sql_resource=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Get an entry by target resource name. This method allows clients to use - the resource name from the source Google Cloud Platform service to get the - Data Catalog Entry. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> response = client.lookup_entry() - - Args: - linked_resource (str): The full name of the Google Cloud Platform resource the Data Catalog - entry represents. See: - https://cloud.google.com/apis/design/resource_names#full_resource_name. - Full names are case-sensitive. - - Examples: - - - //bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId - - //pubsub.googleapis.com/projects/projectId/topics/topicId - sql_resource (str): The SQL name of the entry. SQL names are case-sensitive. - - Examples: - - - ``pubsub.project_id.topic_id`` - - :literal:`pubsub.project_id.`topic.id.with.dots\`` - - ``bigquery.table.project_id.dataset_id.table_id`` - - ``bigquery.dataset.project_id.dataset_id`` - - ``datacatalog.entry.project_id.location_id.entry_group_id.entry_id`` - - ``*_id``\ s shoud satisfy the standard SQL rules for identifiers. - https://cloud.google.com/bigquery/docs/reference/standard-sql/lexical. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.Entry` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "lookup_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "lookup_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.lookup_entry, - default_retry=self._method_configs["LookupEntry"].retry, - default_timeout=self._method_configs["LookupEntry"].timeout, - client_info=self._client_info, - ) - - # Sanity check: We have some fields which are mutually exclusive; - # raise ValueError if more than one is sent. - google.api_core.protobuf_helpers.check_oneof( - linked_resource=linked_resource, sql_resource=sql_resource - ) - - request = datacatalog_pb2.LookupEntryRequest( - linked_resource=linked_resource, sql_resource=sql_resource - ) - return self._inner_api_calls["lookup_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def list_entries( - self, - parent, - page_size=None, - read_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Lists entries. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> parent = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> # Iterate over all results - >>> for element in client.list_entries(parent): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.list_entries(parent).pages: - ... for element in page: - ... # process element - ... pass - - Args: - parent (str): Required. The name of the entry group that contains the entries, - which can be provided in URL format. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - read_mask (Union[dict, ~google.cloud.datacatalog_v1.types.FieldMask]): The fields to return for each Entry. If not set or empty, all fields - are returned. For example, setting read_mask to contain only one path - "name" will cause ListEntries to return a list of Entries with only - "name" field. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1.types.Entry` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "list_entries" not in self._inner_api_calls: - self._inner_api_calls[ - "list_entries" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.list_entries, - default_retry=self._method_configs["ListEntries"].retry, - default_timeout=self._method_configs["ListEntries"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.ListEntriesRequest( - parent=parent, page_size=page_size, read_mask=read_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["list_entries"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="entries", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def create_tag_template( - self, - parent, - tag_template_id, - tag_template, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates a tag template. The user should enable the Data Catalog API - in the project identified by the ``parent`` parameter (see `Data Catalog - Resource - Project `__ - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> parent = client.location_path('[PROJECT]', '[LOCATION]') - >>> - >>> # TODO: Initialize `tag_template_id`: - >>> tag_template_id = '' - >>> - >>> # TODO: Initialize `tag_template`: - >>> tag_template = {} - >>> - >>> response = client.create_tag_template(parent, tag_template_id, tag_template) - - Args: - parent (str): Required. The name of the project and the template location - `region `__. - - Example: - - - projects/{project_id}/locations/us-central1 - tag_template_id (str): Required. The id of the tag template to create. - tag_template (Union[dict, ~google.cloud.datacatalog_v1.types.TagTemplate]): Required. The tag template to create. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.TagTemplate` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.TagTemplate` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_tag_template" not in self._inner_api_calls: - self._inner_api_calls[ - "create_tag_template" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_tag_template, - default_retry=self._method_configs["CreateTagTemplate"].retry, - default_timeout=self._method_configs["CreateTagTemplate"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateTagTemplateRequest( - parent=parent, tag_template_id=tag_template_id, tag_template=tag_template - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_tag_template"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def get_tag_template( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets a tag template. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.tag_template_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]') - >>> - >>> response = client.get_tag_template(name) - - Args: - name (str): Required. The name of the tag template. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.TagTemplate` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_tag_template" not in self._inner_api_calls: - self._inner_api_calls[ - "get_tag_template" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_tag_template, - default_retry=self._method_configs["GetTagTemplate"].retry, - default_timeout=self._method_configs["GetTagTemplate"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.GetTagTemplateRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_tag_template"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_tag_template( - self, - tag_template, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates a tag template. This method cannot be used to update the - fields of a template. The tag template fields are represented as - separate resources and should be updated using their own - create/update/delete methods. Users should enable the Data Catalog API - in the project identified by the ``tag_template.name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> # TODO: Initialize `tag_template`: - >>> tag_template = {} - >>> - >>> response = client.update_tag_template(tag_template) - - Args: - tag_template (Union[dict, ~google.cloud.datacatalog_v1.types.TagTemplate]): Required. The template to update. The "name" field must be set. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.TagTemplate` - update_mask (Union[dict, ~google.cloud.datacatalog_v1.types.FieldMask]): The field mask specifies the parts of the template to overwrite. - - Allowed fields: - - - ``display_name`` - - If absent or empty, all of the allowed fields above will be updated. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.TagTemplate` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_tag_template" not in self._inner_api_calls: - self._inner_api_calls[ - "update_tag_template" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_tag_template, - default_retry=self._method_configs["UpdateTagTemplate"].retry, - default_timeout=self._method_configs["UpdateTagTemplate"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateTagTemplateRequest( - tag_template=tag_template, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("tag_template.name", tag_template.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_tag_template"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_tag_template( - self, - name, - force, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes a tag template and all tags using the template. Users should - enable the Data Catalog API in the project identified by the ``name`` - parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.tag_template_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]') - >>> - >>> # TODO: Initialize `force`: - >>> force = False - >>> - >>> client.delete_tag_template(name, force) - - Args: - name (str): Required. The name of the tag template to delete. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} - force (bool): Required. Currently, this field must always be set to ``true``. This - confirms the deletion of any possible tags using this template. - ``force = false`` will be supported in the future. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_tag_template" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_tag_template" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_tag_template, - default_retry=self._method_configs["DeleteTagTemplate"].retry, - default_timeout=self._method_configs["DeleteTagTemplate"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteTagTemplateRequest(name=name, force=force) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_tag_template"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def create_tag_template_field( - self, - parent, - tag_template_field_id, - tag_template_field, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates a field in a tag template. The user should enable the Data - Catalog API in the project identified by the ``parent`` parameter (see - `Data Catalog Resource - Project `__ - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> parent = client.tag_template_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]') - >>> - >>> # TODO: Initialize `tag_template_field_id`: - >>> tag_template_field_id = '' - >>> - >>> # TODO: Initialize `tag_template_field`: - >>> tag_template_field = {} - >>> - >>> response = client.create_tag_template_field(parent, tag_template_field_id, tag_template_field) - - Args: - parent (str): Required. The name of the project and the template location - `region `__. - - Example: - - - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} - tag_template_field_id (str): Required. The ID of the tag template field to create. Field ids can - contain letters (both uppercase and lowercase), numbers (0-9), - underscores (_) and dashes (-). Field IDs must be at least 1 character - long and at most 128 characters long. Field IDs must also be unique - within their template. - tag_template_field (Union[dict, ~google.cloud.datacatalog_v1.types.TagTemplateField]): Required. The tag template field to create. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.TagTemplateField` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.TagTemplateField` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_tag_template_field" not in self._inner_api_calls: - self._inner_api_calls[ - "create_tag_template_field" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_tag_template_field, - default_retry=self._method_configs["CreateTagTemplateField"].retry, - default_timeout=self._method_configs["CreateTagTemplateField"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateTagTemplateFieldRequest( - parent=parent, - tag_template_field_id=tag_template_field_id, - tag_template_field=tag_template_field, - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_tag_template_field"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_tag_template_field( - self, - name, - tag_template_field, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates a field in a tag template. This method cannot be used to - update the field type. Users should enable the Data Catalog API in the - project identified by the ``name`` parameter (see [Data Catalog Resource - Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.tag_template_field_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]', '[FIELD]') - >>> - >>> # TODO: Initialize `tag_template_field`: - >>> tag_template_field = {} - >>> - >>> response = client.update_tag_template_field(name, tag_template_field) - - Args: - name (str): Required. The name of the tag template field. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} - tag_template_field (Union[dict, ~google.cloud.datacatalog_v1.types.TagTemplateField]): Required. The template to update. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.TagTemplateField` - update_mask (Union[dict, ~google.cloud.datacatalog_v1.types.FieldMask]): Optional. The field mask specifies the parts of the template to be - updated. Allowed fields: - - - ``display_name`` - - ``type.enum_type`` - - ``is_required`` - - If ``update_mask`` is not set or empty, all of the allowed fields above - will be updated. - - When updating an enum type, the provided values will be merged with the - existing values. Therefore, enum values can only be added, existing enum - values cannot be deleted nor renamed. Updating a template field from - optional to required is NOT allowed. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.TagTemplateField` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_tag_template_field" not in self._inner_api_calls: - self._inner_api_calls[ - "update_tag_template_field" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_tag_template_field, - default_retry=self._method_configs["UpdateTagTemplateField"].retry, - default_timeout=self._method_configs["UpdateTagTemplateField"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateTagTemplateFieldRequest( - name=name, tag_template_field=tag_template_field, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_tag_template_field"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def rename_tag_template_field( - self, - name, - new_tag_template_field_id, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Renames a field in a tag template. The user should enable the Data - Catalog API in the project identified by the ``name`` parameter (see - `Data Catalog Resource - Project `__ - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.tag_template_field_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]', '[FIELD]') - >>> - >>> # TODO: Initialize `new_tag_template_field_id`: - >>> new_tag_template_field_id = '' - >>> - >>> response = client.rename_tag_template_field(name, new_tag_template_field_id) - - Args: - name (str): Required. The name of the tag template. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} - new_tag_template_field_id (str): Required. The new ID of this tag template field. For example, - ``my_new_field``. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.TagTemplateField` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "rename_tag_template_field" not in self._inner_api_calls: - self._inner_api_calls[ - "rename_tag_template_field" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.rename_tag_template_field, - default_retry=self._method_configs["RenameTagTemplateField"].retry, - default_timeout=self._method_configs["RenameTagTemplateField"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.RenameTagTemplateFieldRequest( - name=name, new_tag_template_field_id=new_tag_template_field_id - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["rename_tag_template_field"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_tag_template_field( - self, - name, - force, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes a field in a tag template and all uses of that field. Users - should enable the Data Catalog API in the project identified by the - ``name`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.tag_template_field_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]', '[FIELD]') - >>> - >>> # TODO: Initialize `force`: - >>> force = False - >>> - >>> client.delete_tag_template_field(name, force) - - Args: - name (str): Required. The name of the tag template field to delete. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} - force (bool): Required. Currently, this field must always be set to ``true``. This - confirms the deletion of this field from any tags using this field. - ``force = false`` will be supported in the future. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_tag_template_field" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_tag_template_field" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_tag_template_field, - default_retry=self._method_configs["DeleteTagTemplateField"].retry, - default_timeout=self._method_configs["DeleteTagTemplateField"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteTagTemplateFieldRequest(name=name, force=force) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_tag_template_field"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def create_tag( - self, - parent, - tag, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates a tag on an ``Entry``. Note: The project identified by the - ``parent`` parameter for the - `tag `__ - and the `tag - template `__ - used to create the tag must be from the same organization. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> parent = client.tag_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]', '[TAG]') - >>> - >>> # TODO: Initialize `tag`: - >>> tag = {} - >>> - >>> response = client.create_tag(parent, tag) - - Args: - parent (str): Required. The name of the resource to attach this tag to. Tags can - be attached to Entries. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} - - Note that this Tag and its child resources may not actually be stored in - the location in this name. - tag (Union[dict, ~google.cloud.datacatalog_v1.types.Tag]): Required. The tag to create. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.Tag` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.Tag` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "create_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_tag, - default_retry=self._method_configs["CreateTag"].retry, - default_timeout=self._method_configs["CreateTag"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateTagRequest(parent=parent, tag=tag) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_tag( - self, - tag, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates an existing tag. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> # TODO: Initialize `tag`: - >>> tag = {} - >>> - >>> response = client.update_tag(tag) - - Args: - tag (Union[dict, ~google.cloud.datacatalog_v1.types.Tag]): Required. The updated tag. The "name" field must be set. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.Tag` - update_mask (Union[dict, ~google.cloud.datacatalog_v1.types.FieldMask]): The fields to update on the Tag. If absent or empty, all modifiable - fields are updated. Currently the only modifiable field is the field - ``fields``. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.Tag` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "update_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_tag, - default_retry=self._method_configs["UpdateTag"].retry, - default_timeout=self._method_configs["UpdateTag"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateTagRequest(tag=tag, update_mask=update_mask) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("tag.name", tag.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_tag( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes a tag. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> name = client.entry_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]') - >>> - >>> client.delete_tag(name) - - Args: - name (str): Required. The name of the tag to delete. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_tag, - default_retry=self._method_configs["DeleteTag"].retry, - default_timeout=self._method_configs["DeleteTag"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteTagRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def list_tags( - self, - parent, - page_size=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Lists the tags on an ``Entry``. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> parent = client.entry_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]') - >>> - >>> # Iterate over all results - >>> for element in client.list_tags(parent): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.list_tags(parent).pages: - ... for element in page: - ... # process element - ... pass - - Args: - parent (str): Required. The name of the Data Catalog resource to list the tags of. - The resource could be an ``Entry`` or an ``EntryGroup``. - - Examples: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1.types.Tag` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "list_tags" not in self._inner_api_calls: - self._inner_api_calls[ - "list_tags" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.list_tags, - default_retry=self._method_configs["ListTags"].retry, - default_timeout=self._method_configs["ListTags"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.ListTagsRequest(parent=parent, page_size=page_size) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["list_tags"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="tags", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def set_iam_policy( - self, - resource, - policy, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Sets the access control policy for a resource. Replaces any existing - policy. Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - Callers must have following Google IAM permission - - - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on tag - templates. - - ``datacatalog.entries.setIamPolicy`` to set policies on entries. - - ``datacatalog.entryGroups.setIamPolicy`` to set policies on entry - groups. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> # TODO: Initialize `resource`: - >>> resource = '' - >>> - >>> # TODO: Initialize `policy`: - >>> policy = {} - >>> - >>> response = client.set_iam_policy(resource, policy) - - Args: - resource (str): REQUIRED: The resource for which the policy is being specified. - See the operation documentation for the appropriate value for this field. - policy (Union[dict, ~google.cloud.datacatalog_v1.types.Policy]): REQUIRED: The complete policy to be applied to the ``resource``. The - size of the policy is limited to a few 10s of KB. An empty policy is a - valid policy but certain Cloud Platform services (such as Projects) - might reject them. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.Policy` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.Policy` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "set_iam_policy" not in self._inner_api_calls: - self._inner_api_calls[ - "set_iam_policy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.set_iam_policy, - default_retry=self._method_configs["SetIamPolicy"].retry, - default_timeout=self._method_configs["SetIamPolicy"].timeout, - client_info=self._client_info, - ) - - request = iam_policy_pb2.SetIamPolicyRequest(resource=resource, policy=policy) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("resource", resource)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["set_iam_policy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def get_iam_policy( - self, - resource, - options_=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets the access control policy for a resource. A ``NOT_FOUND`` error - is returned if the resource does not exist. An empty policy is returned - if the resource exists but does not have a policy set on it. - - Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - Callers must have following Google IAM permission - - - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on tag - templates. - - ``datacatalog.entries.getIamPolicy`` to get policies on entries. - - ``datacatalog.entryGroups.getIamPolicy`` to get policies on entry - groups. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> # TODO: Initialize `resource`: - >>> resource = '' - >>> - >>> response = client.get_iam_policy(resource) - - Args: - resource (str): REQUIRED: The resource for which the policy is being requested. - See the operation documentation for the appropriate value for this field. - options_ (Union[dict, ~google.cloud.datacatalog_v1.types.GetPolicyOptions]): OPTIONAL: A ``GetPolicyOptions`` object for specifying options to - ``GetIamPolicy``. This field is only used by Cloud IAM. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1.types.GetPolicyOptions` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.Policy` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_iam_policy" not in self._inner_api_calls: - self._inner_api_calls[ - "get_iam_policy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_iam_policy, - default_retry=self._method_configs["GetIamPolicy"].retry, - default_timeout=self._method_configs["GetIamPolicy"].timeout, - client_info=self._client_info, - ) - - request = iam_policy_pb2.GetIamPolicyRequest( - resource=resource, options=options_ - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("resource", resource)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_iam_policy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def test_iam_permissions( - self, - resource, - permissions, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Returns the caller's permissions on a resource. If the resource does - not exist, an empty set of permissions is returned (We don't return a - ``NOT_FOUND`` error). - - Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - A caller is not required to have Google IAM permission to make this - request. - - Example: - >>> from google.cloud import datacatalog_v1 - >>> - >>> client = datacatalog_v1.DataCatalogClient() - >>> - >>> # TODO: Initialize `resource`: - >>> resource = '' - >>> - >>> # TODO: Initialize `permissions`: - >>> permissions = [] - >>> - >>> response = client.test_iam_permissions(resource, permissions) - - Args: - resource (str): REQUIRED: The resource for which the policy detail is being requested. - See the operation documentation for the appropriate value for this field. - permissions (list[str]): The set of permissions to check for the ``resource``. Permissions - with wildcards (such as '*' or 'storage.*') are not allowed. For more - information see `IAM - Overview `__. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1.types.TestIamPermissionsResponse` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "test_iam_permissions" not in self._inner_api_calls: - self._inner_api_calls[ - "test_iam_permissions" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.test_iam_permissions, - default_retry=self._method_configs["TestIamPermissions"].retry, - default_timeout=self._method_configs["TestIamPermissions"].timeout, - client_info=self._client_info, - ) - - request = iam_policy_pb2.TestIamPermissionsRequest( - resource=resource, permissions=permissions - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("resource", resource)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["test_iam_permissions"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) diff --git a/google/cloud/datacatalog_v1/gapic/data_catalog_client_config.py b/google/cloud/datacatalog_v1/gapic/data_catalog_client_config.py deleted file mode 100644 index d59dc6a5..00000000 --- a/google/cloud/datacatalog_v1/gapic/data_catalog_client_config.py +++ /dev/null @@ -1,177 +0,0 @@ -config = { - "interfaces": { - "google.cloud.datacatalog.v1.DataCatalog": { - "retry_codes": { - "retry_policy_1_codes": ["UNAVAILABLE"], - "no_retry_codes": [], - "no_retry_1_codes": [], - }, - "retry_params": { - "retry_policy_1_params": { - "initial_retry_delay_millis": 100, - "retry_delay_multiplier": 1.3, - "max_retry_delay_millis": 60000, - "initial_rpc_timeout_millis": 60000, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 60000, - "total_timeout_millis": 60000, - }, - "no_retry_params": { - "initial_retry_delay_millis": 0, - "retry_delay_multiplier": 0.0, - "max_retry_delay_millis": 0, - "initial_rpc_timeout_millis": 0, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 0, - "total_timeout_millis": 0, - }, - "no_retry_1_params": { - "initial_retry_delay_millis": 0, - "retry_delay_multiplier": 0.0, - "max_retry_delay_millis": 0, - "initial_rpc_timeout_millis": 60000, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 60000, - "total_timeout_millis": 60000, - }, - }, - "methods": { - "SearchCatalog": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "CreateEntryGroup": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "GetEntryGroup": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "UpdateEntryGroup": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "DeleteEntryGroup": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "ListEntryGroups": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "CreateEntry": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "UpdateEntry": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "DeleteEntry": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "GetEntry": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "LookupEntry": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "ListEntries": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "CreateTagTemplate": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "GetTagTemplate": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "UpdateTagTemplate": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "DeleteTagTemplate": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "CreateTagTemplateField": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "UpdateTagTemplateField": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "RenameTagTemplateField": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "DeleteTagTemplateField": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "CreateTag": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "UpdateTag": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "DeleteTag": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "ListTags": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "SetIamPolicy": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "GetIamPolicy": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "TestIamPermissions": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - }, - } - } -} diff --git a/google/cloud/datacatalog_v1/gapic/enums.py b/google/cloud/datacatalog_v1/gapic/enums.py deleted file mode 100644 index b5c584ba..00000000 --- a/google/cloud/datacatalog_v1/gapic/enums.py +++ /dev/null @@ -1,110 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Wrappers for protocol buffer enum types.""" - -import enum - - -class EntryType(enum.IntEnum): - """ - Entry resources in Data Catalog can be of different types e.g. a - BigQuery Table entry is of type ``TABLE``. This enum describes all the - possible types Data Catalog contains. - - Attributes: - ENTRY_TYPE_UNSPECIFIED (int): Default unknown type. - TABLE (int): Output only. The type of entry that has a GoogleSQL schema, including - logical views. - MODEL (int): Output only. The type of models, examples include - https://cloud.google.com/bigquery-ml/docs/bigqueryml-intro - DATA_STREAM (int): Output only. An entry type which is used for streaming entries. Example: - Pub/Sub topic. - FILESET (int): An entry type which is a set of files or objects. Example: - Cloud Storage fileset. - """ - - ENTRY_TYPE_UNSPECIFIED = 0 - TABLE = 2 - MODEL = 5 - DATA_STREAM = 3 - FILESET = 4 - - -class IntegratedSystem(enum.IntEnum): - """ - This enum describes all the possible systems that Data Catalog integrates - with. - - Attributes: - INTEGRATED_SYSTEM_UNSPECIFIED (int): Default unknown system. - BIGQUERY (int): BigQuery. - CLOUD_PUBSUB (int): Cloud Pub/Sub. - """ - - INTEGRATED_SYSTEM_UNSPECIFIED = 0 - BIGQUERY = 1 - CLOUD_PUBSUB = 2 - - -class SearchResultType(enum.IntEnum): - """ - The different types of resources that can be returned in search. - - Attributes: - SEARCH_RESULT_TYPE_UNSPECIFIED (int): Default unknown type. - ENTRY (int): An ``Entry``. - TAG_TEMPLATE (int): A ``TagTemplate``. - ENTRY_GROUP (int): An ``EntryGroup``. - """ - - SEARCH_RESULT_TYPE_UNSPECIFIED = 0 - ENTRY = 1 - TAG_TEMPLATE = 2 - ENTRY_GROUP = 3 - - -class TableSourceType(enum.IntEnum): - """ - Table source type. - - Attributes: - TABLE_SOURCE_TYPE_UNSPECIFIED (int): Default unknown type. - BIGQUERY_VIEW (int): Table view. - BIGQUERY_TABLE (int): BigQuery native table. - """ - - TABLE_SOURCE_TYPE_UNSPECIFIED = 0 - BIGQUERY_VIEW = 2 - BIGQUERY_TABLE = 5 - - -class FieldType(object): - class PrimitiveType(enum.IntEnum): - """ - Attributes: - PRIMITIVE_TYPE_UNSPECIFIED (int): This is the default invalid value for a type. - DOUBLE (int): A double precision number. - STRING (int): An UTF-8 string. - BOOL (int): A boolean value. - TIMESTAMP (int): A timestamp. - """ - - PRIMITIVE_TYPE_UNSPECIFIED = 0 - DOUBLE = 1 - STRING = 2 - BOOL = 3 - TIMESTAMP = 4 diff --git a/google/cloud/datacatalog_v1/gapic/transports/__init__.py b/google/cloud/datacatalog_v1/gapic/transports/__init__.py deleted file mode 100644 index e69de29b..00000000 diff --git a/google/cloud/datacatalog_v1/gapic/transports/data_catalog_grpc_transport.py b/google/cloud/datacatalog_v1/gapic/transports/data_catalog_grpc_transport.py deleted file mode 100644 index df44fac7..00000000 --- a/google/cloud/datacatalog_v1/gapic/transports/data_catalog_grpc_transport.py +++ /dev/null @@ -1,603 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - - -import google.api_core.grpc_helpers - -from google.cloud.datacatalog_v1.proto import datacatalog_pb2_grpc - - -class DataCatalogGrpcTransport(object): - """gRPC transport class providing stubs for - google.cloud.datacatalog.v1 DataCatalog API. - - The transport provides access to the raw gRPC stubs, - which can be used to take advantage of advanced - features of gRPC. - """ - - # The scopes needed to make gRPC calls to all of the methods defined - # in this service. - _OAUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",) - - def __init__( - self, channel=None, credentials=None, address="datacatalog.googleapis.com:443" - ): - """Instantiate the transport class. - - Args: - channel (grpc.Channel): A ``Channel`` instance through - which to make calls. This argument is mutually exclusive - with ``credentials``; providing both will raise an exception. - credentials (google.auth.credentials.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If none - are specified, the client will attempt to ascertain the - credentials from the environment. - address (str): The address where the service is hosted. - """ - # If both `channel` and `credentials` are specified, raise an - # exception (channels come with credentials baked in already). - if channel is not None and credentials is not None: - raise ValueError( - "The `channel` and `credentials` arguments are mutually " "exclusive." - ) - - # Create the channel. - if channel is None: - channel = self.create_channel( - address=address, - credentials=credentials, - options={ - "grpc.max_send_message_length": -1, - "grpc.max_receive_message_length": -1, - }.items(), - ) - - self._channel = channel - - # gRPC uses objects called "stubs" that are bound to the - # channel and provide a basic method for each RPC. - self._stubs = { - "data_catalog_stub": datacatalog_pb2_grpc.DataCatalogStub(channel) - } - - @classmethod - def create_channel( - cls, address="datacatalog.googleapis.com:443", credentials=None, **kwargs - ): - """Create and return a gRPC channel object. - - Args: - address (str): The host for the channel to use. - credentials (~.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If - none are specified, the client will attempt to ascertain - the credentials from the environment. - kwargs (dict): Keyword arguments, which are passed to the - channel creation. - - Returns: - grpc.Channel: A gRPC channel object. - """ - return google.api_core.grpc_helpers.create_channel( - address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs - ) - - @property - def channel(self): - """The gRPC channel used by the transport. - - Returns: - grpc.Channel: A gRPC channel object. - """ - return self._channel - - @property - def search_catalog(self): - """Return the gRPC stub for :meth:`DataCatalogClient.search_catalog`. - - Searches Data Catalog for multiple resources like entries, tags that - match a query. - - This is a custom method - (https://cloud.google.com/apis/design/custom_methods) and does not - return the complete resource, only the resource identifier and high - level fields. Clients can subsequentally call ``Get`` methods. - - Note that Data Catalog search queries do not guarantee full recall. - Query results that match your query may not be returned, even in - subsequent result pages. Also note that results returned (and not - returned) can vary across repeated search queries. - - See `Data Catalog Search - Syntax `__ - for more information. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].SearchCatalog - - @property - def create_entry_group(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_entry_group`. - - Creates an EntryGroup. - - An entry group contains logically related entries together with Cloud - Identity and Access Management policies that specify the users who can - create, edit, and view entries within the entry group. - - Data Catalog automatically creates an entry group for BigQuery entries - ("@bigquery") and Pub/Sub topics ("@pubsub"). Users create their own - entry group to contain Cloud Storage fileset entries or custom type - entries, and the IAM policies associated with those entries. Entry - groups, like entries, can be searched. - - A maximum of 10,000 entry groups may be created per organization across - all locations. - - Users should enable the Data Catalog API in the project identified by - the ``parent`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateEntryGroup - - @property - def get_entry_group(self): - """Return the gRPC stub for :meth:`DataCatalogClient.get_entry_group`. - - Gets an EntryGroup. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].GetEntryGroup - - @property - def update_entry_group(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_entry_group`. - - Updates an EntryGroup. The user should enable the Data Catalog API - in the project identified by the ``entry_group.name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateEntryGroup - - @property - def delete_entry_group(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_entry_group`. - - Deletes an EntryGroup. Only entry groups that do not contain entries - can be deleted. Users should enable the Data Catalog API in the project - identified by the ``name`` parameter (see [Data Catalog Resource - Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteEntryGroup - - @property - def list_entry_groups(self): - """Return the gRPC stub for :meth:`DataCatalogClient.list_entry_groups`. - - Lists entry groups. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].ListEntryGroups - - @property - def create_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_entry`. - - Creates an entry. Only entries of 'FILESET' type or user-specified - type can be created. - - Users should enable the Data Catalog API in the project identified by - the ``parent`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - A maximum of 100,000 entries may be created per entry group. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateEntry - - @property - def update_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_entry`. - - Updates an existing entry. Users should enable the Data Catalog API - in the project identified by the ``entry.name`` parameter (see [Data - Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateEntry - - @property - def delete_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_entry`. - - Deletes an existing entry. Only entries created through - ``CreateEntry`` method can be deleted. Users should enable the Data - Catalog API in the project identified by the ``name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteEntry - - @property - def get_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.get_entry`. - - Gets an entry. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].GetEntry - - @property - def lookup_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.lookup_entry`. - - Get an entry by target resource name. This method allows clients to use - the resource name from the source Google Cloud Platform service to get the - Data Catalog Entry. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].LookupEntry - - @property - def list_entries(self): - """Return the gRPC stub for :meth:`DataCatalogClient.list_entries`. - - Lists entries. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].ListEntries - - @property - def create_tag_template(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_tag_template`. - - Creates a tag template. The user should enable the Data Catalog API - in the project identified by the ``parent`` parameter (see `Data Catalog - Resource - Project `__ - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateTagTemplate - - @property - def get_tag_template(self): - """Return the gRPC stub for :meth:`DataCatalogClient.get_tag_template`. - - Gets a tag template. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].GetTagTemplate - - @property - def update_tag_template(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_tag_template`. - - Updates a tag template. This method cannot be used to update the - fields of a template. The tag template fields are represented as - separate resources and should be updated using their own - create/update/delete methods. Users should enable the Data Catalog API - in the project identified by the ``tag_template.name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateTagTemplate - - @property - def delete_tag_template(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_tag_template`. - - Deletes a tag template and all tags using the template. Users should - enable the Data Catalog API in the project identified by the ``name`` - parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteTagTemplate - - @property - def create_tag_template_field(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_tag_template_field`. - - Creates a field in a tag template. The user should enable the Data - Catalog API in the project identified by the ``parent`` parameter (see - `Data Catalog Resource - Project `__ - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateTagTemplateField - - @property - def update_tag_template_field(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_tag_template_field`. - - Updates a field in a tag template. This method cannot be used to - update the field type. Users should enable the Data Catalog API in the - project identified by the ``name`` parameter (see [Data Catalog Resource - Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateTagTemplateField - - @property - def rename_tag_template_field(self): - """Return the gRPC stub for :meth:`DataCatalogClient.rename_tag_template_field`. - - Renames a field in a tag template. The user should enable the Data - Catalog API in the project identified by the ``name`` parameter (see - `Data Catalog Resource - Project `__ - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].RenameTagTemplateField - - @property - def delete_tag_template_field(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_tag_template_field`. - - Deletes a field in a tag template and all uses of that field. Users - should enable the Data Catalog API in the project identified by the - ``name`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteTagTemplateField - - @property - def create_tag(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_tag`. - - Creates a tag on an ``Entry``. Note: The project identified by the - ``parent`` parameter for the - `tag `__ - and the `tag - template `__ - used to create the tag must be from the same organization. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateTag - - @property - def update_tag(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_tag`. - - Updates an existing tag. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateTag - - @property - def delete_tag(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_tag`. - - Deletes a tag. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteTag - - @property - def list_tags(self): - """Return the gRPC stub for :meth:`DataCatalogClient.list_tags`. - - Lists the tags on an ``Entry``. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].ListTags - - @property - def set_iam_policy(self): - """Return the gRPC stub for :meth:`DataCatalogClient.set_iam_policy`. - - Sets the access control policy for a resource. Replaces any existing - policy. Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - Callers must have following Google IAM permission - - - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on tag - templates. - - ``datacatalog.entries.setIamPolicy`` to set policies on entries. - - ``datacatalog.entryGroups.setIamPolicy`` to set policies on entry - groups. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].SetIamPolicy - - @property - def get_iam_policy(self): - """Return the gRPC stub for :meth:`DataCatalogClient.get_iam_policy`. - - Gets the access control policy for a resource. A ``NOT_FOUND`` error - is returned if the resource does not exist. An empty policy is returned - if the resource exists but does not have a policy set on it. - - Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - Callers must have following Google IAM permission - - - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on tag - templates. - - ``datacatalog.entries.getIamPolicy`` to get policies on entries. - - ``datacatalog.entryGroups.getIamPolicy`` to get policies on entry - groups. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].GetIamPolicy - - @property - def test_iam_permissions(self): - """Return the gRPC stub for :meth:`DataCatalogClient.test_iam_permissions`. - - Returns the caller's permissions on a resource. If the resource does - not exist, an empty set of permissions is returned (We don't return a - ``NOT_FOUND`` error). - - Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - A caller is not required to have Google IAM permission to make this - request. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].TestIamPermissions diff --git a/google/cloud/datacatalog_v1/proto/__init__.py b/google/cloud/datacatalog_v1/proto/__init__.py deleted file mode 100644 index e69de29b..00000000 diff --git a/google/cloud/datacatalog_v1/proto/common.proto b/google/cloud/datacatalog_v1/proto/common.proto new file mode 100644 index 00000000..bb31bceb --- /dev/null +++ b/google/cloud/datacatalog_v1/proto/common.proto @@ -0,0 +1,38 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog"; +option java_multiple_files = true; +option java_package = "com.google.cloud.datacatalog.v1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1"; +option ruby_package = "Google::Cloud::DataCatalog::V1"; + +// This enum describes all the possible systems that Data Catalog integrates +// with. +enum IntegratedSystem { + // Default unknown system. + INTEGRATED_SYSTEM_UNSPECIFIED = 0; + + // BigQuery. + BIGQUERY = 1; + + // Cloud Pub/Sub. + CLOUD_PUBSUB = 2; +} diff --git a/google/cloud/datacatalog_v1/proto/common_pb2.py b/google/cloud/datacatalog_v1/proto/common_pb2.py deleted file mode 100644 index 22f8fe5a..00000000 --- a/google/cloud/datacatalog_v1/proto/common_pb2.py +++ /dev/null @@ -1,75 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1/proto/common.proto - -from google.protobuf.internal import enum_type_wrapper -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1/proto/common.proto", - package="google.cloud.datacatalog.v1", - syntax="proto3", - serialized_options=b"\n\037com.google.cloud.datacatalog.v1P\001ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\370\001\001\252\002\033Google.Cloud.DataCatalog.V1\312\002\033Google\\Cloud\\DataCatalog\\V1\352\002\036Google::Cloud::DataCatalog::V1", - create_key=_descriptor._internal_create_key, - serialized_pb=b"\n.google/cloud/datacatalog_v1/proto/common.proto\x12\x1bgoogle.cloud.datacatalog.v1*U\n\x10IntegratedSystem\x12!\n\x1dINTEGRATED_SYSTEM_UNSPECIFIED\x10\x00\x12\x0c\n\x08\x42IGQUERY\x10\x01\x12\x10\n\x0c\x43LOUD_PUBSUB\x10\x02\x42\xcb\x01\n\x1f\x63om.google.cloud.datacatalog.v1P\x01ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\xf8\x01\x01\xaa\x02\x1bGoogle.Cloud.DataCatalog.V1\xca\x02\x1bGoogle\\Cloud\\DataCatalog\\V1\xea\x02\x1eGoogle::Cloud::DataCatalog::V1b\x06proto3", -) - -_INTEGRATEDSYSTEM = _descriptor.EnumDescriptor( - name="IntegratedSystem", - full_name="google.cloud.datacatalog.v1.IntegratedSystem", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="INTEGRATED_SYSTEM_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="BIGQUERY", - index=1, - number=1, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="CLOUD_PUBSUB", - index=2, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=79, - serialized_end=164, -) -_sym_db.RegisterEnumDescriptor(_INTEGRATEDSYSTEM) - -IntegratedSystem = enum_type_wrapper.EnumTypeWrapper(_INTEGRATEDSYSTEM) -INTEGRATED_SYSTEM_UNSPECIFIED = 0 -BIGQUERY = 1 -CLOUD_PUBSUB = 2 - - -DESCRIPTOR.enum_types_by_name["IntegratedSystem"] = _INTEGRATEDSYSTEM -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - - -DESCRIPTOR._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1/proto/common_pb2_grpc.py b/google/cloud/datacatalog_v1/proto/common_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1/proto/common_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1/proto/datacatalog.proto b/google/cloud/datacatalog_v1/proto/datacatalog.proto new file mode 100644 index 00000000..c5b700dd --- /dev/null +++ b/google/cloud/datacatalog_v1/proto/datacatalog.proto @@ -0,0 +1,1261 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1; + +import "google/api/annotations.proto"; +import "google/api/client.proto"; +import "google/api/field_behavior.proto"; +import "google/api/resource.proto"; +import "google/cloud/datacatalog/v1/common.proto"; +import "google/cloud/datacatalog/v1/gcs_fileset_spec.proto"; +import "google/cloud/datacatalog/v1/schema.proto"; +import "google/cloud/datacatalog/v1/search.proto"; +import "google/cloud/datacatalog/v1/table_spec.proto"; +import "google/cloud/datacatalog/v1/tags.proto"; +import "google/cloud/datacatalog/v1/timestamps.proto"; +import "google/iam/v1/iam_policy.proto"; +import "google/iam/v1/policy.proto"; +import "google/protobuf/empty.proto"; +import "google/protobuf/field_mask.proto"; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog"; +option java_multiple_files = true; +option java_package = "com.google.cloud.datacatalog.v1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1"; +option ruby_package = "Google::Cloud::DataCatalog::V1"; + +// Data Catalog API service allows clients to discover, understand, and manage +// their data. +service DataCatalog { + option (google.api.default_host) = "datacatalog.googleapis.com"; + option (google.api.oauth_scopes) = + "https://www.googleapis.com/auth/cloud-platform"; + + // Searches Data Catalog for multiple resources like entries, tags that + // match a query. + // + // This is a custom method + // (https://cloud.google.com/apis/design/custom_methods) and does not return + // the complete resource, only the resource identifier and high level + // fields. Clients can subsequentally call `Get` methods. + // + // Note that Data Catalog search queries do not guarantee full recall. Query + // results that match your query may not be returned, even in subsequent + // result pages. Also note that results returned (and not returned) can vary + // across repeated search queries. + // + // See [Data Catalog Search + // Syntax](https://cloud.google.com/data-catalog/docs/how-to/search-reference) + // for more information. + rpc SearchCatalog(SearchCatalogRequest) returns (SearchCatalogResponse) { + option (google.api.http) = { + post: "/v1/catalog:search" + body: "*" + }; + option (google.api.method_signature) = "scope,query"; + } + + // Creates an EntryGroup. + // + // An entry group contains logically related entries together with Cloud + // Identity and Access Management policies that specify the users who can + // create, edit, and view entries within the entry group. + // + // Data Catalog automatically creates an entry group for BigQuery entries + // ("@bigquery") and Pub/Sub topics ("@pubsub"). Users create their own entry + // group to contain Cloud Storage fileset entries or custom type entries, + // and the IAM policies associated with those entries. Entry groups, like + // entries, can be searched. + // + // A maximum of 10,000 entry groups may be created per organization across all + // locations. + // + // Users should enable the Data Catalog API in the project identified by + // the `parent` parameter (see [Data Catalog Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc CreateEntryGroup(CreateEntryGroupRequest) returns (EntryGroup) { + option (google.api.http) = { + post: "/v1/{parent=projects/*/locations/*}/entryGroups" + body: "entry_group" + }; + option (google.api.method_signature) = "parent,entry_group_id,entry_group"; + } + + // Gets an EntryGroup. + rpc GetEntryGroup(GetEntryGroupRequest) returns (EntryGroup) { + option (google.api.http) = { + get: "/v1/{name=projects/*/locations/*/entryGroups/*}" + }; + option (google.api.method_signature) = "name"; + option (google.api.method_signature) = "name,read_mask"; + } + + // Updates an EntryGroup. The user should enable the Data Catalog API in the + // project identified by the `entry_group.name` parameter (see [Data Catalog + // Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc UpdateEntryGroup(UpdateEntryGroupRequest) returns (EntryGroup) { + option (google.api.http) = { + patch: "/v1/{entry_group.name=projects/*/locations/*/entryGroups/*}" + body: "entry_group" + }; + option (google.api.method_signature) = "entry_group"; + option (google.api.method_signature) = "entry_group,update_mask"; + } + + // Deletes an EntryGroup. Only entry groups that do not contain entries can be + // deleted. Users should enable the Data Catalog API in the project + // identified by the `name` parameter (see [Data Catalog Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc DeleteEntryGroup(DeleteEntryGroupRequest) + returns (google.protobuf.Empty) { + option (google.api.http) = { + delete: "/v1/{name=projects/*/locations/*/entryGroups/*}" + }; + option (google.api.method_signature) = "name"; + } + + // Lists entry groups. + rpc ListEntryGroups(ListEntryGroupsRequest) + returns (ListEntryGroupsResponse) { + option (google.api.http) = { + get: "/v1/{parent=projects/*/locations/*}/entryGroups" + }; + option (google.api.method_signature) = "parent"; + } + + // Creates an entry. Only entries of 'FILESET' type or user-specified type can + // be created. + // + // Users should enable the Data Catalog API in the project identified by + // the `parent` parameter (see [Data Catalog Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + // + // A maximum of 100,000 entries may be created per entry group. + rpc CreateEntry(CreateEntryRequest) returns (Entry) { + option (google.api.http) = { + post: "/v1/{parent=projects/*/locations/*/entryGroups/*}/entries" + body: "entry" + }; + option (google.api.method_signature) = "parent,entry_id,entry"; + } + + // Updates an existing entry. + // Users should enable the Data Catalog API in the project identified by + // the `entry.name` parameter (see [Data Catalog Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc UpdateEntry(UpdateEntryRequest) returns (Entry) { + option (google.api.http) = { + patch: "/v1/{entry.name=projects/*/locations/*/entryGroups/*/entries/*}" + body: "entry" + }; + option (google.api.method_signature) = "entry"; + option (google.api.method_signature) = "entry,update_mask"; + } + + // Deletes an existing entry. Only entries created through + // [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry] + // method can be deleted. + // Users should enable the Data Catalog API in the project identified by + // the `name` parameter (see [Data Catalog Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc DeleteEntry(DeleteEntryRequest) returns (google.protobuf.Empty) { + option (google.api.http) = { + delete: "/v1/{name=projects/*/locations/*/entryGroups/*/entries/*}" + }; + option (google.api.method_signature) = "name"; + } + + // Gets an entry. + rpc GetEntry(GetEntryRequest) returns (Entry) { + option (google.api.http) = { + get: "/v1/{name=projects/*/locations/*/entryGroups/*/entries/*}" + }; + option (google.api.method_signature) = "name"; + } + + // Get an entry by target resource name. This method allows clients to use + // the resource name from the source Google Cloud Platform service to get the + // Data Catalog Entry. + rpc LookupEntry(LookupEntryRequest) returns (Entry) { + option (google.api.http) = { + get: "/v1/entries:lookup" + }; + } + + // Lists entries. + rpc ListEntries(ListEntriesRequest) returns (ListEntriesResponse) { + option (google.api.http) = { + get: "/v1/{parent=projects/*/locations/*/entryGroups/*}/entries" + }; + option (google.api.method_signature) = "parent"; + } + + // Creates a tag template. The user should enable the Data Catalog API in + // the project identified by the `parent` parameter (see [Data Catalog + // Resource + // Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) + // for more information). + rpc CreateTagTemplate(CreateTagTemplateRequest) returns (TagTemplate) { + option (google.api.http) = { + post: "/v1/{parent=projects/*/locations/*}/tagTemplates" + body: "tag_template" + }; + option (google.api.method_signature) = + "parent,tag_template_id,tag_template"; + } + + // Gets a tag template. + rpc GetTagTemplate(GetTagTemplateRequest) returns (TagTemplate) { + option (google.api.http) = { + get: "/v1/{name=projects/*/locations/*/tagTemplates/*}" + }; + option (google.api.method_signature) = "name"; + } + + // Updates a tag template. This method cannot be used to update the fields of + // a template. The tag template fields are represented as separate resources + // and should be updated using their own create/update/delete methods. + // Users should enable the Data Catalog API in the project identified by + // the `tag_template.name` parameter (see [Data Catalog Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc UpdateTagTemplate(UpdateTagTemplateRequest) returns (TagTemplate) { + option (google.api.http) = { + patch: "/v1/{tag_template.name=projects/*/locations/*/tagTemplates/*}" + body: "tag_template" + }; + option (google.api.method_signature) = "tag_template"; + option (google.api.method_signature) = "tag_template,update_mask"; + } + + // Deletes a tag template and all tags using the template. + // Users should enable the Data Catalog API in the project identified by + // the `name` parameter (see [Data Catalog Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc DeleteTagTemplate(DeleteTagTemplateRequest) + returns (google.protobuf.Empty) { + option (google.api.http) = { + delete: "/v1/{name=projects/*/locations/*/tagTemplates/*}" + }; + option (google.api.method_signature) = "name,force"; + } + + // Creates a field in a tag template. The user should enable the Data Catalog + // API in the project identified by the `parent` parameter (see + // [Data Catalog Resource + // Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) + // for more information). + rpc CreateTagTemplateField(CreateTagTemplateFieldRequest) + returns (TagTemplateField) { + option (google.api.http) = { + post: "/v1/{parent=projects/*/locations/*/tagTemplates/*}/fields" + body: "tag_template_field" + }; + option (google.api.method_signature) = + "parent,tag_template_field_id,tag_template_field"; + } + + // Updates a field in a tag template. This method cannot be used to update the + // field type. Users should enable the Data Catalog API in the project + // identified by the `name` parameter (see [Data Catalog Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc UpdateTagTemplateField(UpdateTagTemplateFieldRequest) + returns (TagTemplateField) { + option (google.api.http) = { + patch: "/v1/{name=projects/*/locations/*/tagTemplates/*/fields/*}" + body: "tag_template_field" + }; + option (google.api.method_signature) = "name,tag_template_field"; + option (google.api.method_signature) = + "name,tag_template_field,update_mask"; + } + + // Renames a field in a tag template. The user should enable the Data Catalog + // API in the project identified by the `name` parameter (see [Data Catalog + // Resource + // Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) + // for more information). + rpc RenameTagTemplateField(RenameTagTemplateFieldRequest) + returns (TagTemplateField) { + option (google.api.http) = { + post: "/v1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:rename" + body: "*" + }; + option (google.api.method_signature) = "name,new_tag_template_field_id"; + } + + // Deletes a field in a tag template and all uses of that field. + // Users should enable the Data Catalog API in the project identified by + // the `name` parameter (see [Data Catalog Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc DeleteTagTemplateField(DeleteTagTemplateFieldRequest) + returns (google.protobuf.Empty) { + option (google.api.http) = { + delete: "/v1/{name=projects/*/locations/*/tagTemplates/*/fields/*}" + }; + option (google.api.method_signature) = "name,force"; + } + + // Creates a tag on an [Entry][google.cloud.datacatalog.v1.Entry]. + // Note: The project identified by the `parent` parameter for the + // [tag](https://cloud.google.com/data-catalog/docs/reference/rest/v1/projects.locations.entryGroups.entries.tags/create#path-parameters) + // and the + // [tag + // template](https://cloud.google.com/data-catalog/docs/reference/rest/v1/projects.locations.tagTemplates/create#path-parameters) + // used to create the tag must be from the same organization. + rpc CreateTag(CreateTagRequest) returns (Tag) { + option (google.api.http) = { + post: "/v1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tags" + body: "tag" + additional_bindings { + post: "/v1/{parent=projects/*/locations/*/entryGroups/*}/tags" + body: "tag" + } + }; + option (google.api.method_signature) = "parent,tag"; + } + + // Updates an existing tag. + rpc UpdateTag(UpdateTagRequest) returns (Tag) { + option (google.api.http) = { + patch: "/v1/{tag.name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}" + body: "tag" + additional_bindings { + patch: "/v1/{tag.name=projects/*/locations/*/entryGroups/*/tags/*}" + body: "tag" + } + }; + option (google.api.method_signature) = "tag"; + option (google.api.method_signature) = "tag,update_mask"; + } + + // Deletes a tag. + rpc DeleteTag(DeleteTagRequest) returns (google.protobuf.Empty) { + option (google.api.http) = { + delete: "/v1/{name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}" + additional_bindings { + delete: "/v1/{name=projects/*/locations/*/entryGroups/*/tags/*}" + } + }; + option (google.api.method_signature) = "name"; + } + + // Lists the tags on an [Entry][google.cloud.datacatalog.v1.Entry]. + rpc ListTags(ListTagsRequest) returns (ListTagsResponse) { + option (google.api.http) = { + get: "/v1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tags" + additional_bindings { + get: "/v1/{parent=projects/*/locations/*/entryGroups/*}/tags" + } + }; + option (google.api.method_signature) = "parent"; + } + + // Sets the access control policy for a resource. Replaces any existing + // policy. + // Supported resources are: + // - Tag templates. + // - Entries. + // - Entry groups. + // Note, this method cannot be used to manage policies for BigQuery, Pub/Sub + // and any external Google Cloud Platform resources synced to Data Catalog. + // + // Callers must have following Google IAM permission + // - `datacatalog.tagTemplates.setIamPolicy` to set policies on tag + // templates. + // - `datacatalog.entries.setIamPolicy` to set policies on entries. + // - `datacatalog.entryGroups.setIamPolicy` to set policies on entry groups. + rpc SetIamPolicy(google.iam.v1.SetIamPolicyRequest) + returns (google.iam.v1.Policy) { + option (google.api.http) = { + post: "/v1/{resource=projects/*/locations/*/tagTemplates/*}:setIamPolicy" + body: "*" + additional_bindings { + post: "/v1/{resource=projects/*/locations/*/entryGroups/*}:setIamPolicy" + body: "*" + } + }; + option (google.api.method_signature) = "resource,policy"; + } + + // Gets the access control policy for a resource. A `NOT_FOUND` error + // is returned if the resource does not exist. An empty policy is returned + // if the resource exists but does not have a policy set on it. + // + // Supported resources are: + // - Tag templates. + // - Entries. + // - Entry groups. + // Note, this method cannot be used to manage policies for BigQuery, Pub/Sub + // and any external Google Cloud Platform resources synced to Data Catalog. + // + // Callers must have following Google IAM permission + // - `datacatalog.tagTemplates.getIamPolicy` to get policies on tag + // templates. + // - `datacatalog.entries.getIamPolicy` to get policies on entries. + // - `datacatalog.entryGroups.getIamPolicy` to get policies on entry groups. + rpc GetIamPolicy(google.iam.v1.GetIamPolicyRequest) + returns (google.iam.v1.Policy) { + option (google.api.http) = { + post: "/v1/{resource=projects/*/locations/*/tagTemplates/*}:getIamPolicy" + body: "*" + additional_bindings { + post: "/v1/{resource=projects/*/locations/*/entryGroups/*}:getIamPolicy" + body: "*" + } + additional_bindings { + post: "/v1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:getIamPolicy" + body: "*" + } + }; + option (google.api.method_signature) = "resource"; + } + + // Returns the caller's permissions on a resource. + // If the resource does not exist, an empty set of permissions is returned + // (We don't return a `NOT_FOUND` error). + // + // Supported resources are: + // - Tag templates. + // - Entries. + // - Entry groups. + // Note, this method cannot be used to manage policies for BigQuery, Pub/Sub + // and any external Google Cloud Platform resources synced to Data Catalog. + // + // A caller is not required to have Google IAM permission to make this + // request. + rpc TestIamPermissions(google.iam.v1.TestIamPermissionsRequest) + returns (google.iam.v1.TestIamPermissionsResponse) { + option (google.api.http) = { + post: "/v1/{resource=projects/*/locations/*/tagTemplates/*}:testIamPermissions" + body: "*" + additional_bindings { + post: "/v1/{resource=projects/*/locations/*/entryGroups/*}:testIamPermissions" + body: "*" + } + additional_bindings { + post: "/v1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:testIamPermissions" + body: "*" + } + }; + } +} + +// Request message for +// [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. +message SearchCatalogRequest { + // The criteria that select the subspace used for query matching. + message Scope { + // The list of organization IDs to search within. To find your organization + // ID, follow instructions in + // https://cloud.google.com/resource-manager/docs/creating-managing-organization. + repeated string include_org_ids = 2; + + // The list of project IDs to search within. To learn more about the + // distinction between project names/IDs/numbers, go to + // https://cloud.google.com/docs/overview/#projects. + repeated string include_project_ids = 3; + + // If `true`, include Google Cloud Platform (GCP) public datasets in the + // search results. Info on GCP public datasets is available at + // https://cloud.google.com/public-datasets/. By default, GCP public + // datasets are excluded. + bool include_gcp_public_datasets = 7; + + // Optional. The list of locations to search within. + // 1. If empty, search will be performed in all locations; + // 2. If any of the locations are NOT in the valid locations list, error + // will be returned; + // 3. Otherwise, search only the given locations for matching results. + // Typical usage is to leave this field empty. When a location is + // unreachable as returned in the `SearchCatalogResponse.unreachable` field, + // users can repeat the search request with this parameter set to get + // additional information on the error. + // + // Valid locations: + // * asia-east1 + // * asia-east2 + // * asia-northeast1 + // * asia-northeast2 + // * asia-northeast3 + // * asia-south1 + // * asia-southeast1 + // * australia-southeast1 + // * eu + // * europe-north1 + // * europe-west1 + // * europe-west2 + // * europe-west3 + // * europe-west4 + // * europe-west6 + // * global + // * northamerica-northeast1 + // * southamerica-east1 + // * us + // * us-central1 + // * us-east1 + // * us-east4 + // * us-west1 + // * us-west2 + repeated string restricted_locations = 16 + [(google.api.field_behavior) = OPTIONAL]; + } + + // Required. The scope of this search request. A `scope` that has empty + // `include_org_ids`, `include_project_ids` AND false + // `include_gcp_public_datasets` is considered invalid. Data Catalog will + // return an error in such a case. + Scope scope = 6 [(google.api.field_behavior) = REQUIRED]; + + // Required. The query string in search query syntax. The query must be + // non-empty. + // + // Query strings can be simple as "x" or more qualified as: + // + // * name:x + // * column:x + // * description:y + // + // Note: Query tokens need to have a minimum of 3 characters for substring + // matching to work correctly. See [Data Catalog Search + // Syntax](https://cloud.google.com/data-catalog/docs/how-to/search-reference) + // for more information. + string query = 1 [(google.api.field_behavior) = REQUIRED]; + + // Number of results in the search page. If <=0 then defaults to 10. Max limit + // for page_size is 1000. Throws an invalid argument for page_size > 1000. + int32 page_size = 2; + + // Optional. Pagination token returned in an earlier + // [SearchCatalogResponse.next_page_token][google.cloud.datacatalog.v1.SearchCatalogResponse.next_page_token], + // which indicates that this is a continuation of a prior + // [SearchCatalogRequest][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog] + // call, and that the system should return the next page of data. If empty, + // the first page is returned. + string page_token = 3 [(google.api.field_behavior) = OPTIONAL]; + + // Specifies the ordering of results, currently supported case-sensitive + // choices are: + // + // * `relevance`, only supports descending + // * `last_modified_timestamp [asc|desc]`, defaults to descending if not + // specified + // + // If not specified, defaults to `relevance` descending. + string order_by = 5; +} + +// Response message for +// [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. +message SearchCatalogResponse { + // Search results. + repeated SearchCatalogResult results = 1; + + // The token that can be used to retrieve the next page of results. + string next_page_token = 3; + + // Unreachable locations. Search result does not include data from those + // locations. Users can get additional information on the error by repeating + // the search request with a more restrictive parameter -- setting the value + // for `SearchDataCatalogRequest.scope.include_locations`. + repeated string unreachable = 6; +} + +// Request message for +// [CreateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.CreateEntryGroup]. +message CreateEntryGroupRequest { + // Required. The name of the project this entry group is in. Example: + // + // * projects/{project_id}/locations/{location} + // + // Note that this EntryGroup and its child resources may not actually be + // stored in the location in this name. + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + child_type: "datacatalog.googleapis.com/EntryGroup" + } + ]; + + // Required. The id of the entry group to create. + // The id must begin with a letter or underscore, contain only English + // letters, numbers and underscores, and be at most 64 characters. + string entry_group_id = 3 [(google.api.field_behavior) = REQUIRED]; + + // The entry group to create. Defaults to an empty entry group. + EntryGroup entry_group = 2; +} + +// Request message for +// [UpdateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.UpdateEntryGroup]. +message UpdateEntryGroupRequest { + // Required. The updated entry group. "name" field must be set. + EntryGroup entry_group = 1 [(google.api.field_behavior) = REQUIRED]; + + // The fields to update on the entry group. If absent or empty, all modifiable + // fields are updated. + google.protobuf.FieldMask update_mask = 2; +} + +// Request message for +// [GetEntryGroup][google.cloud.datacatalog.v1.DataCatalog.GetEntryGroup]. +message GetEntryGroupRequest { + // Required. The name of the entry group. For example, + // `projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}`. + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/EntryGroup" + } + ]; + + // The fields to return. If not set or empty, all fields are returned. + google.protobuf.FieldMask read_mask = 2; +} + +// Request message for +// [DeleteEntryGroup][google.cloud.datacatalog.v1.DataCatalog.DeleteEntryGroup]. +message DeleteEntryGroupRequest { + // Required. The name of the entry group. For example, + // `projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}`. + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/EntryGroup" + } + ]; + + // Optional. If true, deletes all entries in the entry group. + bool force = 2 [(google.api.field_behavior) = OPTIONAL]; +} + +// Request message for +// [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. +message ListEntryGroupsRequest { + // Required. The name of the location that contains the entry groups, which + // can be provided in URL format. Example: + // + // * projects/{project_id}/locations/{location} + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/EntryGroup" + } + ]; + + // Optional. The maximum number of items to return. Default is 10. Max limit + // is 1000. Throws an invalid argument for `page_size > 1000`. + int32 page_size = 2 [(google.api.field_behavior) = OPTIONAL]; + + // Optional. Token that specifies which page is requested. If empty, the first + // page is returned. + string page_token = 3 [(google.api.field_behavior) = OPTIONAL]; +} + +// Response message for +// [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. +message ListEntryGroupsResponse { + // EntryGroup details. + repeated EntryGroup entry_groups = 1; + + // Token to retrieve the next page of results. It is set to empty if no items + // remain in results. + string next_page_token = 2; +} + +// Request message for +// [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry]. +message CreateEntryRequest { + // Required. The name of the entry group this entry is in. Example: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + // + // Note that this Entry and its child resources may not actually be stored in + // the location in this name. + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/EntryGroup" + } + ]; + + // Required. The id of the entry to create. + string entry_id = 3 [(google.api.field_behavior) = REQUIRED]; + + // Required. The entry to create. + Entry entry = 2 [(google.api.field_behavior) = REQUIRED]; +} + +// Request message for +// [UpdateEntry][google.cloud.datacatalog.v1.DataCatalog.UpdateEntry]. +message UpdateEntryRequest { + // Required. The updated entry. The "name" field must be set. + Entry entry = 1 [(google.api.field_behavior) = REQUIRED]; + + // The fields to update on the entry. If absent or empty, all modifiable + // fields are updated. + // + // The following fields are modifiable: + // * For entries with type `DATA_STREAM`: + // * `schema` + // * For entries with type `FILESET` + // * `schema` + // * `display_name` + // * `description` + // * `gcs_fileset_spec` + // * `gcs_fileset_spec.file_patterns` + // * For entries with `user_specified_type` + // * `schema` + // * `display_name` + // * `description` + // * user_specified_type + // * user_specified_system + // * linked_resource + // * source_system_timestamps + google.protobuf.FieldMask update_mask = 2; +} + +// Request message for +// [DeleteEntry][google.cloud.datacatalog.v1.DataCatalog.DeleteEntry]. +message DeleteEntryRequest { + // Required. The name of the entry. Example: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/Entry" + } + ]; +} + +// Request message for +// [GetEntry][google.cloud.datacatalog.v1.DataCatalog.GetEntry]. +message GetEntryRequest { + // Required. The name of the entry. Example: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/Entry" + } + ]; +} + +// Request message for +// [LookupEntry][google.cloud.datacatalog.v1.DataCatalog.LookupEntry]. +message LookupEntryRequest { + // Required. Represents either the Google Cloud Platform resource or SQL name + // for a Google Cloud Platform resource. + oneof target_name { + // The full name of the Google Cloud Platform resource the Data Catalog + // entry represents. See: + // https://cloud.google.com/apis/design/resource_names#full_resource_name. + // Full names are case-sensitive. + // + // Examples: + // + // * //bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId + // * //pubsub.googleapis.com/projects/projectId/topics/topicId + string linked_resource = 1; + + // The SQL name of the entry. SQL names are case-sensitive. + // + // Examples: + // + // * `pubsub.project_id.topic_id` + // * ``pubsub.project_id.`topic.id.with.dots` `` + // * `bigquery.table.project_id.dataset_id.table_id` + // * `bigquery.dataset.project_id.dataset_id` + // * `datacatalog.entry.project_id.location_id.entry_group_id.entry_id` + // + // `*_id`s shoud satisfy the standard SQL rules for identifiers. + // https://cloud.google.com/bigquery/docs/reference/standard-sql/lexical. + string sql_resource = 3; + } +} + +// Entry Metadata. +// A Data Catalog Entry resource represents another resource in Google +// Cloud Platform (such as a BigQuery dataset or a Pub/Sub topic) or +// outside of Google Cloud Platform. Clients can use the `linked_resource` field +// in the Entry resource to refer to the original resource ID of the source +// system. +// +// An Entry resource contains resource details, such as its schema. An Entry can +// also be used to attach flexible metadata, such as a +// [Tag][google.cloud.datacatalog.v1.Tag]. +message Entry { + option (google.api.resource) = { + type: "datacatalog.googleapis.com/Entry" + pattern: "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}" + }; + + // The Data Catalog resource name of the entry in URL format. Example: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + // + // Note that this Entry and its child resources may not actually be stored in + // the location in this name. + string name = 1 [(google.api.resource_reference) = { + type: "datacatalog.googleapis.com/EntryGroup" + }]; + + // The resource this metadata entry refers to. + // + // For Google Cloud Platform resources, `linked_resource` is the [full name of + // the + // resource](https://cloud.google.com/apis/design/resource_names#full_resource_name). + // For example, the `linked_resource` for a table resource from BigQuery is: + // + // * //bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId + // + // Output only when Entry is of type in the EntryType enum. For entries with + // user_specified_type, this field is optional and defaults to an empty + // string. + string linked_resource = 9; + + // Required. Entry type. + oneof entry_type { + // The type of the entry. + // Only used for Entries with types in the EntryType enum. + EntryType type = 2; + + // Entry type if it does not fit any of the input-allowed values listed in + // `EntryType` enum above. When creating an entry, users should check the + // enum values first, if nothing matches the entry to be created, then + // provide a custom value, for example "my_special_type". + // `user_specified_type` strings must begin with a letter or underscore and + // can only contain letters, numbers, and underscores; are case insensitive; + // must be at least 1 character and at most 64 characters long. + // + // Currently, only FILESET enum value is allowed. All other entries created + // through Data Catalog must use `user_specified_type`. + string user_specified_type = 16; + } + + // The source system of the entry. + oneof system { + // Output only. This field indicates the entry's source system that Data + // Catalog integrates with, such as BigQuery or Pub/Sub. + IntegratedSystem integrated_system = 17 + [(google.api.field_behavior) = OUTPUT_ONLY]; + + // This field indicates the entry's source system that Data Catalog does not + // integrate with. `user_specified_system` strings must begin with a letter + // or underscore and can only contain letters, numbers, and underscores; are + // case insensitive; must be at least 1 character and at most 64 characters + // long. + string user_specified_system = 18; + } + + // Type specification information. + oneof type_spec { + // Specification that applies to a Cloud Storage fileset. This is only valid + // on entries of type FILESET. + GcsFilesetSpec gcs_fileset_spec = 6; + + // Specification that applies to a BigQuery table. This is only valid on + // entries of type `TABLE`. + BigQueryTableSpec bigquery_table_spec = 12; + + // Specification for a group of BigQuery tables with name pattern + // `[prefix]YYYYMMDD`. Context: + // https://cloud.google.com/bigquery/docs/partitioned-tables#partitioning_versus_sharding. + BigQueryDateShardedSpec bigquery_date_sharded_spec = 15; + } + + // Display information such as title and description. A short name to identify + // the entry, for example, "Analytics Data - Jan 2011". Default value is an + // empty string. + string display_name = 3; + + // Entry description, which can consist of several sentences or paragraphs + // that describe entry contents. Default value is an empty string. + string description = 4; + + // Schema of the entry. An entry might not have any schema attached to it. + Schema schema = 5; + + // Timestamps about the underlying resource, not about this Data Catalog + // entry. Output only when Entry is of type in the EntryType enum. For entries + // with user_specified_type, this field is optional and defaults to an empty + // timestamp. + SystemTimestamps source_system_timestamps = 7; +} + +// EntryGroup Metadata. +// An EntryGroup resource represents a logical grouping of zero or more +// Data Catalog [Entry][google.cloud.datacatalog.v1.Entry] resources. +message EntryGroup { + option (google.api.resource) = { + type: "datacatalog.googleapis.com/EntryGroup" + pattern: "projects/{project}/locations/{location}/entryGroups/{entry_group}" + }; + + // The resource name of the entry group in URL format. Example: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + // + // Note that this EntryGroup and its child resources may not actually be + // stored in the location in this name. + string name = 1; + + // A short name to identify the entry group, for example, + // "analytics data - jan 2011". Default value is an empty string. + string display_name = 2; + + // Entry group description, which can consist of several sentences or + // paragraphs that describe entry group contents. Default value is an empty + // string. + string description = 3; + + // Output only. Timestamps about this EntryGroup. Default value is empty + // timestamps. + SystemTimestamps data_catalog_timestamps = 4 + [(google.api.field_behavior) = OUTPUT_ONLY]; +} + +// Request message for +// [CreateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplate]. +message CreateTagTemplateRequest { + // Required. The name of the project and the template location + // [region](https://cloud.google.com/data-catalog/docs/concepts/regions). + // + // Example: + // + // * projects/{project_id}/locations/us-central1 + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + child_type: "datacatalog.googleapis.com/TagTemplate" + } + ]; + + // Required. The id of the tag template to create. + string tag_template_id = 3 [(google.api.field_behavior) = REQUIRED]; + + // Required. The tag template to create. + TagTemplate tag_template = 2 [(google.api.field_behavior) = REQUIRED]; +} + +// Request message for +// [GetTagTemplate][google.cloud.datacatalog.v1.DataCatalog.GetTagTemplate]. +message GetTagTemplateRequest { + // Required. The name of the tag template. Example: + // + // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/TagTemplate" + } + ]; +} + +// Request message for +// [UpdateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplate]. +message UpdateTagTemplateRequest { + // Required. The template to update. The "name" field must be set. + TagTemplate tag_template = 1 [(google.api.field_behavior) = REQUIRED]; + + // The field mask specifies the parts of the template to overwrite. + // + // Allowed fields: + // + // * `display_name` + // + // If absent or empty, all of the allowed fields above will be updated. + google.protobuf.FieldMask update_mask = 2; +} + +// Request message for +// [DeleteTagTemplate][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplate]. +message DeleteTagTemplateRequest { + // Required. The name of the tag template to delete. Example: + // + // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/TagTemplate" + } + ]; + + // Required. Currently, this field must always be set to `true`. + // This confirms the deletion of any possible tags using this template. + // `force = false` will be supported in the future. + bool force = 2 [(google.api.field_behavior) = REQUIRED]; +} + +// Request message for +// [CreateTag][google.cloud.datacatalog.v1.DataCatalog.CreateTag]. +message CreateTagRequest { + // Required. The name of the resource to attach this tag to. Tags can be + // attached to Entries. Example: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + // + // Note that this Tag and its child resources may not actually be stored in + // the location in this name. + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { type: "datacatalog.googleapis.com/Tag" } + ]; + + // Required. The tag to create. + Tag tag = 2 [(google.api.field_behavior) = REQUIRED]; +} + +// Request message for +// [UpdateTag][google.cloud.datacatalog.v1.DataCatalog.UpdateTag]. +message UpdateTagRequest { + // Required. The updated tag. The "name" field must be set. + Tag tag = 1 [(google.api.field_behavior) = REQUIRED]; + + // The fields to update on the Tag. If absent or empty, all modifiable fields + // are updated. Currently the only modifiable field is the field `fields`. + google.protobuf.FieldMask update_mask = 2; +} + +// Request message for +// [DeleteTag][google.cloud.datacatalog.v1.DataCatalog.DeleteTag]. +message DeleteTagRequest { + // Required. The name of the tag to delete. Example: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + child_type: "datacatalog.googleapis.com/Tag" + } + ]; +} + +// Request message for +// [CreateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplateField]. +message CreateTagTemplateFieldRequest { + // Required. The name of the project and the template location + // [region](https://cloud.google.com/data-catalog/docs/concepts/regions). + // + // Example: + // + // * projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/TagTemplate" + } + ]; + + // Required. The ID of the tag template field to create. + // Field ids can contain letters (both uppercase and lowercase), numbers + // (0-9), underscores (_) and dashes (-). Field IDs must be at least 1 + // character long and at most 128 characters long. Field IDs must also be + // unique within their template. + string tag_template_field_id = 2 [(google.api.field_behavior) = REQUIRED]; + + // Required. The tag template field to create. + TagTemplateField tag_template_field = 3 + [(google.api.field_behavior) = REQUIRED]; +} + +// Request message for +// [UpdateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplateField]. +message UpdateTagTemplateFieldRequest { + // Required. The name of the tag template field. Example: + // + // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/TagTemplateField" + } + ]; + + // Required. The template to update. + TagTemplateField tag_template_field = 2 + [(google.api.field_behavior) = REQUIRED]; + + // Optional. The field mask specifies the parts of the template to be updated. + // Allowed fields: + // + // * `display_name` + // * `type.enum_type` + // * `is_required` + // + // If `update_mask` is not set or empty, all of the allowed fields above will + // be updated. + // + // When updating an enum type, the provided values will be merged with the + // existing values. Therefore, enum values can only be added, existing enum + // values cannot be deleted nor renamed. Updating a template field from + // optional to required is NOT allowed. + google.protobuf.FieldMask update_mask = 3 + [(google.api.field_behavior) = OPTIONAL]; +} + +// Request message for +// [RenameTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.RenameTagTemplateField]. +message RenameTagTemplateFieldRequest { + // Required. The name of the tag template. Example: + // + // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/TagTemplateField" + } + ]; + + // Required. The new ID of this tag template field. For example, + // `my_new_field`. + string new_tag_template_field_id = 2 [(google.api.field_behavior) = REQUIRED]; +} + +// Request message for +// [DeleteTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplateField]. +message DeleteTagTemplateFieldRequest { + // Required. The name of the tag template field to delete. Example: + // + // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/TagTemplateField" + } + ]; + + // Required. Currently, this field must always be set to `true`. + // This confirms the deletion of this field from any tags using this field. + // `force = false` will be supported in the future. + bool force = 2 [(google.api.field_behavior) = REQUIRED]; +} + +// Request message for +// [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. +message ListTagsRequest { + // Required. The name of the Data Catalog resource to list the tags of. The + // resource could be an [Entry][google.cloud.datacatalog.v1.Entry] or an + // [EntryGroup][google.cloud.datacatalog.v1.EntryGroup]. + // + // Examples: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + child_type: "datacatalog.googleapis.com/Tag" + } + ]; + + // The maximum number of tags to return. Default is 10. Max limit is 1000. + int32 page_size = 2; + + // Token that specifies which page is requested. If empty, the first page is + // returned. + string page_token = 3; +} + +// Response message for +// [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. +message ListTagsResponse { + // [Tag][google.cloud.datacatalog.v1.Tag] details. + repeated Tag tags = 1; + + // Token to retrieve the next page of results. It is set to empty if no items + // remain in results. + string next_page_token = 2; +} + +// Request message for +// [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. +message ListEntriesRequest { + // Required. The name of the entry group that contains the entries, which can + // be provided in URL format. Example: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/EntryGroup" + } + ]; + + // The maximum number of items to return. Default is 10. Max limit is 1000. + // Throws an invalid argument for `page_size > 1000`. + int32 page_size = 2; + + // Token that specifies which page is requested. If empty, the first page is + // returned. + string page_token = 3; + + // The fields to return for each Entry. If not set or empty, all + // fields are returned. + // For example, setting read_mask to contain only one path "name" will cause + // ListEntries to return a list of Entries with only "name" field. + google.protobuf.FieldMask read_mask = 4; +} + +// Response message for +// [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. +message ListEntriesResponse { + // Entry details. + repeated Entry entries = 1; + + // Token to retrieve the next page of results. It is set to empty if no items + // remain in results. + string next_page_token = 2; +} + +// Entry resources in Data Catalog can be of different types e.g. a BigQuery +// Table entry is of type `TABLE`. This enum describes all the possible types +// Data Catalog contains. +enum EntryType { + // Default unknown type. + ENTRY_TYPE_UNSPECIFIED = 0; + + // Output only. The type of entry that has a GoogleSQL schema, including + // logical views. + TABLE = 2; + + // Output only. The type of models, examples include + // https://cloud.google.com/bigquery-ml/docs/bigqueryml-intro + MODEL = 5; + + // Output only. An entry type which is used for streaming entries. Example: + // Pub/Sub topic. + DATA_STREAM = 3; + + // An entry type which is a set of files or objects. Example: + // Cloud Storage fileset. + FILESET = 4; +} diff --git a/google/cloud/datacatalog_v1/proto/datacatalog_pb2.py b/google/cloud/datacatalog_v1/proto/datacatalog_pb2.py deleted file mode 100644 index ef10d680..00000000 --- a/google/cloud/datacatalog_v1/proto/datacatalog_pb2.py +++ /dev/null @@ -1,3906 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1/proto/datacatalog.proto - -from google.protobuf.internal import enum_type_wrapper -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2 -from google.api import client_pb2 as google_dot_api_dot_client__pb2 -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.api import resource_pb2 as google_dot_api_dot_resource__pb2 -from google.cloud.datacatalog_v1.proto import ( - common_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_common__pb2, -) -from google.cloud.datacatalog_v1.proto import ( - gcs_fileset_spec_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_gcs__fileset__spec__pb2, -) -from google.cloud.datacatalog_v1.proto import ( - schema_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_schema__pb2, -) -from google.cloud.datacatalog_v1.proto import ( - search_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_search__pb2, -) -from google.cloud.datacatalog_v1.proto import ( - table_spec_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_table__spec__pb2, -) -from google.cloud.datacatalog_v1.proto import ( - tags_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2, -) -from google.cloud.datacatalog_v1.proto import ( - timestamps_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_timestamps__pb2, -) -from google.iam.v1 import iam_policy_pb2 as google_dot_iam_dot_v1_dot_iam__policy__pb2 -from google.iam.v1 import policy_pb2 as google_dot_iam_dot_v1_dot_policy__pb2 -from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2 -from google.protobuf import field_mask_pb2 as google_dot_protobuf_dot_field__mask__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1/proto/datacatalog.proto", - package="google.cloud.datacatalog.v1", - syntax="proto3", - serialized_options=b"\n\037com.google.cloud.datacatalog.v1P\001ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\370\001\001\252\002\033Google.Cloud.DataCatalog.V1\312\002\033Google\\Cloud\\DataCatalog\\V1\352\002\036Google::Cloud::DataCatalog::V1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n3google/cloud/datacatalog_v1/proto/datacatalog.proto\x12\x1bgoogle.cloud.datacatalog.v1\x1a\x1cgoogle/api/annotations.proto\x1a\x17google/api/client.proto\x1a\x1fgoogle/api/field_behavior.proto\x1a\x19google/api/resource.proto\x1a.google/cloud/datacatalog_v1/proto/common.proto\x1a\x38google/cloud/datacatalog_v1/proto/gcs_fileset_spec.proto\x1a.google/cloud/datacatalog_v1/proto/schema.proto\x1a.google/cloud/datacatalog_v1/proto/search.proto\x1a\x32google/cloud/datacatalog_v1/proto/table_spec.proto\x1a,google/cloud/datacatalog_v1/proto/tags.proto\x1a\x32google/cloud/datacatalog_v1/proto/timestamps.proto\x1a\x1egoogle/iam/v1/iam_policy.proto\x1a\x1agoogle/iam/v1/policy.proto\x1a\x1bgoogle/protobuf/empty.proto\x1a google/protobuf/field_mask.proto"\xbd\x02\n\x14SearchCatalogRequest\x12K\n\x05scope\x18\x06 \x01(\x0b\x32\x37.google.cloud.datacatalog.v1.SearchCatalogRequest.ScopeB\x03\xe0\x41\x02\x12\x12\n\x05query\x18\x01 \x01(\tB\x03\xe0\x41\x02\x12\x11\n\tpage_size\x18\x02 \x01(\x05\x12\x17\n\npage_token\x18\x03 \x01(\tB\x03\xe0\x41\x01\x12\x10\n\x08order_by\x18\x05 \x01(\t\x1a\x85\x01\n\x05Scope\x12\x17\n\x0finclude_org_ids\x18\x02 \x03(\t\x12\x1b\n\x13include_project_ids\x18\x03 \x03(\t\x12#\n\x1binclude_gcp_public_datasets\x18\x07 \x01(\x08\x12!\n\x14restricted_locations\x18\x10 \x03(\tB\x03\xe0\x41\x01"\x88\x01\n\x15SearchCatalogResponse\x12\x41\n\x07results\x18\x01 \x03(\x0b\x32\x30.google.cloud.datacatalog.v1.SearchCatalogResult\x12\x17\n\x0fnext_page_token\x18\x03 \x01(\t\x12\x13\n\x0bunreachable\x18\x06 \x03(\t"\xb3\x01\n\x17\x43reateEntryGroupRequest\x12=\n\x06parent\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\x12%datacatalog.googleapis.com/EntryGroup\x12\x1b\n\x0e\x65ntry_group_id\x18\x03 \x01(\tB\x03\xe0\x41\x02\x12<\n\x0b\x65ntry_group\x18\x02 \x01(\x0b\x32\'.google.cloud.datacatalog.v1.EntryGroup"\x8d\x01\n\x17UpdateEntryGroupRequest\x12\x41\n\x0b\x65ntry_group\x18\x01 \x01(\x0b\x32\'.google.cloud.datacatalog.v1.EntryGroupB\x03\xe0\x41\x02\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"\x82\x01\n\x14GetEntryGroupRequest\x12;\n\x04name\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12-\n\tread_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"j\n\x17\x44\x65leteEntryGroupRequest\x12;\n\x04name\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x12\n\x05\x66orce\x18\x02 \x01(\x08\x42\x03\xe0\x41\x01"\x88\x01\n\x16ListEntryGroupsRequest\x12=\n\x06parent\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x16\n\tpage_size\x18\x02 \x01(\x05\x42\x03\xe0\x41\x01\x12\x17\n\npage_token\x18\x03 \x01(\tB\x03\xe0\x41\x01"q\n\x17ListEntryGroupsResponse\x12=\n\x0c\x65ntry_groups\x18\x01 \x03(\x0b\x32\'.google.cloud.datacatalog.v1.EntryGroup\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t"\xa2\x01\n\x12\x43reateEntryRequest\x12=\n\x06parent\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x15\n\x08\x65ntry_id\x18\x03 \x01(\tB\x03\xe0\x41\x02\x12\x36\n\x05\x65ntry\x18\x02 \x01(\x0b\x32".google.cloud.datacatalog.v1.EntryB\x03\xe0\x41\x02"}\n\x12UpdateEntryRequest\x12\x36\n\x05\x65ntry\x18\x01 \x01(\x0b\x32".google.cloud.datacatalog.v1.EntryB\x03\xe0\x41\x02\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"L\n\x12\x44\x65leteEntryRequest\x12\x36\n\x04name\x18\x01 \x01(\tB(\xe0\x41\x02\xfa\x41"\n datacatalog.googleapis.com/Entry"I\n\x0fGetEntryRequest\x12\x36\n\x04name\x18\x01 \x01(\tB(\xe0\x41\x02\xfa\x41"\n datacatalog.googleapis.com/Entry"V\n\x12LookupEntryRequest\x12\x19\n\x0flinked_resource\x18\x01 \x01(\tH\x00\x12\x16\n\x0csql_resource\x18\x03 \x01(\tH\x00\x42\r\n\x0btarget_name"\xe7\x06\n\x05\x45ntry\x12\x38\n\x04name\x18\x01 \x01(\tB*\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x17\n\x0flinked_resource\x18\t \x01(\t\x12\x36\n\x04type\x18\x02 \x01(\x0e\x32&.google.cloud.datacatalog.v1.EntryTypeH\x00\x12\x1d\n\x13user_specified_type\x18\x10 \x01(\tH\x00\x12O\n\x11integrated_system\x18\x11 \x01(\x0e\x32-.google.cloud.datacatalog.v1.IntegratedSystemB\x03\xe0\x41\x03H\x01\x12\x1f\n\x15user_specified_system\x18\x12 \x01(\tH\x01\x12G\n\x10gcs_fileset_spec\x18\x06 \x01(\x0b\x32+.google.cloud.datacatalog.v1.GcsFilesetSpecH\x02\x12M\n\x13\x62igquery_table_spec\x18\x0c \x01(\x0b\x32..google.cloud.datacatalog.v1.BigQueryTableSpecH\x02\x12Z\n\x1a\x62igquery_date_sharded_spec\x18\x0f \x01(\x0b\x32\x34.google.cloud.datacatalog.v1.BigQueryDateShardedSpecH\x02\x12\x14\n\x0c\x64isplay_name\x18\x03 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t\x12\x33\n\x06schema\x18\x05 \x01(\x0b\x32#.google.cloud.datacatalog.v1.Schema\x12O\n\x18source_system_timestamps\x18\x07 \x01(\x0b\x32-.google.cloud.datacatalog.v1.SystemTimestamps:x\xea\x41u\n datacatalog.googleapis.com/Entry\x12Qprojects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}B\x0c\n\nentry_typeB\x08\n\x06systemB\x0b\n\ttype_spec"\x89\x02\n\nEntryGroup\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x14\n\x0c\x64isplay_name\x18\x02 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x03 \x01(\t\x12S\n\x17\x64\x61ta_catalog_timestamps\x18\x04 \x01(\x0b\x32-.google.cloud.datacatalog.v1.SystemTimestampsB\x03\xe0\x41\x03:m\xea\x41j\n%datacatalog.googleapis.com/EntryGroup\x12\x41projects/{project}/locations/{location}/entryGroups/{entry_group}"\xbd\x01\n\x18\x43reateTagTemplateRequest\x12>\n\x06parent\x18\x01 \x01(\tB.\xe0\x41\x02\xfa\x41(\x12&datacatalog.googleapis.com/TagTemplate\x12\x1c\n\x0ftag_template_id\x18\x03 \x01(\tB\x03\xe0\x41\x02\x12\x43\n\x0ctag_template\x18\x02 \x01(\x0b\x32(.google.cloud.datacatalog.v1.TagTemplateB\x03\xe0\x41\x02"U\n\x15GetTagTemplateRequest\x12<\n\x04name\x18\x01 \x01(\tB.\xe0\x41\x02\xfa\x41(\n&datacatalog.googleapis.com/TagTemplate"\x90\x01\n\x18UpdateTagTemplateRequest\x12\x43\n\x0ctag_template\x18\x01 \x01(\x0b\x32(.google.cloud.datacatalog.v1.TagTemplateB\x03\xe0\x41\x02\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"l\n\x18\x44\x65leteTagTemplateRequest\x12<\n\x04name\x18\x01 \x01(\tB.\xe0\x41\x02\xfa\x41(\n&datacatalog.googleapis.com/TagTemplate\x12\x12\n\x05\x66orce\x18\x02 \x01(\x08\x42\x03\xe0\x41\x02"~\n\x10\x43reateTagRequest\x12\x36\n\x06parent\x18\x01 \x01(\tB&\xe0\x41\x02\xfa\x41 \n\x1e\x64\x61tacatalog.googleapis.com/Tag\x12\x32\n\x03tag\x18\x02 \x01(\x0b\x32 .google.cloud.datacatalog.v1.TagB\x03\xe0\x41\x02"w\n\x10UpdateTagRequest\x12\x32\n\x03tag\x18\x01 \x01(\x0b\x32 .google.cloud.datacatalog.v1.TagB\x03\xe0\x41\x02\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"H\n\x10\x44\x65leteTagRequest\x12\x34\n\x04name\x18\x01 \x01(\tB&\xe0\x41\x02\xfa\x41 \x12\x1e\x64\x61tacatalog.googleapis.com/Tag"\xd3\x01\n\x1d\x43reateTagTemplateFieldRequest\x12>\n\x06parent\x18\x01 \x01(\tB.\xe0\x41\x02\xfa\x41(\n&datacatalog.googleapis.com/TagTemplate\x12"\n\x15tag_template_field_id\x18\x02 \x01(\tB\x03\xe0\x41\x02\x12N\n\x12tag_template_field\x18\x03 \x01(\x0b\x32-.google.cloud.datacatalog.v1.TagTemplateFieldB\x03\xe0\x41\x02"\xe8\x01\n\x1dUpdateTagTemplateFieldRequest\x12\x41\n\x04name\x18\x01 \x01(\tB3\xe0\x41\x02\xfa\x41-\n+datacatalog.googleapis.com/TagTemplateField\x12N\n\x12tag_template_field\x18\x02 \x01(\x0b\x32-.google.cloud.datacatalog.v1.TagTemplateFieldB\x03\xe0\x41\x02\x12\x34\n\x0bupdate_mask\x18\x03 \x01(\x0b\x32\x1a.google.protobuf.FieldMaskB\x03\xe0\x41\x01"\x8a\x01\n\x1dRenameTagTemplateFieldRequest\x12\x41\n\x04name\x18\x01 \x01(\tB3\xe0\x41\x02\xfa\x41-\n+datacatalog.googleapis.com/TagTemplateField\x12&\n\x19new_tag_template_field_id\x18\x02 \x01(\tB\x03\xe0\x41\x02"v\n\x1d\x44\x65leteTagTemplateFieldRequest\x12\x41\n\x04name\x18\x01 \x01(\tB3\xe0\x41\x02\xfa\x41-\n+datacatalog.googleapis.com/TagTemplateField\x12\x12\n\x05\x66orce\x18\x02 \x01(\x08\x42\x03\xe0\x41\x02"p\n\x0fListTagsRequest\x12\x36\n\x06parent\x18\x01 \x01(\tB&\xe0\x41\x02\xfa\x41 \x12\x1e\x64\x61tacatalog.googleapis.com/Tag\x12\x11\n\tpage_size\x18\x02 \x01(\x05\x12\x12\n\npage_token\x18\x03 \x01(\t"[\n\x10ListTagsResponse\x12.\n\x04tags\x18\x01 \x03(\x0b\x32 .google.cloud.datacatalog.v1.Tag\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t"\xa9\x01\n\x12ListEntriesRequest\x12=\n\x06parent\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x11\n\tpage_size\x18\x02 \x01(\x05\x12\x12\n\npage_token\x18\x03 \x01(\t\x12-\n\tread_mask\x18\x04 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"c\n\x13ListEntriesResponse\x12\x33\n\x07\x65ntries\x18\x01 \x03(\x0b\x32".google.cloud.datacatalog.v1.Entry\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t*[\n\tEntryType\x12\x1a\n\x16\x45NTRY_TYPE_UNSPECIFIED\x10\x00\x12\t\n\x05TABLE\x10\x02\x12\t\n\x05MODEL\x10\x05\x12\x0f\n\x0b\x44\x41TA_STREAM\x10\x03\x12\x0b\n\x07\x46ILESET\x10\x04\x32\xad/\n\x0b\x44\x61taCatalog\x12\xa3\x01\n\rSearchCatalog\x12\x31.google.cloud.datacatalog.v1.SearchCatalogRequest\x1a\x32.google.cloud.datacatalog.v1.SearchCatalogResponse"+\x82\xd3\xe4\x93\x02\x17"\x12/v1/catalog:search:\x01*\xda\x41\x0bscope,query\x12\xdb\x01\n\x10\x43reateEntryGroup\x12\x34.google.cloud.datacatalog.v1.CreateEntryGroupRequest\x1a\'.google.cloud.datacatalog.v1.EntryGroup"h\x82\xd3\xe4\x93\x02>"//v1/{parent=projects/*/locations/*}/entryGroups:\x0b\x65ntry_group\xda\x41!parent,entry_group_id,entry_group\x12\xbc\x01\n\rGetEntryGroup\x12\x31.google.cloud.datacatalog.v1.GetEntryGroupRequest\x1a\'.google.cloud.datacatalog.v1.EntryGroup"O\x82\xd3\xe4\x93\x02\x31\x12//v1/{name=projects/*/locations/*/entryGroups/*}\xda\x41\x04name\xda\x41\x0ename,read_mask\x12\xeb\x01\n\x10UpdateEntryGroup\x12\x34.google.cloud.datacatalog.v1.UpdateEntryGroupRequest\x1a\'.google.cloud.datacatalog.v1.EntryGroup"x\x82\xd3\xe4\x93\x02J2;/v1/{entry_group.name=projects/*/locations/*/entryGroups/*}:\x0b\x65ntry_group\xda\x41\x0b\x65ntry_group\xda\x41\x17\x65ntry_group,update_mask\x12\xa0\x01\n\x10\x44\x65leteEntryGroup\x12\x34.google.cloud.datacatalog.v1.DeleteEntryGroupRequest\x1a\x16.google.protobuf.Empty">\x82\xd3\xe4\x93\x02\x31*//v1/{name=projects/*/locations/*/entryGroups/*}\xda\x41\x04name\x12\xbe\x01\n\x0fListEntryGroups\x12\x33.google.cloud.datacatalog.v1.ListEntryGroupsRequest\x1a\x34.google.cloud.datacatalog.v1.ListEntryGroupsResponse"@\x82\xd3\xe4\x93\x02\x31\x12//v1/{parent=projects/*/locations/*}/entryGroups\xda\x41\x06parent\x12\xc4\x01\n\x0b\x43reateEntry\x12/.google.cloud.datacatalog.v1.CreateEntryRequest\x1a".google.cloud.datacatalog.v1.Entry"`\x82\xd3\xe4\x93\x02\x42"9/v1/{parent=projects/*/locations/*/entryGroups/*}/entries:\x05\x65ntry\xda\x41\x15parent,entry_id,entry\x12\xce\x01\n\x0bUpdateEntry\x12/.google.cloud.datacatalog.v1.UpdateEntryRequest\x1a".google.cloud.datacatalog.v1.Entry"j\x82\xd3\xe4\x93\x02H2?/v1/{entry.name=projects/*/locations/*/entryGroups/*/entries/*}:\x05\x65ntry\xda\x41\x05\x65ntry\xda\x41\x11\x65ntry,update_mask\x12\xa0\x01\n\x0b\x44\x65leteEntry\x12/.google.cloud.datacatalog.v1.DeleteEntryRequest\x1a\x16.google.protobuf.Empty"H\x82\xd3\xe4\x93\x02;*9/v1/{name=projects/*/locations/*/entryGroups/*/entries/*}\xda\x41\x04name\x12\xa6\x01\n\x08GetEntry\x12,.google.cloud.datacatalog.v1.GetEntryRequest\x1a".google.cloud.datacatalog.v1.Entry"H\x82\xd3\xe4\x93\x02;\x12\x39/v1/{name=projects/*/locations/*/entryGroups/*/entries/*}\xda\x41\x04name\x12~\n\x0bLookupEntry\x12/.google.cloud.datacatalog.v1.LookupEntryRequest\x1a".google.cloud.datacatalog.v1.Entry"\x1a\x82\xd3\xe4\x93\x02\x14\x12\x12/v1/entries:lookup\x12\xbc\x01\n\x0bListEntries\x12/.google.cloud.datacatalog.v1.ListEntriesRequest\x1a\x30.google.cloud.datacatalog.v1.ListEntriesResponse"J\x82\xd3\xe4\x93\x02;\x12\x39/v1/{parent=projects/*/locations/*/entryGroups/*}/entries\xda\x41\x06parent\x12\xe2\x01\n\x11\x43reateTagTemplate\x12\x35.google.cloud.datacatalog.v1.CreateTagTemplateRequest\x1a(.google.cloud.datacatalog.v1.TagTemplate"l\x82\xd3\xe4\x93\x02@"0/v1/{parent=projects/*/locations/*}/tagTemplates:\x0ctag_template\xda\x41#parent,tag_template_id,tag_template\x12\xaf\x01\n\x0eGetTagTemplate\x12\x32.google.cloud.datacatalog.v1.GetTagTemplateRequest\x1a(.google.cloud.datacatalog.v1.TagTemplate"?\x82\xd3\xe4\x93\x02\x32\x12\x30/v1/{name=projects/*/locations/*/tagTemplates/*}\xda\x41\x04name\x12\xf3\x01\n\x11UpdateTagTemplate\x12\x35.google.cloud.datacatalog.v1.UpdateTagTemplateRequest\x1a(.google.cloud.datacatalog.v1.TagTemplate"}\x82\xd3\xe4\x93\x02M2=/v1/{tag_template.name=projects/*/locations/*/tagTemplates/*}:\x0ctag_template\xda\x41\x0ctag_template\xda\x41\x18tag_template,update_mask\x12\xa9\x01\n\x11\x44\x65leteTagTemplate\x12\x35.google.cloud.datacatalog.v1.DeleteTagTemplateRequest\x1a\x16.google.protobuf.Empty"E\x82\xd3\xe4\x93\x02\x32*0/v1/{name=projects/*/locations/*/tagTemplates/*}\xda\x41\nname,force\x12\x8d\x02\n\x16\x43reateTagTemplateField\x12:.google.cloud.datacatalog.v1.CreateTagTemplateFieldRequest\x1a-.google.cloud.datacatalog.v1.TagTemplateField"\x87\x01\x82\xd3\xe4\x93\x02O"9/v1/{parent=projects/*/locations/*/tagTemplates/*}/fields:\x12tag_template_field\xda\x41/parent,tag_template_field_id,tag_template_field\x12\x9b\x02\n\x16UpdateTagTemplateField\x12:.google.cloud.datacatalog.v1.UpdateTagTemplateFieldRequest\x1a-.google.cloud.datacatalog.v1.TagTemplateField"\x95\x01\x82\xd3\xe4\x93\x02O29/v1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:\x12tag_template_field\xda\x41\x17name,tag_template_field\xda\x41#name,tag_template_field,update_mask\x12\xf1\x01\n\x16RenameTagTemplateField\x12:.google.cloud.datacatalog.v1.RenameTagTemplateFieldRequest\x1a-.google.cloud.datacatalog.v1.TagTemplateField"l\x82\xd3\xe4\x93\x02\x45"@/v1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:rename:\x01*\xda\x41\x1ename,new_tag_template_field_id\x12\xbc\x01\n\x16\x44\x65leteTagTemplateField\x12:.google.cloud.datacatalog.v1.DeleteTagTemplateFieldRequest\x1a\x16.google.protobuf.Empty"N\x82\xd3\xe4\x93\x02;*9/v1/{name=projects/*/locations/*/tagTemplates/*/fields/*}\xda\x41\nname,force\x12\xf9\x01\n\tCreateTag\x12-.google.cloud.datacatalog.v1.CreateTagRequest\x1a .google.cloud.datacatalog.v1.Tag"\x9a\x01\x82\xd3\xe4\x93\x02\x86\x01"@/v1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tags:\x03tagZ="6/v1/{parent=projects/*/locations/*/entryGroups/*}/tags:\x03tag\xda\x41\nparent,tag\x12\x8c\x02\n\tUpdateTag\x12-.google.cloud.datacatalog.v1.UpdateTagRequest\x1a .google.cloud.datacatalog.v1.Tag"\xad\x01\x82\xd3\xe4\x93\x02\x8e\x01\x32\x44/v1/{tag.name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}:\x03tagZA2:/v1/{tag.name=projects/*/locations/*/entryGroups/*/tags/*}:\x03tag\xda\x41\x03tag\xda\x41\x0ftag,update_mask\x12\xde\x01\n\tDeleteTag\x12-.google.cloud.datacatalog.v1.DeleteTagRequest\x1a\x16.google.protobuf.Empty"\x89\x01\x82\xd3\xe4\x93\x02|*@/v1/{name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}Z8*6/v1/{name=projects/*/locations/*/entryGroups/*/tags/*}\xda\x41\x04name\x12\xf5\x01\n\x08ListTags\x12,.google.cloud.datacatalog.v1.ListTagsRequest\x1a-.google.cloud.datacatalog.v1.ListTagsResponse"\x8b\x01\x82\xd3\xe4\x93\x02|\x12@/v1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tagsZ8\x12\x36/v1/{parent=projects/*/locations/*/entryGroups/*}/tags\xda\x41\x06parent\x12\xf2\x01\n\x0cSetIamPolicy\x12".google.iam.v1.SetIamPolicyRequest\x1a\x15.google.iam.v1.Policy"\xa6\x01\x82\xd3\xe4\x93\x02\x8d\x01"A/v1/{resource=projects/*/locations/*/tagTemplates/*}:setIamPolicy:\x01*ZE"@/v1/{resource=projects/*/locations/*/entryGroups/*}:setIamPolicy:\x01*\xda\x41\x0fresource,policy\x12\xbc\x02\n\x0cGetIamPolicy\x12".google.iam.v1.GetIamPolicyRequest\x1a\x15.google.iam.v1.Policy"\xf0\x01\x82\xd3\xe4\x93\x02\xde\x01"A/v1/{resource=projects/*/locations/*/tagTemplates/*}:getIamPolicy:\x01*ZE"@/v1/{resource=projects/*/locations/*/entryGroups/*}:getIamPolicy:\x01*ZO"J/v1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:getIamPolicy:\x01*\xda\x41\x08resource\x12\xe3\x02\n\x12TestIamPermissions\x12(.google.iam.v1.TestIamPermissionsRequest\x1a).google.iam.v1.TestIamPermissionsResponse"\xf7\x01\x82\xd3\xe4\x93\x02\xf0\x01"G/v1/{resource=projects/*/locations/*/tagTemplates/*}:testIamPermissions:\x01*ZK"F/v1/{resource=projects/*/locations/*/entryGroups/*}:testIamPermissions:\x01*ZU"P/v1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:testIamPermissions:\x01*\x1aN\xca\x41\x1a\x64\x61tacatalog.googleapis.com\xd2\x41.https://www.googleapis.com/auth/cloud-platformB\xcb\x01\n\x1f\x63om.google.cloud.datacatalog.v1P\x01ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\xf8\x01\x01\xaa\x02\x1bGoogle.Cloud.DataCatalog.V1\xca\x02\x1bGoogle\\Cloud\\DataCatalog\\V1\xea\x02\x1eGoogle::Cloud::DataCatalog::V1b\x06proto3', - dependencies=[ - google_dot_api_dot_annotations__pb2.DESCRIPTOR, - google_dot_api_dot_client__pb2.DESCRIPTOR, - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_api_dot_resource__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_common__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_gcs__fileset__spec__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_schema__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_search__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_table__spec__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_timestamps__pb2.DESCRIPTOR, - google_dot_iam_dot_v1_dot_iam__policy__pb2.DESCRIPTOR, - google_dot_iam_dot_v1_dot_policy__pb2.DESCRIPTOR, - google_dot_protobuf_dot_empty__pb2.DESCRIPTOR, - google_dot_protobuf_dot_field__mask__pb2.DESCRIPTOR, - ], -) - -_ENTRYTYPE = _descriptor.EnumDescriptor( - name="EntryType", - full_name="google.cloud.datacatalog.v1.EntryType", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="ENTRY_TYPE_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="TABLE", - index=1, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="MODEL", - index=2, - number=5, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="DATA_STREAM", - index=3, - number=3, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="FILESET", - index=4, - number=4, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=5678, - serialized_end=5769, -) -_sym_db.RegisterEnumDescriptor(_ENTRYTYPE) - -EntryType = enum_type_wrapper.EnumTypeWrapper(_ENTRYTYPE) -ENTRY_TYPE_UNSPECIFIED = 0 -TABLE = 2 -MODEL = 5 -DATA_STREAM = 3 -FILESET = 4 - - -_SEARCHCATALOGREQUEST_SCOPE = _descriptor.Descriptor( - name="Scope", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.Scope", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="include_org_ids", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.Scope.include_org_ids", - index=0, - number=2, - type=9, - cpp_type=9, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="include_project_ids", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.Scope.include_project_ids", - index=1, - number=3, - type=9, - cpp_type=9, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="include_gcp_public_datasets", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.Scope.include_gcp_public_datasets", - index=2, - number=7, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="restricted_locations", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.Scope.restricted_locations", - index=3, - number=16, - type=9, - cpp_type=9, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=859, - serialized_end=992, -) - -_SEARCHCATALOGREQUEST = _descriptor.Descriptor( - name="SearchCatalogRequest", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="scope", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.scope", - index=0, - number=6, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="query", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.query", - index=1, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_size", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.page_size", - index=2, - number=2, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_token", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.page_token", - index=3, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="order_by", - full_name="google.cloud.datacatalog.v1.SearchCatalogRequest.order_by", - index=4, - number=5, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_SEARCHCATALOGREQUEST_SCOPE], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=675, - serialized_end=992, -) - - -_SEARCHCATALOGRESPONSE = _descriptor.Descriptor( - name="SearchCatalogResponse", - full_name="google.cloud.datacatalog.v1.SearchCatalogResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="results", - full_name="google.cloud.datacatalog.v1.SearchCatalogResponse.results", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="next_page_token", - full_name="google.cloud.datacatalog.v1.SearchCatalogResponse.next_page_token", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="unreachable", - full_name="google.cloud.datacatalog.v1.SearchCatalogResponse.unreachable", - index=2, - number=6, - type=9, - cpp_type=9, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=995, - serialized_end=1131, -) - - -_CREATEENTRYGROUPREQUEST = _descriptor.Descriptor( - name="CreateEntryGroupRequest", - full_name="google.cloud.datacatalog.v1.CreateEntryGroupRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1.CreateEntryGroupRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\022%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="entry_group_id", - full_name="google.cloud.datacatalog.v1.CreateEntryGroupRequest.entry_group_id", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="entry_group", - full_name="google.cloud.datacatalog.v1.CreateEntryGroupRequest.entry_group", - index=2, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1134, - serialized_end=1313, -) - - -_UPDATEENTRYGROUPREQUEST = _descriptor.Descriptor( - name="UpdateEntryGroupRequest", - full_name="google.cloud.datacatalog.v1.UpdateEntryGroupRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="entry_group", - full_name="google.cloud.datacatalog.v1.UpdateEntryGroupRequest.entry_group", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1.UpdateEntryGroupRequest.update_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1316, - serialized_end=1457, -) - - -_GETENTRYGROUPREQUEST = _descriptor.Descriptor( - name="GetEntryGroupRequest", - full_name="google.cloud.datacatalog.v1.GetEntryGroupRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.GetEntryGroupRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="read_mask", - full_name="google.cloud.datacatalog.v1.GetEntryGroupRequest.read_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1460, - serialized_end=1590, -) - - -_DELETEENTRYGROUPREQUEST = _descriptor.Descriptor( - name="DeleteEntryGroupRequest", - full_name="google.cloud.datacatalog.v1.DeleteEntryGroupRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.DeleteEntryGroupRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="force", - full_name="google.cloud.datacatalog.v1.DeleteEntryGroupRequest.force", - index=1, - number=2, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1592, - serialized_end=1698, -) - - -_LISTENTRYGROUPSREQUEST = _descriptor.Descriptor( - name="ListEntryGroupsRequest", - full_name="google.cloud.datacatalog.v1.ListEntryGroupsRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1.ListEntryGroupsRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_size", - full_name="google.cloud.datacatalog.v1.ListEntryGroupsRequest.page_size", - index=1, - number=2, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_token", - full_name="google.cloud.datacatalog.v1.ListEntryGroupsRequest.page_token", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1701, - serialized_end=1837, -) - - -_LISTENTRYGROUPSRESPONSE = _descriptor.Descriptor( - name="ListEntryGroupsResponse", - full_name="google.cloud.datacatalog.v1.ListEntryGroupsResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="entry_groups", - full_name="google.cloud.datacatalog.v1.ListEntryGroupsResponse.entry_groups", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="next_page_token", - full_name="google.cloud.datacatalog.v1.ListEntryGroupsResponse.next_page_token", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1839, - serialized_end=1952, -) - - -_CREATEENTRYREQUEST = _descriptor.Descriptor( - name="CreateEntryRequest", - full_name="google.cloud.datacatalog.v1.CreateEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1.CreateEntryRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="entry_id", - full_name="google.cloud.datacatalog.v1.CreateEntryRequest.entry_id", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="entry", - full_name="google.cloud.datacatalog.v1.CreateEntryRequest.entry", - index=2, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1955, - serialized_end=2117, -) - - -_UPDATEENTRYREQUEST = _descriptor.Descriptor( - name="UpdateEntryRequest", - full_name="google.cloud.datacatalog.v1.UpdateEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="entry", - full_name="google.cloud.datacatalog.v1.UpdateEntryRequest.entry", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1.UpdateEntryRequest.update_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=2119, - serialized_end=2244, -) - - -_DELETEENTRYREQUEST = _descriptor.Descriptor( - name="DeleteEntryRequest", - full_name="google.cloud.datacatalog.v1.DeleteEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.DeleteEntryRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b'\340A\002\372A"\n datacatalog.googleapis.com/Entry', - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=2246, - serialized_end=2322, -) - - -_GETENTRYREQUEST = _descriptor.Descriptor( - name="GetEntryRequest", - full_name="google.cloud.datacatalog.v1.GetEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.GetEntryRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b'\340A\002\372A"\n datacatalog.googleapis.com/Entry', - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=2324, - serialized_end=2397, -) - - -_LOOKUPENTRYREQUEST = _descriptor.Descriptor( - name="LookupEntryRequest", - full_name="google.cloud.datacatalog.v1.LookupEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="linked_resource", - full_name="google.cloud.datacatalog.v1.LookupEntryRequest.linked_resource", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="sql_resource", - full_name="google.cloud.datacatalog.v1.LookupEntryRequest.sql_resource", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="target_name", - full_name="google.cloud.datacatalog.v1.LookupEntryRequest.target_name", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=2399, - serialized_end=2485, -) - - -_ENTRY = _descriptor.Descriptor( - name="Entry", - full_name="google.cloud.datacatalog.v1.Entry", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.Entry.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="linked_resource", - full_name="google.cloud.datacatalog.v1.Entry.linked_resource", - index=1, - number=9, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="type", - full_name="google.cloud.datacatalog.v1.Entry.type", - index=2, - number=2, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="user_specified_type", - full_name="google.cloud.datacatalog.v1.Entry.user_specified_type", - index=3, - number=16, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="integrated_system", - full_name="google.cloud.datacatalog.v1.Entry.integrated_system", - index=4, - number=17, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="user_specified_system", - full_name="google.cloud.datacatalog.v1.Entry.user_specified_system", - index=5, - number=18, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="gcs_fileset_spec", - full_name="google.cloud.datacatalog.v1.Entry.gcs_fileset_spec", - index=6, - number=6, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="bigquery_table_spec", - full_name="google.cloud.datacatalog.v1.Entry.bigquery_table_spec", - index=7, - number=12, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="bigquery_date_sharded_spec", - full_name="google.cloud.datacatalog.v1.Entry.bigquery_date_sharded_spec", - index=8, - number=15, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1.Entry.display_name", - index=9, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="description", - full_name="google.cloud.datacatalog.v1.Entry.description", - index=10, - number=4, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="schema", - full_name="google.cloud.datacatalog.v1.Entry.schema", - index=11, - number=5, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="source_system_timestamps", - full_name="google.cloud.datacatalog.v1.Entry.source_system_timestamps", - index=12, - number=7, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"\352Au\n datacatalog.googleapis.com/Entry\022Qprojects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="entry_type", - full_name="google.cloud.datacatalog.v1.Entry.entry_type", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ), - _descriptor.OneofDescriptor( - name="system", - full_name="google.cloud.datacatalog.v1.Entry.system", - index=1, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ), - _descriptor.OneofDescriptor( - name="type_spec", - full_name="google.cloud.datacatalog.v1.Entry.type_spec", - index=2, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ), - ], - serialized_start=2488, - serialized_end=3359, -) - - -_ENTRYGROUP = _descriptor.Descriptor( - name="EntryGroup", - full_name="google.cloud.datacatalog.v1.EntryGroup", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.EntryGroup.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1.EntryGroup.display_name", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="description", - full_name="google.cloud.datacatalog.v1.EntryGroup.description", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="data_catalog_timestamps", - full_name="google.cloud.datacatalog.v1.EntryGroup.data_catalog_timestamps", - index=3, - number=4, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"\352Aj\n%datacatalog.googleapis.com/EntryGroup\022Aprojects/{project}/locations/{location}/entryGroups/{entry_group}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=3362, - serialized_end=3627, -) - - -_CREATETAGTEMPLATEREQUEST = _descriptor.Descriptor( - name="CreateTagTemplateRequest", - full_name="google.cloud.datacatalog.v1.CreateTagTemplateRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1.CreateTagTemplateRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A(\022&datacatalog.googleapis.com/TagTemplate", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template_id", - full_name="google.cloud.datacatalog.v1.CreateTagTemplateRequest.tag_template_id", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template", - full_name="google.cloud.datacatalog.v1.CreateTagTemplateRequest.tag_template", - index=2, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=3630, - serialized_end=3819, -) - - -_GETTAGTEMPLATEREQUEST = _descriptor.Descriptor( - name="GetTagTemplateRequest", - full_name="google.cloud.datacatalog.v1.GetTagTemplateRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.GetTagTemplateRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A(\n&datacatalog.googleapis.com/TagTemplate", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=3821, - serialized_end=3906, -) - - -_UPDATETAGTEMPLATEREQUEST = _descriptor.Descriptor( - name="UpdateTagTemplateRequest", - full_name="google.cloud.datacatalog.v1.UpdateTagTemplateRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="tag_template", - full_name="google.cloud.datacatalog.v1.UpdateTagTemplateRequest.tag_template", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1.UpdateTagTemplateRequest.update_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=3909, - serialized_end=4053, -) - - -_DELETETAGTEMPLATEREQUEST = _descriptor.Descriptor( - name="DeleteTagTemplateRequest", - full_name="google.cloud.datacatalog.v1.DeleteTagTemplateRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.DeleteTagTemplateRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A(\n&datacatalog.googleapis.com/TagTemplate", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="force", - full_name="google.cloud.datacatalog.v1.DeleteTagTemplateRequest.force", - index=1, - number=2, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4055, - serialized_end=4163, -) - - -_CREATETAGREQUEST = _descriptor.Descriptor( - name="CreateTagRequest", - full_name="google.cloud.datacatalog.v1.CreateTagRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1.CreateTagRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A \n\036datacatalog.googleapis.com/Tag", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag", - full_name="google.cloud.datacatalog.v1.CreateTagRequest.tag", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4165, - serialized_end=4291, -) - - -_UPDATETAGREQUEST = _descriptor.Descriptor( - name="UpdateTagRequest", - full_name="google.cloud.datacatalog.v1.UpdateTagRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="tag", - full_name="google.cloud.datacatalog.v1.UpdateTagRequest.tag", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1.UpdateTagRequest.update_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4293, - serialized_end=4412, -) - - -_DELETETAGREQUEST = _descriptor.Descriptor( - name="DeleteTagRequest", - full_name="google.cloud.datacatalog.v1.DeleteTagRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.DeleteTagRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A \022\036datacatalog.googleapis.com/Tag", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4414, - serialized_end=4486, -) - - -_CREATETAGTEMPLATEFIELDREQUEST = _descriptor.Descriptor( - name="CreateTagTemplateFieldRequest", - full_name="google.cloud.datacatalog.v1.CreateTagTemplateFieldRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1.CreateTagTemplateFieldRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A(\n&datacatalog.googleapis.com/TagTemplate", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template_field_id", - full_name="google.cloud.datacatalog.v1.CreateTagTemplateFieldRequest.tag_template_field_id", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template_field", - full_name="google.cloud.datacatalog.v1.CreateTagTemplateFieldRequest.tag_template_field", - index=2, - number=3, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4489, - serialized_end=4700, -) - - -_UPDATETAGTEMPLATEFIELDREQUEST = _descriptor.Descriptor( - name="UpdateTagTemplateFieldRequest", - full_name="google.cloud.datacatalog.v1.UpdateTagTemplateFieldRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.UpdateTagTemplateFieldRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A-\n+datacatalog.googleapis.com/TagTemplateField", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template_field", - full_name="google.cloud.datacatalog.v1.UpdateTagTemplateFieldRequest.tag_template_field", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1.UpdateTagTemplateFieldRequest.update_mask", - index=2, - number=3, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4703, - serialized_end=4935, -) - - -_RENAMETAGTEMPLATEFIELDREQUEST = _descriptor.Descriptor( - name="RenameTagTemplateFieldRequest", - full_name="google.cloud.datacatalog.v1.RenameTagTemplateFieldRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.RenameTagTemplateFieldRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A-\n+datacatalog.googleapis.com/TagTemplateField", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="new_tag_template_field_id", - full_name="google.cloud.datacatalog.v1.RenameTagTemplateFieldRequest.new_tag_template_field_id", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4938, - serialized_end=5076, -) - - -_DELETETAGTEMPLATEFIELDREQUEST = _descriptor.Descriptor( - name="DeleteTagTemplateFieldRequest", - full_name="google.cloud.datacatalog.v1.DeleteTagTemplateFieldRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.DeleteTagTemplateFieldRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A-\n+datacatalog.googleapis.com/TagTemplateField", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="force", - full_name="google.cloud.datacatalog.v1.DeleteTagTemplateFieldRequest.force", - index=1, - number=2, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5078, - serialized_end=5196, -) - - -_LISTTAGSREQUEST = _descriptor.Descriptor( - name="ListTagsRequest", - full_name="google.cloud.datacatalog.v1.ListTagsRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1.ListTagsRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A \022\036datacatalog.googleapis.com/Tag", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_size", - full_name="google.cloud.datacatalog.v1.ListTagsRequest.page_size", - index=1, - number=2, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_token", - full_name="google.cloud.datacatalog.v1.ListTagsRequest.page_token", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5198, - serialized_end=5310, -) - - -_LISTTAGSRESPONSE = _descriptor.Descriptor( - name="ListTagsResponse", - full_name="google.cloud.datacatalog.v1.ListTagsResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="tags", - full_name="google.cloud.datacatalog.v1.ListTagsResponse.tags", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="next_page_token", - full_name="google.cloud.datacatalog.v1.ListTagsResponse.next_page_token", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5312, - serialized_end=5403, -) - - -_LISTENTRIESREQUEST = _descriptor.Descriptor( - name="ListEntriesRequest", - full_name="google.cloud.datacatalog.v1.ListEntriesRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1.ListEntriesRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_size", - full_name="google.cloud.datacatalog.v1.ListEntriesRequest.page_size", - index=1, - number=2, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_token", - full_name="google.cloud.datacatalog.v1.ListEntriesRequest.page_token", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="read_mask", - full_name="google.cloud.datacatalog.v1.ListEntriesRequest.read_mask", - index=3, - number=4, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5406, - serialized_end=5575, -) - - -_LISTENTRIESRESPONSE = _descriptor.Descriptor( - name="ListEntriesResponse", - full_name="google.cloud.datacatalog.v1.ListEntriesResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="entries", - full_name="google.cloud.datacatalog.v1.ListEntriesResponse.entries", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="next_page_token", - full_name="google.cloud.datacatalog.v1.ListEntriesResponse.next_page_token", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5577, - serialized_end=5676, -) - -_SEARCHCATALOGREQUEST_SCOPE.containing_type = _SEARCHCATALOGREQUEST -_SEARCHCATALOGREQUEST.fields_by_name["scope"].message_type = _SEARCHCATALOGREQUEST_SCOPE -_SEARCHCATALOGRESPONSE.fields_by_name[ - "results" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_search__pb2._SEARCHCATALOGRESULT -) -_CREATEENTRYGROUPREQUEST.fields_by_name["entry_group"].message_type = _ENTRYGROUP -_UPDATEENTRYGROUPREQUEST.fields_by_name["entry_group"].message_type = _ENTRYGROUP -_UPDATEENTRYGROUPREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_GETENTRYGROUPREQUEST.fields_by_name[ - "read_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_LISTENTRYGROUPSRESPONSE.fields_by_name["entry_groups"].message_type = _ENTRYGROUP -_CREATEENTRYREQUEST.fields_by_name["entry"].message_type = _ENTRY -_UPDATEENTRYREQUEST.fields_by_name["entry"].message_type = _ENTRY -_UPDATEENTRYREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_LOOKUPENTRYREQUEST.oneofs_by_name["target_name"].fields.append( - _LOOKUPENTRYREQUEST.fields_by_name["linked_resource"] -) -_LOOKUPENTRYREQUEST.fields_by_name[ - "linked_resource" -].containing_oneof = _LOOKUPENTRYREQUEST.oneofs_by_name["target_name"] -_LOOKUPENTRYREQUEST.oneofs_by_name["target_name"].fields.append( - _LOOKUPENTRYREQUEST.fields_by_name["sql_resource"] -) -_LOOKUPENTRYREQUEST.fields_by_name[ - "sql_resource" -].containing_oneof = _LOOKUPENTRYREQUEST.oneofs_by_name["target_name"] -_ENTRY.fields_by_name["type"].enum_type = _ENTRYTYPE -_ENTRY.fields_by_name[ - "integrated_system" -].enum_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_common__pb2._INTEGRATEDSYSTEM -) -_ENTRY.fields_by_name[ - "gcs_fileset_spec" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_gcs__fileset__spec__pb2._GCSFILESETSPEC -) -_ENTRY.fields_by_name[ - "bigquery_table_spec" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_table__spec__pb2._BIGQUERYTABLESPEC -) -_ENTRY.fields_by_name[ - "bigquery_date_sharded_spec" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_table__spec__pb2._BIGQUERYDATESHARDEDSPEC -) -_ENTRY.fields_by_name[ - "schema" -].message_type = google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_schema__pb2._SCHEMA -_ENTRY.fields_by_name[ - "source_system_timestamps" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_timestamps__pb2._SYSTEMTIMESTAMPS -) -_ENTRY.oneofs_by_name["entry_type"].fields.append(_ENTRY.fields_by_name["type"]) -_ENTRY.fields_by_name["type"].containing_oneof = _ENTRY.oneofs_by_name["entry_type"] -_ENTRY.oneofs_by_name["entry_type"].fields.append( - _ENTRY.fields_by_name["user_specified_type"] -) -_ENTRY.fields_by_name["user_specified_type"].containing_oneof = _ENTRY.oneofs_by_name[ - "entry_type" -] -_ENTRY.oneofs_by_name["system"].fields.append( - _ENTRY.fields_by_name["integrated_system"] -) -_ENTRY.fields_by_name["integrated_system"].containing_oneof = _ENTRY.oneofs_by_name[ - "system" -] -_ENTRY.oneofs_by_name["system"].fields.append( - _ENTRY.fields_by_name["user_specified_system"] -) -_ENTRY.fields_by_name["user_specified_system"].containing_oneof = _ENTRY.oneofs_by_name[ - "system" -] -_ENTRY.oneofs_by_name["type_spec"].fields.append( - _ENTRY.fields_by_name["gcs_fileset_spec"] -) -_ENTRY.fields_by_name["gcs_fileset_spec"].containing_oneof = _ENTRY.oneofs_by_name[ - "type_spec" -] -_ENTRY.oneofs_by_name["type_spec"].fields.append( - _ENTRY.fields_by_name["bigquery_table_spec"] -) -_ENTRY.fields_by_name["bigquery_table_spec"].containing_oneof = _ENTRY.oneofs_by_name[ - "type_spec" -] -_ENTRY.oneofs_by_name["type_spec"].fields.append( - _ENTRY.fields_by_name["bigquery_date_sharded_spec"] -) -_ENTRY.fields_by_name[ - "bigquery_date_sharded_spec" -].containing_oneof = _ENTRY.oneofs_by_name["type_spec"] -_ENTRYGROUP.fields_by_name[ - "data_catalog_timestamps" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_timestamps__pb2._SYSTEMTIMESTAMPS -) -_CREATETAGTEMPLATEREQUEST.fields_by_name[ - "tag_template" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATE -) -_UPDATETAGTEMPLATEREQUEST.fields_by_name[ - "tag_template" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATE -) -_UPDATETAGTEMPLATEREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_CREATETAGREQUEST.fields_by_name[ - "tag" -].message_type = google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAG -_UPDATETAGREQUEST.fields_by_name[ - "tag" -].message_type = google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAG -_UPDATETAGREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_CREATETAGTEMPLATEFIELDREQUEST.fields_by_name[ - "tag_template_field" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD -) -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name[ - "tag_template_field" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD -) -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_LISTTAGSRESPONSE.fields_by_name[ - "tags" -].message_type = google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAG -_LISTENTRIESREQUEST.fields_by_name[ - "read_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_LISTENTRIESRESPONSE.fields_by_name["entries"].message_type = _ENTRY -DESCRIPTOR.message_types_by_name["SearchCatalogRequest"] = _SEARCHCATALOGREQUEST -DESCRIPTOR.message_types_by_name["SearchCatalogResponse"] = _SEARCHCATALOGRESPONSE -DESCRIPTOR.message_types_by_name["CreateEntryGroupRequest"] = _CREATEENTRYGROUPREQUEST -DESCRIPTOR.message_types_by_name["UpdateEntryGroupRequest"] = _UPDATEENTRYGROUPREQUEST -DESCRIPTOR.message_types_by_name["GetEntryGroupRequest"] = _GETENTRYGROUPREQUEST -DESCRIPTOR.message_types_by_name["DeleteEntryGroupRequest"] = _DELETEENTRYGROUPREQUEST -DESCRIPTOR.message_types_by_name["ListEntryGroupsRequest"] = _LISTENTRYGROUPSREQUEST -DESCRIPTOR.message_types_by_name["ListEntryGroupsResponse"] = _LISTENTRYGROUPSRESPONSE -DESCRIPTOR.message_types_by_name["CreateEntryRequest"] = _CREATEENTRYREQUEST -DESCRIPTOR.message_types_by_name["UpdateEntryRequest"] = _UPDATEENTRYREQUEST -DESCRIPTOR.message_types_by_name["DeleteEntryRequest"] = _DELETEENTRYREQUEST -DESCRIPTOR.message_types_by_name["GetEntryRequest"] = _GETENTRYREQUEST -DESCRIPTOR.message_types_by_name["LookupEntryRequest"] = _LOOKUPENTRYREQUEST -DESCRIPTOR.message_types_by_name["Entry"] = _ENTRY -DESCRIPTOR.message_types_by_name["EntryGroup"] = _ENTRYGROUP -DESCRIPTOR.message_types_by_name["CreateTagTemplateRequest"] = _CREATETAGTEMPLATEREQUEST -DESCRIPTOR.message_types_by_name["GetTagTemplateRequest"] = _GETTAGTEMPLATEREQUEST -DESCRIPTOR.message_types_by_name["UpdateTagTemplateRequest"] = _UPDATETAGTEMPLATEREQUEST -DESCRIPTOR.message_types_by_name["DeleteTagTemplateRequest"] = _DELETETAGTEMPLATEREQUEST -DESCRIPTOR.message_types_by_name["CreateTagRequest"] = _CREATETAGREQUEST -DESCRIPTOR.message_types_by_name["UpdateTagRequest"] = _UPDATETAGREQUEST -DESCRIPTOR.message_types_by_name["DeleteTagRequest"] = _DELETETAGREQUEST -DESCRIPTOR.message_types_by_name[ - "CreateTagTemplateFieldRequest" -] = _CREATETAGTEMPLATEFIELDREQUEST -DESCRIPTOR.message_types_by_name[ - "UpdateTagTemplateFieldRequest" -] = _UPDATETAGTEMPLATEFIELDREQUEST -DESCRIPTOR.message_types_by_name[ - "RenameTagTemplateFieldRequest" -] = _RENAMETAGTEMPLATEFIELDREQUEST -DESCRIPTOR.message_types_by_name[ - "DeleteTagTemplateFieldRequest" -] = _DELETETAGTEMPLATEFIELDREQUEST -DESCRIPTOR.message_types_by_name["ListTagsRequest"] = _LISTTAGSREQUEST -DESCRIPTOR.message_types_by_name["ListTagsResponse"] = _LISTTAGSRESPONSE -DESCRIPTOR.message_types_by_name["ListEntriesRequest"] = _LISTENTRIESREQUEST -DESCRIPTOR.message_types_by_name["ListEntriesResponse"] = _LISTENTRIESRESPONSE -DESCRIPTOR.enum_types_by_name["EntryType"] = _ENTRYTYPE -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -SearchCatalogRequest = _reflection.GeneratedProtocolMessageType( - "SearchCatalogRequest", - (_message.Message,), - { - "Scope": _reflection.GeneratedProtocolMessageType( - "Scope", - (_message.Message,), - { - "DESCRIPTOR": _SEARCHCATALOGREQUEST_SCOPE, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """The criteria that select the subspace used for query matching. - - Attributes: - include_org_ids: - The list of organization IDs to search within. To find your - organization ID, follow instructions in - https://cloud.google.com/resource-manager/docs/creating- - managing-organization. - include_project_ids: - The list of project IDs to search within. To learn more about - the distinction between project names/IDs/numbers, go to - https://cloud.google.com/docs/overview/#projects. - include_gcp_public_datasets: - If ``true``, include Google Cloud Platform (GCP) public - datasets in the search results. Info on GCP public datasets is - available at https://cloud.google.com/public-datasets/. By - default, GCP public datasets are excluded. - restricted_locations: - Optional. The list of locations to search within. 1. If empty, - search will be performed in all locations; 2. If any of the - locations are NOT in the valid locations list, error will be - returned; 3. Otherwise, search only the given locations for - matching results. Typical usage is to leave this field empty. - When a location is unreachable as returned in the - ``SearchCatalogResponse.unreachable`` field, users can repeat - the search request with this parameter set to get additional - information on the error. Valid locations: \* asia-east1 \* - asia-east2 \* asia-northeast1 \* asia-northeast2 \* asia- - northeast3 \* asia-south1 \* asia-southeast1 \* australia- - southeast1 \* eu \* europe-north1 \* europe-west1 \* europe- - west2 \* europe-west3 \* europe-west4 \* europe-west6 \* - global \* northamerica-northeast1 \* southamerica-east1 \* us - \* us-central1 \* us-east1 \* us-east4 \* us-west1 \* us-west2 - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.SearchCatalogRequest.Scope) - }, - ), - "DESCRIPTOR": _SEARCHCATALOGREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [SearchCatalog][google.cloud.datacatalog.v1.DataCa - talog.SearchCatalog]. - - Attributes: - scope: - Required. The scope of this search request. A ``scope`` that - has empty ``include_org_ids``, ``include_project_ids`` AND - false ``include_gcp_public_datasets`` is considered invalid. - Data Catalog will return an error in such a case. - query: - Required. The query string in search query syntax. The query - must be non-empty. Query strings can be simple as “x” or more - qualified as: - name:x - column:x - description:y Note: - Query tokens need to have a minimum of 3 characters for - substring matching to work correctly. See `Data Catalog Search - Syntax `__ for more information. - page_size: - Number of results in the search page. If <=0 then defaults to - 10. Max limit for page_size is 1000. Throws an invalid - argument for page_size > 1000. - page_token: - Optional. Pagination token returned in an earlier [SearchCatal - ogResponse.next_page_token][google.cloud.datacatalog.v1.Search - CatalogResponse.next_page_token], which indicates that this is - a continuation of a prior [SearchCatalogRequest][google.cloud. - datacatalog.v1.DataCatalog.SearchCatalog] call, and that the - system should return the next page of data. If empty, the - first page is returned. - order_by: - Specifies the ordering of results, currently supported case- - sensitive choices are: - ``relevance``, only supports - descending - ``last_modified_timestamp [asc|desc]``, defaults - to descending if not specified If not specified, defaults - to ``relevance`` descending. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.SearchCatalogRequest) - }, -) -_sym_db.RegisterMessage(SearchCatalogRequest) -_sym_db.RegisterMessage(SearchCatalogRequest.Scope) - -SearchCatalogResponse = _reflection.GeneratedProtocolMessageType( - "SearchCatalogResponse", - (_message.Message,), - { - "DESCRIPTOR": _SEARCHCATALOGRESPONSE, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Response message for [SearchCatalog][google.cloud.datacatalog.v1.DataC - atalog.SearchCatalog]. - - Attributes: - results: - Search results. - next_page_token: - The token that can be used to retrieve the next page of - results. - unreachable: - Unreachable locations. Search result does not include data - from those locations. Users can get additional information on - the error by repeating the search request with a more - restrictive parameter – setting the value for - ``SearchDataCatalogRequest.scope.include_locations``. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.SearchCatalogResponse) - }, -) -_sym_db.RegisterMessage(SearchCatalogResponse) - -CreateEntryGroupRequest = _reflection.GeneratedProtocolMessageType( - "CreateEntryGroupRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATEENTRYGROUPREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [CreateEntryGroup][google.cloud.datacatalog.v1.Dat - aCatalog.CreateEntryGroup]. - - Attributes: - parent: - Required. The name of the project this entry group is in. - Example: - projects/{project_id}/locations/{location} Note - that this EntryGroup and its child resources may not actually - be stored in the location in this name. - entry_group_id: - Required. The id of the entry group to create. The id must - begin with a letter or underscore, contain only English - letters, numbers and underscores, and be at most 64 - characters. - entry_group: - The entry group to create. Defaults to an empty entry group. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.CreateEntryGroupRequest) - }, -) -_sym_db.RegisterMessage(CreateEntryGroupRequest) - -UpdateEntryGroupRequest = _reflection.GeneratedProtocolMessageType( - "UpdateEntryGroupRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATEENTRYGROUPREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [UpdateEntryGroup][google.cloud.datacatalog.v1.Dat - aCatalog.UpdateEntryGroup]. - - Attributes: - entry_group: - Required. The updated entry group. “name” field must be set. - update_mask: - The fields to update on the entry group. If absent or empty, - all modifiable fields are updated. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.UpdateEntryGroupRequest) - }, -) -_sym_db.RegisterMessage(UpdateEntryGroupRequest) - -GetEntryGroupRequest = _reflection.GeneratedProtocolMessageType( - "GetEntryGroupRequest", - (_message.Message,), - { - "DESCRIPTOR": _GETENTRYGROUPREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [GetEntryGroup][google.cloud.datacatalog.v1.DataCa - talog.GetEntryGroup]. - - Attributes: - name: - Required. The name of the entry group. For example, ``projects - /{project_id}/locations/{location}/entryGroups/{entry_group_id - }``. - read_mask: - The fields to return. If not set or empty, all fields are - returned. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.GetEntryGroupRequest) - }, -) -_sym_db.RegisterMessage(GetEntryGroupRequest) - -DeleteEntryGroupRequest = _reflection.GeneratedProtocolMessageType( - "DeleteEntryGroupRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETEENTRYGROUPREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [DeleteEntryGroup][google.cloud.datacatalog.v1.Dat - aCatalog.DeleteEntryGroup]. - - Attributes: - name: - Required. The name of the entry group. For example, ``projects - /{project_id}/locations/{location}/entryGroups/{entry_group_id - }``. - force: - Optional. If true, deletes all entries in the entry group. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.DeleteEntryGroupRequest) - }, -) -_sym_db.RegisterMessage(DeleteEntryGroupRequest) - -ListEntryGroupsRequest = _reflection.GeneratedProtocolMessageType( - "ListEntryGroupsRequest", - (_message.Message,), - { - "DESCRIPTOR": _LISTENTRYGROUPSREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [ListEntryGroups][google.cloud.datacatalog.v1.Data - Catalog.ListEntryGroups]. - - Attributes: - parent: - Required. The name of the location that contains the entry - groups, which can be provided in URL format. Example: - - projects/{project_id}/locations/{location} - page_size: - Optional. The maximum number of items to return. Default is - 10. Max limit is 1000. Throws an invalid argument for - ``page_size > 1000``. - page_token: - Optional. Token that specifies which page is requested. If - empty, the first page is returned. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.ListEntryGroupsRequest) - }, -) -_sym_db.RegisterMessage(ListEntryGroupsRequest) - -ListEntryGroupsResponse = _reflection.GeneratedProtocolMessageType( - "ListEntryGroupsResponse", - (_message.Message,), - { - "DESCRIPTOR": _LISTENTRYGROUPSRESPONSE, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Response message for [ListEntryGroups][google.cloud.datacatalog.v1.Dat - aCatalog.ListEntryGroups]. - - Attributes: - entry_groups: - EntryGroup details. - next_page_token: - Token to retrieve the next page of results. It is set to empty - if no items remain in results. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.ListEntryGroupsResponse) - }, -) -_sym_db.RegisterMessage(ListEntryGroupsResponse) - -CreateEntryRequest = _reflection.GeneratedProtocolMessageType( - "CreateEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATEENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry]. - - Attributes: - parent: - Required. The name of the entry group this entry is in. - Example: - projects/{project_id}/locations/{location}/entryG - roups/{entry_group_id} Note that this Entry and its child - resources may not actually be stored in the location in this - name. - entry_id: - Required. The id of the entry to create. - entry: - Required. The entry to create. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.CreateEntryRequest) - }, -) -_sym_db.RegisterMessage(CreateEntryRequest) - -UpdateEntryRequest = _reflection.GeneratedProtocolMessageType( - "UpdateEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATEENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [UpdateEntry][google.cloud.datacatalog.v1.DataCatalog.UpdateEntry]. - - Attributes: - entry: - Required. The updated entry. The “name” field must be set. - update_mask: - The fields to update on the entry. If absent or empty, all - modifiable fields are updated. The following fields are - modifiable: \* For entries with type ``DATA_STREAM``: \* - ``schema`` \* For entries with type ``FILESET`` \* ``schema`` - \* ``display_name`` \* ``description`` \* ``gcs_fileset_spec`` - \* ``gcs_fileset_spec.file_patterns`` \* For entries with - ``user_specified_type`` \* ``schema`` \* ``display_name`` \* - ``description`` \* user_specified_type \* - user_specified_system \* linked_resource \* - source_system_timestamps - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.UpdateEntryRequest) - }, -) -_sym_db.RegisterMessage(UpdateEntryRequest) - -DeleteEntryRequest = _reflection.GeneratedProtocolMessageType( - "DeleteEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETEENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [DeleteEntry][google.cloud.datacatalog.v1.DataCatalog.DeleteEntry]. - - Attributes: - name: - Required. The name of the entry. Example: - projects/{projec - t_id}/locations/{location}/entryGroups/{entry_group_id}/entrie - s/{entry_id} - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.DeleteEntryRequest) - }, -) -_sym_db.RegisterMessage(DeleteEntryRequest) - -GetEntryRequest = _reflection.GeneratedProtocolMessageType( - "GetEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _GETENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [GetEntry][google.cloud.datacatalog.v1.DataCatalog.GetEntry]. - - Attributes: - name: - Required. The name of the entry. Example: - projects/{projec - t_id}/locations/{location}/entryGroups/{entry_group_id}/entrie - s/{entry_id} - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.GetEntryRequest) - }, -) -_sym_db.RegisterMessage(GetEntryRequest) - -LookupEntryRequest = _reflection.GeneratedProtocolMessageType( - "LookupEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _LOOKUPENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [LookupEntry][google.cloud.datacatalog.v1.DataCatalog.LookupEntry]. - - Attributes: - target_name: - Required. Represents either the Google Cloud Platform resource - or SQL name for a Google Cloud Platform resource. - linked_resource: - The full name of the Google Cloud Platform resource the Data - Catalog entry represents. See: https://cloud.google.com/apis/d - esign/resource_names#full_resource_name. Full names are case- - sensitive. Examples: - //bigquery.googleapis.com/projects/p - rojectId/datasets/datasetId/tables/tableId - - //pubsub.googleapis.com/projects/projectId/topics/topicId - sql_resource: - The SQL name of the entry. SQL names are case-sensitive. - Examples: - ``pubsub.project_id.topic_id`` - - :literal:`pubsub.project_id.`topic.id.with.dots\`` - - ``bigquery.table.project_id.dataset_id.table_id`` - - ``bigquery.dataset.project_id.dataset_id`` - ``datacatalog.en - try.project_id.location_id.entry_group_id.entry_id`` - ``*_id``\ s shoud satisfy the standard SQL rules for - identifiers. - https://cloud.google.com/bigquery/docs/reference/standard- - sql/lexical. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.LookupEntryRequest) - }, -) -_sym_db.RegisterMessage(LookupEntryRequest) - -Entry = _reflection.GeneratedProtocolMessageType( - "Entry", - (_message.Message,), - { - "DESCRIPTOR": _ENTRY, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Entry Metadata. A Data Catalog Entry resource represents another - resource in Google Cloud Platform (such as a BigQuery dataset or a - Pub/Sub topic) or outside of Google Cloud Platform. Clients can use - the ``linked_resource`` field in the Entry resource to refer to the - original resource ID of the source system. An Entry resource contains - resource details, such as its schema. An Entry can also be used to - attach flexible metadata, such as a - [Tag][google.cloud.datacatalog.v1.Tag]. - - Attributes: - name: - The Data Catalog resource name of the entry in URL format. - Example: - projects/{project_id}/locations/{location}/entryG - roups/{entry_group_id}/entries/{entry_id} Note that this - Entry and its child resources may not actually be stored in - the location in this name. - linked_resource: - The resource this metadata entry refers to. For Google Cloud - Platform resources, ``linked_resource`` is the `full name of - the resource `__. For example, the - ``linked_resource`` for a table resource from BigQuery is: - - //bigquery.googleapis.com/projects/projectId/datasets/datasetI - d/tables/tableId Output only when Entry is of type in the - EntryType enum. For entries with user_specified_type, this - field is optional and defaults to an empty string. - entry_type: - Required. Entry type. - type: - The type of the entry. Only used for Entries with types in the - EntryType enum. - user_specified_type: - Entry type if it does not fit any of the input-allowed values - listed in ``EntryType`` enum above. When creating an entry, - users should check the enum values first, if nothing matches - the entry to be created, then provide a custom value, for - example “my_special_type”. ``user_specified_type`` strings - must begin with a letter or underscore and can only contain - letters, numbers, and underscores; are case insensitive; must - be at least 1 character and at most 64 characters long. - Currently, only FILESET enum value is allowed. All other - entries created through Data Catalog must use - ``user_specified_type``. - system: - The source system of the entry. - integrated_system: - Output only. This field indicates the entry’s source system - that Data Catalog integrates with, such as BigQuery or - Pub/Sub. - user_specified_system: - This field indicates the entry’s source system that Data - Catalog does not integrate with. ``user_specified_system`` - strings must begin with a letter or underscore and can only - contain letters, numbers, and underscores; are case - insensitive; must be at least 1 character and at most 64 - characters long. - type_spec: - Type specification information. - gcs_fileset_spec: - Specification that applies to a Cloud Storage fileset. This is - only valid on entries of type FILESET. - bigquery_table_spec: - Specification that applies to a BigQuery table. This is only - valid on entries of type ``TABLE``. - bigquery_date_sharded_spec: - Specification for a group of BigQuery tables with name pattern - ``[prefix]YYYYMMDD``. Context: - https://cloud.google.com/bigquery/docs/partitioned- - tables#partitioning_versus_sharding. - display_name: - Display information such as title and description. A short - name to identify the entry, for example, “Analytics Data - Jan - 2011”. Default value is an empty string. - description: - Entry description, which can consist of several sentences or - paragraphs that describe entry contents. Default value is an - empty string. - schema: - Schema of the entry. An entry might not have any schema - attached to it. - source_system_timestamps: - Timestamps about the underlying resource, not about this Data - Catalog entry. Output only when Entry is of type in the - EntryType enum. For entries with user_specified_type, this - field is optional and defaults to an empty timestamp. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.Entry) - }, -) -_sym_db.RegisterMessage(Entry) - -EntryGroup = _reflection.GeneratedProtocolMessageType( - "EntryGroup", - (_message.Message,), - { - "DESCRIPTOR": _ENTRYGROUP, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """EntryGroup Metadata. An EntryGroup resource represents a logical - grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1.Entry] resources. - - Attributes: - name: - The resource name of the entry group in URL format. Example: - - projects/{project_id}/locations/{location}/entryGroups/{ent - ry_group_id} Note that this EntryGroup and its child - resources may not actually be stored in the location in this - name. - display_name: - A short name to identify the entry group, for example, - “analytics data - jan 2011”. Default value is an empty string. - description: - Entry group description, which can consist of several - sentences or paragraphs that describe entry group contents. - Default value is an empty string. - data_catalog_timestamps: - Output only. Timestamps about this EntryGroup. Default value - is empty timestamps. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.EntryGroup) - }, -) -_sym_db.RegisterMessage(EntryGroup) - -CreateTagTemplateRequest = _reflection.GeneratedProtocolMessageType( - "CreateTagTemplateRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATETAGTEMPLATEREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [CreateTagTemplate][google.cloud.datacatalog.v1.Da - taCatalog.CreateTagTemplate]. - - Attributes: - parent: - Required. The name of the project and the template location - `region `__. Example: - - projects/{project_id}/locations/us-central1 - tag_template_id: - Required. The id of the tag template to create. - tag_template: - Required. The tag template to create. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.CreateTagTemplateRequest) - }, -) -_sym_db.RegisterMessage(CreateTagTemplateRequest) - -GetTagTemplateRequest = _reflection.GeneratedProtocolMessageType( - "GetTagTemplateRequest", - (_message.Message,), - { - "DESCRIPTOR": _GETTAGTEMPLATEREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [GetTagTemplate][google.cloud.datacatalog.v1.DataC - atalog.GetTagTemplate]. - - Attributes: - name: - Required. The name of the tag template. Example: - projects/ - {project_id}/locations/{location}/tagTemplates/{tag_template_i - d} - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.GetTagTemplateRequest) - }, -) -_sym_db.RegisterMessage(GetTagTemplateRequest) - -UpdateTagTemplateRequest = _reflection.GeneratedProtocolMessageType( - "UpdateTagTemplateRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATETAGTEMPLATEREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [UpdateTagTemplate][google.cloud.datacatalog.v1.Da - taCatalog.UpdateTagTemplate]. - - Attributes: - tag_template: - Required. The template to update. The “name” field must be - set. - update_mask: - The field mask specifies the parts of the template to - overwrite. Allowed fields: - ``display_name`` If absent or - empty, all of the allowed fields above will be updated. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.UpdateTagTemplateRequest) - }, -) -_sym_db.RegisterMessage(UpdateTagTemplateRequest) - -DeleteTagTemplateRequest = _reflection.GeneratedProtocolMessageType( - "DeleteTagTemplateRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETETAGTEMPLATEREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [DeleteTagTemplate][google.cloud.datacatalog.v1.Da - taCatalog.DeleteTagTemplate]. - - Attributes: - name: - Required. The name of the tag template to delete. Example: - - projects/{project_id}/locations/{location}/tagTemplates/{tag_t - emplate_id} - force: - Required. Currently, this field must always be set to - ``true``. This confirms the deletion of any possible tags - using this template. ``force = false`` will be supported in - the future. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.DeleteTagTemplateRequest) - }, -) -_sym_db.RegisterMessage(DeleteTagTemplateRequest) - -CreateTagRequest = _reflection.GeneratedProtocolMessageType( - "CreateTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATETAGREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [CreateTag][google.cloud.datacatalog.v1.DataCatalog.CreateTag]. - - Attributes: - parent: - Required. The name of the resource to attach this tag to. Tags - can be attached to Entries. Example: - projects/{project_id} - /locations/{location}/entryGroups/{entry_group_id}/entries/{en - try_id} Note that this Tag and its child resources may not - actually be stored in the location in this name. - tag: - Required. The tag to create. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.CreateTagRequest) - }, -) -_sym_db.RegisterMessage(CreateTagRequest) - -UpdateTagRequest = _reflection.GeneratedProtocolMessageType( - "UpdateTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATETAGREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [UpdateTag][google.cloud.datacatalog.v1.DataCatalog.UpdateTag]. - - Attributes: - tag: - Required. The updated tag. The “name” field must be set. - update_mask: - The fields to update on the Tag. If absent or empty, all - modifiable fields are updated. Currently the only modifiable - field is the field ``fields``. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.UpdateTagRequest) - }, -) -_sym_db.RegisterMessage(UpdateTagRequest) - -DeleteTagRequest = _reflection.GeneratedProtocolMessageType( - "DeleteTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETETAGREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [DeleteTag][google.cloud.datacatalog.v1.DataCatalog.DeleteTag]. - - Attributes: - name: - Required. The name of the tag to delete. Example: - projects - /{project_id}/locations/{location}/entryGroups/{entry_group_id - }/entries/{entry_id}/tags/{tag_id} - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.DeleteTagRequest) - }, -) -_sym_db.RegisterMessage(DeleteTagRequest) - -CreateTagTemplateFieldRequest = _reflection.GeneratedProtocolMessageType( - "CreateTagTemplateFieldRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATETAGTEMPLATEFIELDREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [CreateTagTemplateField][google.cloud.datacatalog. - v1.DataCatalog.CreateTagTemplateField]. - - Attributes: - parent: - Required. The name of the project and the template location - `region `__. Example: - - projects/{project_id}/locations/us- - central1/tagTemplates/{tag_template_id} - tag_template_field_id: - Required. The ID of the tag template field to create. Field - ids can contain letters (both uppercase and lowercase), - numbers (0-9), underscores (_) and dashes (-). Field IDs must - be at least 1 character long and at most 128 characters long. - Field IDs must also be unique within their template. - tag_template_field: - Required. The tag template field to create. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.CreateTagTemplateFieldRequest) - }, -) -_sym_db.RegisterMessage(CreateTagTemplateFieldRequest) - -UpdateTagTemplateFieldRequest = _reflection.GeneratedProtocolMessageType( - "UpdateTagTemplateFieldRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATETAGTEMPLATEFIELDREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [UpdateTagTemplateField][google.cloud.datacatalog. - v1.DataCatalog.UpdateTagTemplateField]. - - Attributes: - name: - Required. The name of the tag template field. Example: - pro - jects/{project_id}/locations/{location}/tagTemplates/{tag_temp - late_id}/fields/{tag_template_field_id} - tag_template_field: - Required. The template to update. - update_mask: - Optional. The field mask specifies the parts of the template - to be updated. Allowed fields: - ``display_name`` - - ``type.enum_type`` - ``is_required`` If ``update_mask`` is - not set or empty, all of the allowed fields above will be - updated. When updating an enum type, the provided values will - be merged with the existing values. Therefore, enum values can - only be added, existing enum values cannot be deleted nor - renamed. Updating a template field from optional to required - is NOT allowed. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.UpdateTagTemplateFieldRequest) - }, -) -_sym_db.RegisterMessage(UpdateTagTemplateFieldRequest) - -RenameTagTemplateFieldRequest = _reflection.GeneratedProtocolMessageType( - "RenameTagTemplateFieldRequest", - (_message.Message,), - { - "DESCRIPTOR": _RENAMETAGTEMPLATEFIELDREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [RenameTagTemplateField][google.cloud.datacatalog. - v1.DataCatalog.RenameTagTemplateField]. - - Attributes: - name: - Required. The name of the tag template. Example: - projects/ - {project_id}/locations/{location}/tagTemplates/{tag_template_i - d}/fields/{tag_template_field_id} - new_tag_template_field_id: - Required. The new ID of this tag template field. For example, - ``my_new_field``. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.RenameTagTemplateFieldRequest) - }, -) -_sym_db.RegisterMessage(RenameTagTemplateFieldRequest) - -DeleteTagTemplateFieldRequest = _reflection.GeneratedProtocolMessageType( - "DeleteTagTemplateFieldRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETETAGTEMPLATEFIELDREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for [DeleteTagTemplateField][google.cloud.datacatalog. - v1.DataCatalog.DeleteTagTemplateField]. - - Attributes: - name: - Required. The name of the tag template field to delete. - Example: - projects/{project_id}/locations/{location}/tagTem - plates/{tag_template_id}/fields/{tag_template_field_id} - force: - Required. Currently, this field must always be set to - ``true``. This confirms the deletion of this field from any - tags using this field. ``force = false`` will be supported in - the future. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.DeleteTagTemplateFieldRequest) - }, -) -_sym_db.RegisterMessage(DeleteTagTemplateFieldRequest) - -ListTagsRequest = _reflection.GeneratedProtocolMessageType( - "ListTagsRequest", - (_message.Message,), - { - "DESCRIPTOR": _LISTTAGSREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. - - Attributes: - parent: - Required. The name of the Data Catalog resource to list the - tags of. The resource could be an - [Entry][google.cloud.datacatalog.v1.Entry] or an - [EntryGroup][google.cloud.datacatalog.v1.EntryGroup]. - Examples: - projects/{project_id}/locations/{location}/entry - Groups/{entry_group_id} - projects/{project_id}/locations/{lo - cation}/entryGroups/{entry_group_id}/entries/{entry_id} - page_size: - The maximum number of tags to return. Default is 10. Max limit - is 1000. - page_token: - Token that specifies which page is requested. If empty, the - first page is returned. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.ListTagsRequest) - }, -) -_sym_db.RegisterMessage(ListTagsRequest) - -ListTagsResponse = _reflection.GeneratedProtocolMessageType( - "ListTagsResponse", - (_message.Message,), - { - "DESCRIPTOR": _LISTTAGSRESPONSE, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Response message for - [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. - - Attributes: - tags: - [Tag][google.cloud.datacatalog.v1.Tag] details. - next_page_token: - Token to retrieve the next page of results. It is set to empty - if no items remain in results. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.ListTagsResponse) - }, -) -_sym_db.RegisterMessage(ListTagsResponse) - -ListEntriesRequest = _reflection.GeneratedProtocolMessageType( - "ListEntriesRequest", - (_message.Message,), - { - "DESCRIPTOR": _LISTENTRIESREQUEST, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Request message for - [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. - - Attributes: - parent: - Required. The name of the entry group that contains the - entries, which can be provided in URL format. Example: - pro - jects/{project_id}/locations/{location}/entryGroups/{entry_gro - up_id} - page_size: - The maximum number of items to return. Default is 10. Max - limit is 1000. Throws an invalid argument for ``page_size > - 1000``. - page_token: - Token that specifies which page is requested. If empty, the - first page is returned. - read_mask: - The fields to return for each Entry. If not set or empty, all - fields are returned. For example, setting read_mask to contain - only one path “name” will cause ListEntries to return a list - of Entries with only “name” field. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.ListEntriesRequest) - }, -) -_sym_db.RegisterMessage(ListEntriesRequest) - -ListEntriesResponse = _reflection.GeneratedProtocolMessageType( - "ListEntriesResponse", - (_message.Message,), - { - "DESCRIPTOR": _LISTENTRIESRESPONSE, - "__module__": "google.cloud.datacatalog_v1.proto.datacatalog_pb2", - "__doc__": """Response message for - [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. - - Attributes: - entries: - Entry details. - next_page_token: - Token to retrieve the next page of results. It is set to empty - if no items remain in results. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.ListEntriesResponse) - }, -) -_sym_db.RegisterMessage(ListEntriesResponse) - - -DESCRIPTOR._options = None -_SEARCHCATALOGREQUEST_SCOPE.fields_by_name["restricted_locations"]._options = None -_SEARCHCATALOGREQUEST.fields_by_name["scope"]._options = None -_SEARCHCATALOGREQUEST.fields_by_name["query"]._options = None -_SEARCHCATALOGREQUEST.fields_by_name["page_token"]._options = None -_CREATEENTRYGROUPREQUEST.fields_by_name["parent"]._options = None -_CREATEENTRYGROUPREQUEST.fields_by_name["entry_group_id"]._options = None -_UPDATEENTRYGROUPREQUEST.fields_by_name["entry_group"]._options = None -_GETENTRYGROUPREQUEST.fields_by_name["name"]._options = None -_DELETEENTRYGROUPREQUEST.fields_by_name["name"]._options = None -_DELETEENTRYGROUPREQUEST.fields_by_name["force"]._options = None -_LISTENTRYGROUPSREQUEST.fields_by_name["parent"]._options = None -_LISTENTRYGROUPSREQUEST.fields_by_name["page_size"]._options = None -_LISTENTRYGROUPSREQUEST.fields_by_name["page_token"]._options = None -_CREATEENTRYREQUEST.fields_by_name["parent"]._options = None -_CREATEENTRYREQUEST.fields_by_name["entry_id"]._options = None -_CREATEENTRYREQUEST.fields_by_name["entry"]._options = None -_UPDATEENTRYREQUEST.fields_by_name["entry"]._options = None -_DELETEENTRYREQUEST.fields_by_name["name"]._options = None -_GETENTRYREQUEST.fields_by_name["name"]._options = None -_ENTRY.fields_by_name["name"]._options = None -_ENTRY.fields_by_name["integrated_system"]._options = None -_ENTRY._options = None -_ENTRYGROUP.fields_by_name["data_catalog_timestamps"]._options = None -_ENTRYGROUP._options = None -_CREATETAGTEMPLATEREQUEST.fields_by_name["parent"]._options = None -_CREATETAGTEMPLATEREQUEST.fields_by_name["tag_template_id"]._options = None -_CREATETAGTEMPLATEREQUEST.fields_by_name["tag_template"]._options = None -_GETTAGTEMPLATEREQUEST.fields_by_name["name"]._options = None -_UPDATETAGTEMPLATEREQUEST.fields_by_name["tag_template"]._options = None -_DELETETAGTEMPLATEREQUEST.fields_by_name["name"]._options = None -_DELETETAGTEMPLATEREQUEST.fields_by_name["force"]._options = None -_CREATETAGREQUEST.fields_by_name["parent"]._options = None -_CREATETAGREQUEST.fields_by_name["tag"]._options = None -_UPDATETAGREQUEST.fields_by_name["tag"]._options = None -_DELETETAGREQUEST.fields_by_name["name"]._options = None -_CREATETAGTEMPLATEFIELDREQUEST.fields_by_name["parent"]._options = None -_CREATETAGTEMPLATEFIELDREQUEST.fields_by_name["tag_template_field_id"]._options = None -_CREATETAGTEMPLATEFIELDREQUEST.fields_by_name["tag_template_field"]._options = None -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name["name"]._options = None -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name["tag_template_field"]._options = None -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name["update_mask"]._options = None -_RENAMETAGTEMPLATEFIELDREQUEST.fields_by_name["name"]._options = None -_RENAMETAGTEMPLATEFIELDREQUEST.fields_by_name[ - "new_tag_template_field_id" -]._options = None -_DELETETAGTEMPLATEFIELDREQUEST.fields_by_name["name"]._options = None -_DELETETAGTEMPLATEFIELDREQUEST.fields_by_name["force"]._options = None -_LISTTAGSREQUEST.fields_by_name["parent"]._options = None -_LISTENTRIESREQUEST.fields_by_name["parent"]._options = None - -_DATACATALOG = _descriptor.ServiceDescriptor( - name="DataCatalog", - full_name="google.cloud.datacatalog.v1.DataCatalog", - file=DESCRIPTOR, - index=0, - serialized_options=b"\312A\032datacatalog.googleapis.com\322A.https://www.googleapis.com/auth/cloud-platform", - create_key=_descriptor._internal_create_key, - serialized_start=5772, - serialized_end=11833, - methods=[ - _descriptor.MethodDescriptor( - name="SearchCatalog", - full_name="google.cloud.datacatalog.v1.DataCatalog.SearchCatalog", - index=0, - containing_service=None, - input_type=_SEARCHCATALOGREQUEST, - output_type=_SEARCHCATALOGRESPONSE, - serialized_options=b'\202\323\344\223\002\027"\022/v1/catalog:search:\001*\332A\013scope,query', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateEntryGroup", - full_name="google.cloud.datacatalog.v1.DataCatalog.CreateEntryGroup", - index=1, - containing_service=None, - input_type=_CREATEENTRYGROUPREQUEST, - output_type=_ENTRYGROUP, - serialized_options=b'\202\323\344\223\002>"//v1/{parent=projects/*/locations/*}/entryGroups:\013entry_group\332A!parent,entry_group_id,entry_group', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="GetEntryGroup", - full_name="google.cloud.datacatalog.v1.DataCatalog.GetEntryGroup", - index=2, - containing_service=None, - input_type=_GETENTRYGROUPREQUEST, - output_type=_ENTRYGROUP, - serialized_options=b"\202\323\344\223\0021\022//v1/{name=projects/*/locations/*/entryGroups/*}\332A\004name\332A\016name,read_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateEntryGroup", - full_name="google.cloud.datacatalog.v1.DataCatalog.UpdateEntryGroup", - index=3, - containing_service=None, - input_type=_UPDATEENTRYGROUPREQUEST, - output_type=_ENTRYGROUP, - serialized_options=b"\202\323\344\223\002J2;/v1/{entry_group.name=projects/*/locations/*/entryGroups/*}:\013entry_group\332A\013entry_group\332A\027entry_group,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteEntryGroup", - full_name="google.cloud.datacatalog.v1.DataCatalog.DeleteEntryGroup", - index=4, - containing_service=None, - input_type=_DELETEENTRYGROUPREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\0021*//v1/{name=projects/*/locations/*/entryGroups/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="ListEntryGroups", - full_name="google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups", - index=5, - containing_service=None, - input_type=_LISTENTRYGROUPSREQUEST, - output_type=_LISTENTRYGROUPSRESPONSE, - serialized_options=b"\202\323\344\223\0021\022//v1/{parent=projects/*/locations/*}/entryGroups\332A\006parent", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateEntry", - full_name="google.cloud.datacatalog.v1.DataCatalog.CreateEntry", - index=6, - containing_service=None, - input_type=_CREATEENTRYREQUEST, - output_type=_ENTRY, - serialized_options=b'\202\323\344\223\002B"9/v1/{parent=projects/*/locations/*/entryGroups/*}/entries:\005entry\332A\025parent,entry_id,entry', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateEntry", - full_name="google.cloud.datacatalog.v1.DataCatalog.UpdateEntry", - index=7, - containing_service=None, - input_type=_UPDATEENTRYREQUEST, - output_type=_ENTRY, - serialized_options=b"\202\323\344\223\002H2?/v1/{entry.name=projects/*/locations/*/entryGroups/*/entries/*}:\005entry\332A\005entry\332A\021entry,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteEntry", - full_name="google.cloud.datacatalog.v1.DataCatalog.DeleteEntry", - index=8, - containing_service=None, - input_type=_DELETEENTRYREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\002;*9/v1/{name=projects/*/locations/*/entryGroups/*/entries/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="GetEntry", - full_name="google.cloud.datacatalog.v1.DataCatalog.GetEntry", - index=9, - containing_service=None, - input_type=_GETENTRYREQUEST, - output_type=_ENTRY, - serialized_options=b"\202\323\344\223\002;\0229/v1/{name=projects/*/locations/*/entryGroups/*/entries/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="LookupEntry", - full_name="google.cloud.datacatalog.v1.DataCatalog.LookupEntry", - index=10, - containing_service=None, - input_type=_LOOKUPENTRYREQUEST, - output_type=_ENTRY, - serialized_options=b"\202\323\344\223\002\024\022\022/v1/entries:lookup", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="ListEntries", - full_name="google.cloud.datacatalog.v1.DataCatalog.ListEntries", - index=11, - containing_service=None, - input_type=_LISTENTRIESREQUEST, - output_type=_LISTENTRIESRESPONSE, - serialized_options=b"\202\323\344\223\002;\0229/v1/{parent=projects/*/locations/*/entryGroups/*}/entries\332A\006parent", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateTagTemplate", - full_name="google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplate", - index=12, - containing_service=None, - input_type=_CREATETAGTEMPLATEREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATE, - serialized_options=b'\202\323\344\223\002@"0/v1/{parent=projects/*/locations/*}/tagTemplates:\014tag_template\332A#parent,tag_template_id,tag_template', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="GetTagTemplate", - full_name="google.cloud.datacatalog.v1.DataCatalog.GetTagTemplate", - index=13, - containing_service=None, - input_type=_GETTAGTEMPLATEREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATE, - serialized_options=b"\202\323\344\223\0022\0220/v1/{name=projects/*/locations/*/tagTemplates/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateTagTemplate", - full_name="google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplate", - index=14, - containing_service=None, - input_type=_UPDATETAGTEMPLATEREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATE, - serialized_options=b"\202\323\344\223\002M2=/v1/{tag_template.name=projects/*/locations/*/tagTemplates/*}:\014tag_template\332A\014tag_template\332A\030tag_template,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteTagTemplate", - full_name="google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplate", - index=15, - containing_service=None, - input_type=_DELETETAGTEMPLATEREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\0022*0/v1/{name=projects/*/locations/*/tagTemplates/*}\332A\nname,force", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateTagTemplateField", - full_name="google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplateField", - index=16, - containing_service=None, - input_type=_CREATETAGTEMPLATEFIELDREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD, - serialized_options=b'\202\323\344\223\002O"9/v1/{parent=projects/*/locations/*/tagTemplates/*}/fields:\022tag_template_field\332A/parent,tag_template_field_id,tag_template_field', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateTagTemplateField", - full_name="google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplateField", - index=17, - containing_service=None, - input_type=_UPDATETAGTEMPLATEFIELDREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD, - serialized_options=b"\202\323\344\223\002O29/v1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:\022tag_template_field\332A\027name,tag_template_field\332A#name,tag_template_field,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="RenameTagTemplateField", - full_name="google.cloud.datacatalog.v1.DataCatalog.RenameTagTemplateField", - index=18, - containing_service=None, - input_type=_RENAMETAGTEMPLATEFIELDREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD, - serialized_options=b'\202\323\344\223\002E"@/v1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:rename:\001*\332A\036name,new_tag_template_field_id', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteTagTemplateField", - full_name="google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplateField", - index=19, - containing_service=None, - input_type=_DELETETAGTEMPLATEFIELDREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\002;*9/v1/{name=projects/*/locations/*/tagTemplates/*/fields/*}\332A\nname,force", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateTag", - full_name="google.cloud.datacatalog.v1.DataCatalog.CreateTag", - index=20, - containing_service=None, - input_type=_CREATETAGREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAG, - serialized_options=b'\202\323\344\223\002\206\001"@/v1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tags:\003tagZ="6/v1/{parent=projects/*/locations/*/entryGroups/*}/tags:\003tag\332A\nparent,tag', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateTag", - full_name="google.cloud.datacatalog.v1.DataCatalog.UpdateTag", - index=21, - containing_service=None, - input_type=_UPDATETAGREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2._TAG, - serialized_options=b"\202\323\344\223\002\216\0012D/v1/{tag.name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}:\003tagZA2:/v1/{tag.name=projects/*/locations/*/entryGroups/*/tags/*}:\003tag\332A\003tag\332A\017tag,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteTag", - full_name="google.cloud.datacatalog.v1.DataCatalog.DeleteTag", - index=22, - containing_service=None, - input_type=_DELETETAGREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\002|*@/v1/{name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}Z8*6/v1/{name=projects/*/locations/*/entryGroups/*/tags/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="ListTags", - full_name="google.cloud.datacatalog.v1.DataCatalog.ListTags", - index=23, - containing_service=None, - input_type=_LISTTAGSREQUEST, - output_type=_LISTTAGSRESPONSE, - serialized_options=b"\202\323\344\223\002|\022@/v1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tagsZ8\0226/v1/{parent=projects/*/locations/*/entryGroups/*}/tags\332A\006parent", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="SetIamPolicy", - full_name="google.cloud.datacatalog.v1.DataCatalog.SetIamPolicy", - index=24, - containing_service=None, - input_type=google_dot_iam_dot_v1_dot_iam__policy__pb2._SETIAMPOLICYREQUEST, - output_type=google_dot_iam_dot_v1_dot_policy__pb2._POLICY, - serialized_options=b'\202\323\344\223\002\215\001"A/v1/{resource=projects/*/locations/*/tagTemplates/*}:setIamPolicy:\001*ZE"@/v1/{resource=projects/*/locations/*/entryGroups/*}:setIamPolicy:\001*\332A\017resource,policy', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="GetIamPolicy", - full_name="google.cloud.datacatalog.v1.DataCatalog.GetIamPolicy", - index=25, - containing_service=None, - input_type=google_dot_iam_dot_v1_dot_iam__policy__pb2._GETIAMPOLICYREQUEST, - output_type=google_dot_iam_dot_v1_dot_policy__pb2._POLICY, - serialized_options=b'\202\323\344\223\002\336\001"A/v1/{resource=projects/*/locations/*/tagTemplates/*}:getIamPolicy:\001*ZE"@/v1/{resource=projects/*/locations/*/entryGroups/*}:getIamPolicy:\001*ZO"J/v1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:getIamPolicy:\001*\332A\010resource', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="TestIamPermissions", - full_name="google.cloud.datacatalog.v1.DataCatalog.TestIamPermissions", - index=26, - containing_service=None, - input_type=google_dot_iam_dot_v1_dot_iam__policy__pb2._TESTIAMPERMISSIONSREQUEST, - output_type=google_dot_iam_dot_v1_dot_iam__policy__pb2._TESTIAMPERMISSIONSRESPONSE, - serialized_options=b'\202\323\344\223\002\360\001"G/v1/{resource=projects/*/locations/*/tagTemplates/*}:testIamPermissions:\001*ZK"F/v1/{resource=projects/*/locations/*/entryGroups/*}:testIamPermissions:\001*ZU"P/v1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:testIamPermissions:\001*', - create_key=_descriptor._internal_create_key, - ), - ], -) -_sym_db.RegisterServiceDescriptor(_DATACATALOG) - -DESCRIPTOR.services_by_name["DataCatalog"] = _DATACATALOG - -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1/proto/datacatalog_pb2_grpc.py b/google/cloud/datacatalog_v1/proto/datacatalog_pb2_grpc.py deleted file mode 100644 index 8af913ca..00000000 --- a/google/cloud/datacatalog_v1/proto/datacatalog_pb2_grpc.py +++ /dev/null @@ -1,1373 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc - -from google.cloud.datacatalog_v1.proto import ( - datacatalog_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2, -) -from google.cloud.datacatalog_v1.proto import ( - tags_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2, -) -from google.iam.v1 import iam_policy_pb2 as google_dot_iam_dot_v1_dot_iam__policy__pb2 -from google.iam.v1 import policy_pb2 as google_dot_iam_dot_v1_dot_policy__pb2 -from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2 - - -class DataCatalogStub(object): - """Data Catalog API service allows clients to discover, understand, and manage - their data. - """ - - def __init__(self, channel): - """Constructor. - - Args: - channel: A grpc.Channel. - """ - self.SearchCatalog = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/SearchCatalog", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.SearchCatalogRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.SearchCatalogResponse.FromString, - ) - self.CreateEntryGroup = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/CreateEntryGroup", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateEntryGroupRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - ) - self.GetEntryGroup = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/GetEntryGroup", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.GetEntryGroupRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - ) - self.UpdateEntryGroup = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/UpdateEntryGroup", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateEntryGroupRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - ) - self.DeleteEntryGroup = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/DeleteEntryGroup", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteEntryGroupRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.ListEntryGroups = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/ListEntryGroups", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsResponse.FromString, - ) - self.CreateEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/CreateEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateEntryRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - ) - self.UpdateEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/UpdateEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateEntryRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - ) - self.DeleteEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/DeleteEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteEntryRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.GetEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/GetEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.GetEntryRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - ) - self.LookupEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/LookupEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.LookupEntryRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - ) - self.ListEntries = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/ListEntries", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntriesRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntriesResponse.FromString, - ) - self.CreateTagTemplate = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/CreateTagTemplate", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - ) - self.GetTagTemplate = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/GetTagTemplate", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.GetTagTemplateRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - ) - self.UpdateTagTemplate = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/UpdateTagTemplate", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - ) - self.DeleteTagTemplate = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/DeleteTagTemplate", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.CreateTagTemplateField = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/CreateTagTemplateField", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateFieldRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - ) - self.UpdateTagTemplateField = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/UpdateTagTemplateField", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateFieldRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - ) - self.RenameTagTemplateField = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/RenameTagTemplateField", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.RenameTagTemplateFieldRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - ) - self.DeleteTagTemplateField = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/DeleteTagTemplateField", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateFieldRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.CreateTag = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/CreateTag", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateTagRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.Tag.FromString, - ) - self.UpdateTag = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/UpdateTag", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateTagRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.Tag.FromString, - ) - self.DeleteTag = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/DeleteTag", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteTagRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.ListTags = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/ListTags", - request_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListTagsRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListTagsResponse.FromString, - ) - self.SetIamPolicy = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/SetIamPolicy", - request_serializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.SetIamPolicyRequest.SerializeToString, - response_deserializer=google_dot_iam_dot_v1_dot_policy__pb2.Policy.FromString, - ) - self.GetIamPolicy = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/GetIamPolicy", - request_serializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.GetIamPolicyRequest.SerializeToString, - response_deserializer=google_dot_iam_dot_v1_dot_policy__pb2.Policy.FromString, - ) - self.TestIamPermissions = channel.unary_unary( - "/google.cloud.datacatalog.v1.DataCatalog/TestIamPermissions", - request_serializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsRequest.SerializeToString, - response_deserializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsResponse.FromString, - ) - - -class DataCatalogServicer(object): - """Data Catalog API service allows clients to discover, understand, and manage - their data. - """ - - def SearchCatalog(self, request, context): - """Searches Data Catalog for multiple resources like entries, tags that - match a query. - - This is a custom method - (https://cloud.google.com/apis/design/custom_methods) and does not return - the complete resource, only the resource identifier and high level - fields. Clients can subsequentally call `Get` methods. - - Note that Data Catalog search queries do not guarantee full recall. Query - results that match your query may not be returned, even in subsequent - result pages. Also note that results returned (and not returned) can vary - across repeated search queries. - - See [Data Catalog Search - Syntax](https://cloud.google.com/data-catalog/docs/how-to/search-reference) - for more information. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateEntryGroup(self, request, context): - """Creates an EntryGroup. - - An entry group contains logically related entries together with Cloud - Identity and Access Management policies that specify the users who can - create, edit, and view entries within the entry group. - - Data Catalog automatically creates an entry group for BigQuery entries - ("@bigquery") and Pub/Sub topics ("@pubsub"). Users create their own entry - group to contain Cloud Storage fileset entries or custom type entries, - and the IAM policies associated with those entries. Entry groups, like - entries, can be searched. - - A maximum of 10,000 entry groups may be created per organization across all - locations. - - Users should enable the Data Catalog API in the project identified by - the `parent` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def GetEntryGroup(self, request, context): - """Gets an EntryGroup. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateEntryGroup(self, request, context): - """Updates an EntryGroup. The user should enable the Data Catalog API in the - project identified by the `entry_group.name` parameter (see [Data Catalog - Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteEntryGroup(self, request, context): - """Deletes an EntryGroup. Only entry groups that do not contain entries can be - deleted. Users should enable the Data Catalog API in the project - identified by the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def ListEntryGroups(self, request, context): - """Lists entry groups. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateEntry(self, request, context): - """Creates an entry. Only entries of 'FILESET' type or user-specified type can - be created. - - Users should enable the Data Catalog API in the project identified by - the `parent` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - - A maximum of 100,000 entries may be created per entry group. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateEntry(self, request, context): - """Updates an existing entry. - Users should enable the Data Catalog API in the project identified by - the `entry.name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteEntry(self, request, context): - """Deletes an existing entry. Only entries created through - [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry] - method can be deleted. - Users should enable the Data Catalog API in the project identified by - the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def GetEntry(self, request, context): - """Gets an entry. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def LookupEntry(self, request, context): - """Get an entry by target resource name. This method allows clients to use - the resource name from the source Google Cloud Platform service to get the - Data Catalog Entry. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def ListEntries(self, request, context): - """Lists entries. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateTagTemplate(self, request, context): - """Creates a tag template. The user should enable the Data Catalog API in - the project identified by the `parent` parameter (see [Data Catalog - Resource - Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def GetTagTemplate(self, request, context): - """Gets a tag template. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateTagTemplate(self, request, context): - """Updates a tag template. This method cannot be used to update the fields of - a template. The tag template fields are represented as separate resources - and should be updated using their own create/update/delete methods. - Users should enable the Data Catalog API in the project identified by - the `tag_template.name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteTagTemplate(self, request, context): - """Deletes a tag template and all tags using the template. - Users should enable the Data Catalog API in the project identified by - the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateTagTemplateField(self, request, context): - """Creates a field in a tag template. The user should enable the Data Catalog - API in the project identified by the `parent` parameter (see - [Data Catalog Resource - Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateTagTemplateField(self, request, context): - """Updates a field in a tag template. This method cannot be used to update the - field type. Users should enable the Data Catalog API in the project - identified by the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def RenameTagTemplateField(self, request, context): - """Renames a field in a tag template. The user should enable the Data Catalog - API in the project identified by the `name` parameter (see [Data Catalog - Resource - Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteTagTemplateField(self, request, context): - """Deletes a field in a tag template and all uses of that field. - Users should enable the Data Catalog API in the project identified by - the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateTag(self, request, context): - """Creates a tag on an [Entry][google.cloud.datacatalog.v1.Entry]. - Note: The project identified by the `parent` parameter for the - [tag](https://cloud.google.com/data-catalog/docs/reference/rest/v1/projects.locations.entryGroups.entries.tags/create#path-parameters) - and the - [tag - template](https://cloud.google.com/data-catalog/docs/reference/rest/v1/projects.locations.tagTemplates/create#path-parameters) - used to create the tag must be from the same organization. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateTag(self, request, context): - """Updates an existing tag. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteTag(self, request, context): - """Deletes a tag. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def ListTags(self, request, context): - """Lists the tags on an [Entry][google.cloud.datacatalog.v1.Entry]. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def SetIamPolicy(self, request, context): - """Sets the access control policy for a resource. Replaces any existing - policy. - Supported resources are: - - Tag templates. - - Entries. - - Entry groups. - Note, this method cannot be used to manage policies for BigQuery, Pub/Sub - and any external Google Cloud Platform resources synced to Data Catalog. - - Callers must have following Google IAM permission - - `datacatalog.tagTemplates.setIamPolicy` to set policies on tag - templates. - - `datacatalog.entries.setIamPolicy` to set policies on entries. - - `datacatalog.entryGroups.setIamPolicy` to set policies on entry groups. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def GetIamPolicy(self, request, context): - """Gets the access control policy for a resource. A `NOT_FOUND` error - is returned if the resource does not exist. An empty policy is returned - if the resource exists but does not have a policy set on it. - - Supported resources are: - - Tag templates. - - Entries. - - Entry groups. - Note, this method cannot be used to manage policies for BigQuery, Pub/Sub - and any external Google Cloud Platform resources synced to Data Catalog. - - Callers must have following Google IAM permission - - `datacatalog.tagTemplates.getIamPolicy` to get policies on tag - templates. - - `datacatalog.entries.getIamPolicy` to get policies on entries. - - `datacatalog.entryGroups.getIamPolicy` to get policies on entry groups. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def TestIamPermissions(self, request, context): - """Returns the caller's permissions on a resource. - If the resource does not exist, an empty set of permissions is returned - (We don't return a `NOT_FOUND` error). - - Supported resources are: - - Tag templates. - - Entries. - - Entry groups. - Note, this method cannot be used to manage policies for BigQuery, Pub/Sub - and any external Google Cloud Platform resources synced to Data Catalog. - - A caller is not required to have Google IAM permission to make this - request. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - -def add_DataCatalogServicer_to_server(servicer, server): - rpc_method_handlers = { - "SearchCatalog": grpc.unary_unary_rpc_method_handler( - servicer.SearchCatalog, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.SearchCatalogRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.SearchCatalogResponse.SerializeToString, - ), - "CreateEntryGroup": grpc.unary_unary_rpc_method_handler( - servicer.CreateEntryGroup, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateEntryGroupRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.EntryGroup.SerializeToString, - ), - "GetEntryGroup": grpc.unary_unary_rpc_method_handler( - servicer.GetEntryGroup, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.GetEntryGroupRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.EntryGroup.SerializeToString, - ), - "UpdateEntryGroup": grpc.unary_unary_rpc_method_handler( - servicer.UpdateEntryGroup, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateEntryGroupRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.EntryGroup.SerializeToString, - ), - "DeleteEntryGroup": grpc.unary_unary_rpc_method_handler( - servicer.DeleteEntryGroup, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteEntryGroupRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "ListEntryGroups": grpc.unary_unary_rpc_method_handler( - servicer.ListEntryGroups, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsResponse.SerializeToString, - ), - "CreateEntry": grpc.unary_unary_rpc_method_handler( - servicer.CreateEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateEntryRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.SerializeToString, - ), - "UpdateEntry": grpc.unary_unary_rpc_method_handler( - servicer.UpdateEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateEntryRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.SerializeToString, - ), - "DeleteEntry": grpc.unary_unary_rpc_method_handler( - servicer.DeleteEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteEntryRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "GetEntry": grpc.unary_unary_rpc_method_handler( - servicer.GetEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.GetEntryRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.SerializeToString, - ), - "LookupEntry": grpc.unary_unary_rpc_method_handler( - servicer.LookupEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.LookupEntryRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.SerializeToString, - ), - "ListEntries": grpc.unary_unary_rpc_method_handler( - servicer.ListEntries, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntriesRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntriesResponse.SerializeToString, - ), - "CreateTagTemplate": grpc.unary_unary_rpc_method_handler( - servicer.CreateTagTemplate, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplate.SerializeToString, - ), - "GetTagTemplate": grpc.unary_unary_rpc_method_handler( - servicer.GetTagTemplate, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.GetTagTemplateRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplate.SerializeToString, - ), - "UpdateTagTemplate": grpc.unary_unary_rpc_method_handler( - servicer.UpdateTagTemplate, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplate.SerializeToString, - ), - "DeleteTagTemplate": grpc.unary_unary_rpc_method_handler( - servicer.DeleteTagTemplate, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "CreateTagTemplateField": grpc.unary_unary_rpc_method_handler( - servicer.CreateTagTemplateField, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateFieldRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplateField.SerializeToString, - ), - "UpdateTagTemplateField": grpc.unary_unary_rpc_method_handler( - servicer.UpdateTagTemplateField, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateFieldRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplateField.SerializeToString, - ), - "RenameTagTemplateField": grpc.unary_unary_rpc_method_handler( - servicer.RenameTagTemplateField, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.RenameTagTemplateFieldRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplateField.SerializeToString, - ), - "DeleteTagTemplateField": grpc.unary_unary_rpc_method_handler( - servicer.DeleteTagTemplateField, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateFieldRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "CreateTag": grpc.unary_unary_rpc_method_handler( - servicer.CreateTag, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateTagRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.Tag.SerializeToString, - ), - "UpdateTag": grpc.unary_unary_rpc_method_handler( - servicer.UpdateTag, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateTagRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.Tag.SerializeToString, - ), - "DeleteTag": grpc.unary_unary_rpc_method_handler( - servicer.DeleteTag, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteTagRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "ListTags": grpc.unary_unary_rpc_method_handler( - servicer.ListTags, - request_deserializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListTagsRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListTagsResponse.SerializeToString, - ), - "SetIamPolicy": grpc.unary_unary_rpc_method_handler( - servicer.SetIamPolicy, - request_deserializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.SetIamPolicyRequest.FromString, - response_serializer=google_dot_iam_dot_v1_dot_policy__pb2.Policy.SerializeToString, - ), - "GetIamPolicy": grpc.unary_unary_rpc_method_handler( - servicer.GetIamPolicy, - request_deserializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.GetIamPolicyRequest.FromString, - response_serializer=google_dot_iam_dot_v1_dot_policy__pb2.Policy.SerializeToString, - ), - "TestIamPermissions": grpc.unary_unary_rpc_method_handler( - servicer.TestIamPermissions, - request_deserializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsRequest.FromString, - response_serializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsResponse.SerializeToString, - ), - } - generic_handler = grpc.method_handlers_generic_handler( - "google.cloud.datacatalog.v1.DataCatalog", rpc_method_handlers - ) - server.add_generic_rpc_handlers((generic_handler,)) - - -# This class is part of an EXPERIMENTAL API. -class DataCatalog(object): - """Data Catalog API service allows clients to discover, understand, and manage - their data. - """ - - @staticmethod - def SearchCatalog( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/SearchCatalog", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.SearchCatalogRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.SearchCatalogResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateEntryGroup( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/CreateEntryGroup", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateEntryGroupRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def GetEntryGroup( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/GetEntryGroup", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.GetEntryGroupRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateEntryGroup( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/UpdateEntryGroup", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateEntryGroupRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteEntryGroup( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/DeleteEntryGroup", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteEntryGroupRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def ListEntryGroups( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/ListEntryGroups", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/CreateEntry", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateEntryRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/UpdateEntry", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateEntryRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/DeleteEntry", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteEntryRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def GetEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/GetEntry", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.GetEntryRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def LookupEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/LookupEntry", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.LookupEntryRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def ListEntries( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/ListEntries", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntriesRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListEntriesResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateTagTemplate( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/CreateTagTemplate", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def GetTagTemplate( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/GetTagTemplate", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.GetTagTemplateRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateTagTemplate( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/UpdateTagTemplate", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteTagTemplate( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/DeleteTagTemplate", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateTagTemplateField( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/CreateTagTemplateField", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateFieldRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateTagTemplateField( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/UpdateTagTemplateField", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateFieldRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def RenameTagTemplateField( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/RenameTagTemplateField", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.RenameTagTemplateFieldRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteTagTemplateField( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/DeleteTagTemplateField", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateFieldRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateTag( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/CreateTag", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.CreateTagRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.Tag.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateTag( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/UpdateTag", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.UpdateTagRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_tags__pb2.Tag.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteTag( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/DeleteTag", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.DeleteTagRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def ListTags( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/ListTags", - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListTagsRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_datacatalog__pb2.ListTagsResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def SetIamPolicy( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/SetIamPolicy", - google_dot_iam_dot_v1_dot_iam__policy__pb2.SetIamPolicyRequest.SerializeToString, - google_dot_iam_dot_v1_dot_policy__pb2.Policy.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def GetIamPolicy( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/GetIamPolicy", - google_dot_iam_dot_v1_dot_iam__policy__pb2.GetIamPolicyRequest.SerializeToString, - google_dot_iam_dot_v1_dot_policy__pb2.Policy.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def TestIamPermissions( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1.DataCatalog/TestIamPermissions", - google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsRequest.SerializeToString, - google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) diff --git a/google/cloud/datacatalog_v1/proto/gcs_fileset_spec.proto b/google/cloud/datacatalog_v1/proto/gcs_fileset_spec.proto new file mode 100644 index 00000000..bcf0ead6 --- /dev/null +++ b/google/cloud/datacatalog_v1/proto/gcs_fileset_spec.proto @@ -0,0 +1,77 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1; + +import "google/api/field_behavior.proto"; +import "google/cloud/datacatalog/v1/timestamps.proto"; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog"; +option java_multiple_files = true; +option java_package = "com.google.cloud.datacatalog.v1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1"; +option ruby_package = "Google::Cloud::DataCatalog::V1"; + +// Describes a Cloud Storage fileset entry. +message GcsFilesetSpec { + // Required. Patterns to identify a set of files in Google Cloud Storage. + // See [Cloud Storage + // documentation](https://cloud.google.com/storage/docs/gsutil/addlhelp/WildcardNames) + // for more information. Note that bucket wildcards are currently not + // supported. + // + // Examples of valid file_patterns: + // + // * `gs://bucket_name/dir/*`: matches all files within `bucket_name/dir` + // directory. + // * `gs://bucket_name/dir/**`: matches all files in `bucket_name/dir` + // spanning all subdirectories. + // * `gs://bucket_name/file*`: matches files prefixed by `file` in + // `bucket_name` + // * `gs://bucket_name/??.txt`: matches files with two characters followed by + // `.txt` in `bucket_name` + // * `gs://bucket_name/[aeiou].txt`: matches files that contain a single + // vowel character followed by `.txt` in + // `bucket_name` + // * `gs://bucket_name/[a-m].txt`: matches files that contain `a`, `b`, ... + // or `m` followed by `.txt` in `bucket_name` + // * `gs://bucket_name/a/*/b`: matches all files in `bucket_name` that match + // `a/*/b` pattern, such as `a/c/b`, `a/d/b` + // * `gs://another_bucket/a.txt`: matches `gs://another_bucket/a.txt` + // + // You can combine wildcards to provide more powerful matches, for example: + // + // * `gs://bucket_name/[a-m]??.j*g` + repeated string file_patterns = 1 [(google.api.field_behavior) = REQUIRED]; + + // Output only. Sample files contained in this fileset, not all files + // contained in this fileset are represented here. + repeated GcsFileSpec sample_gcs_file_specs = 2 [(google.api.field_behavior) = OUTPUT_ONLY]; +} + +// Specifications of a single file in Cloud Storage. +message GcsFileSpec { + // Required. The full file path. Example: `gs://bucket_name/a/b.txt`. + string file_path = 1 [(google.api.field_behavior) = REQUIRED]; + + // Output only. Timestamps about the Cloud Storage file. + SystemTimestamps gcs_timestamps = 2 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // Output only. The size of the file, in bytes. + int64 size_bytes = 4 [(google.api.field_behavior) = OUTPUT_ONLY]; +} diff --git a/google/cloud/datacatalog_v1/proto/gcs_fileset_spec_pb2.py b/google/cloud/datacatalog_v1/proto/gcs_fileset_spec_pb2.py deleted file mode 100644 index 27828664..00000000 --- a/google/cloud/datacatalog_v1/proto/gcs_fileset_spec_pb2.py +++ /dev/null @@ -1,254 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1/proto/gcs_fileset_spec.proto - -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.cloud.datacatalog_v1.proto import ( - timestamps_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_timestamps__pb2, -) - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1/proto/gcs_fileset_spec.proto", - package="google.cloud.datacatalog.v1", - syntax="proto3", - serialized_options=b"\n\037com.google.cloud.datacatalog.v1P\001ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\370\001\001\252\002\033Google.Cloud.DataCatalog.V1\312\002\033Google\\Cloud\\DataCatalog\\V1\352\002\036Google::Cloud::DataCatalog::V1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n8google/cloud/datacatalog_v1/proto/gcs_fileset_spec.proto\x12\x1bgoogle.cloud.datacatalog.v1\x1a\x1fgoogle/api/field_behavior.proto\x1a\x32google/cloud/datacatalog_v1/proto/timestamps.proto"z\n\x0eGcsFilesetSpec\x12\x1a\n\rfile_patterns\x18\x01 \x03(\tB\x03\xe0\x41\x02\x12L\n\x15sample_gcs_file_specs\x18\x02 \x03(\x0b\x32(.google.cloud.datacatalog.v1.GcsFileSpecB\x03\xe0\x41\x03"\x8a\x01\n\x0bGcsFileSpec\x12\x16\n\tfile_path\x18\x01 \x01(\tB\x03\xe0\x41\x02\x12J\n\x0egcs_timestamps\x18\x02 \x01(\x0b\x32-.google.cloud.datacatalog.v1.SystemTimestampsB\x03\xe0\x41\x03\x12\x17\n\nsize_bytes\x18\x04 \x01(\x03\x42\x03\xe0\x41\x03\x42\xcb\x01\n\x1f\x63om.google.cloud.datacatalog.v1P\x01ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\xf8\x01\x01\xaa\x02\x1bGoogle.Cloud.DataCatalog.V1\xca\x02\x1bGoogle\\Cloud\\DataCatalog\\V1\xea\x02\x1eGoogle::Cloud::DataCatalog::V1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_timestamps__pb2.DESCRIPTOR, - ], -) - - -_GCSFILESETSPEC = _descriptor.Descriptor( - name="GcsFilesetSpec", - full_name="google.cloud.datacatalog.v1.GcsFilesetSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="file_patterns", - full_name="google.cloud.datacatalog.v1.GcsFilesetSpec.file_patterns", - index=0, - number=1, - type=9, - cpp_type=9, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="sample_gcs_file_specs", - full_name="google.cloud.datacatalog.v1.GcsFilesetSpec.sample_gcs_file_specs", - index=1, - number=2, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=174, - serialized_end=296, -) - - -_GCSFILESPEC = _descriptor.Descriptor( - name="GcsFileSpec", - full_name="google.cloud.datacatalog.v1.GcsFileSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="file_path", - full_name="google.cloud.datacatalog.v1.GcsFileSpec.file_path", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="gcs_timestamps", - full_name="google.cloud.datacatalog.v1.GcsFileSpec.gcs_timestamps", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="size_bytes", - full_name="google.cloud.datacatalog.v1.GcsFileSpec.size_bytes", - index=2, - number=4, - type=3, - cpp_type=2, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=299, - serialized_end=437, -) - -_GCSFILESETSPEC.fields_by_name["sample_gcs_file_specs"].message_type = _GCSFILESPEC -_GCSFILESPEC.fields_by_name[ - "gcs_timestamps" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_timestamps__pb2._SYSTEMTIMESTAMPS -) -DESCRIPTOR.message_types_by_name["GcsFilesetSpec"] = _GCSFILESETSPEC -DESCRIPTOR.message_types_by_name["GcsFileSpec"] = _GCSFILESPEC -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -GcsFilesetSpec = _reflection.GeneratedProtocolMessageType( - "GcsFilesetSpec", - (_message.Message,), - { - "DESCRIPTOR": _GCSFILESETSPEC, - "__module__": "google.cloud.datacatalog_v1.proto.gcs_fileset_spec_pb2", - "__doc__": """Describes a Cloud Storage fileset entry. - - Attributes: - file_patterns: - Required. Patterns to identify a set of files in Google Cloud - Storage. See `Cloud Storage documentation `__ for more - information. Note that bucket wildcards are currently not - supported. Examples of valid file_patterns: - - ``gs://bucket_name/dir/*``: matches all files within - ``bucket_name/dir`` directory. - ``gs://bucket_name/dir/**``: - matches all files in ``bucket_name/dir`` spanning all - subdirectories. - ``gs://bucket_name/file*``: matches files - prefixed by ``file`` in ``bucket_name`` - - ``gs://bucket_name/??.txt``: matches files with two characters - followed by ``.txt`` in ``bucket_name`` - - ``gs://bucket_name/[aeiou].txt``: matches files that contain a - single vowel character followed by ``.txt`` in - ``bucket_name`` - ``gs://bucket_name/[a-m].txt``: matches - files that contain ``a``, ``b``, … or ``m`` followed by - ``.txt`` in ``bucket_name`` - ``gs://bucket_name/a/*/b``: - matches all files in ``bucket_name`` that match ``a/*/b`` - pattern, such as ``a/c/b``, ``a/d/b`` - - ``gs://another_bucket/a.txt``: matches - ``gs://another_bucket/a.txt`` You can combine wildcards to - provide more powerful matches, for example: - - ``gs://bucket_name/[a-m]??.j*g`` - sample_gcs_file_specs: - Output only. Sample files contained in this fileset, not all - files contained in this fileset are represented here. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.GcsFilesetSpec) - }, -) -_sym_db.RegisterMessage(GcsFilesetSpec) - -GcsFileSpec = _reflection.GeneratedProtocolMessageType( - "GcsFileSpec", - (_message.Message,), - { - "DESCRIPTOR": _GCSFILESPEC, - "__module__": "google.cloud.datacatalog_v1.proto.gcs_fileset_spec_pb2", - "__doc__": """Specifications of a single file in Cloud Storage. - - Attributes: - file_path: - Required. The full file path. Example: - ``gs://bucket_name/a/b.txt``. - gcs_timestamps: - Output only. Timestamps about the Cloud Storage file. - size_bytes: - Output only. The size of the file, in bytes. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.GcsFileSpec) - }, -) -_sym_db.RegisterMessage(GcsFileSpec) - - -DESCRIPTOR._options = None -_GCSFILESETSPEC.fields_by_name["file_patterns"]._options = None -_GCSFILESETSPEC.fields_by_name["sample_gcs_file_specs"]._options = None -_GCSFILESPEC.fields_by_name["file_path"]._options = None -_GCSFILESPEC.fields_by_name["gcs_timestamps"]._options = None -_GCSFILESPEC.fields_by_name["size_bytes"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1/proto/gcs_fileset_spec_pb2_grpc.py b/google/cloud/datacatalog_v1/proto/gcs_fileset_spec_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1/proto/gcs_fileset_spec_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1/proto/schema.proto b/google/cloud/datacatalog_v1/proto/schema.proto new file mode 100644 index 00000000..c34d99e2 --- /dev/null +++ b/google/cloud/datacatalog_v1/proto/schema.proto @@ -0,0 +1,55 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1; + +import "google/api/field_behavior.proto"; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog"; +option java_multiple_files = true; +option java_package = "com.google.cloud.datacatalog.v1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1"; +option ruby_package = "Google::Cloud::DataCatalog::V1"; + +// Represents a schema (e.g. BigQuery, GoogleSQL, Avro schema). +message Schema { + // Required. Schema of columns. A maximum of 10,000 columns and sub-columns can be + // specified. + repeated ColumnSchema columns = 2 [(google.api.field_behavior) = REQUIRED]; +} + +// Representation of a column within a schema. Columns could be nested inside +// other columns. +message ColumnSchema { + // Required. Name of the column. + string column = 6 [(google.api.field_behavior) = REQUIRED]; + + // Required. Type of the column. + string type = 1 [(google.api.field_behavior) = REQUIRED]; + + // Optional. Description of the column. Default value is an empty string. + string description = 2 [(google.api.field_behavior) = OPTIONAL]; + + // Optional. A column's mode indicates whether the values in this column are required, + // nullable, etc. Only `NULLABLE`, `REQUIRED` and `REPEATED` are supported. + // Default mode is `NULLABLE`. + string mode = 3 [(google.api.field_behavior) = OPTIONAL]; + + // Optional. Schema of sub-columns. A column can have zero or more sub-columns. + repeated ColumnSchema subcolumns = 7 [(google.api.field_behavior) = OPTIONAL]; +} diff --git a/google/cloud/datacatalog_v1/proto/schema_pb2.py b/google/cloud/datacatalog_v1/proto/schema_pb2.py deleted file mode 100644 index 5f32f487..00000000 --- a/google/cloud/datacatalog_v1/proto/schema_pb2.py +++ /dev/null @@ -1,249 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1/proto/schema.proto - -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1/proto/schema.proto", - package="google.cloud.datacatalog.v1", - syntax="proto3", - serialized_options=b"\n\037com.google.cloud.datacatalog.v1P\001ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\370\001\001\252\002\033Google.Cloud.DataCatalog.V1\312\002\033Google\\Cloud\\DataCatalog\\V1\352\002\036Google::Cloud::DataCatalog::V1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n.google/cloud/datacatalog_v1/proto/schema.proto\x12\x1bgoogle.cloud.datacatalog.v1\x1a\x1fgoogle/api/field_behavior.proto"I\n\x06Schema\x12?\n\x07\x63olumns\x18\x02 \x03(\x0b\x32).google.cloud.datacatalog.v1.ColumnSchemaB\x03\xe0\x41\x02"\xa7\x01\n\x0c\x43olumnSchema\x12\x13\n\x06\x63olumn\x18\x06 \x01(\tB\x03\xe0\x41\x02\x12\x11\n\x04type\x18\x01 \x01(\tB\x03\xe0\x41\x02\x12\x18\n\x0b\x64\x65scription\x18\x02 \x01(\tB\x03\xe0\x41\x01\x12\x11\n\x04mode\x18\x03 \x01(\tB\x03\xe0\x41\x01\x12\x42\n\nsubcolumns\x18\x07 \x03(\x0b\x32).google.cloud.datacatalog.v1.ColumnSchemaB\x03\xe0\x41\x01\x42\xcb\x01\n\x1f\x63om.google.cloud.datacatalog.v1P\x01ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\xf8\x01\x01\xaa\x02\x1bGoogle.Cloud.DataCatalog.V1\xca\x02\x1bGoogle\\Cloud\\DataCatalog\\V1\xea\x02\x1eGoogle::Cloud::DataCatalog::V1b\x06proto3', - dependencies=[google_dot_api_dot_field__behavior__pb2.DESCRIPTOR], -) - - -_SCHEMA = _descriptor.Descriptor( - name="Schema", - full_name="google.cloud.datacatalog.v1.Schema", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="columns", - full_name="google.cloud.datacatalog.v1.Schema.columns", - index=0, - number=2, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=112, - serialized_end=185, -) - - -_COLUMNSCHEMA = _descriptor.Descriptor( - name="ColumnSchema", - full_name="google.cloud.datacatalog.v1.ColumnSchema", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="column", - full_name="google.cloud.datacatalog.v1.ColumnSchema.column", - index=0, - number=6, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="type", - full_name="google.cloud.datacatalog.v1.ColumnSchema.type", - index=1, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="description", - full_name="google.cloud.datacatalog.v1.ColumnSchema.description", - index=2, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="mode", - full_name="google.cloud.datacatalog.v1.ColumnSchema.mode", - index=3, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="subcolumns", - full_name="google.cloud.datacatalog.v1.ColumnSchema.subcolumns", - index=4, - number=7, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=188, - serialized_end=355, -) - -_SCHEMA.fields_by_name["columns"].message_type = _COLUMNSCHEMA -_COLUMNSCHEMA.fields_by_name["subcolumns"].message_type = _COLUMNSCHEMA -DESCRIPTOR.message_types_by_name["Schema"] = _SCHEMA -DESCRIPTOR.message_types_by_name["ColumnSchema"] = _COLUMNSCHEMA -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -Schema = _reflection.GeneratedProtocolMessageType( - "Schema", - (_message.Message,), - { - "DESCRIPTOR": _SCHEMA, - "__module__": "google.cloud.datacatalog_v1.proto.schema_pb2", - "__doc__": """Represents a schema (e.g. BigQuery, GoogleSQL, Avro schema). - - Attributes: - columns: - Required. Schema of columns. A maximum of 10,000 columns and - sub-columns can be specified. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.Schema) - }, -) -_sym_db.RegisterMessage(Schema) - -ColumnSchema = _reflection.GeneratedProtocolMessageType( - "ColumnSchema", - (_message.Message,), - { - "DESCRIPTOR": _COLUMNSCHEMA, - "__module__": "google.cloud.datacatalog_v1.proto.schema_pb2", - "__doc__": """Representation of a column within a schema. Columns could be nested - inside other columns. - - Attributes: - column: - Required. Name of the column. - type: - Required. Type of the column. - description: - Optional. Description of the column. Default value is an empty - string. - mode: - Optional. A column’s mode indicates whether the values in this - column are required, nullable, etc. Only ``NULLABLE``, - ``REQUIRED`` and ``REPEATED`` are supported. Default mode is - ``NULLABLE``. - subcolumns: - Optional. Schema of sub-columns. A column can have zero or - more sub-columns. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.ColumnSchema) - }, -) -_sym_db.RegisterMessage(ColumnSchema) - - -DESCRIPTOR._options = None -_SCHEMA.fields_by_name["columns"]._options = None -_COLUMNSCHEMA.fields_by_name["column"]._options = None -_COLUMNSCHEMA.fields_by_name["type"]._options = None -_COLUMNSCHEMA.fields_by_name["description"]._options = None -_COLUMNSCHEMA.fields_by_name["mode"]._options = None -_COLUMNSCHEMA.fields_by_name["subcolumns"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1/proto/schema_pb2_grpc.py b/google/cloud/datacatalog_v1/proto/schema_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1/proto/schema_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1/proto/search.proto b/google/cloud/datacatalog_v1/proto/search.proto new file mode 100644 index 00000000..37f6923b --- /dev/null +++ b/google/cloud/datacatalog_v1/proto/search.proto @@ -0,0 +1,84 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1; + +import "google/api/field_behavior.proto"; +import "google/cloud/datacatalog/v1/common.proto"; +import "google/protobuf/timestamp.proto"; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog"; +option java_multiple_files = true; +option java_package = "com.google.cloud.datacatalog.v1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1"; +option ruby_package = "Google::Cloud::DataCatalog::V1"; + +// A result that appears in the response of a search request. Each result +// captures details of one entry that matches the search. +message SearchCatalogResult { + // Type of the search result. This field can be used to determine which Get + // method to call to fetch the full resource. + SearchResultType search_result_type = 1; + + // Sub-type of the search result. This is a dot-delimited description of the + // resource's full type, and is the same as the value callers would provide in + // the "type" search facet. Examples: `entry.table`, `entry.dataStream`, + // `tagTemplate`. + string search_result_subtype = 2; + + // The relative resource name of the resource in URL format. + // Examples: + // + // * `projects/{project_id}/locations/{location_id}/entryGroups/{entry_group_id}/entries/{entry_id}` + // * `projects/{project_id}/tagTemplates/{tag_template_id}` + string relative_resource_name = 3; + + // The full name of the cloud resource the entry belongs to. See: + // https://cloud.google.com/apis/design/resource_names#full_resource_name. + // Example: + // + // * `//bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId` + string linked_resource = 4; + + // The source system of the entry. Only applicable when `search_result_type` + // is ENTRY. + oneof system { + // Output only. This field indicates the entry's source system that Data Catalog + // integrates with, such as BigQuery or Cloud Pub/Sub. + IntegratedSystem integrated_system = 8 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // This field indicates the entry's source system that Data Catalog does not + // integrate with. + string user_specified_system = 9; + } +} + +// The different types of resources that can be returned in search. +enum SearchResultType { + // Default unknown type. + SEARCH_RESULT_TYPE_UNSPECIFIED = 0; + + // An [Entry][google.cloud.datacatalog.v1.Entry]. + ENTRY = 1; + + // A [TagTemplate][google.cloud.datacatalog.v1.TagTemplate]. + TAG_TEMPLATE = 2; + + // An [EntryGroup][google.cloud.datacatalog.v1.EntryGroup]. + ENTRY_GROUP = 3; +} diff --git a/google/cloud/datacatalog_v1/proto/search_pb2.py b/google/cloud/datacatalog_v1/proto/search_pb2.py deleted file mode 100644 index 0fa3ff18..00000000 --- a/google/cloud/datacatalog_v1/proto/search_pb2.py +++ /dev/null @@ -1,305 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1/proto/search.proto - -from google.protobuf.internal import enum_type_wrapper -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.cloud.datacatalog_v1.proto import ( - common_pb2 as google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_common__pb2, -) -from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1/proto/search.proto", - package="google.cloud.datacatalog.v1", - syntax="proto3", - serialized_options=b"\n\037com.google.cloud.datacatalog.v1P\001ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\370\001\001\252\002\033Google.Cloud.DataCatalog.V1\312\002\033Google\\Cloud\\DataCatalog\\V1\352\002\036Google::Cloud::DataCatalog::V1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n.google/cloud/datacatalog_v1/proto/search.proto\x12\x1bgoogle.cloud.datacatalog.v1\x1a\x1fgoogle/api/field_behavior.proto\x1a.google/cloud/datacatalog_v1/proto/common.proto\x1a\x1fgoogle/protobuf/timestamp.proto"\xb4\x02\n\x13SearchCatalogResult\x12I\n\x12search_result_type\x18\x01 \x01(\x0e\x32-.google.cloud.datacatalog.v1.SearchResultType\x12\x1d\n\x15search_result_subtype\x18\x02 \x01(\t\x12\x1e\n\x16relative_resource_name\x18\x03 \x01(\t\x12\x17\n\x0flinked_resource\x18\x04 \x01(\t\x12O\n\x11integrated_system\x18\x08 \x01(\x0e\x32-.google.cloud.datacatalog.v1.IntegratedSystemB\x03\xe0\x41\x03H\x00\x12\x1f\n\x15user_specified_system\x18\t \x01(\tH\x00\x42\x08\n\x06system*d\n\x10SearchResultType\x12"\n\x1eSEARCH_RESULT_TYPE_UNSPECIFIED\x10\x00\x12\t\n\x05\x45NTRY\x10\x01\x12\x10\n\x0cTAG_TEMPLATE\x10\x02\x12\x0f\n\x0b\x45NTRY_GROUP\x10\x03\x42\xcb\x01\n\x1f\x63om.google.cloud.datacatalog.v1P\x01ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\xf8\x01\x01\xaa\x02\x1bGoogle.Cloud.DataCatalog.V1\xca\x02\x1bGoogle\\Cloud\\DataCatalog\\V1\xea\x02\x1eGoogle::Cloud::DataCatalog::V1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_common__pb2.DESCRIPTOR, - google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR, - ], -) - -_SEARCHRESULTTYPE = _descriptor.EnumDescriptor( - name="SearchResultType", - full_name="google.cloud.datacatalog.v1.SearchResultType", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="SEARCH_RESULT_TYPE_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="ENTRY", - index=1, - number=1, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="TAG_TEMPLATE", - index=2, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="ENTRY_GROUP", - index=3, - number=3, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=504, - serialized_end=604, -) -_sym_db.RegisterEnumDescriptor(_SEARCHRESULTTYPE) - -SearchResultType = enum_type_wrapper.EnumTypeWrapper(_SEARCHRESULTTYPE) -SEARCH_RESULT_TYPE_UNSPECIFIED = 0 -ENTRY = 1 -TAG_TEMPLATE = 2 -ENTRY_GROUP = 3 - - -_SEARCHCATALOGRESULT = _descriptor.Descriptor( - name="SearchCatalogResult", - full_name="google.cloud.datacatalog.v1.SearchCatalogResult", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="search_result_type", - full_name="google.cloud.datacatalog.v1.SearchCatalogResult.search_result_type", - index=0, - number=1, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="search_result_subtype", - full_name="google.cloud.datacatalog.v1.SearchCatalogResult.search_result_subtype", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="relative_resource_name", - full_name="google.cloud.datacatalog.v1.SearchCatalogResult.relative_resource_name", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="linked_resource", - full_name="google.cloud.datacatalog.v1.SearchCatalogResult.linked_resource", - index=3, - number=4, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="integrated_system", - full_name="google.cloud.datacatalog.v1.SearchCatalogResult.integrated_system", - index=4, - number=8, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="user_specified_system", - full_name="google.cloud.datacatalog.v1.SearchCatalogResult.user_specified_system", - index=5, - number=9, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="system", - full_name="google.cloud.datacatalog.v1.SearchCatalogResult.system", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=194, - serialized_end=502, -) - -_SEARCHCATALOGRESULT.fields_by_name["search_result_type"].enum_type = _SEARCHRESULTTYPE -_SEARCHCATALOGRESULT.fields_by_name[ - "integrated_system" -].enum_type = ( - google_dot_cloud_dot_datacatalog__v1_dot_proto_dot_common__pb2._INTEGRATEDSYSTEM -) -_SEARCHCATALOGRESULT.oneofs_by_name["system"].fields.append( - _SEARCHCATALOGRESULT.fields_by_name["integrated_system"] -) -_SEARCHCATALOGRESULT.fields_by_name[ - "integrated_system" -].containing_oneof = _SEARCHCATALOGRESULT.oneofs_by_name["system"] -_SEARCHCATALOGRESULT.oneofs_by_name["system"].fields.append( - _SEARCHCATALOGRESULT.fields_by_name["user_specified_system"] -) -_SEARCHCATALOGRESULT.fields_by_name[ - "user_specified_system" -].containing_oneof = _SEARCHCATALOGRESULT.oneofs_by_name["system"] -DESCRIPTOR.message_types_by_name["SearchCatalogResult"] = _SEARCHCATALOGRESULT -DESCRIPTOR.enum_types_by_name["SearchResultType"] = _SEARCHRESULTTYPE -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -SearchCatalogResult = _reflection.GeneratedProtocolMessageType( - "SearchCatalogResult", - (_message.Message,), - { - "DESCRIPTOR": _SEARCHCATALOGRESULT, - "__module__": "google.cloud.datacatalog_v1.proto.search_pb2", - "__doc__": """A result that appears in the response of a search request. Each result - captures details of one entry that matches the search. - - Attributes: - search_result_type: - Type of the search result. This field can be used to determine - which Get method to call to fetch the full resource. - search_result_subtype: - Sub-type of the search result. This is a dot-delimited - description of the resource’s full type, and is the same as - the value callers would provide in the “type” search facet. - Examples: ``entry.table``, ``entry.dataStream``, - ``tagTemplate``. - relative_resource_name: - The relative resource name of the resource in URL format. - Examples: - ``projects/{project_id}/locations/{location_id}/ - entryGroups/{entry_group_id}/entries/{entry_id}`` - - ``projects/{project_id}/tagTemplates/{tag_template_id}`` - linked_resource: - The full name of the cloud resource the entry belongs to. See: - https://cloud.google.com/apis/design/resource_names#full_resou - rce_name. Example: - ``//bigquery.googleapis.com/projects/pr - ojectId/datasets/datasetId/tables/tableId`` - system: - The source system of the entry. Only applicable when - ``search_result_type`` is ENTRY. - integrated_system: - Output only. This field indicates the entry’s source system - that Data Catalog integrates with, such as BigQuery or Cloud - Pub/Sub. - user_specified_system: - This field indicates the entry’s source system that Data - Catalog does not integrate with. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.SearchCatalogResult) - }, -) -_sym_db.RegisterMessage(SearchCatalogResult) - - -DESCRIPTOR._options = None -_SEARCHCATALOGRESULT.fields_by_name["integrated_system"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1/proto/search_pb2_grpc.py b/google/cloud/datacatalog_v1/proto/search_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1/proto/search_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1/proto/table_spec.proto b/google/cloud/datacatalog_v1/proto/table_spec.proto new file mode 100644 index 00000000..c87afc54 --- /dev/null +++ b/google/cloud/datacatalog_v1/proto/table_spec.proto @@ -0,0 +1,101 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1; + +import "google/api/field_behavior.proto"; +import "google/api/resource.proto"; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog"; +option java_multiple_files = true; +option java_package = "com.google.cloud.datacatalog.v1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1"; +option ruby_package = "Google::Cloud::DataCatalog::V1"; + +// Describes a BigQuery table. +message BigQueryTableSpec { + // Output only. The table source type. + TableSourceType table_source_type = 1 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // Output only. + oneof type_spec { + // Table view specification. This field should only be populated if + // `table_source_type` is `BIGQUERY_VIEW`. + ViewSpec view_spec = 2; + + // Spec of a BigQuery table. This field should only be populated if + // `table_source_type` is `BIGQUERY_TABLE`. + TableSpec table_spec = 3; + } +} + +// Table source type. +enum TableSourceType { + // Default unknown type. + TABLE_SOURCE_TYPE_UNSPECIFIED = 0; + + // Table view. + BIGQUERY_VIEW = 2; + + // BigQuery native table. + BIGQUERY_TABLE = 5; +} + +// Table view specification. +message ViewSpec { + // Output only. The query that defines the table view. + string view_query = 1 [(google.api.field_behavior) = OUTPUT_ONLY]; +} + +// Normal BigQuery table spec. +message TableSpec { + // Output only. If the table is a dated shard, i.e., with name pattern `[prefix]YYYYMMDD`, + // `grouped_entry` is the Data Catalog resource name of the date sharded + // grouped entry, for example, + // `projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}`. + // Otherwise, `grouped_entry` is empty. + string grouped_entry = 1 [ + (google.api.field_behavior) = OUTPUT_ONLY, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/Entry" + } + ]; +} + +// Spec for a group of BigQuery tables with name pattern `[prefix]YYYYMMDD`. +// Context: +// https://cloud.google.com/bigquery/docs/partitioned-tables#partitioning_versus_sharding +message BigQueryDateShardedSpec { + // Output only. The Data Catalog resource name of the dataset entry the current table + // belongs to, for example, + // `projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}`. + string dataset = 1 [ + (google.api.field_behavior) = OUTPUT_ONLY, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/Entry" + } + ]; + + // Output only. The table name prefix of the shards. The name of any given shard is + // `[table_prefix]YYYYMMDD`, for example, for shard `MyTable20180101`, the + // `table_prefix` is `MyTable`. + string table_prefix = 2 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // Output only. Total number of shards. + int64 shard_count = 3 [(google.api.field_behavior) = OUTPUT_ONLY]; +} diff --git a/google/cloud/datacatalog_v1/proto/table_spec_pb2.py b/google/cloud/datacatalog_v1/proto/table_spec_pb2.py deleted file mode 100644 index d0d4f4e9..00000000 --- a/google/cloud/datacatalog_v1/proto/table_spec_pb2.py +++ /dev/null @@ -1,450 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1/proto/table_spec.proto - -from google.protobuf.internal import enum_type_wrapper -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.api import resource_pb2 as google_dot_api_dot_resource__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1/proto/table_spec.proto", - package="google.cloud.datacatalog.v1", - syntax="proto3", - serialized_options=b"\n\037com.google.cloud.datacatalog.v1P\001ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\370\001\001\252\002\033Google.Cloud.DataCatalog.V1\312\002\033Google\\Cloud\\DataCatalog\\V1\352\002\036Google::Cloud::DataCatalog::V1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n2google/cloud/datacatalog_v1/proto/table_spec.proto\x12\x1bgoogle.cloud.datacatalog.v1\x1a\x1fgoogle/api/field_behavior.proto\x1a\x19google/api/resource.proto"\xe8\x01\n\x11\x42igQueryTableSpec\x12L\n\x11table_source_type\x18\x01 \x01(\x0e\x32,.google.cloud.datacatalog.v1.TableSourceTypeB\x03\xe0\x41\x03\x12:\n\tview_spec\x18\x02 \x01(\x0b\x32%.google.cloud.datacatalog.v1.ViewSpecH\x00\x12<\n\ntable_spec\x18\x03 \x01(\x0b\x32&.google.cloud.datacatalog.v1.TableSpecH\x00\x42\x0b\n\ttype_spec"#\n\x08ViewSpec\x12\x17\n\nview_query\x18\x01 \x01(\tB\x03\xe0\x41\x03"L\n\tTableSpec\x12?\n\rgrouped_entry\x18\x01 \x01(\tB(\xe0\x41\x03\xfa\x41"\n datacatalog.googleapis.com/Entry"\x89\x01\n\x17\x42igQueryDateShardedSpec\x12\x39\n\x07\x64\x61taset\x18\x01 \x01(\tB(\xe0\x41\x03\xfa\x41"\n datacatalog.googleapis.com/Entry\x12\x19\n\x0ctable_prefix\x18\x02 \x01(\tB\x03\xe0\x41\x03\x12\x18\n\x0bshard_count\x18\x03 \x01(\x03\x42\x03\xe0\x41\x03*[\n\x0fTableSourceType\x12!\n\x1dTABLE_SOURCE_TYPE_UNSPECIFIED\x10\x00\x12\x11\n\rBIGQUERY_VIEW\x10\x02\x12\x12\n\x0e\x42IGQUERY_TABLE\x10\x05\x42\xcb\x01\n\x1f\x63om.google.cloud.datacatalog.v1P\x01ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\xf8\x01\x01\xaa\x02\x1bGoogle.Cloud.DataCatalog.V1\xca\x02\x1bGoogle\\Cloud\\DataCatalog\\V1\xea\x02\x1eGoogle::Cloud::DataCatalog::V1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_api_dot_resource__pb2.DESCRIPTOR, - ], -) - -_TABLESOURCETYPE = _descriptor.EnumDescriptor( - name="TableSourceType", - full_name="google.cloud.datacatalog.v1.TableSourceType", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="TABLE_SOURCE_TYPE_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="BIGQUERY_VIEW", - index=1, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="BIGQUERY_TABLE", - index=2, - number=5, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=633, - serialized_end=724, -) -_sym_db.RegisterEnumDescriptor(_TABLESOURCETYPE) - -TableSourceType = enum_type_wrapper.EnumTypeWrapper(_TABLESOURCETYPE) -TABLE_SOURCE_TYPE_UNSPECIFIED = 0 -BIGQUERY_VIEW = 2 -BIGQUERY_TABLE = 5 - - -_BIGQUERYTABLESPEC = _descriptor.Descriptor( - name="BigQueryTableSpec", - full_name="google.cloud.datacatalog.v1.BigQueryTableSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="table_source_type", - full_name="google.cloud.datacatalog.v1.BigQueryTableSpec.table_source_type", - index=0, - number=1, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="view_spec", - full_name="google.cloud.datacatalog.v1.BigQueryTableSpec.view_spec", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="table_spec", - full_name="google.cloud.datacatalog.v1.BigQueryTableSpec.table_spec", - index=2, - number=3, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="type_spec", - full_name="google.cloud.datacatalog.v1.BigQueryTableSpec.type_spec", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=144, - serialized_end=376, -) - - -_VIEWSPEC = _descriptor.Descriptor( - name="ViewSpec", - full_name="google.cloud.datacatalog.v1.ViewSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="view_query", - full_name="google.cloud.datacatalog.v1.ViewSpec.view_query", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=378, - serialized_end=413, -) - - -_TABLESPEC = _descriptor.Descriptor( - name="TableSpec", - full_name="google.cloud.datacatalog.v1.TableSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="grouped_entry", - full_name="google.cloud.datacatalog.v1.TableSpec.grouped_entry", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b'\340A\003\372A"\n datacatalog.googleapis.com/Entry', - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=415, - serialized_end=491, -) - - -_BIGQUERYDATESHARDEDSPEC = _descriptor.Descriptor( - name="BigQueryDateShardedSpec", - full_name="google.cloud.datacatalog.v1.BigQueryDateShardedSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="dataset", - full_name="google.cloud.datacatalog.v1.BigQueryDateShardedSpec.dataset", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b'\340A\003\372A"\n datacatalog.googleapis.com/Entry', - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="table_prefix", - full_name="google.cloud.datacatalog.v1.BigQueryDateShardedSpec.table_prefix", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="shard_count", - full_name="google.cloud.datacatalog.v1.BigQueryDateShardedSpec.shard_count", - index=2, - number=3, - type=3, - cpp_type=2, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=494, - serialized_end=631, -) - -_BIGQUERYTABLESPEC.fields_by_name["table_source_type"].enum_type = _TABLESOURCETYPE -_BIGQUERYTABLESPEC.fields_by_name["view_spec"].message_type = _VIEWSPEC -_BIGQUERYTABLESPEC.fields_by_name["table_spec"].message_type = _TABLESPEC -_BIGQUERYTABLESPEC.oneofs_by_name["type_spec"].fields.append( - _BIGQUERYTABLESPEC.fields_by_name["view_spec"] -) -_BIGQUERYTABLESPEC.fields_by_name[ - "view_spec" -].containing_oneof = _BIGQUERYTABLESPEC.oneofs_by_name["type_spec"] -_BIGQUERYTABLESPEC.oneofs_by_name["type_spec"].fields.append( - _BIGQUERYTABLESPEC.fields_by_name["table_spec"] -) -_BIGQUERYTABLESPEC.fields_by_name[ - "table_spec" -].containing_oneof = _BIGQUERYTABLESPEC.oneofs_by_name["type_spec"] -DESCRIPTOR.message_types_by_name["BigQueryTableSpec"] = _BIGQUERYTABLESPEC -DESCRIPTOR.message_types_by_name["ViewSpec"] = _VIEWSPEC -DESCRIPTOR.message_types_by_name["TableSpec"] = _TABLESPEC -DESCRIPTOR.message_types_by_name["BigQueryDateShardedSpec"] = _BIGQUERYDATESHARDEDSPEC -DESCRIPTOR.enum_types_by_name["TableSourceType"] = _TABLESOURCETYPE -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -BigQueryTableSpec = _reflection.GeneratedProtocolMessageType( - "BigQueryTableSpec", - (_message.Message,), - { - "DESCRIPTOR": _BIGQUERYTABLESPEC, - "__module__": "google.cloud.datacatalog_v1.proto.table_spec_pb2", - "__doc__": """Describes a BigQuery table. - - Attributes: - table_source_type: - Output only. The table source type. - type_spec: - Output only. - view_spec: - Table view specification. This field should only be populated - if ``table_source_type`` is ``BIGQUERY_VIEW``. - table_spec: - Spec of a BigQuery table. This field should only be populated - if ``table_source_type`` is ``BIGQUERY_TABLE``. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.BigQueryTableSpec) - }, -) -_sym_db.RegisterMessage(BigQueryTableSpec) - -ViewSpec = _reflection.GeneratedProtocolMessageType( - "ViewSpec", - (_message.Message,), - { - "DESCRIPTOR": _VIEWSPEC, - "__module__": "google.cloud.datacatalog_v1.proto.table_spec_pb2", - "__doc__": """Table view specification. - - Attributes: - view_query: - Output only. The query that defines the table view. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.ViewSpec) - }, -) -_sym_db.RegisterMessage(ViewSpec) - -TableSpec = _reflection.GeneratedProtocolMessageType( - "TableSpec", - (_message.Message,), - { - "DESCRIPTOR": _TABLESPEC, - "__module__": "google.cloud.datacatalog_v1.proto.table_spec_pb2", - "__doc__": """Normal BigQuery table spec. - - Attributes: - grouped_entry: - Output only. If the table is a dated shard, i.e., with name - pattern ``[prefix]YYYYMMDD``, ``grouped_entry`` is the Data - Catalog resource name of the date sharded grouped entry, for - example, ``projects/{project_id}/locations/{location}/entrygro - ups/{entry_group_id}/entries/{entry_id}``. Otherwise, - ``grouped_entry`` is empty. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.TableSpec) - }, -) -_sym_db.RegisterMessage(TableSpec) - -BigQueryDateShardedSpec = _reflection.GeneratedProtocolMessageType( - "BigQueryDateShardedSpec", - (_message.Message,), - { - "DESCRIPTOR": _BIGQUERYDATESHARDEDSPEC, - "__module__": "google.cloud.datacatalog_v1.proto.table_spec_pb2", - "__doc__": """Spec for a group of BigQuery tables with name pattern - ``[prefix]YYYYMMDD``. Context: - https://cloud.google.com/bigquery/docs/partitioned- - tables#partitioning_versus_sharding - - Attributes: - dataset: - Output only. The Data Catalog resource name of the dataset - entry the current table belongs to, for example, ``projects/{p - roject_id}/locations/{location}/entrygroups/{entry_group_id}/e - ntries/{entry_id}``. - table_prefix: - Output only. The table name prefix of the shards. The name of - any given shard is ``[table_prefix]YYYYMMDD``, for example, - for shard ``MyTable20180101``, the ``table_prefix`` is - ``MyTable``. - shard_count: - Output only. Total number of shards. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.BigQueryDateShardedSpec) - }, -) -_sym_db.RegisterMessage(BigQueryDateShardedSpec) - - -DESCRIPTOR._options = None -_BIGQUERYTABLESPEC.fields_by_name["table_source_type"]._options = None -_VIEWSPEC.fields_by_name["view_query"]._options = None -_TABLESPEC.fields_by_name["grouped_entry"]._options = None -_BIGQUERYDATESHARDEDSPEC.fields_by_name["dataset"]._options = None -_BIGQUERYDATESHARDEDSPEC.fields_by_name["table_prefix"]._options = None -_BIGQUERYDATESHARDEDSPEC.fields_by_name["shard_count"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1/proto/table_spec_pb2_grpc.py b/google/cloud/datacatalog_v1/proto/table_spec_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1/proto/table_spec_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1/proto/tags.proto b/google/cloud/datacatalog_v1/proto/tags.proto new file mode 100644 index 00000000..4efefa52 --- /dev/null +++ b/google/cloud/datacatalog_v1/proto/tags.proto @@ -0,0 +1,229 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1; + +import "google/api/field_behavior.proto"; +import "google/api/resource.proto"; +import "google/protobuf/timestamp.proto"; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog"; +option java_multiple_files = true; +option java_package = "com.google.cloud.datacatalog.v1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1"; +option ruby_package = "Google::Cloud::DataCatalog::V1"; + +// Tags are used to attach custom metadata to Data Catalog resources. Tags +// conform to the specifications within their tag template. +// +// See [Data Catalog +// IAM](https://cloud.google.com/data-catalog/docs/concepts/iam) for information +// on the permissions needed to create or view tags. +message Tag { + option (google.api.resource) = { + type: "datacatalog.googleapis.com/Tag" + pattern: "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}" + }; + + // The resource name of the tag in URL format. Example: + // + // * projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + // + // where `tag_id` is a system-generated identifier. + // Note that this Tag may not actually be stored in the location in this name. + string name = 1; + + // Required. The resource name of the tag template that this tag uses. Example: + // + // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + // + // This field cannot be modified after creation. + string template = 2 [(google.api.field_behavior) = REQUIRED]; + + // Output only. The display name of the tag template. + string template_display_name = 5 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // The scope within the parent resource that this tag is attached to. If not + // provided, the tag is attached to the parent resource itself. + // Deleting the scope from the parent resource will delete all tags attached + // to that scope. These fields cannot be updated after creation. + oneof scope { + // Resources like Entry can have schemas associated with them. This scope + // allows users to attach tags to an individual column based on that schema. + // + // For attaching a tag to a nested column, use `.` to separate the column + // names. Example: + // + // * `outer_column.inner_column` + string column = 4; + } + + // Required. This maps the ID of a tag field to the value of and additional information + // about that field. Valid field IDs are defined by the tag's template. A tag + // must have at least 1 field and at most 500 fields. + map fields = 3 [(google.api.field_behavior) = REQUIRED]; +} + +// Contains the value and supporting information for a field within +// a [Tag][google.cloud.datacatalog.v1.Tag]. +message TagField { + // Holds an enum value. + message EnumValue { + // The display name of the enum value. + string display_name = 1; + } + + // Output only. The display name of this field. + string display_name = 1 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // Required. The value of this field. + oneof kind { + // Holds the value for a tag field with double type. + double double_value = 2; + + // Holds the value for a tag field with string type. + string string_value = 3; + + // Holds the value for a tag field with boolean type. + bool bool_value = 4; + + // Holds the value for a tag field with timestamp type. + google.protobuf.Timestamp timestamp_value = 5; + + // Holds the value for a tag field with enum type. This value must be + // one of the allowed values in the definition of this enum. + EnumValue enum_value = 6; + } + + // Output only. The order of this field with respect to other fields in this tag. It can be + // set in [Tag][google.cloud.datacatalog.v1.TagTemplateField.order]. For + // example, a higher value can indicate a more important field. The value can + // be negative. Multiple fields can have the same order, and field orders + // within a tag do not have to be sequential. + int32 order = 7 [(google.api.field_behavior) = OUTPUT_ONLY]; +} + +// A tag template defines a tag, which can have one or more typed fields. +// The template is used to create and attach the tag to GCP resources. +// [Tag template +// roles](https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) +// provide permissions to create, edit, and use the template. See, for example, +// the [TagTemplate +// User](https://cloud.google.com/data-catalog/docs/how-to/template-user) role, +// which includes permission to use the tag template to tag resources. +message TagTemplate { + option (google.api.resource) = { + type: "datacatalog.googleapis.com/TagTemplate" + pattern: "projects/{project}/locations/{location}/tagTemplates/{tag_template}" + }; + + // The resource name of the tag template in URL format. Example: + // + // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + // + // Note that this TagTemplate and its child resources may not actually be + // stored in the location in this name. + string name = 1; + + // The display name for this template. Defaults to an empty string. + string display_name = 2; + + // Required. Map of tag template field IDs to the settings for the field. + // This map is an exhaustive list of the allowed fields. This map must contain + // at least one field and at most 500 fields. + // + // The keys to this map are tag template field IDs. Field IDs can contain + // letters (both uppercase and lowercase), numbers (0-9) and underscores (_). + // Field IDs must be at least 1 character long and at most + // 64 characters long. Field IDs must start with a letter or underscore. + map fields = 3 [(google.api.field_behavior) = REQUIRED]; +} + +// The template for an individual field within a tag template. +message TagTemplateField { + option (google.api.resource) = { + type: "datacatalog.googleapis.com/TagTemplateField" + pattern: "projects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}" + }; + + // Output only. The resource name of the tag template field in URL format. Example: + // + // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template}/fields/{field} + // + // Note that this TagTemplateField may not actually be stored in the location + // in this name. + string name = 6 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // The display name for this field. Defaults to an empty string. + string display_name = 1; + + // Required. The type of value this tag field can contain. + FieldType type = 2 [(google.api.field_behavior) = REQUIRED]; + + // Whether this is a required field. Defaults to false. + bool is_required = 3; + + // The order of this field with respect to other fields in this tag + // template. For example, a higher value can indicate a more important field. + // The value can be negative. Multiple fields can have the same order, and + // field orders within a tag do not have to be sequential. + int32 order = 5; +} + +message FieldType { + message EnumType { + message EnumValue { + // Required. The display name of the enum value. Must not be an empty string. + string display_name = 1 [(google.api.field_behavior) = REQUIRED]; + } + + // Required on create; optional on update. The set of allowed values for + // this enum. This set must not be empty, the display names of the values in + // this set must not be empty and the display names of the values must be + // case-insensitively unique within this set. Currently, enum values can + // only be added to the list of allowed values. Deletion and renaming of + // enum values are not supported. Can have up to 500 allowed values. + repeated EnumValue allowed_values = 1; + } + + enum PrimitiveType { + // This is the default invalid value for a type. + PRIMITIVE_TYPE_UNSPECIFIED = 0; + + // A double precision number. + DOUBLE = 1; + + // An UTF-8 string. + STRING = 2; + + // A boolean value. + BOOL = 3; + + // A timestamp. + TIMESTAMP = 4; + } + + // Required. + oneof type_decl { + // Represents primitive types - string, bool etc. + PrimitiveType primitive_type = 1; + + // Represents an enum type. + EnumType enum_type = 2; + } +} diff --git a/google/cloud/datacatalog_v1/proto/tags_pb2.py b/google/cloud/datacatalog_v1/proto/tags_pb2.py deleted file mode 100644 index da6eb3d1..00000000 --- a/google/cloud/datacatalog_v1/proto/tags_pb2.py +++ /dev/null @@ -1,1217 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1/proto/tags.proto - -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.api import resource_pb2 as google_dot_api_dot_resource__pb2 -from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1/proto/tags.proto", - package="google.cloud.datacatalog.v1", - syntax="proto3", - serialized_options=b"\n\037com.google.cloud.datacatalog.v1P\001ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\370\001\001\252\002\033Google.Cloud.DataCatalog.V1\312\002\033Google\\Cloud\\DataCatalog\\V1\352\002\036Google::Cloud::DataCatalog::V1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n,google/cloud/datacatalog_v1/proto/tags.proto\x12\x1bgoogle.cloud.datacatalog.v1\x1a\x1fgoogle/api/field_behavior.proto\x1a\x19google/api/resource.proto\x1a\x1fgoogle/protobuf/timestamp.proto"\x86\x03\n\x03Tag\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x15\n\x08template\x18\x02 \x01(\tB\x03\xe0\x41\x02\x12"\n\x15template_display_name\x18\x05 \x01(\tB\x03\xe0\x41\x03\x12\x10\n\x06\x63olumn\x18\x04 \x01(\tH\x00\x12\x41\n\x06\x66ields\x18\x03 \x03(\x0b\x32,.google.cloud.datacatalog.v1.Tag.FieldsEntryB\x03\xe0\x41\x02\x1aT\n\x0b\x46ieldsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.google.cloud.datacatalog.v1.TagField:\x02\x38\x01:\x81\x01\xea\x41~\n\x1e\x64\x61tacatalog.googleapis.com/Tag\x12\\projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}B\x07\n\x05scope"\xa8\x02\n\x08TagField\x12\x19\n\x0c\x64isplay_name\x18\x01 \x01(\tB\x03\xe0\x41\x03\x12\x16\n\x0c\x64ouble_value\x18\x02 \x01(\x01H\x00\x12\x16\n\x0cstring_value\x18\x03 \x01(\tH\x00\x12\x14\n\nbool_value\x18\x04 \x01(\x08H\x00\x12\x35\n\x0ftimestamp_value\x18\x05 \x01(\x0b\x32\x1a.google.protobuf.TimestampH\x00\x12\x45\n\nenum_value\x18\x06 \x01(\x0b\x32/.google.cloud.datacatalog.v1.TagField.EnumValueH\x00\x12\x12\n\x05order\x18\x07 \x01(\x05\x42\x03\xe0\x41\x03\x1a!\n\tEnumValue\x12\x14\n\x0c\x64isplay_name\x18\x01 \x01(\tB\x06\n\x04kind"\xcc\x02\n\x0bTagTemplate\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x14\n\x0c\x64isplay_name\x18\x02 \x01(\t\x12I\n\x06\x66ields\x18\x03 \x03(\x0b\x32\x34.google.cloud.datacatalog.v1.TagTemplate.FieldsEntryB\x03\xe0\x41\x02\x1a\\\n\x0b\x46ieldsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12<\n\x05value\x18\x02 \x01(\x0b\x32-.google.cloud.datacatalog.v1.TagTemplateField:\x02\x38\x01:p\xea\x41m\n&datacatalog.googleapis.com/TagTemplate\x12\x43projects/{project}/locations/{location}/tagTemplates/{tag_template}"\xa2\x02\n\x10TagTemplateField\x12\x11\n\x04name\x18\x06 \x01(\tB\x03\xe0\x41\x03\x12\x14\n\x0c\x64isplay_name\x18\x01 \x01(\t\x12\x39\n\x04type\x18\x02 \x01(\x0b\x32&.google.cloud.datacatalog.v1.FieldTypeB\x03\xe0\x41\x02\x12\x13\n\x0bis_required\x18\x03 \x01(\x08\x12\r\n\x05order\x18\x05 \x01(\x05:\x85\x01\xea\x41\x81\x01\n+datacatalog.googleapis.com/TagTemplateField\x12Rprojects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}"\x98\x03\n\tFieldType\x12N\n\x0eprimitive_type\x18\x01 \x01(\x0e\x32\x34.google.cloud.datacatalog.v1.FieldType.PrimitiveTypeH\x00\x12\x44\n\tenum_type\x18\x02 \x01(\x0b\x32/.google.cloud.datacatalog.v1.FieldType.EnumTypeH\x00\x1a\x85\x01\n\x08\x45numType\x12Q\n\x0e\x61llowed_values\x18\x01 \x03(\x0b\x32\x39.google.cloud.datacatalog.v1.FieldType.EnumType.EnumValue\x1a&\n\tEnumValue\x12\x19\n\x0c\x64isplay_name\x18\x01 \x01(\tB\x03\xe0\x41\x02"`\n\rPrimitiveType\x12\x1e\n\x1aPRIMITIVE_TYPE_UNSPECIFIED\x10\x00\x12\n\n\x06\x44OUBLE\x10\x01\x12\n\n\x06STRING\x10\x02\x12\x08\n\x04\x42OOL\x10\x03\x12\r\n\tTIMESTAMP\x10\x04\x42\x0b\n\ttype_declB\xcb\x01\n\x1f\x63om.google.cloud.datacatalog.v1P\x01ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\xf8\x01\x01\xaa\x02\x1bGoogle.Cloud.DataCatalog.V1\xca\x02\x1bGoogle\\Cloud\\DataCatalog\\V1\xea\x02\x1eGoogle::Cloud::DataCatalog::V1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_api_dot_resource__pb2.DESCRIPTOR, - google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR, - ], -) - - -_FIELDTYPE_PRIMITIVETYPE = _descriptor.EnumDescriptor( - name="PrimitiveType", - full_name="google.cloud.datacatalog.v1.FieldType.PrimitiveType", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="PRIMITIVE_TYPE_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="DOUBLE", - index=1, - number=1, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="STRING", - index=2, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="BOOL", - index=3, - number=3, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="TIMESTAMP", - index=4, - number=4, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=1790, - serialized_end=1886, -) -_sym_db.RegisterEnumDescriptor(_FIELDTYPE_PRIMITIVETYPE) - - -_TAG_FIELDSENTRY = _descriptor.Descriptor( - name="FieldsEntry", - full_name="google.cloud.datacatalog.v1.Tag.FieldsEntry", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="key", - full_name="google.cloud.datacatalog.v1.Tag.FieldsEntry.key", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="value", - full_name="google.cloud.datacatalog.v1.Tag.FieldsEntry.value", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"8\001", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=336, - serialized_end=420, -) - -_TAG = _descriptor.Descriptor( - name="Tag", - full_name="google.cloud.datacatalog.v1.Tag", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.Tag.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="template", - full_name="google.cloud.datacatalog.v1.Tag.template", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="template_display_name", - full_name="google.cloud.datacatalog.v1.Tag.template_display_name", - index=2, - number=5, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="column", - full_name="google.cloud.datacatalog.v1.Tag.column", - index=3, - number=4, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="fields", - full_name="google.cloud.datacatalog.v1.Tag.fields", - index=4, - number=3, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_TAG_FIELDSENTRY], - enum_types=[], - serialized_options=b"\352A~\n\036datacatalog.googleapis.com/Tag\022\\projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="scope", - full_name="google.cloud.datacatalog.v1.Tag.scope", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=171, - serialized_end=561, -) - - -_TAGFIELD_ENUMVALUE = _descriptor.Descriptor( - name="EnumValue", - full_name="google.cloud.datacatalog.v1.TagField.EnumValue", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1.TagField.EnumValue.display_name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=819, - serialized_end=852, -) - -_TAGFIELD = _descriptor.Descriptor( - name="TagField", - full_name="google.cloud.datacatalog.v1.TagField", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1.TagField.display_name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="double_value", - full_name="google.cloud.datacatalog.v1.TagField.double_value", - index=1, - number=2, - type=1, - cpp_type=5, - label=1, - has_default_value=False, - default_value=float(0), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="string_value", - full_name="google.cloud.datacatalog.v1.TagField.string_value", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="bool_value", - full_name="google.cloud.datacatalog.v1.TagField.bool_value", - index=3, - number=4, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="timestamp_value", - full_name="google.cloud.datacatalog.v1.TagField.timestamp_value", - index=4, - number=5, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="enum_value", - full_name="google.cloud.datacatalog.v1.TagField.enum_value", - index=5, - number=6, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="order", - full_name="google.cloud.datacatalog.v1.TagField.order", - index=6, - number=7, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_TAGFIELD_ENUMVALUE], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="kind", - full_name="google.cloud.datacatalog.v1.TagField.kind", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=564, - serialized_end=860, -) - - -_TAGTEMPLATE_FIELDSENTRY = _descriptor.Descriptor( - name="FieldsEntry", - full_name="google.cloud.datacatalog.v1.TagTemplate.FieldsEntry", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="key", - full_name="google.cloud.datacatalog.v1.TagTemplate.FieldsEntry.key", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="value", - full_name="google.cloud.datacatalog.v1.TagTemplate.FieldsEntry.value", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"8\001", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=989, - serialized_end=1081, -) - -_TAGTEMPLATE = _descriptor.Descriptor( - name="TagTemplate", - full_name="google.cloud.datacatalog.v1.TagTemplate", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.TagTemplate.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1.TagTemplate.display_name", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="fields", - full_name="google.cloud.datacatalog.v1.TagTemplate.fields", - index=2, - number=3, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_TAGTEMPLATE_FIELDSENTRY], - enum_types=[], - serialized_options=b"\352Am\n&datacatalog.googleapis.com/TagTemplate\022Cprojects/{project}/locations/{location}/tagTemplates/{tag_template}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=863, - serialized_end=1195, -) - - -_TAGTEMPLATEFIELD = _descriptor.Descriptor( - name="TagTemplateField", - full_name="google.cloud.datacatalog.v1.TagTemplateField", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1.TagTemplateField.name", - index=0, - number=6, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1.TagTemplateField.display_name", - index=1, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="type", - full_name="google.cloud.datacatalog.v1.TagTemplateField.type", - index=2, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="is_required", - full_name="google.cloud.datacatalog.v1.TagTemplateField.is_required", - index=3, - number=3, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="order", - full_name="google.cloud.datacatalog.v1.TagTemplateField.order", - index=4, - number=5, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"\352A\201\001\n+datacatalog.googleapis.com/TagTemplateField\022Rprojects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1198, - serialized_end=1488, -) - - -_FIELDTYPE_ENUMTYPE_ENUMVALUE = _descriptor.Descriptor( - name="EnumValue", - full_name="google.cloud.datacatalog.v1.FieldType.EnumType.EnumValue", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1.FieldType.EnumType.EnumValue.display_name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1750, - serialized_end=1788, -) - -_FIELDTYPE_ENUMTYPE = _descriptor.Descriptor( - name="EnumType", - full_name="google.cloud.datacatalog.v1.FieldType.EnumType", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="allowed_values", - full_name="google.cloud.datacatalog.v1.FieldType.EnumType.allowed_values", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[_FIELDTYPE_ENUMTYPE_ENUMVALUE], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1655, - serialized_end=1788, -) - -_FIELDTYPE = _descriptor.Descriptor( - name="FieldType", - full_name="google.cloud.datacatalog.v1.FieldType", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="primitive_type", - full_name="google.cloud.datacatalog.v1.FieldType.primitive_type", - index=0, - number=1, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="enum_type", - full_name="google.cloud.datacatalog.v1.FieldType.enum_type", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_FIELDTYPE_ENUMTYPE], - enum_types=[_FIELDTYPE_PRIMITIVETYPE], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="type_decl", - full_name="google.cloud.datacatalog.v1.FieldType.type_decl", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=1491, - serialized_end=1899, -) - -_TAG_FIELDSENTRY.fields_by_name["value"].message_type = _TAGFIELD -_TAG_FIELDSENTRY.containing_type = _TAG -_TAG.fields_by_name["fields"].message_type = _TAG_FIELDSENTRY -_TAG.oneofs_by_name["scope"].fields.append(_TAG.fields_by_name["column"]) -_TAG.fields_by_name["column"].containing_oneof = _TAG.oneofs_by_name["scope"] -_TAGFIELD_ENUMVALUE.containing_type = _TAGFIELD -_TAGFIELD.fields_by_name[ - "timestamp_value" -].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP -_TAGFIELD.fields_by_name["enum_value"].message_type = _TAGFIELD_ENUMVALUE -_TAGFIELD.oneofs_by_name["kind"].fields.append(_TAGFIELD.fields_by_name["double_value"]) -_TAGFIELD.fields_by_name["double_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGFIELD.oneofs_by_name["kind"].fields.append(_TAGFIELD.fields_by_name["string_value"]) -_TAGFIELD.fields_by_name["string_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGFIELD.oneofs_by_name["kind"].fields.append(_TAGFIELD.fields_by_name["bool_value"]) -_TAGFIELD.fields_by_name["bool_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGFIELD.oneofs_by_name["kind"].fields.append( - _TAGFIELD.fields_by_name["timestamp_value"] -) -_TAGFIELD.fields_by_name["timestamp_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGFIELD.oneofs_by_name["kind"].fields.append(_TAGFIELD.fields_by_name["enum_value"]) -_TAGFIELD.fields_by_name["enum_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGTEMPLATE_FIELDSENTRY.fields_by_name["value"].message_type = _TAGTEMPLATEFIELD -_TAGTEMPLATE_FIELDSENTRY.containing_type = _TAGTEMPLATE -_TAGTEMPLATE.fields_by_name["fields"].message_type = _TAGTEMPLATE_FIELDSENTRY -_TAGTEMPLATEFIELD.fields_by_name["type"].message_type = _FIELDTYPE -_FIELDTYPE_ENUMTYPE_ENUMVALUE.containing_type = _FIELDTYPE_ENUMTYPE -_FIELDTYPE_ENUMTYPE.fields_by_name[ - "allowed_values" -].message_type = _FIELDTYPE_ENUMTYPE_ENUMVALUE -_FIELDTYPE_ENUMTYPE.containing_type = _FIELDTYPE -_FIELDTYPE.fields_by_name["primitive_type"].enum_type = _FIELDTYPE_PRIMITIVETYPE -_FIELDTYPE.fields_by_name["enum_type"].message_type = _FIELDTYPE_ENUMTYPE -_FIELDTYPE_PRIMITIVETYPE.containing_type = _FIELDTYPE -_FIELDTYPE.oneofs_by_name["type_decl"].fields.append( - _FIELDTYPE.fields_by_name["primitive_type"] -) -_FIELDTYPE.fields_by_name[ - "primitive_type" -].containing_oneof = _FIELDTYPE.oneofs_by_name["type_decl"] -_FIELDTYPE.oneofs_by_name["type_decl"].fields.append( - _FIELDTYPE.fields_by_name["enum_type"] -) -_FIELDTYPE.fields_by_name["enum_type"].containing_oneof = _FIELDTYPE.oneofs_by_name[ - "type_decl" -] -DESCRIPTOR.message_types_by_name["Tag"] = _TAG -DESCRIPTOR.message_types_by_name["TagField"] = _TAGFIELD -DESCRIPTOR.message_types_by_name["TagTemplate"] = _TAGTEMPLATE -DESCRIPTOR.message_types_by_name["TagTemplateField"] = _TAGTEMPLATEFIELD -DESCRIPTOR.message_types_by_name["FieldType"] = _FIELDTYPE -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -Tag = _reflection.GeneratedProtocolMessageType( - "Tag", - (_message.Message,), - { - "FieldsEntry": _reflection.GeneratedProtocolMessageType( - "FieldsEntry", - (_message.Message,), - { - "DESCRIPTOR": _TAG_FIELDSENTRY, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2" - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.Tag.FieldsEntry) - }, - ), - "DESCRIPTOR": _TAG, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2", - "__doc__": """Tags are used to attach custom metadata to Data Catalog resources. - Tags conform to the specifications within their tag template. See - `Data Catalog IAM `__ for information on the permissions - needed to create or view tags. - - Attributes: - name: - The resource name of the tag in URL format. Example: - proje - cts/{project_id}/locations/{location}/entrygroups/{entry_group - _id}/entries/{entry_id}/tags/{tag_id} where ``tag_id`` is a - system-generated identifier. Note that this Tag may not - actually be stored in the location in this name. - template: - Required. The resource name of the tag template that this tag - uses. Example: - projects/{project_id}/locations/{location}/ - tagTemplates/{tag_template_id} This field cannot be modified - after creation. - template_display_name: - Output only. The display name of the tag template. - scope: - The scope within the parent resource that this tag is attached - to. If not provided, the tag is attached to the parent - resource itself. Deleting the scope from the parent resource - will delete all tags attached to that scope. These fields - cannot be updated after creation. - column: - Resources like Entry can have schemas associated with them. - This scope allows users to attach tags to an individual column - based on that schema. For attaching a tag to a nested column, - use ``.`` to separate the column names. Example: - - ``outer_column.inner_column`` - fields: - Required. This maps the ID of a tag field to the value of and - additional information about that field. Valid field IDs are - defined by the tag’s template. A tag must have at least 1 - field and at most 500 fields. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.Tag) - }, -) -_sym_db.RegisterMessage(Tag) -_sym_db.RegisterMessage(Tag.FieldsEntry) - -TagField = _reflection.GeneratedProtocolMessageType( - "TagField", - (_message.Message,), - { - "EnumValue": _reflection.GeneratedProtocolMessageType( - "EnumValue", - (_message.Message,), - { - "DESCRIPTOR": _TAGFIELD_ENUMVALUE, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2", - "__doc__": """Holds an enum value. - - Attributes: - display_name: - The display name of the enum value. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.TagField.EnumValue) - }, - ), - "DESCRIPTOR": _TAGFIELD, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2", - "__doc__": """Contains the value and supporting information for a field within a - [Tag][google.cloud.datacatalog.v1.Tag]. - - Attributes: - display_name: - Output only. The display name of this field. - kind: - Required. The value of this field. - double_value: - Holds the value for a tag field with double type. - string_value: - Holds the value for a tag field with string type. - bool_value: - Holds the value for a tag field with boolean type. - timestamp_value: - Holds the value for a tag field with timestamp type. - enum_value: - Holds the value for a tag field with enum type. This value - must be one of the allowed values in the definition of this - enum. - order: - Output only. The order of this field with respect to other - fields in this tag. It can be set in - [Tag][google.cloud.datacatalog.v1.TagTemplateField.order]. For - example, a higher value can indicate a more important field. - The value can be negative. Multiple fields can have the same - order, and field orders within a tag do not have to be - sequential. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.TagField) - }, -) -_sym_db.RegisterMessage(TagField) -_sym_db.RegisterMessage(TagField.EnumValue) - -TagTemplate = _reflection.GeneratedProtocolMessageType( - "TagTemplate", - (_message.Message,), - { - "FieldsEntry": _reflection.GeneratedProtocolMessageType( - "FieldsEntry", - (_message.Message,), - { - "DESCRIPTOR": _TAGTEMPLATE_FIELDSENTRY, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2" - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.TagTemplate.FieldsEntry) - }, - ), - "DESCRIPTOR": _TAGTEMPLATE, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2", - "__doc__": """A tag template defines a tag, which can have one or more typed fields. - The template is used to create and attach the tag to GCP resources. - `Tag template roles `__ provide permissions to create, edit, and - use the template. See, for example, the `TagTemplate User - `__ - role, which includes permission to use the tag template to tag - resources. - - Attributes: - name: - The resource name of the tag template in URL format. Example: - - projects/{project_id}/locations/{location}/tagTemplates/{ta - g_template_id} Note that this TagTemplate and its child - resources may not actually be stored in the location in this - name. - display_name: - The display name for this template. Defaults to an empty - string. - fields: - Required. Map of tag template field IDs to the settings for - the field. This map is an exhaustive list of the allowed - fields. This map must contain at least one field and at most - 500 fields. The keys to this map are tag template field IDs. - Field IDs can contain letters (both uppercase and lowercase), - numbers (0-9) and underscores (_). Field IDs must be at least - 1 character long and at most 64 characters long. Field IDs - must start with a letter or underscore. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.TagTemplate) - }, -) -_sym_db.RegisterMessage(TagTemplate) -_sym_db.RegisterMessage(TagTemplate.FieldsEntry) - -TagTemplateField = _reflection.GeneratedProtocolMessageType( - "TagTemplateField", - (_message.Message,), - { - "DESCRIPTOR": _TAGTEMPLATEFIELD, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2", - "__doc__": """The template for an individual field within a tag template. - - Attributes: - name: - Output only. The resource name of the tag template field in - URL format. Example: - projects/{project_id}/locations/{loca - tion}/tagTemplates/{tag_template}/fields/{field} Note that - this TagTemplateField may not actually be stored in the - location in this name. - display_name: - The display name for this field. Defaults to an empty string. - type: - Required. The type of value this tag field can contain. - is_required: - Whether this is a required field. Defaults to false. - order: - The order of this field with respect to other fields in this - tag template. For example, a higher value can indicate a more - important field. The value can be negative. Multiple fields - can have the same order, and field orders within a tag do not - have to be sequential. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.TagTemplateField) - }, -) -_sym_db.RegisterMessage(TagTemplateField) - -FieldType = _reflection.GeneratedProtocolMessageType( - "FieldType", - (_message.Message,), - { - "EnumType": _reflection.GeneratedProtocolMessageType( - "EnumType", - (_message.Message,), - { - "EnumValue": _reflection.GeneratedProtocolMessageType( - "EnumValue", - (_message.Message,), - { - "DESCRIPTOR": _FIELDTYPE_ENUMTYPE_ENUMVALUE, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2", - "__doc__": """ - Attributes: - display_name: - Required. The display name of the enum value. Must not be an - empty string. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.FieldType.EnumType.EnumValue) - }, - ), - "DESCRIPTOR": _FIELDTYPE_ENUMTYPE, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2", - "__doc__": """ - Attributes: - allowed_values: - Required on create; optional on update. The set of allowed - values for this enum. This set must not be empty, the display - names of the values in this set must not be empty and the - display names of the values must be case-insensitively unique - within this set. Currently, enum values can only be added to - the list of allowed values. Deletion and renaming of enum - values are not supported. Can have up to 500 allowed values. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.FieldType.EnumType) - }, - ), - "DESCRIPTOR": _FIELDTYPE, - "__module__": "google.cloud.datacatalog_v1.proto.tags_pb2", - "__doc__": """ - Attributes: - type_decl: - Required. - primitive_type: - Represents primitive types - string, bool etc. - enum_type: - Represents an enum type. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.FieldType) - }, -) -_sym_db.RegisterMessage(FieldType) -_sym_db.RegisterMessage(FieldType.EnumType) -_sym_db.RegisterMessage(FieldType.EnumType.EnumValue) - - -DESCRIPTOR._options = None -_TAG_FIELDSENTRY._options = None -_TAG.fields_by_name["template"]._options = None -_TAG.fields_by_name["template_display_name"]._options = None -_TAG.fields_by_name["fields"]._options = None -_TAG._options = None -_TAGFIELD.fields_by_name["display_name"]._options = None -_TAGFIELD.fields_by_name["order"]._options = None -_TAGTEMPLATE_FIELDSENTRY._options = None -_TAGTEMPLATE.fields_by_name["fields"]._options = None -_TAGTEMPLATE._options = None -_TAGTEMPLATEFIELD.fields_by_name["name"]._options = None -_TAGTEMPLATEFIELD.fields_by_name["type"]._options = None -_TAGTEMPLATEFIELD._options = None -_FIELDTYPE_ENUMTYPE_ENUMVALUE.fields_by_name["display_name"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1/proto/tags_pb2_grpc.py b/google/cloud/datacatalog_v1/proto/tags_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1/proto/tags_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1/proto/timestamps.proto b/google/cloud/datacatalog_v1/proto/timestamps.proto new file mode 100644 index 00000000..a4372ae3 --- /dev/null +++ b/google/cloud/datacatalog_v1/proto/timestamps.proto @@ -0,0 +1,41 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1; + +import "google/api/field_behavior.proto"; +import "google/protobuf/timestamp.proto"; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog"; +option java_multiple_files = true; +option java_package = "com.google.cloud.datacatalog.v1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1"; +option ruby_package = "Google::Cloud::DataCatalog::V1"; + +// Timestamps about this resource according to a particular system. +message SystemTimestamps { + // The creation time of the resource within the given system. + google.protobuf.Timestamp create_time = 1; + + // The last-modified time of the resource within the given system. + google.protobuf.Timestamp update_time = 2; + + // Output only. The expiration time of the resource within the given system. + // Currently only apllicable to BigQuery resources. + google.protobuf.Timestamp expire_time = 3 [(google.api.field_behavior) = OUTPUT_ONLY]; +} diff --git a/google/cloud/datacatalog_v1/proto/timestamps_pb2.py b/google/cloud/datacatalog_v1/proto/timestamps_pb2.py deleted file mode 100644 index 10cdf2cf..00000000 --- a/google/cloud/datacatalog_v1/proto/timestamps_pb2.py +++ /dev/null @@ -1,149 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1/proto/timestamps.proto - -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1/proto/timestamps.proto", - package="google.cloud.datacatalog.v1", - syntax="proto3", - serialized_options=b"\n\037com.google.cloud.datacatalog.v1P\001ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\370\001\001\252\002\033Google.Cloud.DataCatalog.V1\312\002\033Google\\Cloud\\DataCatalog\\V1\352\002\036Google::Cloud::DataCatalog::V1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n2google/cloud/datacatalog_v1/proto/timestamps.proto\x12\x1bgoogle.cloud.datacatalog.v1\x1a\x1fgoogle/api/field_behavior.proto\x1a\x1fgoogle/protobuf/timestamp.proto"\xaa\x01\n\x10SystemTimestamps\x12/\n\x0b\x63reate_time\x18\x01 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12/\n\x0bupdate_time\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12\x34\n\x0b\x65xpire_time\x18\x03 \x01(\x0b\x32\x1a.google.protobuf.TimestampB\x03\xe0\x41\x03\x42\xcb\x01\n\x1f\x63om.google.cloud.datacatalog.v1P\x01ZFgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1;datacatalog\xf8\x01\x01\xaa\x02\x1bGoogle.Cloud.DataCatalog.V1\xca\x02\x1bGoogle\\Cloud\\DataCatalog\\V1\xea\x02\x1eGoogle::Cloud::DataCatalog::V1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR, - ], -) - - -_SYSTEMTIMESTAMPS = _descriptor.Descriptor( - name="SystemTimestamps", - full_name="google.cloud.datacatalog.v1.SystemTimestamps", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="create_time", - full_name="google.cloud.datacatalog.v1.SystemTimestamps.create_time", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_time", - full_name="google.cloud.datacatalog.v1.SystemTimestamps.update_time", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="expire_time", - full_name="google.cloud.datacatalog.v1.SystemTimestamps.expire_time", - index=2, - number=3, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=150, - serialized_end=320, -) - -_SYSTEMTIMESTAMPS.fields_by_name[ - "create_time" -].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP -_SYSTEMTIMESTAMPS.fields_by_name[ - "update_time" -].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP -_SYSTEMTIMESTAMPS.fields_by_name[ - "expire_time" -].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP -DESCRIPTOR.message_types_by_name["SystemTimestamps"] = _SYSTEMTIMESTAMPS -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -SystemTimestamps = _reflection.GeneratedProtocolMessageType( - "SystemTimestamps", - (_message.Message,), - { - "DESCRIPTOR": _SYSTEMTIMESTAMPS, - "__module__": "google.cloud.datacatalog_v1.proto.timestamps_pb2", - "__doc__": """Timestamps about this resource according to a particular system. - - Attributes: - create_time: - The creation time of the resource within the given system. - update_time: - The last-modified time of the resource within the given - system. - expire_time: - Output only. The expiration time of the resource within the - given system. Currently only apllicable to BigQuery resources. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1.SystemTimestamps) - }, -) -_sym_db.RegisterMessage(SystemTimestamps) - - -DESCRIPTOR._options = None -_SYSTEMTIMESTAMPS.fields_by_name["expire_time"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1/proto/timestamps_pb2_grpc.py b/google/cloud/datacatalog_v1/proto/timestamps_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1/proto/timestamps_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1/py.typed b/google/cloud/datacatalog_v1/py.typed new file mode 100644 index 00000000..bb4088a3 --- /dev/null +++ b/google/cloud/datacatalog_v1/py.typed @@ -0,0 +1,2 @@ +# Marker file for PEP 561. +# The google-cloud-datacatalog package uses inline types. diff --git a/google/cloud/datacatalog_v1/services/__init__.py b/google/cloud/datacatalog_v1/services/__init__.py new file mode 100644 index 00000000..42ffdf2b --- /dev/null +++ b/google/cloud/datacatalog_v1/services/__init__.py @@ -0,0 +1,16 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# diff --git a/google/__init__.py b/google/cloud/datacatalog_v1/services/data_catalog/__init__.py similarity index 71% rename from google/__init__.py rename to google/cloud/datacatalog_v1/services/data_catalog/__init__.py index 9a1b64a6..e56ed8a6 100644 --- a/google/__init__.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/__init__.py @@ -1,24 +1,24 @@ # -*- coding: utf-8 -*- -# + # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # -# https://www.apache.org/licenses/LICENSE-2.0 +# http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. +# -try: - import pkg_resources - - pkg_resources.declare_namespace(__name__) -except ImportError: - import pkgutil +from .client import DataCatalogClient +from .async_client import DataCatalogAsyncClient - __path__ = pkgutil.extend_path(__path__, __name__) +__all__ = ( + "DataCatalogClient", + "DataCatalogAsyncClient", +) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/async_client.py b/google/cloud/datacatalog_v1/services/data_catalog/async_client.py new file mode 100644 index 00000000..84dac12f --- /dev/null +++ b/google/cloud/datacatalog_v1/services/data_catalog/async_client.py @@ -0,0 +1,2716 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +import functools +import re +from typing import Dict, Sequence, Tuple, Type, Union +import pkg_resources + +import google.api_core.client_options as ClientOptions # type: ignore +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore +from google.oauth2 import service_account # type: ignore + +from google.cloud.datacatalog_v1.services.data_catalog import pagers +from google.cloud.datacatalog_v1.types import common +from google.cloud.datacatalog_v1.types import datacatalog +from google.cloud.datacatalog_v1.types import gcs_fileset_spec +from google.cloud.datacatalog_v1.types import schema +from google.cloud.datacatalog_v1.types import search +from google.cloud.datacatalog_v1.types import table_spec +from google.cloud.datacatalog_v1.types import tags +from google.cloud.datacatalog_v1.types import timestamps +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import field_mask_pb2 as field_mask # type: ignore + +from .transports.base import DataCatalogTransport +from .transports.grpc_asyncio import DataCatalogGrpcAsyncIOTransport +from .client import DataCatalogClient + + +class DataCatalogAsyncClient: + """Data Catalog API service allows clients to discover, + understand, and manage their data. + """ + + _client: DataCatalogClient + + DEFAULT_ENDPOINT = DataCatalogClient.DEFAULT_ENDPOINT + DEFAULT_MTLS_ENDPOINT = DataCatalogClient.DEFAULT_MTLS_ENDPOINT + + tag_template_field_path = staticmethod(DataCatalogClient.tag_template_field_path) + + tag_template_path = staticmethod(DataCatalogClient.tag_template_path) + + entry_path = staticmethod(DataCatalogClient.entry_path) + + tag_path = staticmethod(DataCatalogClient.tag_path) + + entry_group_path = staticmethod(DataCatalogClient.entry_group_path) + + from_service_account_file = DataCatalogClient.from_service_account_file + from_service_account_json = from_service_account_file + + get_transport_class = functools.partial( + type(DataCatalogClient).get_transport_class, type(DataCatalogClient) + ) + + def __init__( + self, + *, + credentials: credentials.Credentials = None, + transport: Union[str, DataCatalogTransport] = "grpc_asyncio", + client_options: ClientOptions = None, + ) -> None: + """Instantiate the data catalog client. + + Args: + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + transport (Union[str, ~.DataCatalogTransport]): The + transport to use. If set to None, a transport is chosen + automatically. + client_options (ClientOptions): Custom options for the client. It + won't take effect if a ``transport`` instance is provided. + (1) The ``api_endpoint`` property can be used to override the + default endpoint provided by the client. GOOGLE_API_USE_MTLS + environment variable can also be used to override the endpoint: + "always" (always use the default mTLS endpoint), "never" (always + use the default regular endpoint, this is the default value for + the environment variable) and "auto" (auto switch to the default + mTLS endpoint if client SSL credentials is present). However, + the ``api_endpoint`` property takes precedence if provided. + (2) The ``client_cert_source`` property is used to provide client + SSL credentials for mutual TLS transport. If not provided, the + default SSL credentials will be used if present. + + Raises: + google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport + creation failed for any reason. + """ + + self._client = DataCatalogClient( + credentials=credentials, transport=transport, client_options=client_options, + ) + + async def search_catalog( + self, + request: datacatalog.SearchCatalogRequest = None, + *, + scope: datacatalog.SearchCatalogRequest.Scope = None, + query: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.SearchCatalogAsyncPager: + r"""Searches Data Catalog for multiple resources like entries, tags + that match a query. + + This is a custom method + (https://cloud.google.com/apis/design/custom_methods) and does + not return the complete resource, only the resource identifier + and high level fields. Clients can subsequentally call ``Get`` + methods. + + Note that Data Catalog search queries do not guarantee full + recall. Query results that match your query may not be returned, + even in subsequent result pages. Also note that results returned + (and not returned) can vary across repeated search queries. + + See `Data Catalog Search + Syntax `__ + for more information. + + Args: + request (:class:`~.datacatalog.SearchCatalogRequest`): + The request object. Request message for + [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. + scope (:class:`~.datacatalog.SearchCatalogRequest.Scope`): + Required. The scope of this search request. A ``scope`` + that has empty ``include_org_ids``, + ``include_project_ids`` AND false + ``include_gcp_public_datasets`` is considered invalid. + Data Catalog will return an error in such a case. + This corresponds to the ``scope`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + query (:class:`str`): + Required. The query string in search query syntax. The + query must be non-empty. + + Query strings can be simple as "x" or more qualified as: + + - name:x + - column:x + - description:y + + Note: Query tokens need to have a minimum of 3 + characters for substring matching to work correctly. See + `Data Catalog Search + Syntax `__ + for more information. + This corresponds to the ``query`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.SearchCatalogAsyncPager: + Response message for + [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([scope, query]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.SearchCatalogRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if scope is not None: + request.scope = scope + if query is not None: + request.query = query + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.search_catalog, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.SearchCatalogAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def create_entry_group( + self, + request: datacatalog.CreateEntryGroupRequest = None, + *, + parent: str = None, + entry_group_id: str = None, + entry_group: datacatalog.EntryGroup = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Creates an EntryGroup. + + An entry group contains logically related entries together with + Cloud Identity and Access Management policies that specify the + users who can create, edit, and view entries within the entry + group. + + Data Catalog automatically creates an entry group for BigQuery + entries ("@bigquery") and Pub/Sub topics ("@pubsub"). Users + create their own entry group to contain Cloud Storage fileset + entries or custom type entries, and the IAM policies associated + with those entries. Entry groups, like entries, can be searched. + + A maximum of 10,000 entry groups may be created per organization + across all locations. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.CreateEntryGroupRequest`): + The request object. Request message for + [CreateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.CreateEntryGroup]. + parent (:class:`str`): + Required. The name of the project this entry group is + in. Example: + + - projects/{project_id}/locations/{location} + + Note that this EntryGroup and its child resources may + not actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_group_id (:class:`str`): + Required. The id of the entry group + to create. The id must begin with a + letter or underscore, contain only + English letters, numbers and + underscores, and be at most 64 + characters. + This corresponds to the ``entry_group_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_group (:class:`~.datacatalog.EntryGroup`): + The entry group to create. Defaults + to an empty entry group. + This corresponds to the ``entry_group`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, entry_group_id, entry_group]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if entry_group_id is not None: + request.entry_group_id = entry_group_id + if entry_group is not None: + request.entry_group = entry_group + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_entry_group, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def get_entry_group( + self, + request: datacatalog.GetEntryGroupRequest = None, + *, + name: str = None, + read_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Gets an EntryGroup. + + Args: + request (:class:`~.datacatalog.GetEntryGroupRequest`): + The request object. Request message for + [GetEntryGroup][google.cloud.datacatalog.v1.DataCatalog.GetEntryGroup]. + name (:class:`str`): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + read_mask (:class:`~.field_mask.FieldMask`): + The fields to return. If not set or + empty, all fields are returned. + This corresponds to the ``read_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, read_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.GetEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if read_mask is not None: + request.read_mask = read_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_entry_group, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_entry_group( + self, + request: datacatalog.UpdateEntryGroupRequest = None, + *, + entry_group: datacatalog.EntryGroup = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Updates an EntryGroup. The user should enable the Data Catalog + API in the project identified by the ``entry_group.name`` + parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateEntryGroupRequest`): + The request object. Request message for + [UpdateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.UpdateEntryGroup]. + entry_group (:class:`~.datacatalog.EntryGroup`): + Required. The updated entry group. + "name" field must be set. + This corresponds to the ``entry_group`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the entry + group. If absent or empty, all + modifiable fields are updated. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([entry_group, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if entry_group is not None: + request.entry_group = entry_group + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_entry_group, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("entry_group.name", request.entry_group.name),) + ), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_entry_group( + self, + request: datacatalog.DeleteEntryGroupRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes an EntryGroup. Only entry groups that do not contain + entries can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteEntryGroupRequest`): + The request object. Request message for + [DeleteEntryGroup][google.cloud.datacatalog.v1.DataCatalog.DeleteEntryGroup]. + name (:class:`str`): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_entry_group, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def list_entry_groups( + self, + request: datacatalog.ListEntryGroupsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListEntryGroupsAsyncPager: + r"""Lists entry groups. + + Args: + request (:class:`~.datacatalog.ListEntryGroupsRequest`): + The request object. Request message for + [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. + parent (:class:`str`): + Required. The name of the location that contains the + entry groups, which can be provided in URL format. + Example: + + - projects/{project_id}/locations/{location} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListEntryGroupsAsyncPager: + Response message for + [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.ListEntryGroupsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.list_entry_groups, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.ListEntryGroupsAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def create_entry( + self, + request: datacatalog.CreateEntryRequest = None, + *, + parent: str = None, + entry_id: str = None, + entry: datacatalog.Entry = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Creates an entry. Only entries of 'FILESET' type or + user-specified type can be created. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + A maximum of 100,000 entries may be created per entry group. + + Args: + request (:class:`~.datacatalog.CreateEntryRequest`): + The request object. Request message for + [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry]. + parent (:class:`str`): + Required. The name of the entry group this entry is in. + Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + + Note that this Entry and its child resources may not + actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_id (:class:`str`): + Required. The id of the entry to + create. + This corresponds to the ``entry_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry (:class:`~.datacatalog.Entry`): + Required. The entry to create. + This corresponds to the ``entry`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic) or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, entry_id, entry]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if entry_id is not None: + request.entry_id = entry_id + if entry is not None: + request.entry = entry + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_entry, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_entry( + self, + request: datacatalog.UpdateEntryRequest = None, + *, + entry: datacatalog.Entry = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Updates an existing entry. Users should enable the Data Catalog + API in the project identified by the ``entry.name`` parameter + (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateEntryRequest`): + The request object. Request message for + [UpdateEntry][google.cloud.datacatalog.v1.DataCatalog.UpdateEntry]. + entry (:class:`~.datacatalog.Entry`): + Required. The updated entry. The + "name" field must be set. + This corresponds to the ``entry`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the entry. If absent or empty, + all modifiable fields are updated. + + The following fields are modifiable: + + - For entries with type ``DATA_STREAM``: + + - ``schema`` + + - For entries with type ``FILESET`` + + - ``schema`` + - ``display_name`` + - ``description`` + - ``gcs_fileset_spec`` + - ``gcs_fileset_spec.file_patterns`` + + - For entries with ``user_specified_type`` + + - ``schema`` + - ``display_name`` + - ``description`` + - user_specified_type + - user_specified_system + - linked_resource + - source_system_timestamps + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic) or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([entry, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if entry is not None: + request.entry = entry + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_entry, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("entry.name", request.entry.name),) + ), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_entry( + self, + request: datacatalog.DeleteEntryRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes an existing entry. Only entries created through + [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry] + method can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteEntryRequest`): + The request object. Request message for + [DeleteEntry][google.cloud.datacatalog.v1.DataCatalog.DeleteEntry]. + name (:class:`str`): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_entry, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def get_entry( + self, + request: datacatalog.GetEntryRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Gets an entry. + + Args: + request (:class:`~.datacatalog.GetEntryRequest`): + The request object. Request message for + [GetEntry][google.cloud.datacatalog.v1.DataCatalog.GetEntry]. + name (:class:`str`): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic) or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.GetEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def lookup_entry( + self, + request: datacatalog.LookupEntryRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Get an entry by target resource name. This method + allows clients to use the resource name from the source + Google Cloud Platform service to get the Data Catalog + Entry. + + Args: + request (:class:`~.datacatalog.LookupEntryRequest`): + The request object. Request message for + [LookupEntry][google.cloud.datacatalog.v1.DataCatalog.LookupEntry]. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic) or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. + + """ + # Create or coerce a protobuf request object. + + request = datacatalog.LookupEntryRequest(request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.lookup_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def list_entries( + self, + request: datacatalog.ListEntriesRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListEntriesAsyncPager: + r"""Lists entries. + + Args: + request (:class:`~.datacatalog.ListEntriesRequest`): + The request object. Request message for + [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. + parent (:class:`str`): + Required. The name of the entry group that contains the + entries, which can be provided in URL format. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListEntriesAsyncPager: + Response message for + [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.ListEntriesRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.list_entries, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.ListEntriesAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def create_tag_template( + self, + request: datacatalog.CreateTagTemplateRequest = None, + *, + parent: str = None, + tag_template_id: str = None, + tag_template: tags.TagTemplate = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Creates a tag template. The user should enable the Data Catalog + API in the project identified by the ``parent`` parameter (see + `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.CreateTagTemplateRequest`): + The request object. Request message for + [CreateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplate]. + parent (:class:`str`): + Required. The name of the project and the template + location + `region `__. + + Example: + + - projects/{project_id}/locations/us-central1 + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_id (:class:`str`): + Required. The id of the tag template + to create. + This corresponds to the ``tag_template_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template (:class:`~.tags.TagTemplate`): + Required. The tag template to create. + This corresponds to the ``tag_template`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, tag_template_id, tag_template]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag_template_id is not None: + request.tag_template_id = tag_template_id + if tag_template is not None: + request.tag_template = tag_template + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_tag_template, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def get_tag_template( + self, + request: datacatalog.GetTagTemplateRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Gets a tag template. + + Args: + request (:class:`~.datacatalog.GetTagTemplateRequest`): + The request object. Request message for + [GetTagTemplate][google.cloud.datacatalog.v1.DataCatalog.GetTagTemplate]. + name (:class:`str`): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.GetTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_tag_template, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_tag_template( + self, + request: datacatalog.UpdateTagTemplateRequest = None, + *, + tag_template: tags.TagTemplate = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Updates a tag template. This method cannot be used to update the + fields of a template. The tag template fields are represented as + separate resources and should be updated using their own + create/update/delete methods. Users should enable the Data + Catalog API in the project identified by the + ``tag_template.name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateTagTemplateRequest`): + The request object. Request message for + [UpdateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplate]. + tag_template (:class:`~.tags.TagTemplate`): + Required. The template to update. The + "name" field must be set. + This corresponds to the ``tag_template`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The field mask specifies the parts of the template to + overwrite. + + Allowed fields: + + - ``display_name`` + + If absent or empty, all of the allowed fields above will + be updated. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([tag_template, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if tag_template is not None: + request.tag_template = tag_template + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_tag_template, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("tag_template.name", request.tag_template.name),) + ), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_tag_template( + self, + request: datacatalog.DeleteTagTemplateRequest = None, + *, + name: str = None, + force: bool = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a tag template and all tags using the template. Users + should enable the Data Catalog API in the project identified by + the ``name`` parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteTagTemplateRequest`): + The request object. Request message for + [DeleteTagTemplate][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplate]. + name (:class:`str`): + Required. The name of the tag template to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + force (:class:`bool`): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of any possible + tags using this template. ``force = false`` will be + supported in the future. + This corresponds to the ``force`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, force]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if force is not None: + request.force = force + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_tag_template, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def create_tag_template_field( + self, + request: datacatalog.CreateTagTemplateFieldRequest = None, + *, + parent: str = None, + tag_template_field_id: str = None, + tag_template_field: tags.TagTemplateField = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Creates a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``parent`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.CreateTagTemplateFieldRequest`): + The request object. Request message for + [CreateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplateField]. + parent (:class:`str`): + Required. The name of the project and the template + location + `region `__. + + Example: + + - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field_id (:class:`str`): + Required. The ID of the tag template field to create. + Field ids can contain letters (both uppercase and + lowercase), numbers (0-9), underscores (_) and dashes + (-). Field IDs must be at least 1 character long and at + most 128 characters long. Field IDs must also be unique + within their template. + This corresponds to the ``tag_template_field_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field (:class:`~.tags.TagTemplateField`): + Required. The tag template field to + create. + This corresponds to the ``tag_template_field`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any( + [parent, tag_template_field_id, tag_template_field] + ): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag_template_field_id is not None: + request.tag_template_field_id = tag_template_field_id + if tag_template_field is not None: + request.tag_template_field = tag_template_field + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_tag_template_field, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_tag_template_field( + self, + request: datacatalog.UpdateTagTemplateFieldRequest = None, + *, + name: str = None, + tag_template_field: tags.TagTemplateField = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Updates a field in a tag template. This method cannot be used to + update the field type. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateTagTemplateFieldRequest`): + The request object. Request message for + [UpdateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template field. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field (:class:`~.tags.TagTemplateField`): + Required. The template to update. + This corresponds to the ``tag_template_field`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + Optional. The field mask specifies the parts of the + template to be updated. Allowed fields: + + - ``display_name`` + - ``type.enum_type`` + - ``is_required`` + + If ``update_mask`` is not set or empty, all of the + allowed fields above will be updated. + + When updating an enum type, the provided values will be + merged with the existing values. Therefore, enum values + can only be added, existing enum values cannot be + deleted nor renamed. Updating a template field from + optional to required is NOT allowed. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, tag_template_field, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if tag_template_field is not None: + request.tag_template_field = tag_template_field + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_tag_template_field, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def rename_tag_template_field( + self, + request: datacatalog.RenameTagTemplateFieldRequest = None, + *, + name: str = None, + new_tag_template_field_id: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Renames a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``name`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.RenameTagTemplateFieldRequest`): + The request object. Request message for + [RenameTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.RenameTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + new_tag_template_field_id (:class:`str`): + Required. The new ID of this tag template field. For + example, ``my_new_field``. + This corresponds to the ``new_tag_template_field_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, new_tag_template_field_id]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.RenameTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if new_tag_template_field_id is not None: + request.new_tag_template_field_id = new_tag_template_field_id + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.rename_tag_template_field, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_tag_template_field( + self, + request: datacatalog.DeleteTagTemplateFieldRequest = None, + *, + name: str = None, + force: bool = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a field in a tag template and all uses of that field. + Users should enable the Data Catalog API in the project + identified by the ``name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteTagTemplateFieldRequest`): + The request object. Request message for + [DeleteTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template field to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + force (:class:`bool`): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of this field from + any tags using this field. ``force = false`` will be + supported in the future. + This corresponds to the ``force`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, force]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if force is not None: + request.force = force + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_tag_template_field, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def create_tag( + self, + request: datacatalog.CreateTagRequest = None, + *, + parent: str = None, + tag: tags.Tag = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.Tag: + r"""Creates a tag on an [Entry][google.cloud.datacatalog.v1.Entry]. + Note: The project identified by the ``parent`` parameter for the + `tag `__ + and the `tag + template `__ + used to create the tag must be from the same organization. + + Args: + request (:class:`~.datacatalog.CreateTagRequest`): + The request object. Request message for + [CreateTag][google.cloud.datacatalog.v1.DataCatalog.CreateTag]. + parent (:class:`str`): + Required. The name of the resource to attach this tag + to. Tags can be attached to Entries. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + + Note that this Tag and its child resources may not + actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag (:class:`~.tags.Tag`): + Required. The tag to create. + This corresponds to the ``tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.Tag: + Tags are used to attach custom metadata to Data Catalog + resources. Tags conform to the specifications within + their tag template. + + See `Data Catalog + IAM `__ + for information on the permissions needed to create or + view tags. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, tag]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag is not None: + request.tag = tag + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_tag, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_tag( + self, + request: datacatalog.UpdateTagRequest = None, + *, + tag: tags.Tag = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.Tag: + r"""Updates an existing tag. + + Args: + request (:class:`~.datacatalog.UpdateTagRequest`): + The request object. Request message for + [UpdateTag][google.cloud.datacatalog.v1.DataCatalog.UpdateTag]. + tag (:class:`~.tags.Tag`): + Required. The updated tag. The "name" + field must be set. + This corresponds to the ``tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the Tag. If absent or empty, all + modifiable fields are updated. Currently the only + modifiable field is the field ``fields``. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.Tag: + Tags are used to attach custom metadata to Data Catalog + resources. Tags conform to the specifications within + their tag template. + + See `Data Catalog + IAM `__ + for information on the permissions needed to create or + view tags. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([tag, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if tag is not None: + request.tag = tag + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_tag, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("tag.name", request.tag.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_tag( + self, + request: datacatalog.DeleteTagRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a tag. + + Args: + request (:class:`~.datacatalog.DeleteTagRequest`): + The request object. Request message for + [DeleteTag][google.cloud.datacatalog.v1.DataCatalog.DeleteTag]. + name (:class:`str`): + Required. The name of the tag to delete. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_tag, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def list_tags( + self, + request: datacatalog.ListTagsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListTagsAsyncPager: + r"""Lists the tags on an [Entry][google.cloud.datacatalog.v1.Entry]. + + Args: + request (:class:`~.datacatalog.ListTagsRequest`): + The request object. Request message for + [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. + parent (:class:`str`): + Required. The name of the Data Catalog resource to list + the tags of. The resource could be an + [Entry][google.cloud.datacatalog.v1.Entry] or an + [EntryGroup][google.cloud.datacatalog.v1.EntryGroup]. + + Examples: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListTagsAsyncPager: + Response message for + [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.ListTagsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.list_tags, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.ListTagsAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def set_iam_policy( + self, + request: iam_policy.SetIamPolicyRequest = None, + *, + resource: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Sets the access control policy for a resource. Replaces any + existing policy. Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on + tag templates. + - ``datacatalog.entries.setIamPolicy`` to set policies on + entries. + - ``datacatalog.entryGroups.setIamPolicy`` to set policies on + entry groups. + + Args: + request (:class:`~.iam_policy.SetIamPolicyRequest`): + The request object. Request message for `SetIamPolicy` + method. + resource (:class:`str`): + REQUIRED: The resource for which the + policy is being specified. See the + operation documentation for the + appropriate value for this field. + This corresponds to the ``resource`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([resource]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.SetIamPolicyRequest(**request) + + elif not request: + request = iam_policy.SetIamPolicyRequest() + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if resource is not None: + request.resource = resource + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.set_iam_policy, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def get_iam_policy( + self, + request: iam_policy.GetIamPolicyRequest = None, + *, + resource: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Gets the access control policy for a resource. A ``NOT_FOUND`` + error is returned if the resource does not exist. An empty + policy is returned if the resource exists but does not have a + policy set on it. + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on + tag templates. + - ``datacatalog.entries.getIamPolicy`` to get policies on + entries. + - ``datacatalog.entryGroups.getIamPolicy`` to get policies on + entry groups. + + Args: + request (:class:`~.iam_policy.GetIamPolicyRequest`): + The request object. Request message for `GetIamPolicy` + method. + resource (:class:`str`): + REQUIRED: The resource for which the + policy is being requested. See the + operation documentation for the + appropriate value for this field. + This corresponds to the ``resource`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([resource]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.GetIamPolicyRequest(**request) + + elif not request: + request = iam_policy.GetIamPolicyRequest() + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if resource is not None: + request.resource = resource + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_iam_policy, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def test_iam_permissions( + self, + request: iam_policy.TestIamPermissionsRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> iam_policy.TestIamPermissionsResponse: + r"""Returns the caller's permissions on a resource. If the resource + does not exist, an empty set of permissions is returned (We + don't return a ``NOT_FOUND`` error). + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + A caller is not required to have Google IAM permission to make + this request. + + Args: + request (:class:`~.iam_policy.TestIamPermissionsRequest`): + The request object. Request message for + `TestIamPermissions` method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.iam_policy.TestIamPermissionsResponse: + Response message for ``TestIamPermissions`` method. + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.TestIamPermissionsRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.test_iam_permissions, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +__all__ = ("DataCatalogAsyncClient",) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/client.py b/google/cloud/datacatalog_v1/services/data_catalog/client.py new file mode 100644 index 00000000..b0e61bf2 --- /dev/null +++ b/google/cloud/datacatalog_v1/services/data_catalog/client.py @@ -0,0 +1,2913 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +import os +import re +from typing import Callable, Dict, Sequence, Tuple, Type, Union +import pkg_resources + +import google.api_core.client_options as ClientOptions # type: ignore +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport import mtls # type: ignore +from google.auth.exceptions import MutualTLSChannelError # type: ignore +from google.oauth2 import service_account # type: ignore + +from google.cloud.datacatalog_v1.services.data_catalog import pagers +from google.cloud.datacatalog_v1.types import common +from google.cloud.datacatalog_v1.types import datacatalog +from google.cloud.datacatalog_v1.types import gcs_fileset_spec +from google.cloud.datacatalog_v1.types import schema +from google.cloud.datacatalog_v1.types import search +from google.cloud.datacatalog_v1.types import table_spec +from google.cloud.datacatalog_v1.types import tags +from google.cloud.datacatalog_v1.types import timestamps +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import field_mask_pb2 as field_mask # type: ignore + +from .transports.base import DataCatalogTransport +from .transports.grpc import DataCatalogGrpcTransport +from .transports.grpc_asyncio import DataCatalogGrpcAsyncIOTransport + + +class DataCatalogClientMeta(type): + """Metaclass for the DataCatalog client. + + This provides class-level methods for building and retrieving + support objects (e.g. transport) without polluting the client instance + objects. + """ + + _transport_registry = OrderedDict() # type: Dict[str, Type[DataCatalogTransport]] + _transport_registry["grpc"] = DataCatalogGrpcTransport + _transport_registry["grpc_asyncio"] = DataCatalogGrpcAsyncIOTransport + + def get_transport_class(cls, label: str = None,) -> Type[DataCatalogTransport]: + """Return an appropriate transport class. + + Args: + label: The name of the desired transport. If none is + provided, then the first transport in the registry is used. + + Returns: + The transport class to use. + """ + # If a specific transport is requested, return that one. + if label: + return cls._transport_registry[label] + + # No transport is requested; return the default (that is, the first one + # in the dictionary). + return next(iter(cls._transport_registry.values())) + + +class DataCatalogClient(metaclass=DataCatalogClientMeta): + """Data Catalog API service allows clients to discover, + understand, and manage their data. + """ + + @staticmethod + def _get_default_mtls_endpoint(api_endpoint): + """Convert api endpoint to mTLS endpoint. + Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to + "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively. + Args: + api_endpoint (Optional[str]): the api endpoint to convert. + Returns: + str: converted mTLS api endpoint. + """ + if not api_endpoint: + return api_endpoint + + mtls_endpoint_re = re.compile( + r"(?P[^.]+)(?P\.mtls)?(?P\.sandbox)?(?P\.googleapis\.com)?" + ) + + m = mtls_endpoint_re.match(api_endpoint) + name, mtls, sandbox, googledomain = m.groups() + if mtls or not googledomain: + return api_endpoint + + if sandbox: + return api_endpoint.replace( + "sandbox.googleapis.com", "mtls.sandbox.googleapis.com" + ) + + return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com") + + DEFAULT_ENDPOINT = "datacatalog.googleapis.com" + DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore + DEFAULT_ENDPOINT + ) + + @classmethod + def from_service_account_file(cls, filename: str, *args, **kwargs): + """Creates an instance of this client using the provided credentials + file. + + Args: + filename (str): The path to the service account private key json + file. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + {@api.name}: The constructed client. + """ + credentials = service_account.Credentials.from_service_account_file(filename) + kwargs["credentials"] = credentials + return cls(*args, **kwargs) + + from_service_account_json = from_service_account_file + + @staticmethod + def entry_path(project: str, location: str, entry_group: str, entry: str,) -> str: + """Return a fully-qualified entry string.""" + return "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}".format( + project=project, location=location, entry_group=entry_group, entry=entry, + ) + + @staticmethod + def parse_entry_path(path: str) -> Dict[str, str]: + """Parse a entry path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/entryGroups/(?P.+?)/entries/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def entry_group_path(project: str, location: str, entry_group: str,) -> str: + """Return a fully-qualified entry_group string.""" + return "projects/{project}/locations/{location}/entryGroups/{entry_group}".format( + project=project, location=location, entry_group=entry_group, + ) + + @staticmethod + def parse_entry_group_path(path: str) -> Dict[str, str]: + """Parse a entry_group path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/entryGroups/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def tag_path( + project: str, location: str, entry_group: str, entry: str, tag: str, + ) -> str: + """Return a fully-qualified tag string.""" + return "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}".format( + project=project, + location=location, + entry_group=entry_group, + entry=entry, + tag=tag, + ) + + @staticmethod + def parse_tag_path(path: str) -> Dict[str, str]: + """Parse a tag path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/entryGroups/(?P.+?)/entries/(?P.+?)/tags/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def tag_template_path(project: str, location: str, tag_template: str,) -> str: + """Return a fully-qualified tag_template string.""" + return "projects/{project}/locations/{location}/tagTemplates/{tag_template}".format( + project=project, location=location, tag_template=tag_template, + ) + + @staticmethod + def parse_tag_template_path(path: str) -> Dict[str, str]: + """Parse a tag_template path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/tagTemplates/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def tag_template_field_path( + project: str, location: str, tag_template: str, field: str, + ) -> str: + """Return a fully-qualified tag_template_field string.""" + return "projects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}".format( + project=project, location=location, tag_template=tag_template, field=field, + ) + + @staticmethod + def parse_tag_template_field_path(path: str) -> Dict[str, str]: + """Parse a tag_template_field path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/tagTemplates/(?P.+?)/fields/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + def __init__( + self, + *, + credentials: credentials.Credentials = None, + transport: Union[str, DataCatalogTransport] = None, + client_options: ClientOptions = None, + ) -> None: + """Instantiate the data catalog client. + + Args: + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + transport (Union[str, ~.DataCatalogTransport]): The + transport to use. If set to None, a transport is chosen + automatically. + client_options (ClientOptions): Custom options for the client. It + won't take effect if a ``transport`` instance is provided. + (1) The ``api_endpoint`` property can be used to override the + default endpoint provided by the client. GOOGLE_API_USE_MTLS + environment variable can also be used to override the endpoint: + "always" (always use the default mTLS endpoint), "never" (always + use the default regular endpoint, this is the default value for + the environment variable) and "auto" (auto switch to the default + mTLS endpoint if client SSL credentials is present). However, + the ``api_endpoint`` property takes precedence if provided. + (2) The ``client_cert_source`` property is used to provide client + SSL credentials for mutual TLS transport. If not provided, the + default SSL credentials will be used if present. + + Raises: + google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport + creation failed for any reason. + """ + if isinstance(client_options, dict): + client_options = ClientOptions.from_dict(client_options) + if client_options is None: + client_options = ClientOptions.ClientOptions() + + if client_options.api_endpoint is None: + use_mtls_env = os.getenv("GOOGLE_API_USE_MTLS", "never") + if use_mtls_env == "never": + client_options.api_endpoint = self.DEFAULT_ENDPOINT + elif use_mtls_env == "always": + client_options.api_endpoint = self.DEFAULT_MTLS_ENDPOINT + elif use_mtls_env == "auto": + has_client_cert_source = ( + client_options.client_cert_source is not None + or mtls.has_default_client_cert_source() + ) + client_options.api_endpoint = ( + self.DEFAULT_MTLS_ENDPOINT + if has_client_cert_source + else self.DEFAULT_ENDPOINT + ) + else: + raise MutualTLSChannelError( + "Unsupported GOOGLE_API_USE_MTLS value. Accepted values: never, auto, always" + ) + + # Save or instantiate the transport. + # Ordinarily, we provide the transport, but allowing a custom transport + # instance provides an extensibility point for unusual situations. + if isinstance(transport, DataCatalogTransport): + # transport is a DataCatalogTransport instance. + if credentials or client_options.credentials_file: + raise ValueError( + "When providing a transport instance, " + "provide its credentials directly." + ) + if client_options.scopes: + raise ValueError( + "When providing a transport instance, " + "provide its scopes directly." + ) + self._transport = transport + else: + Transport = type(self).get_transport_class(transport) + self._transport = Transport( + credentials=credentials, + credentials_file=client_options.credentials_file, + host=client_options.api_endpoint, + scopes=client_options.scopes, + api_mtls_endpoint=client_options.api_endpoint, + client_cert_source=client_options.client_cert_source, + quota_project_id=client_options.quota_project_id, + ) + + def search_catalog( + self, + request: datacatalog.SearchCatalogRequest = None, + *, + scope: datacatalog.SearchCatalogRequest.Scope = None, + query: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.SearchCatalogPager: + r"""Searches Data Catalog for multiple resources like entries, tags + that match a query. + + This is a custom method + (https://cloud.google.com/apis/design/custom_methods) and does + not return the complete resource, only the resource identifier + and high level fields. Clients can subsequentally call ``Get`` + methods. + + Note that Data Catalog search queries do not guarantee full + recall. Query results that match your query may not be returned, + even in subsequent result pages. Also note that results returned + (and not returned) can vary across repeated search queries. + + See `Data Catalog Search + Syntax `__ + for more information. + + Args: + request (:class:`~.datacatalog.SearchCatalogRequest`): + The request object. Request message for + [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. + scope (:class:`~.datacatalog.SearchCatalogRequest.Scope`): + Required. The scope of this search request. A ``scope`` + that has empty ``include_org_ids``, + ``include_project_ids`` AND false + ``include_gcp_public_datasets`` is considered invalid. + Data Catalog will return an error in such a case. + This corresponds to the ``scope`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + query (:class:`str`): + Required. The query string in search query syntax. The + query must be non-empty. + + Query strings can be simple as "x" or more qualified as: + + - name:x + - column:x + - description:y + + Note: Query tokens need to have a minimum of 3 + characters for substring matching to work correctly. See + `Data Catalog Search + Syntax `__ + for more information. + This corresponds to the ``query`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.SearchCatalogPager: + Response message for + [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([scope, query]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.SearchCatalogRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.SearchCatalogRequest): + request = datacatalog.SearchCatalogRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if scope is not None: + request.scope = scope + if query is not None: + request.query = query + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.search_catalog] + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.SearchCatalogPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def create_entry_group( + self, + request: datacatalog.CreateEntryGroupRequest = None, + *, + parent: str = None, + entry_group_id: str = None, + entry_group: datacatalog.EntryGroup = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Creates an EntryGroup. + + An entry group contains logically related entries together with + Cloud Identity and Access Management policies that specify the + users who can create, edit, and view entries within the entry + group. + + Data Catalog automatically creates an entry group for BigQuery + entries ("@bigquery") and Pub/Sub topics ("@pubsub"). Users + create their own entry group to contain Cloud Storage fileset + entries or custom type entries, and the IAM policies associated + with those entries. Entry groups, like entries, can be searched. + + A maximum of 10,000 entry groups may be created per organization + across all locations. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.CreateEntryGroupRequest`): + The request object. Request message for + [CreateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.CreateEntryGroup]. + parent (:class:`str`): + Required. The name of the project this entry group is + in. Example: + + - projects/{project_id}/locations/{location} + + Note that this EntryGroup and its child resources may + not actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_group_id (:class:`str`): + Required. The id of the entry group + to create. The id must begin with a + letter or underscore, contain only + English letters, numbers and + underscores, and be at most 64 + characters. + This corresponds to the ``entry_group_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_group (:class:`~.datacatalog.EntryGroup`): + The entry group to create. Defaults + to an empty entry group. + This corresponds to the ``entry_group`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, entry_group_id, entry_group]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateEntryGroupRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateEntryGroupRequest): + request = datacatalog.CreateEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if entry_group_id is not None: + request.entry_group_id = entry_group_id + if entry_group is not None: + request.entry_group = entry_group + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_entry_group] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def get_entry_group( + self, + request: datacatalog.GetEntryGroupRequest = None, + *, + name: str = None, + read_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Gets an EntryGroup. + + Args: + request (:class:`~.datacatalog.GetEntryGroupRequest`): + The request object. Request message for + [GetEntryGroup][google.cloud.datacatalog.v1.DataCatalog.GetEntryGroup]. + name (:class:`str`): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + read_mask (:class:`~.field_mask.FieldMask`): + The fields to return. If not set or + empty, all fields are returned. + This corresponds to the ``read_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, read_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.GetEntryGroupRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.GetEntryGroupRequest): + request = datacatalog.GetEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if read_mask is not None: + request.read_mask = read_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_entry_group] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_entry_group( + self, + request: datacatalog.UpdateEntryGroupRequest = None, + *, + entry_group: datacatalog.EntryGroup = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Updates an EntryGroup. The user should enable the Data Catalog + API in the project identified by the ``entry_group.name`` + parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateEntryGroupRequest`): + The request object. Request message for + [UpdateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.UpdateEntryGroup]. + entry_group (:class:`~.datacatalog.EntryGroup`): + Required. The updated entry group. + "name" field must be set. + This corresponds to the ``entry_group`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the entry + group. If absent or empty, all + modifiable fields are updated. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([entry_group, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateEntryGroupRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateEntryGroupRequest): + request = datacatalog.UpdateEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if entry_group is not None: + request.entry_group = entry_group + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_entry_group] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("entry_group.name", request.entry_group.name),) + ), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_entry_group( + self, + request: datacatalog.DeleteEntryGroupRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes an EntryGroup. Only entry groups that do not contain + entries can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteEntryGroupRequest`): + The request object. Request message for + [DeleteEntryGroup][google.cloud.datacatalog.v1.DataCatalog.DeleteEntryGroup]. + name (:class:`str`): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteEntryGroupRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteEntryGroupRequest): + request = datacatalog.DeleteEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_entry_group] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def list_entry_groups( + self, + request: datacatalog.ListEntryGroupsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListEntryGroupsPager: + r"""Lists entry groups. + + Args: + request (:class:`~.datacatalog.ListEntryGroupsRequest`): + The request object. Request message for + [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. + parent (:class:`str`): + Required. The name of the location that contains the + entry groups, which can be provided in URL format. + Example: + + - projects/{project_id}/locations/{location} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListEntryGroupsPager: + Response message for + [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.ListEntryGroupsRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.ListEntryGroupsRequest): + request = datacatalog.ListEntryGroupsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.list_entry_groups] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.ListEntryGroupsPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def create_entry( + self, + request: datacatalog.CreateEntryRequest = None, + *, + parent: str = None, + entry_id: str = None, + entry: datacatalog.Entry = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Creates an entry. Only entries of 'FILESET' type or + user-specified type can be created. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + A maximum of 100,000 entries may be created per entry group. + + Args: + request (:class:`~.datacatalog.CreateEntryRequest`): + The request object. Request message for + [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry]. + parent (:class:`str`): + Required. The name of the entry group this entry is in. + Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + + Note that this Entry and its child resources may not + actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_id (:class:`str`): + Required. The id of the entry to + create. + This corresponds to the ``entry_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry (:class:`~.datacatalog.Entry`): + Required. The entry to create. + This corresponds to the ``entry`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic) or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, entry_id, entry]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateEntryRequest): + request = datacatalog.CreateEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if entry_id is not None: + request.entry_id = entry_id + if entry is not None: + request.entry = entry + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_entry] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_entry( + self, + request: datacatalog.UpdateEntryRequest = None, + *, + entry: datacatalog.Entry = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Updates an existing entry. Users should enable the Data Catalog + API in the project identified by the ``entry.name`` parameter + (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateEntryRequest`): + The request object. Request message for + [UpdateEntry][google.cloud.datacatalog.v1.DataCatalog.UpdateEntry]. + entry (:class:`~.datacatalog.Entry`): + Required. The updated entry. The + "name" field must be set. + This corresponds to the ``entry`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the entry. If absent or empty, + all modifiable fields are updated. + + The following fields are modifiable: + + - For entries with type ``DATA_STREAM``: + + - ``schema`` + + - For entries with type ``FILESET`` + + - ``schema`` + - ``display_name`` + - ``description`` + - ``gcs_fileset_spec`` + - ``gcs_fileset_spec.file_patterns`` + + - For entries with ``user_specified_type`` + + - ``schema`` + - ``display_name`` + - ``description`` + - user_specified_type + - user_specified_system + - linked_resource + - source_system_timestamps + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic) or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([entry, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateEntryRequest): + request = datacatalog.UpdateEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if entry is not None: + request.entry = entry + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_entry] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("entry.name", request.entry.name),) + ), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_entry( + self, + request: datacatalog.DeleteEntryRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes an existing entry. Only entries created through + [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry] + method can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteEntryRequest`): + The request object. Request message for + [DeleteEntry][google.cloud.datacatalog.v1.DataCatalog.DeleteEntry]. + name (:class:`str`): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteEntryRequest): + request = datacatalog.DeleteEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_entry] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def get_entry( + self, + request: datacatalog.GetEntryRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Gets an entry. + + Args: + request (:class:`~.datacatalog.GetEntryRequest`): + The request object. Request message for + [GetEntry][google.cloud.datacatalog.v1.DataCatalog.GetEntry]. + name (:class:`str`): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic) or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.GetEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.GetEntryRequest): + request = datacatalog.GetEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_entry] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def lookup_entry( + self, + request: datacatalog.LookupEntryRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Get an entry by target resource name. This method + allows clients to use the resource name from the source + Google Cloud Platform service to get the Data Catalog + Entry. + + Args: + request (:class:`~.datacatalog.LookupEntryRequest`): + The request object. Request message for + [LookupEntry][google.cloud.datacatalog.v1.DataCatalog.LookupEntry]. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic) or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. + + """ + # Create or coerce a protobuf request object. + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.LookupEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.LookupEntryRequest): + request = datacatalog.LookupEntryRequest(request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.lookup_entry] + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def list_entries( + self, + request: datacatalog.ListEntriesRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListEntriesPager: + r"""Lists entries. + + Args: + request (:class:`~.datacatalog.ListEntriesRequest`): + The request object. Request message for + [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. + parent (:class:`str`): + Required. The name of the entry group that contains the + entries, which can be provided in URL format. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListEntriesPager: + Response message for + [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.ListEntriesRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.ListEntriesRequest): + request = datacatalog.ListEntriesRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.list_entries] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.ListEntriesPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def create_tag_template( + self, + request: datacatalog.CreateTagTemplateRequest = None, + *, + parent: str = None, + tag_template_id: str = None, + tag_template: tags.TagTemplate = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Creates a tag template. The user should enable the Data Catalog + API in the project identified by the ``parent`` parameter (see + `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.CreateTagTemplateRequest`): + The request object. Request message for + [CreateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplate]. + parent (:class:`str`): + Required. The name of the project and the template + location + `region `__. + + Example: + + - projects/{project_id}/locations/us-central1 + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_id (:class:`str`): + Required. The id of the tag template + to create. + This corresponds to the ``tag_template_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template (:class:`~.tags.TagTemplate`): + Required. The tag template to create. + This corresponds to the ``tag_template`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, tag_template_id, tag_template]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateTagTemplateRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateTagTemplateRequest): + request = datacatalog.CreateTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag_template_id is not None: + request.tag_template_id = tag_template_id + if tag_template is not None: + request.tag_template = tag_template + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_tag_template] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def get_tag_template( + self, + request: datacatalog.GetTagTemplateRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Gets a tag template. + + Args: + request (:class:`~.datacatalog.GetTagTemplateRequest`): + The request object. Request message for + [GetTagTemplate][google.cloud.datacatalog.v1.DataCatalog.GetTagTemplate]. + name (:class:`str`): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.GetTagTemplateRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.GetTagTemplateRequest): + request = datacatalog.GetTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_tag_template] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_tag_template( + self, + request: datacatalog.UpdateTagTemplateRequest = None, + *, + tag_template: tags.TagTemplate = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Updates a tag template. This method cannot be used to update the + fields of a template. The tag template fields are represented as + separate resources and should be updated using their own + create/update/delete methods. Users should enable the Data + Catalog API in the project identified by the + ``tag_template.name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateTagTemplateRequest`): + The request object. Request message for + [UpdateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplate]. + tag_template (:class:`~.tags.TagTemplate`): + Required. The template to update. The + "name" field must be set. + This corresponds to the ``tag_template`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The field mask specifies the parts of the template to + overwrite. + + Allowed fields: + + - ``display_name`` + + If absent or empty, all of the allowed fields above will + be updated. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([tag_template, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateTagTemplateRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateTagTemplateRequest): + request = datacatalog.UpdateTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if tag_template is not None: + request.tag_template = tag_template + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_tag_template] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("tag_template.name", request.tag_template.name),) + ), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_tag_template( + self, + request: datacatalog.DeleteTagTemplateRequest = None, + *, + name: str = None, + force: bool = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a tag template and all tags using the template. Users + should enable the Data Catalog API in the project identified by + the ``name`` parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteTagTemplateRequest`): + The request object. Request message for + [DeleteTagTemplate][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplate]. + name (:class:`str`): + Required. The name of the tag template to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + force (:class:`bool`): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of any possible + tags using this template. ``force = false`` will be + supported in the future. + This corresponds to the ``force`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, force]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteTagTemplateRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteTagTemplateRequest): + request = datacatalog.DeleteTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if force is not None: + request.force = force + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_tag_template] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def create_tag_template_field( + self, + request: datacatalog.CreateTagTemplateFieldRequest = None, + *, + parent: str = None, + tag_template_field_id: str = None, + tag_template_field: tags.TagTemplateField = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Creates a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``parent`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.CreateTagTemplateFieldRequest`): + The request object. Request message for + [CreateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplateField]. + parent (:class:`str`): + Required. The name of the project and the template + location + `region `__. + + Example: + + - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field_id (:class:`str`): + Required. The ID of the tag template field to create. + Field ids can contain letters (both uppercase and + lowercase), numbers (0-9), underscores (_) and dashes + (-). Field IDs must be at least 1 character long and at + most 128 characters long. Field IDs must also be unique + within their template. + This corresponds to the ``tag_template_field_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field (:class:`~.tags.TagTemplateField`): + Required. The tag template field to + create. + This corresponds to the ``tag_template_field`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, tag_template_field_id, tag_template_field]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateTagTemplateFieldRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateTagTemplateFieldRequest): + request = datacatalog.CreateTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag_template_field_id is not None: + request.tag_template_field_id = tag_template_field_id + if tag_template_field is not None: + request.tag_template_field = tag_template_field + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[ + self._transport.create_tag_template_field + ] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_tag_template_field( + self, + request: datacatalog.UpdateTagTemplateFieldRequest = None, + *, + name: str = None, + tag_template_field: tags.TagTemplateField = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Updates a field in a tag template. This method cannot be used to + update the field type. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateTagTemplateFieldRequest`): + The request object. Request message for + [UpdateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template field. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field (:class:`~.tags.TagTemplateField`): + Required. The template to update. + This corresponds to the ``tag_template_field`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + Optional. The field mask specifies the parts of the + template to be updated. Allowed fields: + + - ``display_name`` + - ``type.enum_type`` + - ``is_required`` + + If ``update_mask`` is not set or empty, all of the + allowed fields above will be updated. + + When updating an enum type, the provided values will be + merged with the existing values. Therefore, enum values + can only be added, existing enum values cannot be + deleted nor renamed. Updating a template field from + optional to required is NOT allowed. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, tag_template_field, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateTagTemplateFieldRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateTagTemplateFieldRequest): + request = datacatalog.UpdateTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if tag_template_field is not None: + request.tag_template_field = tag_template_field + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[ + self._transport.update_tag_template_field + ] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def rename_tag_template_field( + self, + request: datacatalog.RenameTagTemplateFieldRequest = None, + *, + name: str = None, + new_tag_template_field_id: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Renames a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``name`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.RenameTagTemplateFieldRequest`): + The request object. Request message for + [RenameTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.RenameTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + new_tag_template_field_id (:class:`str`): + Required. The new ID of this tag template field. For + example, ``my_new_field``. + This corresponds to the ``new_tag_template_field_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, new_tag_template_field_id]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.RenameTagTemplateFieldRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.RenameTagTemplateFieldRequest): + request = datacatalog.RenameTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if new_tag_template_field_id is not None: + request.new_tag_template_field_id = new_tag_template_field_id + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[ + self._transport.rename_tag_template_field + ] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_tag_template_field( + self, + request: datacatalog.DeleteTagTemplateFieldRequest = None, + *, + name: str = None, + force: bool = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a field in a tag template and all uses of that field. + Users should enable the Data Catalog API in the project + identified by the ``name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteTagTemplateFieldRequest`): + The request object. Request message for + [DeleteTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template field to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + force (:class:`bool`): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of this field from + any tags using this field. ``force = false`` will be + supported in the future. + This corresponds to the ``force`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, force]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteTagTemplateFieldRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteTagTemplateFieldRequest): + request = datacatalog.DeleteTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if force is not None: + request.force = force + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[ + self._transport.delete_tag_template_field + ] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def create_tag( + self, + request: datacatalog.CreateTagRequest = None, + *, + parent: str = None, + tag: tags.Tag = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.Tag: + r"""Creates a tag on an [Entry][google.cloud.datacatalog.v1.Entry]. + Note: The project identified by the ``parent`` parameter for the + `tag `__ + and the `tag + template `__ + used to create the tag must be from the same organization. + + Args: + request (:class:`~.datacatalog.CreateTagRequest`): + The request object. Request message for + [CreateTag][google.cloud.datacatalog.v1.DataCatalog.CreateTag]. + parent (:class:`str`): + Required. The name of the resource to attach this tag + to. Tags can be attached to Entries. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + + Note that this Tag and its child resources may not + actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag (:class:`~.tags.Tag`): + Required. The tag to create. + This corresponds to the ``tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.Tag: + Tags are used to attach custom metadata to Data Catalog + resources. Tags conform to the specifications within + their tag template. + + See `Data Catalog + IAM `__ + for information on the permissions needed to create or + view tags. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, tag]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateTagRequest): + request = datacatalog.CreateTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag is not None: + request.tag = tag + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_tag( + self, + request: datacatalog.UpdateTagRequest = None, + *, + tag: tags.Tag = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.Tag: + r"""Updates an existing tag. + + Args: + request (:class:`~.datacatalog.UpdateTagRequest`): + The request object. Request message for + [UpdateTag][google.cloud.datacatalog.v1.DataCatalog.UpdateTag]. + tag (:class:`~.tags.Tag`): + Required. The updated tag. The "name" + field must be set. + This corresponds to the ``tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the Tag. If absent or empty, all + modifiable fields are updated. Currently the only + modifiable field is the field ``fields``. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.Tag: + Tags are used to attach custom metadata to Data Catalog + resources. Tags conform to the specifications within + their tag template. + + See `Data Catalog + IAM `__ + for information on the permissions needed to create or + view tags. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([tag, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateTagRequest): + request = datacatalog.UpdateTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if tag is not None: + request.tag = tag + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("tag.name", request.tag.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_tag( + self, + request: datacatalog.DeleteTagRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a tag. + + Args: + request (:class:`~.datacatalog.DeleteTagRequest`): + The request object. Request message for + [DeleteTag][google.cloud.datacatalog.v1.DataCatalog.DeleteTag]. + name (:class:`str`): + Required. The name of the tag to delete. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteTagRequest): + request = datacatalog.DeleteTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def list_tags( + self, + request: datacatalog.ListTagsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListTagsPager: + r"""Lists the tags on an [Entry][google.cloud.datacatalog.v1.Entry]. + + Args: + request (:class:`~.datacatalog.ListTagsRequest`): + The request object. Request message for + [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. + parent (:class:`str`): + Required. The name of the Data Catalog resource to list + the tags of. The resource could be an + [Entry][google.cloud.datacatalog.v1.Entry] or an + [EntryGroup][google.cloud.datacatalog.v1.EntryGroup]. + + Examples: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListTagsPager: + Response message for + [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.ListTagsRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.ListTagsRequest): + request = datacatalog.ListTagsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.list_tags] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.ListTagsPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def set_iam_policy( + self, + request: iam_policy.SetIamPolicyRequest = None, + *, + resource: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Sets the access control policy for a resource. Replaces any + existing policy. Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on + tag templates. + - ``datacatalog.entries.setIamPolicy`` to set policies on + entries. + - ``datacatalog.entryGroups.setIamPolicy`` to set policies on + entry groups. + + Args: + request (:class:`~.iam_policy.SetIamPolicyRequest`): + The request object. Request message for `SetIamPolicy` + method. + resource (:class:`str`): + REQUIRED: The resource for which the + policy is being specified. See the + operation documentation for the + appropriate value for this field. + This corresponds to the ``resource`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([resource]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.SetIamPolicyRequest(**request) + + elif not request: + request = iam_policy.SetIamPolicyRequest() + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if resource is not None: + request.resource = resource + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.set_iam_policy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def get_iam_policy( + self, + request: iam_policy.GetIamPolicyRequest = None, + *, + resource: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Gets the access control policy for a resource. A ``NOT_FOUND`` + error is returned if the resource does not exist. An empty + policy is returned if the resource exists but does not have a + policy set on it. + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on + tag templates. + - ``datacatalog.entries.getIamPolicy`` to get policies on + entries. + - ``datacatalog.entryGroups.getIamPolicy`` to get policies on + entry groups. + + Args: + request (:class:`~.iam_policy.GetIamPolicyRequest`): + The request object. Request message for `GetIamPolicy` + method. + resource (:class:`str`): + REQUIRED: The resource for which the + policy is being requested. See the + operation documentation for the + appropriate value for this field. + This corresponds to the ``resource`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([resource]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.GetIamPolicyRequest(**request) + + elif not request: + request = iam_policy.GetIamPolicyRequest() + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if resource is not None: + request.resource = resource + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_iam_policy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def test_iam_permissions( + self, + request: iam_policy.TestIamPermissionsRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> iam_policy.TestIamPermissionsResponse: + r"""Returns the caller's permissions on a resource. If the resource + does not exist, an empty set of permissions is returned (We + don't return a ``NOT_FOUND`` error). + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + A caller is not required to have Google IAM permission to make + this request. + + Args: + request (:class:`~.iam_policy.TestIamPermissionsRequest`): + The request object. Request message for + `TestIamPermissions` method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.iam_policy.TestIamPermissionsResponse: + Response message for ``TestIamPermissions`` method. + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.TestIamPermissionsRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.test_iam_permissions] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +__all__ = ("DataCatalogClient",) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/pagers.py b/google/cloud/datacatalog_v1/services/data_catalog/pagers.py new file mode 100644 index 00000000..05c81bfd --- /dev/null +++ b/google/cloud/datacatalog_v1/services/data_catalog/pagers.py @@ -0,0 +1,534 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Any, AsyncIterable, Awaitable, Callable, Iterable, Sequence, Tuple + +from google.cloud.datacatalog_v1.types import datacatalog +from google.cloud.datacatalog_v1.types import search +from google.cloud.datacatalog_v1.types import tags + + +class SearchCatalogPager: + """A pager for iterating through ``search_catalog`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.SearchCatalogResponse` object, and + provides an ``__iter__`` method to iterate through its + ``results`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``SearchCatalog`` requests and continue to iterate + through the ``results`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.SearchCatalogResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., datacatalog.SearchCatalogResponse], + request: datacatalog.SearchCatalogRequest, + response: datacatalog.SearchCatalogResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.SearchCatalogRequest`): + The initial request object. + response (:class:`~.datacatalog.SearchCatalogResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.SearchCatalogRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[datacatalog.SearchCatalogResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[search.SearchCatalogResult]: + for page in self.pages: + yield from page.results + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class SearchCatalogAsyncPager: + """A pager for iterating through ``search_catalog`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.SearchCatalogResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``results`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``SearchCatalog`` requests and continue to iterate + through the ``results`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.SearchCatalogResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[datacatalog.SearchCatalogResponse]], + request: datacatalog.SearchCatalogRequest, + response: datacatalog.SearchCatalogResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.SearchCatalogRequest`): + The initial request object. + response (:class:`~.datacatalog.SearchCatalogResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.SearchCatalogRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[datacatalog.SearchCatalogResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[search.SearchCatalogResult]: + async def async_generator(): + async for page in self.pages: + for response in page.results: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListEntryGroupsPager: + """A pager for iterating through ``list_entry_groups`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListEntryGroupsResponse` object, and + provides an ``__iter__`` method to iterate through its + ``entry_groups`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``ListEntryGroups`` requests and continue to iterate + through the ``entry_groups`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListEntryGroupsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., datacatalog.ListEntryGroupsResponse], + request: datacatalog.ListEntryGroupsRequest, + response: datacatalog.ListEntryGroupsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListEntryGroupsRequest`): + The initial request object. + response (:class:`~.datacatalog.ListEntryGroupsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListEntryGroupsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[datacatalog.ListEntryGroupsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[datacatalog.EntryGroup]: + for page in self.pages: + yield from page.entry_groups + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListEntryGroupsAsyncPager: + """A pager for iterating through ``list_entry_groups`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListEntryGroupsResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``entry_groups`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``ListEntryGroups`` requests and continue to iterate + through the ``entry_groups`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListEntryGroupsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[datacatalog.ListEntryGroupsResponse]], + request: datacatalog.ListEntryGroupsRequest, + response: datacatalog.ListEntryGroupsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListEntryGroupsRequest`): + The initial request object. + response (:class:`~.datacatalog.ListEntryGroupsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListEntryGroupsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[datacatalog.ListEntryGroupsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[datacatalog.EntryGroup]: + async def async_generator(): + async for page in self.pages: + for response in page.entry_groups: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListEntriesPager: + """A pager for iterating through ``list_entries`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListEntriesResponse` object, and + provides an ``__iter__`` method to iterate through its + ``entries`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``ListEntries`` requests and continue to iterate + through the ``entries`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListEntriesResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., datacatalog.ListEntriesResponse], + request: datacatalog.ListEntriesRequest, + response: datacatalog.ListEntriesResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListEntriesRequest`): + The initial request object. + response (:class:`~.datacatalog.ListEntriesResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListEntriesRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[datacatalog.ListEntriesResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[datacatalog.Entry]: + for page in self.pages: + yield from page.entries + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListEntriesAsyncPager: + """A pager for iterating through ``list_entries`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListEntriesResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``entries`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``ListEntries`` requests and continue to iterate + through the ``entries`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListEntriesResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[datacatalog.ListEntriesResponse]], + request: datacatalog.ListEntriesRequest, + response: datacatalog.ListEntriesResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListEntriesRequest`): + The initial request object. + response (:class:`~.datacatalog.ListEntriesResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListEntriesRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[datacatalog.ListEntriesResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[datacatalog.Entry]: + async def async_generator(): + async for page in self.pages: + for response in page.entries: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListTagsPager: + """A pager for iterating through ``list_tags`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListTagsResponse` object, and + provides an ``__iter__`` method to iterate through its + ``tags`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``ListTags`` requests and continue to iterate + through the ``tags`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListTagsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., datacatalog.ListTagsResponse], + request: datacatalog.ListTagsRequest, + response: datacatalog.ListTagsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListTagsRequest`): + The initial request object. + response (:class:`~.datacatalog.ListTagsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListTagsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[datacatalog.ListTagsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[tags.Tag]: + for page in self.pages: + yield from page.tags + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListTagsAsyncPager: + """A pager for iterating through ``list_tags`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListTagsResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``tags`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``ListTags`` requests and continue to iterate + through the ``tags`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListTagsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[datacatalog.ListTagsResponse]], + request: datacatalog.ListTagsRequest, + response: datacatalog.ListTagsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListTagsRequest`): + The initial request object. + response (:class:`~.datacatalog.ListTagsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListTagsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[datacatalog.ListTagsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[tags.Tag]: + async def async_generator(): + async for page in self.pages: + for response in page.tags: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/transports/__init__.py b/google/cloud/datacatalog_v1/services/data_catalog/transports/__init__.py new file mode 100644 index 00000000..77a41a96 --- /dev/null +++ b/google/cloud/datacatalog_v1/services/data_catalog/transports/__init__.py @@ -0,0 +1,36 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +from typing import Dict, Type + +from .base import DataCatalogTransport +from .grpc import DataCatalogGrpcTransport +from .grpc_asyncio import DataCatalogGrpcAsyncIOTransport + + +# Compile a registry of transports. +_transport_registry = OrderedDict() # type: Dict[str, Type[DataCatalogTransport]] +_transport_registry["grpc"] = DataCatalogGrpcTransport +_transport_registry["grpc_asyncio"] = DataCatalogGrpcAsyncIOTransport + + +__all__ = ( + "DataCatalogTransport", + "DataCatalogGrpcTransport", + "DataCatalogGrpcAsyncIOTransport", +) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/transports/base.py b/google/cloud/datacatalog_v1/services/data_catalog/transports/base.py new file mode 100644 index 00000000..326b640d --- /dev/null +++ b/google/cloud/datacatalog_v1/services/data_catalog/transports/base.py @@ -0,0 +1,528 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import abc +import typing +import pkg_resources + +from google import auth +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore + +from google.cloud.datacatalog_v1.types import datacatalog +from google.cloud.datacatalog_v1.types import tags +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import empty_pb2 as empty # type: ignore + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +class DataCatalogTransport(abc.ABC): + """Abstract transport class for DataCatalog.""" + + AUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",) + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: typing.Optional[str] = None, + scopes: typing.Optional[typing.Sequence[str]] = AUTH_SCOPES, + quota_project_id: typing.Optional[str] = None, + **kwargs, + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is mutually exclusive with credentials. + scope (Optional[Sequence[str]]): A list of scopes. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + """ + # Save the hostname. Default to port 443 (HTTPS) if none is specified. + if ":" not in host: + host += ":443" + self._host = host + + # If no credentials are provided, then determine the appropriate + # defaults. + if credentials and credentials_file: + raise exceptions.DuplicateCredentialArgs( + "'credentials_file' and 'credentials' are mutually exclusive" + ) + + if credentials_file is not None: + credentials, _ = auth.load_credentials_from_file( + credentials_file, scopes=scopes, quota_project_id=quota_project_id + ) + + elif credentials is None: + credentials, _ = auth.default( + scopes=scopes, quota_project_id=quota_project_id + ) + + # Save the credentials. + self._credentials = credentials + + # Lifted into its own function so it can be stubbed out during tests. + self._prep_wrapped_messages() + + def _prep_wrapped_messages(self): + # Precompute the wrapped methods. + self._wrapped_methods = { + self.search_catalog: gapic_v1.method.wrap_method( + self.search_catalog, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.create_entry_group: gapic_v1.method.wrap_method( + self.create_entry_group, default_timeout=None, client_info=_client_info, + ), + self.get_entry_group: gapic_v1.method.wrap_method( + self.get_entry_group, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.update_entry_group: gapic_v1.method.wrap_method( + self.update_entry_group, default_timeout=None, client_info=_client_info, + ), + self.delete_entry_group: gapic_v1.method.wrap_method( + self.delete_entry_group, default_timeout=None, client_info=_client_info, + ), + self.list_entry_groups: gapic_v1.method.wrap_method( + self.list_entry_groups, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.create_entry: gapic_v1.method.wrap_method( + self.create_entry, default_timeout=None, client_info=_client_info, + ), + self.update_entry: gapic_v1.method.wrap_method( + self.update_entry, default_timeout=None, client_info=_client_info, + ), + self.delete_entry: gapic_v1.method.wrap_method( + self.delete_entry, default_timeout=None, client_info=_client_info, + ), + self.get_entry: gapic_v1.method.wrap_method( + self.get_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.lookup_entry: gapic_v1.method.wrap_method( + self.lookup_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.list_entries: gapic_v1.method.wrap_method( + self.list_entries, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.create_tag_template: gapic_v1.method.wrap_method( + self.create_tag_template, + default_timeout=None, + client_info=_client_info, + ), + self.get_tag_template: gapic_v1.method.wrap_method( + self.get_tag_template, default_timeout=None, client_info=_client_info, + ), + self.update_tag_template: gapic_v1.method.wrap_method( + self.update_tag_template, + default_timeout=None, + client_info=_client_info, + ), + self.delete_tag_template: gapic_v1.method.wrap_method( + self.delete_tag_template, + default_timeout=None, + client_info=_client_info, + ), + self.create_tag_template_field: gapic_v1.method.wrap_method( + self.create_tag_template_field, + default_timeout=None, + client_info=_client_info, + ), + self.update_tag_template_field: gapic_v1.method.wrap_method( + self.update_tag_template_field, + default_timeout=None, + client_info=_client_info, + ), + self.rename_tag_template_field: gapic_v1.method.wrap_method( + self.rename_tag_template_field, + default_timeout=None, + client_info=_client_info, + ), + self.delete_tag_template_field: gapic_v1.method.wrap_method( + self.delete_tag_template_field, + default_timeout=None, + client_info=_client_info, + ), + self.create_tag: gapic_v1.method.wrap_method( + self.create_tag, default_timeout=None, client_info=_client_info, + ), + self.update_tag: gapic_v1.method.wrap_method( + self.update_tag, default_timeout=None, client_info=_client_info, + ), + self.delete_tag: gapic_v1.method.wrap_method( + self.delete_tag, default_timeout=None, client_info=_client_info, + ), + self.list_tags: gapic_v1.method.wrap_method( + self.list_tags, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.set_iam_policy: gapic_v1.method.wrap_method( + self.set_iam_policy, default_timeout=None, client_info=_client_info, + ), + self.get_iam_policy: gapic_v1.method.wrap_method( + self.get_iam_policy, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type(exceptions.ServiceUnavailable,), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.test_iam_permissions: gapic_v1.method.wrap_method( + self.test_iam_permissions, + default_timeout=None, + client_info=_client_info, + ), + } + + @property + def search_catalog( + self, + ) -> typing.Callable[ + [datacatalog.SearchCatalogRequest], + typing.Union[ + datacatalog.SearchCatalogResponse, + typing.Awaitable[datacatalog.SearchCatalogResponse], + ], + ]: + raise NotImplementedError() + + @property + def create_entry_group( + self, + ) -> typing.Callable[ + [datacatalog.CreateEntryGroupRequest], + typing.Union[datacatalog.EntryGroup, typing.Awaitable[datacatalog.EntryGroup]], + ]: + raise NotImplementedError() + + @property + def get_entry_group( + self, + ) -> typing.Callable[ + [datacatalog.GetEntryGroupRequest], + typing.Union[datacatalog.EntryGroup, typing.Awaitable[datacatalog.EntryGroup]], + ]: + raise NotImplementedError() + + @property + def update_entry_group( + self, + ) -> typing.Callable[ + [datacatalog.UpdateEntryGroupRequest], + typing.Union[datacatalog.EntryGroup, typing.Awaitable[datacatalog.EntryGroup]], + ]: + raise NotImplementedError() + + @property + def delete_entry_group( + self, + ) -> typing.Callable[ + [datacatalog.DeleteEntryGroupRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def list_entry_groups( + self, + ) -> typing.Callable[ + [datacatalog.ListEntryGroupsRequest], + typing.Union[ + datacatalog.ListEntryGroupsResponse, + typing.Awaitable[datacatalog.ListEntryGroupsResponse], + ], + ]: + raise NotImplementedError() + + @property + def create_entry( + self, + ) -> typing.Callable[ + [datacatalog.CreateEntryRequest], + typing.Union[datacatalog.Entry, typing.Awaitable[datacatalog.Entry]], + ]: + raise NotImplementedError() + + @property + def update_entry( + self, + ) -> typing.Callable[ + [datacatalog.UpdateEntryRequest], + typing.Union[datacatalog.Entry, typing.Awaitable[datacatalog.Entry]], + ]: + raise NotImplementedError() + + @property + def delete_entry( + self, + ) -> typing.Callable[ + [datacatalog.DeleteEntryRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def get_entry( + self, + ) -> typing.Callable[ + [datacatalog.GetEntryRequest], + typing.Union[datacatalog.Entry, typing.Awaitable[datacatalog.Entry]], + ]: + raise NotImplementedError() + + @property + def lookup_entry( + self, + ) -> typing.Callable[ + [datacatalog.LookupEntryRequest], + typing.Union[datacatalog.Entry, typing.Awaitable[datacatalog.Entry]], + ]: + raise NotImplementedError() + + @property + def list_entries( + self, + ) -> typing.Callable[ + [datacatalog.ListEntriesRequest], + typing.Union[ + datacatalog.ListEntriesResponse, + typing.Awaitable[datacatalog.ListEntriesResponse], + ], + ]: + raise NotImplementedError() + + @property + def create_tag_template( + self, + ) -> typing.Callable[ + [datacatalog.CreateTagTemplateRequest], + typing.Union[tags.TagTemplate, typing.Awaitable[tags.TagTemplate]], + ]: + raise NotImplementedError() + + @property + def get_tag_template( + self, + ) -> typing.Callable[ + [datacatalog.GetTagTemplateRequest], + typing.Union[tags.TagTemplate, typing.Awaitable[tags.TagTemplate]], + ]: + raise NotImplementedError() + + @property + def update_tag_template( + self, + ) -> typing.Callable[ + [datacatalog.UpdateTagTemplateRequest], + typing.Union[tags.TagTemplate, typing.Awaitable[tags.TagTemplate]], + ]: + raise NotImplementedError() + + @property + def delete_tag_template( + self, + ) -> typing.Callable[ + [datacatalog.DeleteTagTemplateRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def create_tag_template_field( + self, + ) -> typing.Callable[ + [datacatalog.CreateTagTemplateFieldRequest], + typing.Union[tags.TagTemplateField, typing.Awaitable[tags.TagTemplateField]], + ]: + raise NotImplementedError() + + @property + def update_tag_template_field( + self, + ) -> typing.Callable[ + [datacatalog.UpdateTagTemplateFieldRequest], + typing.Union[tags.TagTemplateField, typing.Awaitable[tags.TagTemplateField]], + ]: + raise NotImplementedError() + + @property + def rename_tag_template_field( + self, + ) -> typing.Callable[ + [datacatalog.RenameTagTemplateFieldRequest], + typing.Union[tags.TagTemplateField, typing.Awaitable[tags.TagTemplateField]], + ]: + raise NotImplementedError() + + @property + def delete_tag_template_field( + self, + ) -> typing.Callable[ + [datacatalog.DeleteTagTemplateFieldRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def create_tag( + self, + ) -> typing.Callable[ + [datacatalog.CreateTagRequest], + typing.Union[tags.Tag, typing.Awaitable[tags.Tag]], + ]: + raise NotImplementedError() + + @property + def update_tag( + self, + ) -> typing.Callable[ + [datacatalog.UpdateTagRequest], + typing.Union[tags.Tag, typing.Awaitable[tags.Tag]], + ]: + raise NotImplementedError() + + @property + def delete_tag( + self, + ) -> typing.Callable[ + [datacatalog.DeleteTagRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def list_tags( + self, + ) -> typing.Callable[ + [datacatalog.ListTagsRequest], + typing.Union[ + datacatalog.ListTagsResponse, typing.Awaitable[datacatalog.ListTagsResponse] + ], + ]: + raise NotImplementedError() + + @property + def set_iam_policy( + self, + ) -> typing.Callable[ + [iam_policy.SetIamPolicyRequest], + typing.Union[policy.Policy, typing.Awaitable[policy.Policy]], + ]: + raise NotImplementedError() + + @property + def get_iam_policy( + self, + ) -> typing.Callable[ + [iam_policy.GetIamPolicyRequest], + typing.Union[policy.Policy, typing.Awaitable[policy.Policy]], + ]: + raise NotImplementedError() + + @property + def test_iam_permissions( + self, + ) -> typing.Callable[ + [iam_policy.TestIamPermissionsRequest], + typing.Union[ + iam_policy.TestIamPermissionsResponse, + typing.Awaitable[iam_policy.TestIamPermissionsResponse], + ], + ]: + raise NotImplementedError() + + +__all__ = ("DataCatalogTransport",) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py new file mode 100644 index 00000000..9de2ca50 --- /dev/null +++ b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py @@ -0,0 +1,1065 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Callable, Dict, Optional, Sequence, Tuple + +from google.api_core import grpc_helpers # type: ignore +from google import auth # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport.grpc import SslCredentials # type: ignore + + +import grpc # type: ignore + +from google.cloud.datacatalog_v1.types import datacatalog +from google.cloud.datacatalog_v1.types import tags +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import empty_pb2 as empty # type: ignore + +from .base import DataCatalogTransport + + +class DataCatalogGrpcTransport(DataCatalogTransport): + """gRPC backend transport for DataCatalog. + + Data Catalog API service allows clients to discover, + understand, and manage their data. + + This class defines the same methods as the primary client, so the + primary client can load the underlying transport implementation + and call it. + + It sends protocol buffers over the wire using gRPC (which is built on + top of HTTP/2); the ``grpcio`` package must be installed. + """ + + _stubs: Dict[str, Callable] + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: str = None, + scopes: Sequence[str] = None, + channel: grpc.Channel = None, + api_mtls_endpoint: str = None, + client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, + quota_project_id: Optional[str] = None + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + This argument is ignored if ``channel`` is provided. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional(Sequence[str])): A list of scopes. This argument is + ignored if ``channel`` is provided. + channel (Optional[grpc.Channel]): A ``Channel`` instance through + which to make calls. + api_mtls_endpoint (Optional[str]): The mutual TLS endpoint. If + provided, it overrides the ``host`` argument and tries to create + a mutual TLS channel with client SSL credentials from + ``client_cert_source`` or applicatin default SSL credentials. + client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]): A + callback to provide client SSL certificate bytes and private key + bytes, both in PEM format. It is ignored if ``api_mtls_endpoint`` + is None. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + + Raises: + google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport + creation failed for any reason. + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + if channel: + # Sanity check: Ensure that channel and credentials are not both + # provided. + credentials = False + + # If a channel was explicitly provided, set it. + self._grpc_channel = channel + elif api_mtls_endpoint: + host = ( + api_mtls_endpoint + if ":" in api_mtls_endpoint + else api_mtls_endpoint + ":443" + ) + + if credentials is None: + credentials, _ = auth.default( + scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id + ) + + # Create SSL credentials with client_cert_source or application + # default SSL credentials. + if client_cert_source: + cert, key = client_cert_source() + ssl_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + else: + ssl_credentials = SslCredentials().ssl_credentials + + # create a new channel. The provided one is ignored. + self._grpc_channel = type(self).create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + ssl_credentials=ssl_credentials, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + self._stubs = {} # type: Dict[str, Callable] + + # Run the base constructor. + super().__init__( + host=host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + @classmethod + def create_channel( + cls, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: str = None, + scopes: Optional[Sequence[str]] = None, + quota_project_id: Optional[str] = None, + **kwargs + ) -> grpc.Channel: + """Create and return a gRPC channel object. + Args: + address (Optionsl[str]): The host for the channel to use. + credentials (Optional[~.Credentials]): The + authorization credentials to attach to requests. These + credentials identify this application to the service. If + none are specified, the client will attempt to ascertain + the credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is mutually exclusive with credentials. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + kwargs (Optional[dict]): Keyword arguments, which are passed to the + channel creation. + Returns: + grpc.Channel: A gRPC channel object. + + Raises: + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + scopes = scopes or cls.AUTH_SCOPES + return grpc_helpers.create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes, + quota_project_id=quota_project_id, + **kwargs + ) + + @property + def grpc_channel(self) -> grpc.Channel: + """Create the channel designed to connect to this service. + + This property caches on the instance; repeated calls return + the same channel. + """ + # Sanity check: Only create a new channel if we do not already + # have one. + if not hasattr(self, "_grpc_channel"): + self._grpc_channel = self.create_channel( + self._host, credentials=self._credentials, + ) + + # Return the channel from cache. + return self._grpc_channel + + @property + def search_catalog( + self, + ) -> Callable[ + [datacatalog.SearchCatalogRequest], datacatalog.SearchCatalogResponse + ]: + r"""Return a callable for the search catalog method over gRPC. + + Searches Data Catalog for multiple resources like entries, tags + that match a query. + + This is a custom method + (https://cloud.google.com/apis/design/custom_methods) and does + not return the complete resource, only the resource identifier + and high level fields. Clients can subsequentally call ``Get`` + methods. + + Note that Data Catalog search queries do not guarantee full + recall. Query results that match your query may not be returned, + even in subsequent result pages. Also note that results returned + (and not returned) can vary across repeated search queries. + + See `Data Catalog Search + Syntax `__ + for more information. + + Returns: + Callable[[~.SearchCatalogRequest], + ~.SearchCatalogResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "search_catalog" not in self._stubs: + self._stubs["search_catalog"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/SearchCatalog", + request_serializer=datacatalog.SearchCatalogRequest.serialize, + response_deserializer=datacatalog.SearchCatalogResponse.deserialize, + ) + return self._stubs["search_catalog"] + + @property + def create_entry_group( + self, + ) -> Callable[[datacatalog.CreateEntryGroupRequest], datacatalog.EntryGroup]: + r"""Return a callable for the create entry group method over gRPC. + + Creates an EntryGroup. + + An entry group contains logically related entries together with + Cloud Identity and Access Management policies that specify the + users who can create, edit, and view entries within the entry + group. + + Data Catalog automatically creates an entry group for BigQuery + entries ("@bigquery") and Pub/Sub topics ("@pubsub"). Users + create their own entry group to contain Cloud Storage fileset + entries or custom type entries, and the IAM policies associated + with those entries. Entry groups, like entries, can be searched. + + A maximum of 10,000 entry groups may be created per organization + across all locations. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.CreateEntryGroupRequest], + ~.EntryGroup]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_entry_group" not in self._stubs: + self._stubs["create_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateEntryGroup", + request_serializer=datacatalog.CreateEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["create_entry_group"] + + @property + def get_entry_group( + self, + ) -> Callable[[datacatalog.GetEntryGroupRequest], datacatalog.EntryGroup]: + r"""Return a callable for the get entry group method over gRPC. + + Gets an EntryGroup. + + Returns: + Callable[[~.GetEntryGroupRequest], + ~.EntryGroup]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_entry_group" not in self._stubs: + self._stubs["get_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/GetEntryGroup", + request_serializer=datacatalog.GetEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["get_entry_group"] + + @property + def update_entry_group( + self, + ) -> Callable[[datacatalog.UpdateEntryGroupRequest], datacatalog.EntryGroup]: + r"""Return a callable for the update entry group method over gRPC. + + Updates an EntryGroup. The user should enable the Data Catalog + API in the project identified by the ``entry_group.name`` + parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateEntryGroupRequest], + ~.EntryGroup]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_entry_group" not in self._stubs: + self._stubs["update_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateEntryGroup", + request_serializer=datacatalog.UpdateEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["update_entry_group"] + + @property + def delete_entry_group( + self, + ) -> Callable[[datacatalog.DeleteEntryGroupRequest], empty.Empty]: + r"""Return a callable for the delete entry group method over gRPC. + + Deletes an EntryGroup. Only entry groups that do not contain + entries can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteEntryGroupRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_entry_group" not in self._stubs: + self._stubs["delete_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteEntryGroup", + request_serializer=datacatalog.DeleteEntryGroupRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_entry_group"] + + @property + def list_entry_groups( + self, + ) -> Callable[ + [datacatalog.ListEntryGroupsRequest], datacatalog.ListEntryGroupsResponse + ]: + r"""Return a callable for the list entry groups method over gRPC. + + Lists entry groups. + + Returns: + Callable[[~.ListEntryGroupsRequest], + ~.ListEntryGroupsResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_entry_groups" not in self._stubs: + self._stubs["list_entry_groups"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/ListEntryGroups", + request_serializer=datacatalog.ListEntryGroupsRequest.serialize, + response_deserializer=datacatalog.ListEntryGroupsResponse.deserialize, + ) + return self._stubs["list_entry_groups"] + + @property + def create_entry( + self, + ) -> Callable[[datacatalog.CreateEntryRequest], datacatalog.Entry]: + r"""Return a callable for the create entry method over gRPC. + + Creates an entry. Only entries of 'FILESET' type or + user-specified type can be created. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + A maximum of 100,000 entries may be created per entry group. + + Returns: + Callable[[~.CreateEntryRequest], + ~.Entry]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_entry" not in self._stubs: + self._stubs["create_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateEntry", + request_serializer=datacatalog.CreateEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["create_entry"] + + @property + def update_entry( + self, + ) -> Callable[[datacatalog.UpdateEntryRequest], datacatalog.Entry]: + r"""Return a callable for the update entry method over gRPC. + + Updates an existing entry. Users should enable the Data Catalog + API in the project identified by the ``entry.name`` parameter + (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateEntryRequest], + ~.Entry]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_entry" not in self._stubs: + self._stubs["update_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateEntry", + request_serializer=datacatalog.UpdateEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["update_entry"] + + @property + def delete_entry(self) -> Callable[[datacatalog.DeleteEntryRequest], empty.Empty]: + r"""Return a callable for the delete entry method over gRPC. + + Deletes an existing entry. Only entries created through + [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry] + method can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteEntryRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_entry" not in self._stubs: + self._stubs["delete_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteEntry", + request_serializer=datacatalog.DeleteEntryRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_entry"] + + @property + def get_entry(self) -> Callable[[datacatalog.GetEntryRequest], datacatalog.Entry]: + r"""Return a callable for the get entry method over gRPC. + + Gets an entry. + + Returns: + Callable[[~.GetEntryRequest], + ~.Entry]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_entry" not in self._stubs: + self._stubs["get_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/GetEntry", + request_serializer=datacatalog.GetEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["get_entry"] + + @property + def lookup_entry( + self, + ) -> Callable[[datacatalog.LookupEntryRequest], datacatalog.Entry]: + r"""Return a callable for the lookup entry method over gRPC. + + Get an entry by target resource name. This method + allows clients to use the resource name from the source + Google Cloud Platform service to get the Data Catalog + Entry. + + Returns: + Callable[[~.LookupEntryRequest], + ~.Entry]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "lookup_entry" not in self._stubs: + self._stubs["lookup_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/LookupEntry", + request_serializer=datacatalog.LookupEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["lookup_entry"] + + @property + def list_entries( + self, + ) -> Callable[[datacatalog.ListEntriesRequest], datacatalog.ListEntriesResponse]: + r"""Return a callable for the list entries method over gRPC. + + Lists entries. + + Returns: + Callable[[~.ListEntriesRequest], + ~.ListEntriesResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_entries" not in self._stubs: + self._stubs["list_entries"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/ListEntries", + request_serializer=datacatalog.ListEntriesRequest.serialize, + response_deserializer=datacatalog.ListEntriesResponse.deserialize, + ) + return self._stubs["list_entries"] + + @property + def create_tag_template( + self, + ) -> Callable[[datacatalog.CreateTagTemplateRequest], tags.TagTemplate]: + r"""Return a callable for the create tag template method over gRPC. + + Creates a tag template. The user should enable the Data Catalog + API in the project identified by the ``parent`` parameter (see + `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.CreateTagTemplateRequest], + ~.TagTemplate]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag_template" not in self._stubs: + self._stubs["create_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateTagTemplate", + request_serializer=datacatalog.CreateTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["create_tag_template"] + + @property + def get_tag_template( + self, + ) -> Callable[[datacatalog.GetTagTemplateRequest], tags.TagTemplate]: + r"""Return a callable for the get tag template method over gRPC. + + Gets a tag template. + + Returns: + Callable[[~.GetTagTemplateRequest], + ~.TagTemplate]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_tag_template" not in self._stubs: + self._stubs["get_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/GetTagTemplate", + request_serializer=datacatalog.GetTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["get_tag_template"] + + @property + def update_tag_template( + self, + ) -> Callable[[datacatalog.UpdateTagTemplateRequest], tags.TagTemplate]: + r"""Return a callable for the update tag template method over gRPC. + + Updates a tag template. This method cannot be used to update the + fields of a template. The tag template fields are represented as + separate resources and should be updated using their own + create/update/delete methods. Users should enable the Data + Catalog API in the project identified by the + ``tag_template.name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateTagTemplateRequest], + ~.TagTemplate]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag_template" not in self._stubs: + self._stubs["update_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateTagTemplate", + request_serializer=datacatalog.UpdateTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["update_tag_template"] + + @property + def delete_tag_template( + self, + ) -> Callable[[datacatalog.DeleteTagTemplateRequest], empty.Empty]: + r"""Return a callable for the delete tag template method over gRPC. + + Deletes a tag template and all tags using the template. Users + should enable the Data Catalog API in the project identified by + the ``name`` parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteTagTemplateRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag_template" not in self._stubs: + self._stubs["delete_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteTagTemplate", + request_serializer=datacatalog.DeleteTagTemplateRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag_template"] + + @property + def create_tag_template_field( + self, + ) -> Callable[[datacatalog.CreateTagTemplateFieldRequest], tags.TagTemplateField]: + r"""Return a callable for the create tag template field method over gRPC. + + Creates a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``parent`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.CreateTagTemplateFieldRequest], + ~.TagTemplateField]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag_template_field" not in self._stubs: + self._stubs["create_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateTagTemplateField", + request_serializer=datacatalog.CreateTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["create_tag_template_field"] + + @property + def update_tag_template_field( + self, + ) -> Callable[[datacatalog.UpdateTagTemplateFieldRequest], tags.TagTemplateField]: + r"""Return a callable for the update tag template field method over gRPC. + + Updates a field in a tag template. This method cannot be used to + update the field type. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateTagTemplateFieldRequest], + ~.TagTemplateField]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag_template_field" not in self._stubs: + self._stubs["update_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateTagTemplateField", + request_serializer=datacatalog.UpdateTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["update_tag_template_field"] + + @property + def rename_tag_template_field( + self, + ) -> Callable[[datacatalog.RenameTagTemplateFieldRequest], tags.TagTemplateField]: + r"""Return a callable for the rename tag template field method over gRPC. + + Renames a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``name`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.RenameTagTemplateFieldRequest], + ~.TagTemplateField]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "rename_tag_template_field" not in self._stubs: + self._stubs["rename_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/RenameTagTemplateField", + request_serializer=datacatalog.RenameTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["rename_tag_template_field"] + + @property + def delete_tag_template_field( + self, + ) -> Callable[[datacatalog.DeleteTagTemplateFieldRequest], empty.Empty]: + r"""Return a callable for the delete tag template field method over gRPC. + + Deletes a field in a tag template and all uses of that field. + Users should enable the Data Catalog API in the project + identified by the ``name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteTagTemplateFieldRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag_template_field" not in self._stubs: + self._stubs["delete_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteTagTemplateField", + request_serializer=datacatalog.DeleteTagTemplateFieldRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag_template_field"] + + @property + def create_tag(self) -> Callable[[datacatalog.CreateTagRequest], tags.Tag]: + r"""Return a callable for the create tag method over gRPC. + + Creates a tag on an [Entry][google.cloud.datacatalog.v1.Entry]. + Note: The project identified by the ``parent`` parameter for the + `tag `__ + and the `tag + template `__ + used to create the tag must be from the same organization. + + Returns: + Callable[[~.CreateTagRequest], + ~.Tag]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag" not in self._stubs: + self._stubs["create_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateTag", + request_serializer=datacatalog.CreateTagRequest.serialize, + response_deserializer=tags.Tag.deserialize, + ) + return self._stubs["create_tag"] + + @property + def update_tag(self) -> Callable[[datacatalog.UpdateTagRequest], tags.Tag]: + r"""Return a callable for the update tag method over gRPC. + + Updates an existing tag. + + Returns: + Callable[[~.UpdateTagRequest], + ~.Tag]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag" not in self._stubs: + self._stubs["update_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateTag", + request_serializer=datacatalog.UpdateTagRequest.serialize, + response_deserializer=tags.Tag.deserialize, + ) + return self._stubs["update_tag"] + + @property + def delete_tag(self) -> Callable[[datacatalog.DeleteTagRequest], empty.Empty]: + r"""Return a callable for the delete tag method over gRPC. + + Deletes a tag. + + Returns: + Callable[[~.DeleteTagRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag" not in self._stubs: + self._stubs["delete_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteTag", + request_serializer=datacatalog.DeleteTagRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag"] + + @property + def list_tags( + self, + ) -> Callable[[datacatalog.ListTagsRequest], datacatalog.ListTagsResponse]: + r"""Return a callable for the list tags method over gRPC. + + Lists the tags on an [Entry][google.cloud.datacatalog.v1.Entry]. + + Returns: + Callable[[~.ListTagsRequest], + ~.ListTagsResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_tags" not in self._stubs: + self._stubs["list_tags"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/ListTags", + request_serializer=datacatalog.ListTagsRequest.serialize, + response_deserializer=datacatalog.ListTagsResponse.deserialize, + ) + return self._stubs["list_tags"] + + @property + def set_iam_policy( + self, + ) -> Callable[[iam_policy.SetIamPolicyRequest], policy.Policy]: + r"""Return a callable for the set iam policy method over gRPC. + + Sets the access control policy for a resource. Replaces any + existing policy. Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on + tag templates. + - ``datacatalog.entries.setIamPolicy`` to set policies on + entries. + - ``datacatalog.entryGroups.setIamPolicy`` to set policies on + entry groups. + + Returns: + Callable[[~.SetIamPolicyRequest], + ~.Policy]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "set_iam_policy" not in self._stubs: + self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/SetIamPolicy", + request_serializer=iam_policy.SetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["set_iam_policy"] + + @property + def get_iam_policy( + self, + ) -> Callable[[iam_policy.GetIamPolicyRequest], policy.Policy]: + r"""Return a callable for the get iam policy method over gRPC. + + Gets the access control policy for a resource. A ``NOT_FOUND`` + error is returned if the resource does not exist. An empty + policy is returned if the resource exists but does not have a + policy set on it. + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on + tag templates. + - ``datacatalog.entries.getIamPolicy`` to get policies on + entries. + - ``datacatalog.entryGroups.getIamPolicy`` to get policies on + entry groups. + + Returns: + Callable[[~.GetIamPolicyRequest], + ~.Policy]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_iam_policy" not in self._stubs: + self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/GetIamPolicy", + request_serializer=iam_policy.GetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["get_iam_policy"] + + @property + def test_iam_permissions( + self, + ) -> Callable[ + [iam_policy.TestIamPermissionsRequest], iam_policy.TestIamPermissionsResponse + ]: + r"""Return a callable for the test iam permissions method over gRPC. + + Returns the caller's permissions on a resource. If the resource + does not exist, an empty set of permissions is returned (We + don't return a ``NOT_FOUND`` error). + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + A caller is not required to have Google IAM permission to make + this request. + + Returns: + Callable[[~.TestIamPermissionsRequest], + ~.TestIamPermissionsResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "test_iam_permissions" not in self._stubs: + self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/TestIamPermissions", + request_serializer=iam_policy.TestIamPermissionsRequest.SerializeToString, + response_deserializer=iam_policy.TestIamPermissionsResponse.FromString, + ) + return self._stubs["test_iam_permissions"] + + +__all__ = ("DataCatalogGrpcTransport",) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py new file mode 100644 index 00000000..24fdb5c9 --- /dev/null +++ b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py @@ -0,0 +1,1086 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple + +from google.api_core import grpc_helpers_async # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport.grpc import SslCredentials # type: ignore + +import grpc # type: ignore +from grpc.experimental import aio # type: ignore + +from google.cloud.datacatalog_v1.types import datacatalog +from google.cloud.datacatalog_v1.types import tags +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import empty_pb2 as empty # type: ignore + +from .base import DataCatalogTransport +from .grpc import DataCatalogGrpcTransport + + +class DataCatalogGrpcAsyncIOTransport(DataCatalogTransport): + """gRPC AsyncIO backend transport for DataCatalog. + + Data Catalog API service allows clients to discover, + understand, and manage their data. + + This class defines the same methods as the primary client, so the + primary client can load the underlying transport implementation + and call it. + + It sends protocol buffers over the wire using gRPC (which is built on + top of HTTP/2); the ``grpcio`` package must be installed. + """ + + _grpc_channel: aio.Channel + _stubs: Dict[str, Callable] = {} + + @classmethod + def create_channel( + cls, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: Optional[str] = None, + scopes: Optional[Sequence[str]] = None, + quota_project_id: Optional[str] = None, + **kwargs, + ) -> aio.Channel: + """Create and return a gRPC AsyncIO channel object. + Args: + address (Optional[str]): The host for the channel to use. + credentials (Optional[~.Credentials]): The + authorization credentials to attach to requests. These + credentials identify this application to the service. If + none are specified, the client will attempt to ascertain + the credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + kwargs (Optional[dict]): Keyword arguments, which are passed to the + channel creation. + Returns: + aio.Channel: A gRPC AsyncIO channel object. + """ + scopes = scopes or cls.AUTH_SCOPES + return grpc_helpers_async.create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes, + quota_project_id=quota_project_id, + **kwargs, + ) + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: Optional[str] = None, + scopes: Optional[Sequence[str]] = None, + channel: aio.Channel = None, + api_mtls_endpoint: str = None, + client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, + quota_project_id=None, + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + This argument is ignored if ``channel`` is provided. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + channel (Optional[aio.Channel]): A ``Channel`` instance through + which to make calls. + api_mtls_endpoint (Optional[str]): The mutual TLS endpoint. If + provided, it overrides the ``host`` argument and tries to create + a mutual TLS channel with client SSL credentials from + ``client_cert_source`` or applicatin default SSL credentials. + client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]): A + callback to provide client SSL certificate bytes and private key + bytes, both in PEM format. It is ignored if ``api_mtls_endpoint`` + is None. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + + Raises: + google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport + creation failed for any reason. + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + if channel: + # Sanity check: Ensure that channel and credentials are not both + # provided. + credentials = False + + # If a channel was explicitly provided, set it. + self._grpc_channel = channel + elif api_mtls_endpoint: + host = ( + api_mtls_endpoint + if ":" in api_mtls_endpoint + else api_mtls_endpoint + ":443" + ) + + # Create SSL credentials with client_cert_source or application + # default SSL credentials. + if client_cert_source: + cert, key = client_cert_source() + ssl_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + else: + ssl_credentials = SslCredentials().ssl_credentials + + # create a new channel. The provided one is ignored. + self._grpc_channel = type(self).create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + ssl_credentials=ssl_credentials, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + # Run the base constructor. + super().__init__( + host=host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + self._stubs = {} + + @property + def grpc_channel(self) -> aio.Channel: + """Create the channel designed to connect to this service. + + This property caches on the instance; repeated calls return + the same channel. + """ + # Sanity check: Only create a new channel if we do not already + # have one. + if not hasattr(self, "_grpc_channel"): + self._grpc_channel = self.create_channel( + self._host, credentials=self._credentials, + ) + + # Return the channel from cache. + return self._grpc_channel + + @property + def search_catalog( + self, + ) -> Callable[ + [datacatalog.SearchCatalogRequest], Awaitable[datacatalog.SearchCatalogResponse] + ]: + r"""Return a callable for the search catalog method over gRPC. + + Searches Data Catalog for multiple resources like entries, tags + that match a query. + + This is a custom method + (https://cloud.google.com/apis/design/custom_methods) and does + not return the complete resource, only the resource identifier + and high level fields. Clients can subsequentally call ``Get`` + methods. + + Note that Data Catalog search queries do not guarantee full + recall. Query results that match your query may not be returned, + even in subsequent result pages. Also note that results returned + (and not returned) can vary across repeated search queries. + + See `Data Catalog Search + Syntax `__ + for more information. + + Returns: + Callable[[~.SearchCatalogRequest], + Awaitable[~.SearchCatalogResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "search_catalog" not in self._stubs: + self._stubs["search_catalog"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/SearchCatalog", + request_serializer=datacatalog.SearchCatalogRequest.serialize, + response_deserializer=datacatalog.SearchCatalogResponse.deserialize, + ) + return self._stubs["search_catalog"] + + @property + def create_entry_group( + self, + ) -> Callable[ + [datacatalog.CreateEntryGroupRequest], Awaitable[datacatalog.EntryGroup] + ]: + r"""Return a callable for the create entry group method over gRPC. + + Creates an EntryGroup. + + An entry group contains logically related entries together with + Cloud Identity and Access Management policies that specify the + users who can create, edit, and view entries within the entry + group. + + Data Catalog automatically creates an entry group for BigQuery + entries ("@bigquery") and Pub/Sub topics ("@pubsub"). Users + create their own entry group to contain Cloud Storage fileset + entries or custom type entries, and the IAM policies associated + with those entries. Entry groups, like entries, can be searched. + + A maximum of 10,000 entry groups may be created per organization + across all locations. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.CreateEntryGroupRequest], + Awaitable[~.EntryGroup]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_entry_group" not in self._stubs: + self._stubs["create_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateEntryGroup", + request_serializer=datacatalog.CreateEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["create_entry_group"] + + @property + def get_entry_group( + self, + ) -> Callable[ + [datacatalog.GetEntryGroupRequest], Awaitable[datacatalog.EntryGroup] + ]: + r"""Return a callable for the get entry group method over gRPC. + + Gets an EntryGroup. + + Returns: + Callable[[~.GetEntryGroupRequest], + Awaitable[~.EntryGroup]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_entry_group" not in self._stubs: + self._stubs["get_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/GetEntryGroup", + request_serializer=datacatalog.GetEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["get_entry_group"] + + @property + def update_entry_group( + self, + ) -> Callable[ + [datacatalog.UpdateEntryGroupRequest], Awaitable[datacatalog.EntryGroup] + ]: + r"""Return a callable for the update entry group method over gRPC. + + Updates an EntryGroup. The user should enable the Data Catalog + API in the project identified by the ``entry_group.name`` + parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateEntryGroupRequest], + Awaitable[~.EntryGroup]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_entry_group" not in self._stubs: + self._stubs["update_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateEntryGroup", + request_serializer=datacatalog.UpdateEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["update_entry_group"] + + @property + def delete_entry_group( + self, + ) -> Callable[[datacatalog.DeleteEntryGroupRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete entry group method over gRPC. + + Deletes an EntryGroup. Only entry groups that do not contain + entries can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteEntryGroupRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_entry_group" not in self._stubs: + self._stubs["delete_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteEntryGroup", + request_serializer=datacatalog.DeleteEntryGroupRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_entry_group"] + + @property + def list_entry_groups( + self, + ) -> Callable[ + [datacatalog.ListEntryGroupsRequest], + Awaitable[datacatalog.ListEntryGroupsResponse], + ]: + r"""Return a callable for the list entry groups method over gRPC. + + Lists entry groups. + + Returns: + Callable[[~.ListEntryGroupsRequest], + Awaitable[~.ListEntryGroupsResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_entry_groups" not in self._stubs: + self._stubs["list_entry_groups"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/ListEntryGroups", + request_serializer=datacatalog.ListEntryGroupsRequest.serialize, + response_deserializer=datacatalog.ListEntryGroupsResponse.deserialize, + ) + return self._stubs["list_entry_groups"] + + @property + def create_entry( + self, + ) -> Callable[[datacatalog.CreateEntryRequest], Awaitable[datacatalog.Entry]]: + r"""Return a callable for the create entry method over gRPC. + + Creates an entry. Only entries of 'FILESET' type or + user-specified type can be created. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + A maximum of 100,000 entries may be created per entry group. + + Returns: + Callable[[~.CreateEntryRequest], + Awaitable[~.Entry]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_entry" not in self._stubs: + self._stubs["create_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateEntry", + request_serializer=datacatalog.CreateEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["create_entry"] + + @property + def update_entry( + self, + ) -> Callable[[datacatalog.UpdateEntryRequest], Awaitable[datacatalog.Entry]]: + r"""Return a callable for the update entry method over gRPC. + + Updates an existing entry. Users should enable the Data Catalog + API in the project identified by the ``entry.name`` parameter + (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateEntryRequest], + Awaitable[~.Entry]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_entry" not in self._stubs: + self._stubs["update_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateEntry", + request_serializer=datacatalog.UpdateEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["update_entry"] + + @property + def delete_entry( + self, + ) -> Callable[[datacatalog.DeleteEntryRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete entry method over gRPC. + + Deletes an existing entry. Only entries created through + [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry] + method can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteEntryRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_entry" not in self._stubs: + self._stubs["delete_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteEntry", + request_serializer=datacatalog.DeleteEntryRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_entry"] + + @property + def get_entry( + self, + ) -> Callable[[datacatalog.GetEntryRequest], Awaitable[datacatalog.Entry]]: + r"""Return a callable for the get entry method over gRPC. + + Gets an entry. + + Returns: + Callable[[~.GetEntryRequest], + Awaitable[~.Entry]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_entry" not in self._stubs: + self._stubs["get_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/GetEntry", + request_serializer=datacatalog.GetEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["get_entry"] + + @property + def lookup_entry( + self, + ) -> Callable[[datacatalog.LookupEntryRequest], Awaitable[datacatalog.Entry]]: + r"""Return a callable for the lookup entry method over gRPC. + + Get an entry by target resource name. This method + allows clients to use the resource name from the source + Google Cloud Platform service to get the Data Catalog + Entry. + + Returns: + Callable[[~.LookupEntryRequest], + Awaitable[~.Entry]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "lookup_entry" not in self._stubs: + self._stubs["lookup_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/LookupEntry", + request_serializer=datacatalog.LookupEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["lookup_entry"] + + @property + def list_entries( + self, + ) -> Callable[ + [datacatalog.ListEntriesRequest], Awaitable[datacatalog.ListEntriesResponse] + ]: + r"""Return a callable for the list entries method over gRPC. + + Lists entries. + + Returns: + Callable[[~.ListEntriesRequest], + Awaitable[~.ListEntriesResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_entries" not in self._stubs: + self._stubs["list_entries"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/ListEntries", + request_serializer=datacatalog.ListEntriesRequest.serialize, + response_deserializer=datacatalog.ListEntriesResponse.deserialize, + ) + return self._stubs["list_entries"] + + @property + def create_tag_template( + self, + ) -> Callable[[datacatalog.CreateTagTemplateRequest], Awaitable[tags.TagTemplate]]: + r"""Return a callable for the create tag template method over gRPC. + + Creates a tag template. The user should enable the Data Catalog + API in the project identified by the ``parent`` parameter (see + `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.CreateTagTemplateRequest], + Awaitable[~.TagTemplate]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag_template" not in self._stubs: + self._stubs["create_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateTagTemplate", + request_serializer=datacatalog.CreateTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["create_tag_template"] + + @property + def get_tag_template( + self, + ) -> Callable[[datacatalog.GetTagTemplateRequest], Awaitable[tags.TagTemplate]]: + r"""Return a callable for the get tag template method over gRPC. + + Gets a tag template. + + Returns: + Callable[[~.GetTagTemplateRequest], + Awaitable[~.TagTemplate]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_tag_template" not in self._stubs: + self._stubs["get_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/GetTagTemplate", + request_serializer=datacatalog.GetTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["get_tag_template"] + + @property + def update_tag_template( + self, + ) -> Callable[[datacatalog.UpdateTagTemplateRequest], Awaitable[tags.TagTemplate]]: + r"""Return a callable for the update tag template method over gRPC. + + Updates a tag template. This method cannot be used to update the + fields of a template. The tag template fields are represented as + separate resources and should be updated using their own + create/update/delete methods. Users should enable the Data + Catalog API in the project identified by the + ``tag_template.name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateTagTemplateRequest], + Awaitable[~.TagTemplate]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag_template" not in self._stubs: + self._stubs["update_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateTagTemplate", + request_serializer=datacatalog.UpdateTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["update_tag_template"] + + @property + def delete_tag_template( + self, + ) -> Callable[[datacatalog.DeleteTagTemplateRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete tag template method over gRPC. + + Deletes a tag template and all tags using the template. Users + should enable the Data Catalog API in the project identified by + the ``name`` parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteTagTemplateRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag_template" not in self._stubs: + self._stubs["delete_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteTagTemplate", + request_serializer=datacatalog.DeleteTagTemplateRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag_template"] + + @property + def create_tag_template_field( + self, + ) -> Callable[ + [datacatalog.CreateTagTemplateFieldRequest], Awaitable[tags.TagTemplateField] + ]: + r"""Return a callable for the create tag template field method over gRPC. + + Creates a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``parent`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.CreateTagTemplateFieldRequest], + Awaitable[~.TagTemplateField]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag_template_field" not in self._stubs: + self._stubs["create_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateTagTemplateField", + request_serializer=datacatalog.CreateTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["create_tag_template_field"] + + @property + def update_tag_template_field( + self, + ) -> Callable[ + [datacatalog.UpdateTagTemplateFieldRequest], Awaitable[tags.TagTemplateField] + ]: + r"""Return a callable for the update tag template field method over gRPC. + + Updates a field in a tag template. This method cannot be used to + update the field type. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateTagTemplateFieldRequest], + Awaitable[~.TagTemplateField]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag_template_field" not in self._stubs: + self._stubs["update_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateTagTemplateField", + request_serializer=datacatalog.UpdateTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["update_tag_template_field"] + + @property + def rename_tag_template_field( + self, + ) -> Callable[ + [datacatalog.RenameTagTemplateFieldRequest], Awaitable[tags.TagTemplateField] + ]: + r"""Return a callable for the rename tag template field method over gRPC. + + Renames a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``name`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.RenameTagTemplateFieldRequest], + Awaitable[~.TagTemplateField]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "rename_tag_template_field" not in self._stubs: + self._stubs["rename_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/RenameTagTemplateField", + request_serializer=datacatalog.RenameTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["rename_tag_template_field"] + + @property + def delete_tag_template_field( + self, + ) -> Callable[[datacatalog.DeleteTagTemplateFieldRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete tag template field method over gRPC. + + Deletes a field in a tag template and all uses of that field. + Users should enable the Data Catalog API in the project + identified by the ``name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteTagTemplateFieldRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag_template_field" not in self._stubs: + self._stubs["delete_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteTagTemplateField", + request_serializer=datacatalog.DeleteTagTemplateFieldRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag_template_field"] + + @property + def create_tag( + self, + ) -> Callable[[datacatalog.CreateTagRequest], Awaitable[tags.Tag]]: + r"""Return a callable for the create tag method over gRPC. + + Creates a tag on an [Entry][google.cloud.datacatalog.v1.Entry]. + Note: The project identified by the ``parent`` parameter for the + `tag `__ + and the `tag + template `__ + used to create the tag must be from the same organization. + + Returns: + Callable[[~.CreateTagRequest], + Awaitable[~.Tag]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag" not in self._stubs: + self._stubs["create_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/CreateTag", + request_serializer=datacatalog.CreateTagRequest.serialize, + response_deserializer=tags.Tag.deserialize, + ) + return self._stubs["create_tag"] + + @property + def update_tag( + self, + ) -> Callable[[datacatalog.UpdateTagRequest], Awaitable[tags.Tag]]: + r"""Return a callable for the update tag method over gRPC. + + Updates an existing tag. + + Returns: + Callable[[~.UpdateTagRequest], + Awaitable[~.Tag]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag" not in self._stubs: + self._stubs["update_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/UpdateTag", + request_serializer=datacatalog.UpdateTagRequest.serialize, + response_deserializer=tags.Tag.deserialize, + ) + return self._stubs["update_tag"] + + @property + def delete_tag( + self, + ) -> Callable[[datacatalog.DeleteTagRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete tag method over gRPC. + + Deletes a tag. + + Returns: + Callable[[~.DeleteTagRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag" not in self._stubs: + self._stubs["delete_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/DeleteTag", + request_serializer=datacatalog.DeleteTagRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag"] + + @property + def list_tags( + self, + ) -> Callable[ + [datacatalog.ListTagsRequest], Awaitable[datacatalog.ListTagsResponse] + ]: + r"""Return a callable for the list tags method over gRPC. + + Lists the tags on an [Entry][google.cloud.datacatalog.v1.Entry]. + + Returns: + Callable[[~.ListTagsRequest], + Awaitable[~.ListTagsResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_tags" not in self._stubs: + self._stubs["list_tags"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/ListTags", + request_serializer=datacatalog.ListTagsRequest.serialize, + response_deserializer=datacatalog.ListTagsResponse.deserialize, + ) + return self._stubs["list_tags"] + + @property + def set_iam_policy( + self, + ) -> Callable[[iam_policy.SetIamPolicyRequest], Awaitable[policy.Policy]]: + r"""Return a callable for the set iam policy method over gRPC. + + Sets the access control policy for a resource. Replaces any + existing policy. Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on + tag templates. + - ``datacatalog.entries.setIamPolicy`` to set policies on + entries. + - ``datacatalog.entryGroups.setIamPolicy`` to set policies on + entry groups. + + Returns: + Callable[[~.SetIamPolicyRequest], + Awaitable[~.Policy]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "set_iam_policy" not in self._stubs: + self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/SetIamPolicy", + request_serializer=iam_policy.SetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["set_iam_policy"] + + @property + def get_iam_policy( + self, + ) -> Callable[[iam_policy.GetIamPolicyRequest], Awaitable[policy.Policy]]: + r"""Return a callable for the get iam policy method over gRPC. + + Gets the access control policy for a resource. A ``NOT_FOUND`` + error is returned if the resource does not exist. An empty + policy is returned if the resource exists but does not have a + policy set on it. + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on + tag templates. + - ``datacatalog.entries.getIamPolicy`` to get policies on + entries. + - ``datacatalog.entryGroups.getIamPolicy`` to get policies on + entry groups. + + Returns: + Callable[[~.GetIamPolicyRequest], + Awaitable[~.Policy]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_iam_policy" not in self._stubs: + self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/GetIamPolicy", + request_serializer=iam_policy.GetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["get_iam_policy"] + + @property + def test_iam_permissions( + self, + ) -> Callable[ + [iam_policy.TestIamPermissionsRequest], + Awaitable[iam_policy.TestIamPermissionsResponse], + ]: + r"""Return a callable for the test iam permissions method over gRPC. + + Returns the caller's permissions on a resource. If the resource + does not exist, an empty set of permissions is returned (We + don't return a ``NOT_FOUND`` error). + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + A caller is not required to have Google IAM permission to make + this request. + + Returns: + Callable[[~.TestIamPermissionsRequest], + Awaitable[~.TestIamPermissionsResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "test_iam_permissions" not in self._stubs: + self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1.DataCatalog/TestIamPermissions", + request_serializer=iam_policy.TestIamPermissionsRequest.SerializeToString, + response_deserializer=iam_policy.TestIamPermissionsResponse.FromString, + ) + return self._stubs["test_iam_permissions"] + + +__all__ = ("DataCatalogGrpcAsyncIOTransport",) diff --git a/google/cloud/datacatalog_v1/types.py b/google/cloud/datacatalog_v1/types.py deleted file mode 100644 index 6bff6007..00000000 --- a/google/cloud/datacatalog_v1/types.py +++ /dev/null @@ -1,72 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - - -from __future__ import absolute_import -import sys - -from google.api_core.protobuf_helpers import get_messages - -from google.cloud.datacatalog_v1.proto import datacatalog_pb2 -from google.cloud.datacatalog_v1.proto import gcs_fileset_spec_pb2 -from google.cloud.datacatalog_v1.proto import schema_pb2 -from google.cloud.datacatalog_v1.proto import search_pb2 -from google.cloud.datacatalog_v1.proto import table_spec_pb2 -from google.cloud.datacatalog_v1.proto import tags_pb2 -from google.cloud.datacatalog_v1.proto import timestamps_pb2 -from google.iam.v1 import iam_policy_pb2 -from google.iam.v1 import options_pb2 -from google.iam.v1 import policy_pb2 -from google.protobuf import empty_pb2 -from google.protobuf import field_mask_pb2 -from google.protobuf import timestamp_pb2 -from google.type import expr_pb2 - - -_shared_modules = [ - iam_policy_pb2, - options_pb2, - policy_pb2, - empty_pb2, - field_mask_pb2, - timestamp_pb2, - expr_pb2, -] - -_local_modules = [ - datacatalog_pb2, - gcs_fileset_spec_pb2, - schema_pb2, - search_pb2, - table_spec_pb2, - tags_pb2, - timestamps_pb2, -] - -names = [] - -for module in _shared_modules: # pragma: NO COVER - for name, message in get_messages(module).items(): - setattr(sys.modules[__name__], name, message) - names.append(name) -for module in _local_modules: - for name, message in get_messages(module).items(): - message.__module__ = "google.cloud.datacatalog_v1.types" - setattr(sys.modules[__name__], name, message) - names.append(name) - - -__all__ = tuple(sorted(names)) diff --git a/google/cloud/datacatalog_v1/types/__init__.py b/google/cloud/datacatalog_v1/types/__init__.py new file mode 100644 index 00000000..b3bff88c --- /dev/null +++ b/google/cloud/datacatalog_v1/types/__init__.py @@ -0,0 +1,121 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from .timestamps import SystemTimestamps +from .gcs_fileset_spec import ( + GcsFilesetSpec, + GcsFileSpec, +) +from .schema import ( + Schema, + ColumnSchema, +) +from .search import SearchCatalogResult +from .table_spec import ( + BigQueryTableSpec, + ViewSpec, + TableSpec, + BigQueryDateShardedSpec, +) +from .tags import ( + Tag, + TagField, + TagTemplate, + TagTemplateField, + FieldType, +) +from .datacatalog import ( + SearchCatalogRequest, + SearchCatalogResponse, + CreateEntryGroupRequest, + UpdateEntryGroupRequest, + GetEntryGroupRequest, + DeleteEntryGroupRequest, + ListEntryGroupsRequest, + ListEntryGroupsResponse, + CreateEntryRequest, + UpdateEntryRequest, + DeleteEntryRequest, + GetEntryRequest, + LookupEntryRequest, + Entry, + EntryGroup, + CreateTagTemplateRequest, + GetTagTemplateRequest, + UpdateTagTemplateRequest, + DeleteTagTemplateRequest, + CreateTagRequest, + UpdateTagRequest, + DeleteTagRequest, + CreateTagTemplateFieldRequest, + UpdateTagTemplateFieldRequest, + RenameTagTemplateFieldRequest, + DeleteTagTemplateFieldRequest, + ListTagsRequest, + ListTagsResponse, + ListEntriesRequest, + ListEntriesResponse, +) + + +__all__ = ( + "SystemTimestamps", + "GcsFilesetSpec", + "GcsFileSpec", + "Schema", + "ColumnSchema", + "SearchCatalogResult", + "BigQueryTableSpec", + "ViewSpec", + "TableSpec", + "BigQueryDateShardedSpec", + "Tag", + "TagField", + "TagTemplate", + "TagTemplateField", + "FieldType", + "SearchCatalogRequest", + "SearchCatalogResponse", + "CreateEntryGroupRequest", + "UpdateEntryGroupRequest", + "GetEntryGroupRequest", + "DeleteEntryGroupRequest", + "ListEntryGroupsRequest", + "ListEntryGroupsResponse", + "CreateEntryRequest", + "UpdateEntryRequest", + "DeleteEntryRequest", + "GetEntryRequest", + "LookupEntryRequest", + "Entry", + "EntryGroup", + "CreateTagTemplateRequest", + "GetTagTemplateRequest", + "UpdateTagTemplateRequest", + "DeleteTagTemplateRequest", + "CreateTagRequest", + "UpdateTagRequest", + "DeleteTagRequest", + "CreateTagTemplateFieldRequest", + "UpdateTagTemplateFieldRequest", + "RenameTagTemplateFieldRequest", + "DeleteTagTemplateFieldRequest", + "ListTagsRequest", + "ListTagsResponse", + "ListEntriesRequest", + "ListEntriesResponse", +) diff --git a/google/cloud/datacatalog_v1/types/common.py b/google/cloud/datacatalog_v1/types/common.py new file mode 100644 index 00000000..feace354 --- /dev/null +++ b/google/cloud/datacatalog_v1/types/common.py @@ -0,0 +1,35 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1", manifest={"IntegratedSystem",}, +) + + +class IntegratedSystem(proto.Enum): + r"""This enum describes all the possible systems that Data + Catalog integrates with. + """ + INTEGRATED_SYSTEM_UNSPECIFIED = 0 + BIGQUERY = 1 + CLOUD_PUBSUB = 2 + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1/types/datacatalog.py b/google/cloud/datacatalog_v1/types/datacatalog.py new file mode 100644 index 00000000..fdb31546 --- /dev/null +++ b/google/cloud/datacatalog_v1/types/datacatalog.py @@ -0,0 +1,1042 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.cloud.datacatalog_v1.types import common +from google.cloud.datacatalog_v1.types import gcs_fileset_spec as gcd_gcs_fileset_spec +from google.cloud.datacatalog_v1.types import schema as gcd_schema +from google.cloud.datacatalog_v1.types import search +from google.cloud.datacatalog_v1.types import table_spec +from google.cloud.datacatalog_v1.types import tags as gcd_tags +from google.cloud.datacatalog_v1.types import timestamps +from google.protobuf import field_mask_pb2 as field_mask # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1", + manifest={ + "EntryType", + "SearchCatalogRequest", + "SearchCatalogResponse", + "CreateEntryGroupRequest", + "UpdateEntryGroupRequest", + "GetEntryGroupRequest", + "DeleteEntryGroupRequest", + "ListEntryGroupsRequest", + "ListEntryGroupsResponse", + "CreateEntryRequest", + "UpdateEntryRequest", + "DeleteEntryRequest", + "GetEntryRequest", + "LookupEntryRequest", + "Entry", + "EntryGroup", + "CreateTagTemplateRequest", + "GetTagTemplateRequest", + "UpdateTagTemplateRequest", + "DeleteTagTemplateRequest", + "CreateTagRequest", + "UpdateTagRequest", + "DeleteTagRequest", + "CreateTagTemplateFieldRequest", + "UpdateTagTemplateFieldRequest", + "RenameTagTemplateFieldRequest", + "DeleteTagTemplateFieldRequest", + "ListTagsRequest", + "ListTagsResponse", + "ListEntriesRequest", + "ListEntriesResponse", + }, +) + + +class EntryType(proto.Enum): + r"""Entry resources in Data Catalog can be of different types e.g. a + BigQuery Table entry is of type ``TABLE``. This enum describes all + the possible types Data Catalog contains. + """ + ENTRY_TYPE_UNSPECIFIED = 0 + TABLE = 2 + MODEL = 5 + DATA_STREAM = 3 + FILESET = 4 + + +class SearchCatalogRequest(proto.Message): + r"""Request message for + [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. + + Attributes: + scope (~.datacatalog.SearchCatalogRequest.Scope): + Required. The scope of this search request. A ``scope`` that + has empty ``include_org_ids``, ``include_project_ids`` AND + false ``include_gcp_public_datasets`` is considered invalid. + Data Catalog will return an error in such a case. + query (str): + Required. The query string in search query syntax. The query + must be non-empty. + + Query strings can be simple as "x" or more qualified as: + + - name:x + - column:x + - description:y + + Note: Query tokens need to have a minimum of 3 characters + for substring matching to work correctly. See `Data Catalog + Search + Syntax `__ + for more information. + page_size (int): + Number of results in the search page. If <=0 then defaults + to 10. Max limit for page_size is 1000. Throws an invalid + argument for page_size > 1000. + page_token (str): + Optional. Pagination token returned in an earlier + [SearchCatalogResponse.next_page_token][google.cloud.datacatalog.v1.SearchCatalogResponse.next_page_token], + which indicates that this is a continuation of a prior + [SearchCatalogRequest][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog] + call, and that the system should return the next page of + data. If empty, the first page is returned. + order_by (str): + Specifies the ordering of results, currently supported + case-sensitive choices are: + + - ``relevance``, only supports descending + - ``last_modified_timestamp [asc|desc]``, defaults to + descending if not specified + + If not specified, defaults to ``relevance`` descending. + """ + + class Scope(proto.Message): + r"""The criteria that select the subspace used for query + matching. + + Attributes: + include_org_ids (Sequence[str]): + The list of organization IDs to search + within. To find your organization ID, follow + instructions in + https://cloud.google.com/resource- + manager/docs/creating-managing-organization. + include_project_ids (Sequence[str]): + The list of project IDs to search within. To + learn more about the distinction between project + names/IDs/numbers, go to + https://cloud.google.com/docs/overview/#projects. + include_gcp_public_datasets (bool): + If ``true``, include Google Cloud Platform (GCP) public + datasets in the search results. Info on GCP public datasets + is available at https://cloud.google.com/public-datasets/. + By default, GCP public datasets are excluded. + restricted_locations (Sequence[str]): + Optional. The list of locations to search within. + + 1. If empty, search will be performed in all locations; + 2. If any of the locations are NOT in the valid locations + list, error will be returned; + 3. Otherwise, search only the given locations for matching + results. Typical usage is to leave this field empty. When + a location is unreachable as returned in the + ``SearchCatalogResponse.unreachable`` field, users can + repeat the search request with this parameter set to get + additional information on the error. + + Valid locations: + + - asia-east1 + - asia-east2 + - asia-northeast1 + - asia-northeast2 + - asia-northeast3 + - asia-south1 + - asia-southeast1 + - australia-southeast1 + - eu + - europe-north1 + - europe-west1 + - europe-west2 + - europe-west3 + - europe-west4 + - europe-west6 + - global + - northamerica-northeast1 + - southamerica-east1 + - us + - us-central1 + - us-east1 + - us-east4 + - us-west1 + - us-west2 + """ + + include_org_ids = proto.RepeatedField(proto.STRING, number=2) + + include_project_ids = proto.RepeatedField(proto.STRING, number=3) + + include_gcp_public_datasets = proto.Field(proto.BOOL, number=7) + + restricted_locations = proto.RepeatedField(proto.STRING, number=16) + + scope = proto.Field(proto.MESSAGE, number=6, message=Scope,) + + query = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + order_by = proto.Field(proto.STRING, number=5) + + +class SearchCatalogResponse(proto.Message): + r"""Response message for + [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. + + Attributes: + results (Sequence[~.search.SearchCatalogResult]): + Search results. + next_page_token (str): + The token that can be used to retrieve the + next page of results. + unreachable (Sequence[str]): + Unreachable locations. Search result does not include data + from those locations. Users can get additional information + on the error by repeating the search request with a more + restrictive parameter -- setting the value for + ``SearchDataCatalogRequest.scope.include_locations``. + """ + + @property + def raw_page(self): + return self + + results = proto.RepeatedField( + proto.MESSAGE, number=1, message=search.SearchCatalogResult, + ) + + next_page_token = proto.Field(proto.STRING, number=3) + + unreachable = proto.RepeatedField(proto.STRING, number=6) + + +class CreateEntryGroupRequest(proto.Message): + r"""Request message for + [CreateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.CreateEntryGroup]. + + Attributes: + parent (str): + Required. The name of the project this entry group is in. + Example: + + - projects/{project_id}/locations/{location} + + Note that this EntryGroup and its child resources may not + actually be stored in the location in this name. + entry_group_id (str): + Required. The id of the entry group to + create. The id must begin with a letter or + underscore, contain only English letters, + numbers and underscores, and be at most 64 + characters. + entry_group (~.datacatalog.EntryGroup): + The entry group to create. Defaults to an + empty entry group. + """ + + parent = proto.Field(proto.STRING, number=1) + + entry_group_id = proto.Field(proto.STRING, number=3) + + entry_group = proto.Field(proto.MESSAGE, number=2, message="EntryGroup",) + + +class UpdateEntryGroupRequest(proto.Message): + r"""Request message for + [UpdateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.UpdateEntryGroup]. + + Attributes: + entry_group (~.datacatalog.EntryGroup): + Required. The updated entry group. "name" + field must be set. + update_mask (~.field_mask.FieldMask): + The fields to update on the entry group. If + absent or empty, all modifiable fields are + updated. + """ + + entry_group = proto.Field(proto.MESSAGE, number=1, message="EntryGroup",) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class GetEntryGroupRequest(proto.Message): + r"""Request message for + [GetEntryGroup][google.cloud.datacatalog.v1.DataCatalog.GetEntryGroup]. + + Attributes: + name (str): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + read_mask (~.field_mask.FieldMask): + The fields to return. If not set or empty, + all fields are returned. + """ + + name = proto.Field(proto.STRING, number=1) + + read_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class DeleteEntryGroupRequest(proto.Message): + r"""Request message for + [DeleteEntryGroup][google.cloud.datacatalog.v1.DataCatalog.DeleteEntryGroup]. + + Attributes: + name (str): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + force (bool): + Optional. If true, deletes all entries in the + entry group. + """ + + name = proto.Field(proto.STRING, number=1) + + force = proto.Field(proto.BOOL, number=2) + + +class ListEntryGroupsRequest(proto.Message): + r"""Request message for + [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. + + Attributes: + parent (str): + Required. The name of the location that contains the entry + groups, which can be provided in URL format. Example: + + - projects/{project_id}/locations/{location} + page_size (int): + Optional. The maximum number of items to return. Default is + 10. Max limit is 1000. Throws an invalid argument for + ``page_size > 1000``. + page_token (str): + Optional. Token that specifies which page is + requested. If empty, the first page is returned. + """ + + parent = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + +class ListEntryGroupsResponse(proto.Message): + r"""Response message for + [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. + + Attributes: + entry_groups (Sequence[~.datacatalog.EntryGroup]): + EntryGroup details. + next_page_token (str): + Token to retrieve the next page of results. + It is set to empty if no items remain in + results. + """ + + @property + def raw_page(self): + return self + + entry_groups = proto.RepeatedField(proto.MESSAGE, number=1, message="EntryGroup",) + + next_page_token = proto.Field(proto.STRING, number=2) + + +class CreateEntryRequest(proto.Message): + r"""Request message for + [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry]. + + Attributes: + parent (str): + Required. The name of the entry group this entry is in. + Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + + Note that this Entry and its child resources may not + actually be stored in the location in this name. + entry_id (str): + Required. The id of the entry to create. + entry (~.datacatalog.Entry): + Required. The entry to create. + """ + + parent = proto.Field(proto.STRING, number=1) + + entry_id = proto.Field(proto.STRING, number=3) + + entry = proto.Field(proto.MESSAGE, number=2, message="Entry",) + + +class UpdateEntryRequest(proto.Message): + r"""Request message for + [UpdateEntry][google.cloud.datacatalog.v1.DataCatalog.UpdateEntry]. + + Attributes: + entry (~.datacatalog.Entry): + Required. The updated entry. The "name" field + must be set. + update_mask (~.field_mask.FieldMask): + The fields to update on the entry. If absent or empty, all + modifiable fields are updated. + + The following fields are modifiable: + + - For entries with type ``DATA_STREAM``: + + - ``schema`` + + - For entries with type ``FILESET`` + + - ``schema`` + - ``display_name`` + - ``description`` + - ``gcs_fileset_spec`` + - ``gcs_fileset_spec.file_patterns`` + + - For entries with ``user_specified_type`` + + - ``schema`` + - ``display_name`` + - ``description`` + - user_specified_type + - user_specified_system + - linked_resource + - source_system_timestamps + """ + + entry = proto.Field(proto.MESSAGE, number=1, message="Entry",) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class DeleteEntryRequest(proto.Message): + r"""Request message for + [DeleteEntry][google.cloud.datacatalog.v1.DataCatalog.DeleteEntry]. + + Attributes: + name (str): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + """ + + name = proto.Field(proto.STRING, number=1) + + +class GetEntryRequest(proto.Message): + r"""Request message for + [GetEntry][google.cloud.datacatalog.v1.DataCatalog.GetEntry]. + + Attributes: + name (str): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + """ + + name = proto.Field(proto.STRING, number=1) + + +class LookupEntryRequest(proto.Message): + r"""Request message for + [LookupEntry][google.cloud.datacatalog.v1.DataCatalog.LookupEntry]. + + Attributes: + linked_resource (str): + The full name of the Google Cloud Platform resource the Data + Catalog entry represents. See: + https://cloud.google.com/apis/design/resource_names#full_resource_name. + Full names are case-sensitive. + + Examples: + + - //bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId + - //pubsub.googleapis.com/projects/projectId/topics/topicId + sql_resource (str): + The SQL name of the entry. SQL names are case-sensitive. + + Examples: + + - ``pubsub.project_id.topic_id`` + - :literal:`pubsub.project_id.`topic.id.with.dots\`` + - ``bigquery.table.project_id.dataset_id.table_id`` + - ``bigquery.dataset.project_id.dataset_id`` + - ``datacatalog.entry.project_id.location_id.entry_group_id.entry_id`` + + ``*_id``\ s shoud satisfy the standard SQL rules for + identifiers. + https://cloud.google.com/bigquery/docs/reference/standard-sql/lexical. + """ + + linked_resource = proto.Field(proto.STRING, number=1, oneof="target_name") + + sql_resource = proto.Field(proto.STRING, number=3, oneof="target_name") + + +class Entry(proto.Message): + r"""Entry Metadata. A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery dataset or a + Pub/Sub topic) or outside of Google Cloud Platform. Clients can use + the ``linked_resource`` field in the Entry resource to refer to the + original resource ID of the source system. + + An Entry resource contains resource details, such as its schema. An + Entry can also be used to attach flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. + + Attributes: + name (str): + The Data Catalog resource name of the entry in URL format. + Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + + Note that this Entry and its child resources may not + actually be stored in the location in this name. + linked_resource (str): + The resource this metadata entry refers to. + + For Google Cloud Platform resources, ``linked_resource`` is + the `full name of the + resource `__. + For example, the ``linked_resource`` for a table resource + from BigQuery is: + + - //bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId + + Output only when Entry is of type in the EntryType enum. For + entries with user_specified_type, this field is optional and + defaults to an empty string. + type (~.datacatalog.EntryType): + The type of the entry. + Only used for Entries with types in the + EntryType enum. + user_specified_type (str): + Entry type if it does not fit any of the input-allowed + values listed in ``EntryType`` enum above. When creating an + entry, users should check the enum values first, if nothing + matches the entry to be created, then provide a custom + value, for example "my_special_type". + ``user_specified_type`` strings must begin with a letter or + underscore and can only contain letters, numbers, and + underscores; are case insensitive; must be at least 1 + character and at most 64 characters long. + + Currently, only FILESET enum value is allowed. All other + entries created through Data Catalog must use + ``user_specified_type``. + integrated_system (~.common.IntegratedSystem): + Output only. This field indicates the entry's + source system that Data Catalog integrates with, + such as BigQuery or Pub/Sub. + user_specified_system (str): + This field indicates the entry's source system that Data + Catalog does not integrate with. ``user_specified_system`` + strings must begin with a letter or underscore and can only + contain letters, numbers, and underscores; are case + insensitive; must be at least 1 character and at most 64 + characters long. + gcs_fileset_spec (~.gcd_gcs_fileset_spec.GcsFilesetSpec): + Specification that applies to a Cloud Storage + fileset. This is only valid on entries of type + FILESET. + bigquery_table_spec (~.table_spec.BigQueryTableSpec): + Specification that applies to a BigQuery table. This is only + valid on entries of type ``TABLE``. + bigquery_date_sharded_spec (~.table_spec.BigQueryDateShardedSpec): + Specification for a group of BigQuery tables with name + pattern ``[prefix]YYYYMMDD``. Context: + https://cloud.google.com/bigquery/docs/partitioned-tables#partitioning_versus_sharding. + display_name (str): + Display information such as title and + description. A short name to identify the entry, + for example, "Analytics Data - Jan 2011". + Default value is an empty string. + description (str): + Entry description, which can consist of + several sentences or paragraphs that describe + entry contents. Default value is an empty + string. + schema (~.gcd_schema.Schema): + Schema of the entry. An entry might not have + any schema attached to it. + source_system_timestamps (~.timestamps.SystemTimestamps): + Timestamps about the underlying resource, not about this + Data Catalog entry. Output only when Entry is of type in the + EntryType enum. For entries with user_specified_type, this + field is optional and defaults to an empty timestamp. + """ + + name = proto.Field(proto.STRING, number=1) + + linked_resource = proto.Field(proto.STRING, number=9) + + type = proto.Field(proto.ENUM, number=2, oneof="entry_type", enum="EntryType",) + + user_specified_type = proto.Field(proto.STRING, number=16, oneof="entry_type") + + integrated_system = proto.Field( + proto.ENUM, number=17, oneof="system", enum=common.IntegratedSystem, + ) + + user_specified_system = proto.Field(proto.STRING, number=18, oneof="system") + + gcs_fileset_spec = proto.Field( + proto.MESSAGE, + number=6, + oneof="type_spec", + message=gcd_gcs_fileset_spec.GcsFilesetSpec, + ) + + bigquery_table_spec = proto.Field( + proto.MESSAGE, + number=12, + oneof="type_spec", + message=table_spec.BigQueryTableSpec, + ) + + bigquery_date_sharded_spec = proto.Field( + proto.MESSAGE, + number=15, + oneof="type_spec", + message=table_spec.BigQueryDateShardedSpec, + ) + + display_name = proto.Field(proto.STRING, number=3) + + description = proto.Field(proto.STRING, number=4) + + schema = proto.Field(proto.MESSAGE, number=5, message=gcd_schema.Schema,) + + source_system_timestamps = proto.Field( + proto.MESSAGE, number=7, message=timestamps.SystemTimestamps, + ) + + +class EntryGroup(proto.Message): + r"""EntryGroup Metadata. An EntryGroup resource represents a logical + grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. + + Attributes: + name (str): + The resource name of the entry group in URL format. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + + Note that this EntryGroup and its child resources may not + actually be stored in the location in this name. + display_name (str): + A short name to identify the entry group, for + example, "analytics data - jan 2011". Default + value is an empty string. + description (str): + Entry group description, which can consist of + several sentences or paragraphs that describe + entry group contents. Default value is an empty + string. + data_catalog_timestamps (~.timestamps.SystemTimestamps): + Output only. Timestamps about this + EntryGroup. Default value is empty timestamps. + """ + + name = proto.Field(proto.STRING, number=1) + + display_name = proto.Field(proto.STRING, number=2) + + description = proto.Field(proto.STRING, number=3) + + data_catalog_timestamps = proto.Field( + proto.MESSAGE, number=4, message=timestamps.SystemTimestamps, + ) + + +class CreateTagTemplateRequest(proto.Message): + r"""Request message for + [CreateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplate]. + + Attributes: + parent (str): + Required. The name of the project and the template location + `region `__. + + Example: + + - projects/{project_id}/locations/us-central1 + tag_template_id (str): + Required. The id of the tag template to + create. + tag_template (~.gcd_tags.TagTemplate): + Required. The tag template to create. + """ + + parent = proto.Field(proto.STRING, number=1) + + tag_template_id = proto.Field(proto.STRING, number=3) + + tag_template = proto.Field(proto.MESSAGE, number=2, message=gcd_tags.TagTemplate,) + + +class GetTagTemplateRequest(proto.Message): + r"""Request message for + [GetTagTemplate][google.cloud.datacatalog.v1.DataCatalog.GetTagTemplate]. + + Attributes: + name (str): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + """ + + name = proto.Field(proto.STRING, number=1) + + +class UpdateTagTemplateRequest(proto.Message): + r"""Request message for + [UpdateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplate]. + + Attributes: + tag_template (~.gcd_tags.TagTemplate): + Required. The template to update. The "name" + field must be set. + update_mask (~.field_mask.FieldMask): + The field mask specifies the parts of the template to + overwrite. + + Allowed fields: + + - ``display_name`` + + If absent or empty, all of the allowed fields above will be + updated. + """ + + tag_template = proto.Field(proto.MESSAGE, number=1, message=gcd_tags.TagTemplate,) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class DeleteTagTemplateRequest(proto.Message): + r"""Request message for + [DeleteTagTemplate][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplate]. + + Attributes: + name (str): + Required. The name of the tag template to delete. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + force (bool): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of any possible tags + using this template. ``force = false`` will be supported in + the future. + """ + + name = proto.Field(proto.STRING, number=1) + + force = proto.Field(proto.BOOL, number=2) + + +class CreateTagRequest(proto.Message): + r"""Request message for + [CreateTag][google.cloud.datacatalog.v1.DataCatalog.CreateTag]. + + Attributes: + parent (str): + Required. The name of the resource to attach this tag to. + Tags can be attached to Entries. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + + Note that this Tag and its child resources may not actually + be stored in the location in this name. + tag (~.gcd_tags.Tag): + Required. The tag to create. + """ + + parent = proto.Field(proto.STRING, number=1) + + tag = proto.Field(proto.MESSAGE, number=2, message=gcd_tags.Tag,) + + +class UpdateTagRequest(proto.Message): + r"""Request message for + [UpdateTag][google.cloud.datacatalog.v1.DataCatalog.UpdateTag]. + + Attributes: + tag (~.gcd_tags.Tag): + Required. The updated tag. The "name" field + must be set. + update_mask (~.field_mask.FieldMask): + The fields to update on the Tag. If absent or empty, all + modifiable fields are updated. Currently the only modifiable + field is the field ``fields``. + """ + + tag = proto.Field(proto.MESSAGE, number=1, message=gcd_tags.Tag,) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class DeleteTagRequest(proto.Message): + r"""Request message for + [DeleteTag][google.cloud.datacatalog.v1.DataCatalog.DeleteTag]. + + Attributes: + name (str): + Required. The name of the tag to delete. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + """ + + name = proto.Field(proto.STRING, number=1) + + +class CreateTagTemplateFieldRequest(proto.Message): + r"""Request message for + [CreateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplateField]. + + Attributes: + parent (str): + Required. The name of the project and the template location + `region `__. + + Example: + + - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + tag_template_field_id (str): + Required. The ID of the tag template field to create. Field + ids can contain letters (both uppercase and lowercase), + numbers (0-9), underscores (_) and dashes (-). Field IDs + must be at least 1 character long and at most 128 characters + long. Field IDs must also be unique within their template. + tag_template_field (~.gcd_tags.TagTemplateField): + Required. The tag template field to create. + """ + + parent = proto.Field(proto.STRING, number=1) + + tag_template_field_id = proto.Field(proto.STRING, number=2) + + tag_template_field = proto.Field( + proto.MESSAGE, number=3, message=gcd_tags.TagTemplateField, + ) + + +class UpdateTagTemplateFieldRequest(proto.Message): + r"""Request message for + [UpdateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplateField]. + + Attributes: + name (str): + Required. The name of the tag template field. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + tag_template_field (~.gcd_tags.TagTemplateField): + Required. The template to update. + update_mask (~.field_mask.FieldMask): + Optional. The field mask specifies the parts of the template + to be updated. Allowed fields: + + - ``display_name`` + - ``type.enum_type`` + - ``is_required`` + + If ``update_mask`` is not set or empty, all of the allowed + fields above will be updated. + + When updating an enum type, the provided values will be + merged with the existing values. Therefore, enum values can + only be added, existing enum values cannot be deleted nor + renamed. Updating a template field from optional to required + is NOT allowed. + """ + + name = proto.Field(proto.STRING, number=1) + + tag_template_field = proto.Field( + proto.MESSAGE, number=2, message=gcd_tags.TagTemplateField, + ) + + update_mask = proto.Field(proto.MESSAGE, number=3, message=field_mask.FieldMask,) + + +class RenameTagTemplateFieldRequest(proto.Message): + r"""Request message for + [RenameTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.RenameTagTemplateField]. + + Attributes: + name (str): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + new_tag_template_field_id (str): + Required. The new ID of this tag template field. For + example, ``my_new_field``. + """ + + name = proto.Field(proto.STRING, number=1) + + new_tag_template_field_id = proto.Field(proto.STRING, number=2) + + +class DeleteTagTemplateFieldRequest(proto.Message): + r"""Request message for + [DeleteTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplateField]. + + Attributes: + name (str): + Required. The name of the tag template field to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + force (bool): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of this field from any + tags using this field. ``force = false`` will be supported + in the future. + """ + + name = proto.Field(proto.STRING, number=1) + + force = proto.Field(proto.BOOL, number=2) + + +class ListTagsRequest(proto.Message): + r"""Request message for + [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. + + Attributes: + parent (str): + Required. The name of the Data Catalog resource to list the + tags of. The resource could be an + [Entry][google.cloud.datacatalog.v1.Entry] or an + [EntryGroup][google.cloud.datacatalog.v1.EntryGroup]. + + Examples: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + page_size (int): + The maximum number of tags to return. Default + is 10. Max limit is 1000. + page_token (str): + Token that specifies which page is requested. + If empty, the first page is returned. + """ + + parent = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + +class ListTagsResponse(proto.Message): + r"""Response message for + [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. + + Attributes: + tags (Sequence[~.gcd_tags.Tag]): + [Tag][google.cloud.datacatalog.v1.Tag] details. + next_page_token (str): + Token to retrieve the next page of results. + It is set to empty if no items remain in + results. + """ + + @property + def raw_page(self): + return self + + tags = proto.RepeatedField(proto.MESSAGE, number=1, message=gcd_tags.Tag,) + + next_page_token = proto.Field(proto.STRING, number=2) + + +class ListEntriesRequest(proto.Message): + r"""Request message for + [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. + + Attributes: + parent (str): + Required. The name of the entry group that contains the + entries, which can be provided in URL format. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + page_size (int): + The maximum number of items to return. Default is 10. Max + limit is 1000. Throws an invalid argument for + ``page_size > 1000``. + page_token (str): + Token that specifies which page is requested. + If empty, the first page is returned. + read_mask (~.field_mask.FieldMask): + The fields to return for each Entry. If not set or empty, + all fields are returned. For example, setting read_mask to + contain only one path "name" will cause ListEntries to + return a list of Entries with only "name" field. + """ + + parent = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + read_mask = proto.Field(proto.MESSAGE, number=4, message=field_mask.FieldMask,) + + +class ListEntriesResponse(proto.Message): + r"""Response message for + [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. + + Attributes: + entries (Sequence[~.datacatalog.Entry]): + Entry details. + next_page_token (str): + Token to retrieve the next page of results. + It is set to empty if no items remain in + results. + """ + + @property + def raw_page(self): + return self + + entries = proto.RepeatedField(proto.MESSAGE, number=1, message=Entry,) + + next_page_token = proto.Field(proto.STRING, number=2) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1/types/gcs_fileset_spec.py b/google/cloud/datacatalog_v1/types/gcs_fileset_spec.py new file mode 100644 index 00000000..64518aff --- /dev/null +++ b/google/cloud/datacatalog_v1/types/gcs_fileset_spec.py @@ -0,0 +1,102 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.cloud.datacatalog_v1.types import timestamps + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1", manifest={"GcsFilesetSpec", "GcsFileSpec",}, +) + + +class GcsFilesetSpec(proto.Message): + r"""Describes a Cloud Storage fileset entry. + + Attributes: + file_patterns (Sequence[str]): + Required. Patterns to identify a set of files in Google + Cloud Storage. See `Cloud Storage + documentation `__ + for more information. Note that bucket wildcards are + currently not supported. + + Examples of valid file_patterns: + + - ``gs://bucket_name/dir/*``: matches all files within + ``bucket_name/dir`` directory. + - ``gs://bucket_name/dir/**``: matches all files in + ``bucket_name/dir`` spanning all subdirectories. + - ``gs://bucket_name/file*``: matches files prefixed by + ``file`` in ``bucket_name`` + - ``gs://bucket_name/??.txt``: matches files with two + characters followed by ``.txt`` in ``bucket_name`` + - ``gs://bucket_name/[aeiou].txt``: matches files that + contain a single vowel character followed by ``.txt`` in + ``bucket_name`` + - ``gs://bucket_name/[a-m].txt``: matches files that + contain ``a``, ``b``, ... or ``m`` followed by ``.txt`` + in ``bucket_name`` + - ``gs://bucket_name/a/*/b``: matches all files in + ``bucket_name`` that match ``a/*/b`` pattern, such as + ``a/c/b``, ``a/d/b`` + - ``gs://another_bucket/a.txt``: matches + ``gs://another_bucket/a.txt`` + + You can combine wildcards to provide more powerful matches, + for example: + + - ``gs://bucket_name/[a-m]??.j*g`` + sample_gcs_file_specs (Sequence[~.gcs_fileset_spec.GcsFileSpec]): + Output only. Sample files contained in this + fileset, not all files contained in this fileset + are represented here. + """ + + file_patterns = proto.RepeatedField(proto.STRING, number=1) + + sample_gcs_file_specs = proto.RepeatedField( + proto.MESSAGE, number=2, message="GcsFileSpec", + ) + + +class GcsFileSpec(proto.Message): + r"""Specifications of a single file in Cloud Storage. + + Attributes: + file_path (str): + Required. The full file path. Example: + ``gs://bucket_name/a/b.txt``. + gcs_timestamps (~.timestamps.SystemTimestamps): + Output only. Timestamps about the Cloud + Storage file. + size_bytes (int): + Output only. The size of the file, in bytes. + """ + + file_path = proto.Field(proto.STRING, number=1) + + gcs_timestamps = proto.Field( + proto.MESSAGE, number=2, message=timestamps.SystemTimestamps, + ) + + size_bytes = proto.Field(proto.INT64, number=4) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1/types/schema.py b/google/cloud/datacatalog_v1/types/schema.py new file mode 100644 index 00000000..4a51a122 --- /dev/null +++ b/google/cloud/datacatalog_v1/types/schema.py @@ -0,0 +1,71 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1", manifest={"Schema", "ColumnSchema",}, +) + + +class Schema(proto.Message): + r"""Represents a schema (e.g. BigQuery, GoogleSQL, Avro schema). + + Attributes: + columns (Sequence[~.schema.ColumnSchema]): + Required. Schema of columns. A maximum of + 10,000 columns and sub-columns can be specified. + """ + + columns = proto.RepeatedField(proto.MESSAGE, number=2, message="ColumnSchema",) + + +class ColumnSchema(proto.Message): + r"""Representation of a column within a schema. Columns could be + nested inside other columns. + + Attributes: + column (str): + Required. Name of the column. + type (str): + Required. Type of the column. + description (str): + Optional. Description of the column. Default + value is an empty string. + mode (str): + Optional. A column's mode indicates whether the values in + this column are required, nullable, etc. Only ``NULLABLE``, + ``REQUIRED`` and ``REPEATED`` are supported. Default mode is + ``NULLABLE``. + subcolumns (Sequence[~.schema.ColumnSchema]): + Optional. Schema of sub-columns. A column can + have zero or more sub-columns. + """ + + column = proto.Field(proto.STRING, number=6) + + type = proto.Field(proto.STRING, number=1) + + description = proto.Field(proto.STRING, number=2) + + mode = proto.Field(proto.STRING, number=3) + + subcolumns = proto.RepeatedField(proto.MESSAGE, number=7, message="ColumnSchema",) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1/types/search.py b/google/cloud/datacatalog_v1/types/search.py new file mode 100644 index 00000000..eb4370da --- /dev/null +++ b/google/cloud/datacatalog_v1/types/search.py @@ -0,0 +1,94 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.cloud.datacatalog_v1.types import common + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1", + manifest={"SearchResultType", "SearchCatalogResult",}, +) + + +class SearchResultType(proto.Enum): + r"""The different types of resources that can be returned in + search. + """ + SEARCH_RESULT_TYPE_UNSPECIFIED = 0 + ENTRY = 1 + TAG_TEMPLATE = 2 + ENTRY_GROUP = 3 + + +class SearchCatalogResult(proto.Message): + r"""A result that appears in the response of a search request. + Each result captures details of one entry that matches the + search. + + Attributes: + search_result_type (~.search.SearchResultType): + Type of the search result. This field can be + used to determine which Get method to call to + fetch the full resource. + search_result_subtype (str): + Sub-type of the search result. This is a dot-delimited + description of the resource's full type, and is the same as + the value callers would provide in the "type" search facet. + Examples: ``entry.table``, ``entry.dataStream``, + ``tagTemplate``. + relative_resource_name (str): + The relative resource name of the resource in URL format. + Examples: + + - ``projects/{project_id}/locations/{location_id}/entryGroups/{entry_group_id}/entries/{entry_id}`` + - ``projects/{project_id}/tagTemplates/{tag_template_id}`` + linked_resource (str): + The full name of the cloud resource the entry belongs to. + See: + https://cloud.google.com/apis/design/resource_names#full_resource_name. + Example: + + - ``//bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId`` + integrated_system (~.common.IntegratedSystem): + Output only. This field indicates the entry's + source system that Data Catalog integrates with, + such as BigQuery or Cloud Pub/Sub. + user_specified_system (str): + This field indicates the entry's source + system that Data Catalog does not integrate + with. + """ + + search_result_type = proto.Field(proto.ENUM, number=1, enum="SearchResultType",) + + search_result_subtype = proto.Field(proto.STRING, number=2) + + relative_resource_name = proto.Field(proto.STRING, number=3) + + linked_resource = proto.Field(proto.STRING, number=4) + + integrated_system = proto.Field( + proto.ENUM, number=8, oneof="system", enum=common.IntegratedSystem, + ) + + user_specified_system = proto.Field(proto.STRING, number=9, oneof="system") + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1/types/table_spec.py b/google/cloud/datacatalog_v1/types/table_spec.py new file mode 100644 index 00000000..4c86f64f --- /dev/null +++ b/google/cloud/datacatalog_v1/types/table_spec.py @@ -0,0 +1,119 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1", + manifest={ + "TableSourceType", + "BigQueryTableSpec", + "ViewSpec", + "TableSpec", + "BigQueryDateShardedSpec", + }, +) + + +class TableSourceType(proto.Enum): + r"""Table source type.""" + TABLE_SOURCE_TYPE_UNSPECIFIED = 0 + BIGQUERY_VIEW = 2 + BIGQUERY_TABLE = 5 + + +class BigQueryTableSpec(proto.Message): + r"""Describes a BigQuery table. + + Attributes: + table_source_type (~.gcd_table_spec.TableSourceType): + Output only. The table source type. + view_spec (~.gcd_table_spec.ViewSpec): + Table view specification. This field should only be + populated if ``table_source_type`` is ``BIGQUERY_VIEW``. + table_spec (~.gcd_table_spec.TableSpec): + Spec of a BigQuery table. This field should only be + populated if ``table_source_type`` is ``BIGQUERY_TABLE``. + """ + + table_source_type = proto.Field(proto.ENUM, number=1, enum="TableSourceType",) + + view_spec = proto.Field( + proto.MESSAGE, number=2, oneof="type_spec", message="ViewSpec", + ) + + table_spec = proto.Field( + proto.MESSAGE, number=3, oneof="type_spec", message="TableSpec", + ) + + +class ViewSpec(proto.Message): + r"""Table view specification. + + Attributes: + view_query (str): + Output only. The query that defines the table + view. + """ + + view_query = proto.Field(proto.STRING, number=1) + + +class TableSpec(proto.Message): + r"""Normal BigQuery table spec. + + Attributes: + grouped_entry (str): + Output only. If the table is a dated shard, i.e., with name + pattern ``[prefix]YYYYMMDD``, ``grouped_entry`` is the Data + Catalog resource name of the date sharded grouped entry, for + example, + ``projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}``. + Otherwise, ``grouped_entry`` is empty. + """ + + grouped_entry = proto.Field(proto.STRING, number=1) + + +class BigQueryDateShardedSpec(proto.Message): + r"""Spec for a group of BigQuery tables with name pattern + ``[prefix]YYYYMMDD``. Context: + https://cloud.google.com/bigquery/docs/partitioned-tables#partitioning_versus_sharding + + Attributes: + dataset (str): + Output only. The Data Catalog resource name of the dataset + entry the current table belongs to, for example, + ``projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}``. + table_prefix (str): + Output only. The table name prefix of the shards. The name + of any given shard is ``[table_prefix]YYYYMMDD``, for + example, for shard ``MyTable20180101``, the ``table_prefix`` + is ``MyTable``. + shard_count (int): + Output only. Total number of shards. + """ + + dataset = proto.Field(proto.STRING, number=1) + + table_prefix = proto.Field(proto.STRING, number=2) + + shard_count = proto.Field(proto.INT64, number=3) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1/types/tags.py b/google/cloud/datacatalog_v1/types/tags.py new file mode 100644 index 00000000..8e6e94e0 --- /dev/null +++ b/google/cloud/datacatalog_v1/types/tags.py @@ -0,0 +1,291 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.protobuf import timestamp_pb2 as timestamp # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1", + manifest={"Tag", "TagField", "TagTemplate", "TagTemplateField", "FieldType",}, +) + + +class Tag(proto.Message): + r"""Tags are used to attach custom metadata to Data Catalog resources. + Tags conform to the specifications within their tag template. + + See `Data Catalog + IAM `__ for + information on the permissions needed to create or view tags. + + Attributes: + name (str): + The resource name of the tag in URL format. Example: + + - projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + + where ``tag_id`` is a system-generated identifier. Note that + this Tag may not actually be stored in the location in this + name. + template (str): + Required. The resource name of the tag template that this + tag uses. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + + This field cannot be modified after creation. + template_display_name (str): + Output only. The display name of the tag + template. + column (str): + Resources like Entry can have schemas associated with them. + This scope allows users to attach tags to an individual + column based on that schema. + + For attaching a tag to a nested column, use ``.`` to + separate the column names. Example: + + - ``outer_column.inner_column`` + fields (Sequence[~.tags.Tag.FieldsEntry]): + Required. This maps the ID of a tag field to + the value of and additional information about + that field. Valid field IDs are defined by the + tag's template. A tag must have at least 1 field + and at most 500 fields. + """ + + name = proto.Field(proto.STRING, number=1) + + template = proto.Field(proto.STRING, number=2) + + template_display_name = proto.Field(proto.STRING, number=5) + + column = proto.Field(proto.STRING, number=4, oneof="scope") + + fields = proto.MapField(proto.STRING, proto.MESSAGE, number=3, message="TagField",) + + +class TagField(proto.Message): + r"""Contains the value and supporting information for a field within a + [Tag][google.cloud.datacatalog.v1.Tag]. + + Attributes: + display_name (str): + Output only. The display name of this field. + double_value (float): + Holds the value for a tag field with double + type. + string_value (str): + Holds the value for a tag field with string + type. + bool_value (bool): + Holds the value for a tag field with boolean + type. + timestamp_value (~.timestamp.Timestamp): + Holds the value for a tag field with + timestamp type. + enum_value (~.tags.TagField.EnumValue): + Holds the value for a tag field with enum + type. This value must be one of the allowed + values in the definition of this enum. + order (int): + Output only. The order of this field with respect to other + fields in this tag. It can be set in + [Tag][google.cloud.datacatalog.v1.TagTemplateField.order]. + For example, a higher value can indicate a more important + field. The value can be negative. Multiple fields can have + the same order, and field orders within a tag do not have to + be sequential. + """ + + class EnumValue(proto.Message): + r"""Holds an enum value. + + Attributes: + display_name (str): + The display name of the enum value. + """ + + display_name = proto.Field(proto.STRING, number=1) + + display_name = proto.Field(proto.STRING, number=1) + + double_value = proto.Field(proto.DOUBLE, number=2, oneof="kind") + + string_value = proto.Field(proto.STRING, number=3, oneof="kind") + + bool_value = proto.Field(proto.BOOL, number=4, oneof="kind") + + timestamp_value = proto.Field( + proto.MESSAGE, number=5, oneof="kind", message=timestamp.Timestamp, + ) + + enum_value = proto.Field(proto.MESSAGE, number=6, oneof="kind", message=EnumValue,) + + order = proto.Field(proto.INT32, number=7) + + +class TagTemplate(proto.Message): + r"""A tag template defines a tag, which can have one or more typed + fields. The template is used to create and attach the tag to GCP + resources. `Tag template + roles `__ + provide permissions to create, edit, and use the template. See, for + example, the `TagTemplate + User `__ + role, which includes permission to use the tag template to tag + resources. + + Attributes: + name (str): + The resource name of the tag template in URL format. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + + Note that this TagTemplate and its child resources may not + actually be stored in the location in this name. + display_name (str): + The display name for this template. Defaults + to an empty string. + fields (Sequence[~.tags.TagTemplate.FieldsEntry]): + Required. Map of tag template field IDs to the settings for + the field. This map is an exhaustive list of the allowed + fields. This map must contain at least one field and at most + 500 fields. + + The keys to this map are tag template field IDs. Field IDs + can contain letters (both uppercase and lowercase), numbers + (0-9) and underscores (_). Field IDs must be at least 1 + character long and at most 64 characters long. Field IDs + must start with a letter or underscore. + """ + + name = proto.Field(proto.STRING, number=1) + + display_name = proto.Field(proto.STRING, number=2) + + fields = proto.MapField( + proto.STRING, proto.MESSAGE, number=3, message="TagTemplateField", + ) + + +class TagTemplateField(proto.Message): + r"""The template for an individual field within a tag template. + + Attributes: + name (str): + Output only. The resource name of the tag template field in + URL format. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template}/fields/{field} + + Note that this TagTemplateField may not actually be stored + in the location in this name. + display_name (str): + The display name for this field. Defaults to + an empty string. + type (~.tags.FieldType): + Required. The type of value this tag field + can contain. + is_required (bool): + Whether this is a required field. Defaults to + false. + order (int): + The order of this field with respect to other + fields in this tag template. For example, a + higher value can indicate a more important + field. The value can be negative. Multiple + fields can have the same order, and field orders + within a tag do not have to be sequential. + """ + + name = proto.Field(proto.STRING, number=6) + + display_name = proto.Field(proto.STRING, number=1) + + type = proto.Field(proto.MESSAGE, number=2, message="FieldType",) + + is_required = proto.Field(proto.BOOL, number=3) + + order = proto.Field(proto.INT32, number=5) + + +class FieldType(proto.Message): + r""" + + Attributes: + primitive_type (~.tags.FieldType.PrimitiveType): + Represents primitive types - string, bool + etc. + enum_type (~.tags.FieldType.EnumType): + Represents an enum type. + """ + + class PrimitiveType(proto.Enum): + r"""""" + PRIMITIVE_TYPE_UNSPECIFIED = 0 + DOUBLE = 1 + STRING = 2 + BOOL = 3 + TIMESTAMP = 4 + + class EnumType(proto.Message): + r""" + + Attributes: + allowed_values (Sequence[~.tags.FieldType.EnumType.EnumValue]): + Required on create; optional on update. The + set of allowed values for this enum. This set + must not be empty, the display names of the + values in this set must not be empty and the + display names of the values must be case- + insensitively unique within this set. Currently, + enum values can only be added to the list of + allowed values. Deletion and renaming of enum + values are not supported. Can have up to 500 + allowed values. + """ + + class EnumValue(proto.Message): + r""" + + Attributes: + display_name (str): + Required. The display name of the enum value. + Must not be an empty string. + """ + + display_name = proto.Field(proto.STRING, number=1) + + allowed_values = proto.RepeatedField( + proto.MESSAGE, number=1, message="FieldType.EnumType.EnumValue", + ) + + primitive_type = proto.Field( + proto.ENUM, number=1, oneof="type_decl", enum=PrimitiveType, + ) + + enum_type = proto.Field( + proto.MESSAGE, number=2, oneof="type_decl", message=EnumType, + ) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1/types/timestamps.py b/google/cloud/datacatalog_v1/types/timestamps.py new file mode 100644 index 00000000..451b9a43 --- /dev/null +++ b/google/cloud/datacatalog_v1/types/timestamps.py @@ -0,0 +1,53 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.protobuf import timestamp_pb2 as timestamp # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1", manifest={"SystemTimestamps",}, +) + + +class SystemTimestamps(proto.Message): + r"""Timestamps about this resource according to a particular + system. + + Attributes: + create_time (~.timestamp.Timestamp): + The creation time of the resource within the + given system. + update_time (~.timestamp.Timestamp): + The last-modified time of the resource within + the given system. + expire_time (~.timestamp.Timestamp): + Output only. The expiration time of the + resource within the given system. Currently only + apllicable to BigQuery resources. + """ + + create_time = proto.Field(proto.MESSAGE, number=1, message=timestamp.Timestamp,) + + update_time = proto.Field(proto.MESSAGE, number=2, message=timestamp.Timestamp,) + + expire_time = proto.Field(proto.MESSAGE, number=3, message=timestamp.Timestamp,) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index 9610cbbd..be0bdd8e 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -1,65 +1,169 @@ # -*- coding: utf-8 -*- -# + # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # -# https://www.apache.org/licenses/LICENSE-2.0 +# http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. +# - -from __future__ import absolute_import -import sys -import warnings - -from google.cloud.datacatalog_v1beta1 import types -from google.cloud.datacatalog_v1beta1.gapic import data_catalog_client -from google.cloud.datacatalog_v1beta1.gapic import enums -from google.cloud.datacatalog_v1beta1.gapic import policy_tag_manager_client -from google.cloud.datacatalog_v1beta1.gapic import ( - policy_tag_manager_serialization_client, +from .services.data_catalog import DataCatalogClient +from .services.policy_tag_manager import PolicyTagManagerClient +from .services.policy_tag_manager_serialization import ( + PolicyTagManagerSerializationClient, ) - - -if sys.version_info[:2] == (2, 7): - message = ( - "A future version of this library will drop support for Python 2.7. " - "More details about Python 2 support for Google Cloud Client Libraries " - "can be found at https://cloud.google.com/python/docs/python2-sunset/" - ) - warnings.warn(message, DeprecationWarning) - - -class DataCatalogClient(data_catalog_client.DataCatalogClient): - __doc__ = data_catalog_client.DataCatalogClient.__doc__ - enums = enums - - -class PolicyTagManagerClient(policy_tag_manager_client.PolicyTagManagerClient): - __doc__ = policy_tag_manager_client.PolicyTagManagerClient.__doc__ - enums = enums - - -class PolicyTagManagerSerializationClient( - policy_tag_manager_serialization_client.PolicyTagManagerSerializationClient -): - __doc__ = ( - policy_tag_manager_serialization_client.PolicyTagManagerSerializationClient.__doc__ - ) - enums = enums +from .types.common import IntegratedSystem +from .types.datacatalog import CreateEntryGroupRequest +from .types.datacatalog import CreateEntryRequest +from .types.datacatalog import CreateTagRequest +from .types.datacatalog import CreateTagTemplateFieldRequest +from .types.datacatalog import CreateTagTemplateRequest +from .types.datacatalog import DeleteEntryGroupRequest +from .types.datacatalog import DeleteEntryRequest +from .types.datacatalog import DeleteTagRequest +from .types.datacatalog import DeleteTagTemplateFieldRequest +from .types.datacatalog import DeleteTagTemplateRequest +from .types.datacatalog import Entry +from .types.datacatalog import EntryGroup +from .types.datacatalog import EntryType +from .types.datacatalog import GetEntryGroupRequest +from .types.datacatalog import GetEntryRequest +from .types.datacatalog import GetTagTemplateRequest +from .types.datacatalog import ListEntriesRequest +from .types.datacatalog import ListEntriesResponse +from .types.datacatalog import ListEntryGroupsRequest +from .types.datacatalog import ListEntryGroupsResponse +from .types.datacatalog import ListTagsRequest +from .types.datacatalog import ListTagsResponse +from .types.datacatalog import LookupEntryRequest +from .types.datacatalog import RenameTagTemplateFieldRequest +from .types.datacatalog import SearchCatalogRequest +from .types.datacatalog import SearchCatalogResponse +from .types.datacatalog import UpdateEntryGroupRequest +from .types.datacatalog import UpdateEntryRequest +from .types.datacatalog import UpdateTagRequest +from .types.datacatalog import UpdateTagTemplateFieldRequest +from .types.datacatalog import UpdateTagTemplateRequest +from .types.gcs_fileset_spec import GcsFileSpec +from .types.gcs_fileset_spec import GcsFilesetSpec +from .types.policytagmanager import CreatePolicyTagRequest +from .types.policytagmanager import CreateTaxonomyRequest +from .types.policytagmanager import DeletePolicyTagRequest +from .types.policytagmanager import DeleteTaxonomyRequest +from .types.policytagmanager import GetPolicyTagRequest +from .types.policytagmanager import GetTaxonomyRequest +from .types.policytagmanager import ListPolicyTagsRequest +from .types.policytagmanager import ListPolicyTagsResponse +from .types.policytagmanager import ListTaxonomiesRequest +from .types.policytagmanager import ListTaxonomiesResponse +from .types.policytagmanager import PolicyTag +from .types.policytagmanager import Taxonomy +from .types.policytagmanager import UpdatePolicyTagRequest +from .types.policytagmanager import UpdateTaxonomyRequest +from .types.policytagmanagerserialization import ExportTaxonomiesRequest +from .types.policytagmanagerserialization import ExportTaxonomiesResponse +from .types.policytagmanagerserialization import ImportTaxonomiesRequest +from .types.policytagmanagerserialization import ImportTaxonomiesResponse +from .types.policytagmanagerserialization import InlineSource +from .types.policytagmanagerserialization import SerializedPolicyTag +from .types.policytagmanagerserialization import SerializedTaxonomy +from .types.schema import ColumnSchema +from .types.schema import Schema +from .types.search import SearchCatalogResult +from .types.search import SearchResultType +from .types.table_spec import BigQueryDateShardedSpec +from .types.table_spec import BigQueryTableSpec +from .types.table_spec import TableSourceType +from .types.table_spec import TableSpec +from .types.table_spec import ViewSpec +from .types.tags import FieldType +from .types.tags import Tag +from .types.tags import TagField +from .types.tags import TagTemplate +from .types.tags import TagTemplateField +from .types.timestamps import SystemTimestamps __all__ = ( - "enums", - "types", + "BigQueryDateShardedSpec", + "BigQueryTableSpec", + "ColumnSchema", + "CreateEntryGroupRequest", + "CreateEntryRequest", + "CreatePolicyTagRequest", + "CreateTagRequest", + "CreateTagTemplateFieldRequest", + "CreateTagTemplateRequest", + "CreateTaxonomyRequest", "DataCatalogClient", + "DeleteEntryGroupRequest", + "DeleteEntryRequest", + "DeletePolicyTagRequest", + "DeleteTagRequest", + "DeleteTagTemplateFieldRequest", + "DeleteTagTemplateRequest", + "DeleteTaxonomyRequest", + "Entry", + "EntryGroup", + "EntryType", + "ExportTaxonomiesRequest", + "ExportTaxonomiesResponse", + "FieldType", + "GcsFileSpec", + "GcsFilesetSpec", + "GetEntryGroupRequest", + "GetEntryRequest", + "GetPolicyTagRequest", + "GetTagTemplateRequest", + "GetTaxonomyRequest", + "ImportTaxonomiesRequest", + "ImportTaxonomiesResponse", + "InlineSource", + "IntegratedSystem", + "ListEntriesRequest", + "ListEntriesResponse", + "ListEntryGroupsRequest", + "ListEntryGroupsResponse", + "ListPolicyTagsRequest", + "ListPolicyTagsResponse", + "ListTagsRequest", + "ListTagsResponse", + "ListTaxonomiesRequest", + "ListTaxonomiesResponse", + "LookupEntryRequest", + "PolicyTag", "PolicyTagManagerClient", + "RenameTagTemplateFieldRequest", + "Schema", + "SearchCatalogRequest", + "SearchCatalogResponse", + "SearchCatalogResult", + "SearchResultType", + "SerializedPolicyTag", + "SerializedTaxonomy", + "SystemTimestamps", + "TableSourceType", + "TableSpec", + "Tag", + "TagField", + "TagTemplate", + "TagTemplateField", + "Taxonomy", + "UpdateEntryGroupRequest", + "UpdateEntryRequest", + "UpdatePolicyTagRequest", + "UpdateTagRequest", + "UpdateTagTemplateFieldRequest", + "UpdateTagTemplateRequest", + "UpdateTaxonomyRequest", + "ViewSpec", "PolicyTagManagerSerializationClient", ) diff --git a/google/cloud/datacatalog_v1beta1/gapic/__init__.py b/google/cloud/datacatalog_v1beta1/gapic/__init__.py deleted file mode 100644 index e69de29b..00000000 diff --git a/google/cloud/datacatalog_v1beta1/gapic/data_catalog_client.py b/google/cloud/datacatalog_v1beta1/gapic/data_catalog_client.py deleted file mode 100644 index c7bc3d3c..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/data_catalog_client.py +++ /dev/null @@ -1,2715 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Accesses the google.cloud.datacatalog.v1beta1 DataCatalog API.""" - -import functools -import pkg_resources -import warnings - -from google.oauth2 import service_account -import google.api_core.client_options -import google.api_core.gapic_v1.client_info -import google.api_core.gapic_v1.config -import google.api_core.gapic_v1.method -import google.api_core.gapic_v1.routing_header -import google.api_core.grpc_helpers -import google.api_core.page_iterator -import google.api_core.path_template -import google.api_core.protobuf_helpers -import grpc - -from google.cloud.datacatalog_v1beta1.gapic import data_catalog_client_config -from google.cloud.datacatalog_v1beta1.gapic import enums -from google.cloud.datacatalog_v1beta1.gapic.transports import ( - data_catalog_grpc_transport, -) -from google.cloud.datacatalog_v1beta1.proto import datacatalog_pb2 -from google.cloud.datacatalog_v1beta1.proto import datacatalog_pb2_grpc -from google.cloud.datacatalog_v1beta1.proto import tags_pb2 -from google.iam.v1 import iam_policy_pb2 -from google.iam.v1 import options_pb2 -from google.iam.v1 import policy_pb2 -from google.protobuf import empty_pb2 -from google.protobuf import field_mask_pb2 - - -_GAPIC_LIBRARY_VERSION = pkg_resources.get_distribution( - "google-cloud-datacatalog" -).version - - -class DataCatalogClient(object): - """ - Data Catalog API service allows clients to discover, understand, and manage - their data. - """ - - SERVICE_ADDRESS = "datacatalog.googleapis.com:443" - """The default address of the service.""" - - # The name of the interface for this client. This is the key used to - # find the method configuration in the client_config dictionary. - _INTERFACE_NAME = "google.cloud.datacatalog.v1beta1.DataCatalog" - - @classmethod - def from_service_account_file(cls, filename, *args, **kwargs): - """Creates an instance of this client using the provided credentials - file. - - Args: - filename (str): The path to the service account private key json - file. - args: Additional arguments to pass to the constructor. - kwargs: Additional arguments to pass to the constructor. - - Returns: - DataCatalogClient: The constructed client. - """ - credentials = service_account.Credentials.from_service_account_file(filename) - kwargs["credentials"] = credentials - return cls(*args, **kwargs) - - from_service_account_json = from_service_account_file - - @classmethod - def entry_path(cls, project, location, entry_group, entry): - """Return a fully-qualified entry string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}", - project=project, - location=location, - entry_group=entry_group, - entry=entry, - ) - - @classmethod - def entry_group_path(cls, project, location, entry_group): - """Return a fully-qualified entry_group string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/entryGroups/{entry_group}", - project=project, - location=location, - entry_group=entry_group, - ) - - @classmethod - def location_path(cls, project, location): - """Return a fully-qualified location string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}", - project=project, - location=location, - ) - - @classmethod - def tag_path(cls, project, location, entry_group, entry, tag): - """Return a fully-qualified tag string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}", - project=project, - location=location, - entry_group=entry_group, - entry=entry, - tag=tag, - ) - - @classmethod - def tag_template_path(cls, project, location, tag_template): - """Return a fully-qualified tag_template string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/tagTemplates/{tag_template}", - project=project, - location=location, - tag_template=tag_template, - ) - - @classmethod - def tag_template_field_path(cls, project, location, tag_template, field): - """Return a fully-qualified tag_template_field string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}", - project=project, - location=location, - tag_template=tag_template, - field=field, - ) - - def __init__( - self, - transport=None, - channel=None, - credentials=None, - client_config=None, - client_info=None, - client_options=None, - ): - """Constructor. - - Args: - transport (Union[~.DataCatalogGrpcTransport, - Callable[[~.Credentials, type], ~.DataCatalogGrpcTransport]): A transport - instance, responsible for actually making the API calls. - The default transport uses the gRPC protocol. - This argument may also be a callable which returns a - transport instance. Callables will be sent the credentials - as the first argument and the default transport class as - the second argument. - channel (grpc.Channel): DEPRECATED. A ``Channel`` instance - through which to make calls. This argument is mutually exclusive - with ``credentials``; providing both will raise an exception. - credentials (google.auth.credentials.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If none - are specified, the client will attempt to ascertain the - credentials from the environment. - This argument is mutually exclusive with providing a - transport instance to ``transport``; doing so will raise - an exception. - client_config (dict): DEPRECATED. A dictionary of call options for - each method. If not specified, the default configuration is used. - client_info (google.api_core.gapic_v1.client_info.ClientInfo): - The client info used to send a user-agent string along with - API requests. If ``None``, then default info will be used. - Generally, you only need to set this if you're developing - your own client library. - client_options (Union[dict, google.api_core.client_options.ClientOptions]): - Client options used to set user options on the client. API Endpoint - should be set through client_options. - """ - # Raise deprecation warnings for things we want to go away. - if client_config is not None: - warnings.warn( - "The `client_config` argument is deprecated.", - PendingDeprecationWarning, - stacklevel=2, - ) - else: - client_config = data_catalog_client_config.config - - if channel: - warnings.warn( - "The `channel` argument is deprecated; use " "`transport` instead.", - PendingDeprecationWarning, - stacklevel=2, - ) - - api_endpoint = self.SERVICE_ADDRESS - if client_options: - if type(client_options) == dict: - client_options = google.api_core.client_options.from_dict( - client_options - ) - if client_options.api_endpoint: - api_endpoint = client_options.api_endpoint - - # Instantiate the transport. - # The transport is responsible for handling serialization and - # deserialization and actually sending data to the service. - if transport: - if callable(transport): - self.transport = transport( - credentials=credentials, - default_class=data_catalog_grpc_transport.DataCatalogGrpcTransport, - address=api_endpoint, - ) - else: - if credentials: - raise ValueError( - "Received both a transport instance and " - "credentials; these are mutually exclusive." - ) - self.transport = transport - else: - self.transport = data_catalog_grpc_transport.DataCatalogGrpcTransport( - address=api_endpoint, channel=channel, credentials=credentials - ) - - if client_info is None: - client_info = google.api_core.gapic_v1.client_info.ClientInfo( - gapic_version=_GAPIC_LIBRARY_VERSION - ) - else: - client_info.gapic_version = _GAPIC_LIBRARY_VERSION - self._client_info = client_info - - # Parse out the default settings for retry and timeout for each RPC - # from the client configuration. - # (Ordinarily, these are the defaults specified in the `*_config.py` - # file next to this one.) - self._method_configs = google.api_core.gapic_v1.config.parse_method_configs( - client_config["interfaces"][self._INTERFACE_NAME] - ) - - # Save a dictionary of cached API call functions. - # These are the actual callables which invoke the proper - # transport methods, wrapped with `wrap_method` to add retry, - # timeout, and the like. - self._inner_api_calls = {} - - # Service calls - def search_catalog( - self, - scope, - query, - page_size=None, - order_by=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Searches Data Catalog for multiple resources like entries, tags that - match a query. - - This is a custom method - (https://cloud.google.com/apis/design/custom_methods) and does not - return the complete resource, only the resource identifier and high - level fields. Clients can subsequentally call ``Get`` methods. - - Note that Data Catalog search queries do not guarantee full recall. - Query results that match your query may not be returned, even in - subsequent result pages. Also note that results returned (and not - returned) can vary across repeated search queries. - - See `Data Catalog Search - Syntax `__ - for more information. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> # TODO: Initialize `scope`: - >>> scope = {} - >>> - >>> # TODO: Initialize `query`: - >>> query = '' - >>> - >>> # Iterate over all results - >>> for element in client.search_catalog(scope, query): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.search_catalog(scope, query).pages: - ... for element in page: - ... # process element - ... pass - - Args: - scope (Union[dict, ~google.cloud.datacatalog_v1beta1.types.Scope]): Required. The scope of this search request. A ``scope`` that has - empty ``include_org_ids``, ``include_project_ids`` AND false - ``include_gcp_public_datasets`` is considered invalid. Data Catalog will - return an error in such a case. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.Scope` - query (str): Required. The query string in search query syntax. The query must be - non-empty. - - Query strings can be simple as "x" or more qualified as: - - - name:x - - column:x - - description:y - - Note: Query tokens need to have a minimum of 3 characters for substring - matching to work correctly. See `Data Catalog Search - Syntax `__ - for more information. - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - order_by (str): Specifies the ordering of results, currently supported - case-sensitive choices are: - - - ``relevance``, only supports descending - - ``last_modified_timestamp [asc|desc]``, defaults to descending if not - specified - - If not specified, defaults to ``relevance`` descending. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1beta1.types.SearchCatalogResult` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "search_catalog" not in self._inner_api_calls: - self._inner_api_calls[ - "search_catalog" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.search_catalog, - default_retry=self._method_configs["SearchCatalog"].retry, - default_timeout=self._method_configs["SearchCatalog"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.SearchCatalogRequest( - scope=scope, query=query, page_size=page_size, order_by=order_by - ) - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["search_catalog"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="results", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def get_entry( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets an entry. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.entry_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]') - >>> - >>> response = client.get_entry(name) - - Args: - name (str): Required. The name of the entry. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Entry` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "get_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_entry, - default_retry=self._method_configs["GetEntry"].retry, - default_timeout=self._method_configs["GetEntry"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.GetEntryRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def lookup_entry( - self, - linked_resource=None, - sql_resource=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Get an entry by target resource name. This method allows clients to use - the resource name from the source Google Cloud Platform service to get the - Data Catalog Entry. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> response = client.lookup_entry() - - Args: - linked_resource (str): The full name of the Google Cloud Platform resource the Data Catalog - entry represents. See: - https://cloud.google.com/apis/design/resource_names#full_resource_name. - Full names are case-sensitive. - - Examples: - - - //bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId - - //pubsub.googleapis.com/projects/projectId/topics/topicId - sql_resource (str): The SQL name of the entry. SQL names are case-sensitive. - - Examples: - - - ``pubsub.project_id.topic_id`` - - :literal:`pubsub.project_id.`topic.id.with.dots\`` - - ``bigquery.table.project_id.dataset_id.table_id`` - - ``bigquery.dataset.project_id.dataset_id`` - - ``datacatalog.entry.project_id.location_id.entry_group_id.entry_id`` - - ``*_id``\ s shoud satisfy the standard SQL rules for identifiers. - https://cloud.google.com/bigquery/docs/reference/standard-sql/lexical. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Entry` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "lookup_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "lookup_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.lookup_entry, - default_retry=self._method_configs["LookupEntry"].retry, - default_timeout=self._method_configs["LookupEntry"].timeout, - client_info=self._client_info, - ) - - # Sanity check: We have some fields which are mutually exclusive; - # raise ValueError if more than one is sent. - google.api_core.protobuf_helpers.check_oneof( - linked_resource=linked_resource, sql_resource=sql_resource - ) - - request = datacatalog_pb2.LookupEntryRequest( - linked_resource=linked_resource, sql_resource=sql_resource - ) - return self._inner_api_calls["lookup_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def create_entry_group( - self, - parent, - entry_group_id, - entry_group=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - A maximum of 10,000 entry groups may be created per organization - across all locations. - - Users should enable the Data Catalog API in the project identified by - the ``parent`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> parent = client.location_path('[PROJECT]', '[LOCATION]') - >>> - >>> # TODO: Initialize `entry_group_id`: - >>> entry_group_id = '' - >>> - >>> response = client.create_entry_group(parent, entry_group_id) - - Args: - parent (str): Required. The name of the project this entry group is in. Example: - - - projects/{project_id}/locations/{location} - - Note that this EntryGroup and its child resources may not actually be - stored in the location in this name. - entry_group_id (str): Required. The id of the entry group to create. - The id must begin with a letter or underscore, contain only English - letters, numbers and underscores, and be at most 64 characters. - entry_group (Union[dict, ~google.cloud.datacatalog_v1beta1.types.EntryGroup]): The entry group to create. Defaults to an empty entry group. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.EntryGroup` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.EntryGroup` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_entry_group" not in self._inner_api_calls: - self._inner_api_calls[ - "create_entry_group" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_entry_group, - default_retry=self._method_configs["CreateEntryGroup"].retry, - default_timeout=self._method_configs["CreateEntryGroup"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateEntryGroupRequest( - parent=parent, entry_group_id=entry_group_id, entry_group=entry_group - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_entry_group"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_entry_group( - self, - entry_group, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates an EntryGroup. The user should enable the Data Catalog API - in the project identified by the ``entry_group.name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> # TODO: Initialize `entry_group`: - >>> entry_group = {} - >>> - >>> response = client.update_entry_group(entry_group) - - Args: - entry_group (Union[dict, ~google.cloud.datacatalog_v1beta1.types.EntryGroup]): Required. The updated entry group. "name" field must be set. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.EntryGroup` - update_mask (Union[dict, ~google.cloud.datacatalog_v1beta1.types.FieldMask]): The fields to update on the entry group. If absent or empty, all modifiable - fields are updated. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.EntryGroup` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_entry_group" not in self._inner_api_calls: - self._inner_api_calls[ - "update_entry_group" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_entry_group, - default_retry=self._method_configs["UpdateEntryGroup"].retry, - default_timeout=self._method_configs["UpdateEntryGroup"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateEntryGroupRequest( - entry_group=entry_group, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("entry_group.name", entry_group.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_entry_group"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def get_entry_group( - self, - name, - read_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets an EntryGroup. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> response = client.get_entry_group(name) - - Args: - name (str): Required. The name of the entry group. For example, - ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. - read_mask (Union[dict, ~google.cloud.datacatalog_v1beta1.types.FieldMask]): The fields to return. If not set or empty, all fields are returned. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.EntryGroup` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_entry_group" not in self._inner_api_calls: - self._inner_api_calls[ - "get_entry_group" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_entry_group, - default_retry=self._method_configs["GetEntryGroup"].retry, - default_timeout=self._method_configs["GetEntryGroup"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.GetEntryGroupRequest(name=name, read_mask=read_mask) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_entry_group"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_entry_group( - self, - name, - force=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes an EntryGroup. Only entry groups that do not contain entries - can be deleted. Users should enable the Data Catalog API in the project - identified by the ``name`` parameter (see [Data Catalog Resource - Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> client.delete_entry_group(name) - - Args: - name (str): Required. The name of the entry group. For example, - ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. - force (bool): Optional. If true, deletes all entries in the entry group. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_entry_group" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_entry_group" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_entry_group, - default_retry=self._method_configs["DeleteEntryGroup"].retry, - default_timeout=self._method_configs["DeleteEntryGroup"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteEntryGroupRequest(name=name, force=force) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_entry_group"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def list_entry_groups( - self, - parent, - page_size=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Lists entry groups. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> parent = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> # Iterate over all results - >>> for element in client.list_entry_groups(parent): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.list_entry_groups(parent).pages: - ... for element in page: - ... # process element - ... pass - - Args: - parent (str): Required. The name of the location that contains the entry groups, - which can be provided in URL format. Example: - - - projects/{project_id}/locations/{location} - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1beta1.types.EntryGroup` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "list_entry_groups" not in self._inner_api_calls: - self._inner_api_calls[ - "list_entry_groups" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.list_entry_groups, - default_retry=self._method_configs["ListEntryGroups"].retry, - default_timeout=self._method_configs["ListEntryGroups"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.ListEntryGroupsRequest( - parent=parent, page_size=page_size - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["list_entry_groups"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="entry_groups", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def create_entry( - self, - parent, - entry_id, - entry, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates an entry. Only entries of 'FILESET' type or user-specified - type can be created. - - Users should enable the Data Catalog API in the project identified by - the ``parent`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - A maximum of 100,000 entries may be created per entry group. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> parent = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> # TODO: Initialize `entry_id`: - >>> entry_id = '' - >>> - >>> # TODO: Initialize `entry`: - >>> entry = {} - >>> - >>> response = client.create_entry(parent, entry_id, entry) - - Args: - parent (str): Required. The name of the entry group this entry is in. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - - Note that this Entry and its child resources may not actually be stored - in the location in this name. - entry_id (str): Required. The id of the entry to create. - entry (Union[dict, ~google.cloud.datacatalog_v1beta1.types.Entry]): Required. The entry to create. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.Entry` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Entry` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "create_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_entry, - default_retry=self._method_configs["CreateEntry"].retry, - default_timeout=self._method_configs["CreateEntry"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateEntryRequest( - parent=parent, entry_id=entry_id, entry=entry - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_entry( - self, - entry, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates an existing entry. Users should enable the Data Catalog API - in the project identified by the ``entry.name`` parameter (see [Data - Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> # TODO: Initialize `entry`: - >>> entry = {} - >>> - >>> response = client.update_entry(entry) - - Args: - entry (Union[dict, ~google.cloud.datacatalog_v1beta1.types.Entry]): Required. The updated entry. The "name" field must be set. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.Entry` - update_mask (Union[dict, ~google.cloud.datacatalog_v1beta1.types.FieldMask]): The fields to update on the entry. If absent or empty, all - modifiable fields are updated. - - The following fields are modifiable: - - - For entries with type ``DATA_STREAM``: - - - ``schema`` - - - For entries with type ``FILESET`` - - - ``schema`` - - ``display_name`` - - ``description`` - - ``gcs_fileset_spec`` - - ``gcs_fileset_spec.file_patterns`` - - - For entries with ``user_specified_type`` - - - ``schema`` - - ``display_name`` - - ``description`` - - user_specified_type - - user_specified_system - - linked_resource - - source_system_timestamps - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Entry` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "update_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_entry, - default_retry=self._method_configs["UpdateEntry"].retry, - default_timeout=self._method_configs["UpdateEntry"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateEntryRequest( - entry=entry, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("entry.name", entry.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_entry( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes an existing entry. Only entries created through - ``CreateEntry`` method can be deleted. Users should enable the Data - Catalog API in the project identified by the ``name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.entry_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]') - >>> - >>> client.delete_entry(name) - - Args: - name (str): Required. The name of the entry. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_entry" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_entry" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_entry, - default_retry=self._method_configs["DeleteEntry"].retry, - default_timeout=self._method_configs["DeleteEntry"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteEntryRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_entry"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def list_entries( - self, - parent, - page_size=None, - read_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Lists entries. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> parent = client.entry_group_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]') - >>> - >>> # Iterate over all results - >>> for element in client.list_entries(parent): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.list_entries(parent).pages: - ... for element in page: - ... # process element - ... pass - - Args: - parent (str): Required. The name of the entry group that contains the entries, - which can be provided in URL format. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - read_mask (Union[dict, ~google.cloud.datacatalog_v1beta1.types.FieldMask]): The fields to return for each Entry. If not set or empty, all fields - are returned. For example, setting read_mask to contain only one path - "name" will cause ListEntries to return a list of Entries with only - "name" field. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1beta1.types.Entry` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "list_entries" not in self._inner_api_calls: - self._inner_api_calls[ - "list_entries" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.list_entries, - default_retry=self._method_configs["ListEntries"].retry, - default_timeout=self._method_configs["ListEntries"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.ListEntriesRequest( - parent=parent, page_size=page_size, read_mask=read_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["list_entries"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="entries", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def create_tag_template( - self, - parent, - tag_template_id, - tag_template, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates a tag template. The user should enable the Data Catalog API - in the project identified by the ``parent`` parameter (see `Data Catalog - Resource - Project `__ - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> parent = client.location_path('[PROJECT]', '[LOCATION]') - >>> - >>> # TODO: Initialize `tag_template_id`: - >>> tag_template_id = '' - >>> - >>> # TODO: Initialize `tag_template`: - >>> tag_template = {} - >>> - >>> response = client.create_tag_template(parent, tag_template_id, tag_template) - - Args: - parent (str): Required. The name of the project and the template location - [region](https://cloud.google.com/data-catalog/docs/concepts/regions. - - Example: - - - projects/{project_id}/locations/us-central1 - tag_template_id (str): Required. The id of the tag template to create. - tag_template (Union[dict, ~google.cloud.datacatalog_v1beta1.types.TagTemplate]): Required. The tag template to create. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplate` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplate` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_tag_template" not in self._inner_api_calls: - self._inner_api_calls[ - "create_tag_template" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_tag_template, - default_retry=self._method_configs["CreateTagTemplate"].retry, - default_timeout=self._method_configs["CreateTagTemplate"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateTagTemplateRequest( - parent=parent, tag_template_id=tag_template_id, tag_template=tag_template - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_tag_template"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def get_tag_template( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets a tag template. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.tag_template_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]') - >>> - >>> response = client.get_tag_template(name) - - Args: - name (str): Required. The name of the tag template. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplate` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_tag_template" not in self._inner_api_calls: - self._inner_api_calls[ - "get_tag_template" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_tag_template, - default_retry=self._method_configs["GetTagTemplate"].retry, - default_timeout=self._method_configs["GetTagTemplate"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.GetTagTemplateRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_tag_template"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_tag_template( - self, - tag_template, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates a tag template. This method cannot be used to update the - fields of a template. The tag template fields are represented as - separate resources and should be updated using their own - create/update/delete methods. Users should enable the Data Catalog API - in the project identified by the ``tag_template.name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> # TODO: Initialize `tag_template`: - >>> tag_template = {} - >>> - >>> response = client.update_tag_template(tag_template) - - Args: - tag_template (Union[dict, ~google.cloud.datacatalog_v1beta1.types.TagTemplate]): Required. The template to update. The "name" field must be set. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplate` - update_mask (Union[dict, ~google.cloud.datacatalog_v1beta1.types.FieldMask]): The field mask specifies the parts of the template to overwrite. - - Allowed fields: - - - ``display_name`` - - If absent or empty, all of the allowed fields above will be updated. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplate` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_tag_template" not in self._inner_api_calls: - self._inner_api_calls[ - "update_tag_template" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_tag_template, - default_retry=self._method_configs["UpdateTagTemplate"].retry, - default_timeout=self._method_configs["UpdateTagTemplate"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateTagTemplateRequest( - tag_template=tag_template, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("tag_template.name", tag_template.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_tag_template"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_tag_template( - self, - name, - force, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes a tag template and all tags using the template. Users should - enable the Data Catalog API in the project identified by the ``name`` - parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.tag_template_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]') - >>> - >>> # TODO: Initialize `force`: - >>> force = False - >>> - >>> client.delete_tag_template(name, force) - - Args: - name (str): Required. The name of the tag template to delete. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} - force (bool): Required. Currently, this field must always be set to ``true``. This - confirms the deletion of any possible tags using this template. - ``force = false`` will be supported in the future. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_tag_template" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_tag_template" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_tag_template, - default_retry=self._method_configs["DeleteTagTemplate"].retry, - default_timeout=self._method_configs["DeleteTagTemplate"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteTagTemplateRequest(name=name, force=force) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_tag_template"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def create_tag_template_field( - self, - parent, - tag_template_field_id, - tag_template_field, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates a field in a tag template. The user should enable the Data - Catalog API in the project identified by the ``parent`` parameter (see - `Data Catalog Resource - Project `__ - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> parent = client.tag_template_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]') - >>> - >>> # TODO: Initialize `tag_template_field_id`: - >>> tag_template_field_id = '' - >>> - >>> # TODO: Initialize `tag_template_field`: - >>> tag_template_field = {} - >>> - >>> response = client.create_tag_template_field(parent, tag_template_field_id, tag_template_field) - - Args: - parent (str): Required. The name of the project and the template location - `region `__. - - Example: - - - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} - tag_template_field_id (str): Required. The ID of the tag template field to create. Field ids can - contain letters (both uppercase and lowercase), numbers (0-9), - underscores (_) and dashes (-). Field IDs must be at least 1 character - long and at most 128 characters long. Field IDs must also be unique - within their template. - tag_template_field (Union[dict, ~google.cloud.datacatalog_v1beta1.types.TagTemplateField]): Required. The tag template field to create. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplateField` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplateField` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_tag_template_field" not in self._inner_api_calls: - self._inner_api_calls[ - "create_tag_template_field" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_tag_template_field, - default_retry=self._method_configs["CreateTagTemplateField"].retry, - default_timeout=self._method_configs["CreateTagTemplateField"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateTagTemplateFieldRequest( - parent=parent, - tag_template_field_id=tag_template_field_id, - tag_template_field=tag_template_field, - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_tag_template_field"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_tag_template_field( - self, - name, - tag_template_field, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates a field in a tag template. This method cannot be used to - update the field type. Users should enable the Data Catalog API in the - project identified by the ``name`` parameter (see [Data Catalog Resource - Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.tag_template_field_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]', '[FIELD]') - >>> - >>> # TODO: Initialize `tag_template_field`: - >>> tag_template_field = {} - >>> - >>> response = client.update_tag_template_field(name, tag_template_field) - - Args: - name (str): Required. The name of the tag template field. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} - tag_template_field (Union[dict, ~google.cloud.datacatalog_v1beta1.types.TagTemplateField]): Required. The template to update. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplateField` - update_mask (Union[dict, ~google.cloud.datacatalog_v1beta1.types.FieldMask]): Optional. The field mask specifies the parts of the template to be - updated. Allowed fields: - - - ``display_name`` - - ``type.enum_type`` - - ``is_required`` - - If ``update_mask`` is not set or empty, all of the allowed fields above - will be updated. - - When updating an enum type, the provided values will be merged with the - existing values. Therefore, enum values can only be added, existing enum - values cannot be deleted nor renamed. Updating a template field from - optional to required is NOT allowed. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplateField` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_tag_template_field" not in self._inner_api_calls: - self._inner_api_calls[ - "update_tag_template_field" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_tag_template_field, - default_retry=self._method_configs["UpdateTagTemplateField"].retry, - default_timeout=self._method_configs["UpdateTagTemplateField"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateTagTemplateFieldRequest( - name=name, tag_template_field=tag_template_field, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_tag_template_field"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def rename_tag_template_field( - self, - name, - new_tag_template_field_id, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Renames a field in a tag template. The user should enable the Data - Catalog API in the project identified by the ``name`` parameter (see - `Data Catalog Resource - Project `__ - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.tag_template_field_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]', '[FIELD]') - >>> - >>> # TODO: Initialize `new_tag_template_field_id`: - >>> new_tag_template_field_id = '' - >>> - >>> response = client.rename_tag_template_field(name, new_tag_template_field_id) - - Args: - name (str): Required. The name of the tag template. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} - new_tag_template_field_id (str): Required. The new ID of this tag template field. For example, - ``my_new_field``. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.TagTemplateField` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "rename_tag_template_field" not in self._inner_api_calls: - self._inner_api_calls[ - "rename_tag_template_field" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.rename_tag_template_field, - default_retry=self._method_configs["RenameTagTemplateField"].retry, - default_timeout=self._method_configs["RenameTagTemplateField"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.RenameTagTemplateFieldRequest( - name=name, new_tag_template_field_id=new_tag_template_field_id - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["rename_tag_template_field"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_tag_template_field( - self, - name, - force, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes a field in a tag template and all uses of that field. Users - should enable the Data Catalog API in the project identified by the - ``name`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.tag_template_field_path('[PROJECT]', '[LOCATION]', '[TAG_TEMPLATE]', '[FIELD]') - >>> - >>> # TODO: Initialize `force`: - >>> force = False - >>> - >>> client.delete_tag_template_field(name, force) - - Args: - name (str): Required. The name of the tag template field to delete. Example: - - - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} - force (bool): Required. Currently, this field must always be set to ``true``. This - confirms the deletion of this field from any tags using this field. - ``force = false`` will be supported in the future. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_tag_template_field" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_tag_template_field" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_tag_template_field, - default_retry=self._method_configs["DeleteTagTemplateField"].retry, - default_timeout=self._method_configs["DeleteTagTemplateField"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteTagTemplateFieldRequest(name=name, force=force) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_tag_template_field"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def create_tag( - self, - parent, - tag, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates a tag on an ``Entry``. Note: The project identified by the - ``parent`` parameter for the - `tag `__ - and the `tag - template `__ - used to create the tag must be from the same organization. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> parent = client.tag_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]', '[TAG]') - >>> - >>> # TODO: Initialize `tag`: - >>> tag = {} - >>> - >>> response = client.create_tag(parent, tag) - - Args: - parent (str): Required. The name of the resource to attach this tag to. Tags can - be attached to Entries. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} - - Note that this Tag and its child resources may not actually be stored in - the location in this name. - tag (Union[dict, ~google.cloud.datacatalog_v1beta1.types.Tag]): Required. The tag to create. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.Tag` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Tag` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "create_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_tag, - default_retry=self._method_configs["CreateTag"].retry, - default_timeout=self._method_configs["CreateTag"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.CreateTagRequest(parent=parent, tag=tag) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_tag( - self, - tag, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates an existing tag. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> # TODO: Initialize `tag`: - >>> tag = {} - >>> - >>> response = client.update_tag(tag) - - Args: - tag (Union[dict, ~google.cloud.datacatalog_v1beta1.types.Tag]): Required. The updated tag. The "name" field must be set. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.Tag` - update_mask (Union[dict, ~google.cloud.datacatalog_v1beta1.types.FieldMask]): The fields to update on the Tag. If absent or empty, all modifiable - fields are updated. Currently the only modifiable field is the field - ``fields``. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Tag` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "update_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_tag, - default_retry=self._method_configs["UpdateTag"].retry, - default_timeout=self._method_configs["UpdateTag"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.UpdateTagRequest(tag=tag, update_mask=update_mask) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("tag.name", tag.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_tag( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes a tag. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> name = client.entry_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]') - >>> - >>> client.delete_tag(name) - - Args: - name (str): Required. The name of the tag to delete. Example: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_tag, - default_retry=self._method_configs["DeleteTag"].retry, - default_timeout=self._method_configs["DeleteTag"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.DeleteTagRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def list_tags( - self, - parent, - page_size=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Lists the tags on an ``Entry``. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> parent = client.entry_path('[PROJECT]', '[LOCATION]', '[ENTRY_GROUP]', '[ENTRY]') - >>> - >>> # Iterate over all results - >>> for element in client.list_tags(parent): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.list_tags(parent).pages: - ... for element in page: - ... # process element - ... pass - - Args: - parent (str): Required. The name of the Data Catalog resource to list the tags of. - The resource could be an ``Entry`` or an ``EntryGroup``. - - Examples: - - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1beta1.types.Tag` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "list_tags" not in self._inner_api_calls: - self._inner_api_calls[ - "list_tags" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.list_tags, - default_retry=self._method_configs["ListTags"].retry, - default_timeout=self._method_configs["ListTags"].timeout, - client_info=self._client_info, - ) - - request = datacatalog_pb2.ListTagsRequest(parent=parent, page_size=page_size) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["list_tags"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="tags", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def set_iam_policy( - self, - resource, - policy, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Sets the access control policy for a resource. Replaces any existing - policy. Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - Callers must have following Google IAM permission - - - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on tag - templates. - - ``datacatalog.entries.setIamPolicy`` to set policies on entries. - - ``datacatalog.entryGroups.setIamPolicy`` to set policies on entry - groups. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> # TODO: Initialize `resource`: - >>> resource = '' - >>> - >>> # TODO: Initialize `policy`: - >>> policy = {} - >>> - >>> response = client.set_iam_policy(resource, policy) - - Args: - resource (str): REQUIRED: The resource for which the policy is being specified. - See the operation documentation for the appropriate value for this field. - policy (Union[dict, ~google.cloud.datacatalog_v1beta1.types.Policy]): REQUIRED: The complete policy to be applied to the ``resource``. The - size of the policy is limited to a few 10s of KB. An empty policy is a - valid policy but certain Cloud Platform services (such as Projects) - might reject them. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.Policy` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Policy` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "set_iam_policy" not in self._inner_api_calls: - self._inner_api_calls[ - "set_iam_policy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.set_iam_policy, - default_retry=self._method_configs["SetIamPolicy"].retry, - default_timeout=self._method_configs["SetIamPolicy"].timeout, - client_info=self._client_info, - ) - - request = iam_policy_pb2.SetIamPolicyRequest(resource=resource, policy=policy) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("resource", resource)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["set_iam_policy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def get_iam_policy( - self, - resource, - options_=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets the access control policy for a resource. A ``NOT_FOUND`` error - is returned if the resource does not exist. An empty policy is returned - if the resource exists but does not have a policy set on it. - - Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - Callers must have following Google IAM permission - - - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on tag - templates. - - ``datacatalog.entries.getIamPolicy`` to get policies on entries. - - ``datacatalog.entryGroups.getIamPolicy`` to get policies on entry - groups. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> # TODO: Initialize `resource`: - >>> resource = '' - >>> - >>> response = client.get_iam_policy(resource) - - Args: - resource (str): REQUIRED: The resource for which the policy is being requested. - See the operation documentation for the appropriate value for this field. - options_ (Union[dict, ~google.cloud.datacatalog_v1beta1.types.GetPolicyOptions]): OPTIONAL: A ``GetPolicyOptions`` object for specifying options to - ``GetIamPolicy``. This field is only used by Cloud IAM. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.GetPolicyOptions` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Policy` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_iam_policy" not in self._inner_api_calls: - self._inner_api_calls[ - "get_iam_policy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_iam_policy, - default_retry=self._method_configs["GetIamPolicy"].retry, - default_timeout=self._method_configs["GetIamPolicy"].timeout, - client_info=self._client_info, - ) - - request = iam_policy_pb2.GetIamPolicyRequest( - resource=resource, options=options_ - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("resource", resource)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_iam_policy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def test_iam_permissions( - self, - resource, - permissions, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Returns the caller's permissions on a resource. If the resource does - not exist, an empty set of permissions is returned (We don't return a - ``NOT_FOUND`` error). - - Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - A caller is not required to have Google IAM permission to make this - request. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.DataCatalogClient() - >>> - >>> # TODO: Initialize `resource`: - >>> resource = '' - >>> - >>> # TODO: Initialize `permissions`: - >>> permissions = [] - >>> - >>> response = client.test_iam_permissions(resource, permissions) - - Args: - resource (str): REQUIRED: The resource for which the policy detail is being requested. - See the operation documentation for the appropriate value for this field. - permissions (list[str]): The set of permissions to check for the ``resource``. Permissions - with wildcards (such as '*' or 'storage.*') are not allowed. For more - information see `IAM - Overview `__. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.TestIamPermissionsResponse` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "test_iam_permissions" not in self._inner_api_calls: - self._inner_api_calls[ - "test_iam_permissions" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.test_iam_permissions, - default_retry=self._method_configs["TestIamPermissions"].retry, - default_timeout=self._method_configs["TestIamPermissions"].timeout, - client_info=self._client_info, - ) - - request = iam_policy_pb2.TestIamPermissionsRequest( - resource=resource, permissions=permissions - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("resource", resource)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["test_iam_permissions"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) diff --git a/google/cloud/datacatalog_v1beta1/gapic/data_catalog_client_config.py b/google/cloud/datacatalog_v1beta1/gapic/data_catalog_client_config.py deleted file mode 100644 index e4c261e3..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/data_catalog_client_config.py +++ /dev/null @@ -1,177 +0,0 @@ -config = { - "interfaces": { - "google.cloud.datacatalog.v1beta1.DataCatalog": { - "retry_codes": { - "retry_policy_1_codes": ["DEADLINE_EXCEEDED", "UNAVAILABLE"], - "no_retry_codes": [], - "no_retry_1_codes": [], - }, - "retry_params": { - "retry_policy_1_params": { - "initial_retry_delay_millis": 100, - "retry_delay_multiplier": 1.3, - "max_retry_delay_millis": 60000, - "initial_rpc_timeout_millis": 60000, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 60000, - "total_timeout_millis": 60000, - }, - "no_retry_params": { - "initial_retry_delay_millis": 0, - "retry_delay_multiplier": 0.0, - "max_retry_delay_millis": 0, - "initial_rpc_timeout_millis": 0, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 0, - "total_timeout_millis": 0, - }, - "no_retry_1_params": { - "initial_retry_delay_millis": 0, - "retry_delay_multiplier": 0.0, - "max_retry_delay_millis": 0, - "initial_rpc_timeout_millis": 60000, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 60000, - "total_timeout_millis": 60000, - }, - }, - "methods": { - "SearchCatalog": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "GetEntry": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "LookupEntry": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "CreateEntryGroup": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "UpdateEntryGroup": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "GetEntryGroup": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "DeleteEntryGroup": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "ListEntryGroups": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "CreateEntry": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "UpdateEntry": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "DeleteEntry": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "ListEntries": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "CreateTagTemplate": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "GetTagTemplate": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "UpdateTagTemplate": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "DeleteTagTemplate": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "CreateTagTemplateField": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "UpdateTagTemplateField": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "RenameTagTemplateField": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "DeleteTagTemplateField": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "CreateTag": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "UpdateTag": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "DeleteTag": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "ListTags": { - "timeout_millis": 60000, - "retry_codes_name": "retry_policy_1_codes", - "retry_params_name": "retry_policy_1_params", - }, - "SetIamPolicy": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "GetIamPolicy": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - "TestIamPermissions": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_1_params", - }, - }, - } - } -} diff --git a/google/cloud/datacatalog_v1beta1/gapic/enums.py b/google/cloud/datacatalog_v1beta1/gapic/enums.py deleted file mode 100644 index ebec5ac2..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/enums.py +++ /dev/null @@ -1,125 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Wrappers for protocol buffer enum types.""" - -import enum - - -class EntryType(enum.IntEnum): - """ - Entry resources in Data Catalog can be of different types e.g. a - BigQuery Table entry is of type ``TABLE``. This enum describes all the - possible types Data Catalog contains. - - Attributes: - ENTRY_TYPE_UNSPECIFIED (int): Default unknown type. - TABLE (int): Output only. The type of entry that has a GoogleSQL schema, including - logical views. - MODEL (int): Output only. The type of models. - https://cloud.google.com/bigquery-ml/docs/bigqueryml-intro - DATA_STREAM (int): Output only. An entry type which is used for streaming entries. Example: - Pub/Sub topic. - FILESET (int): An entry type which is a set of files or objects. Example: - Cloud Storage fileset. - """ - - ENTRY_TYPE_UNSPECIFIED = 0 - TABLE = 2 - MODEL = 5 - DATA_STREAM = 3 - FILESET = 4 - - -class IntegratedSystem(enum.IntEnum): - """ - This enum describes all the possible systems that Data Catalog integrates - with. - - Attributes: - INTEGRATED_SYSTEM_UNSPECIFIED (int): Default unknown system. - BIGQUERY (int): BigQuery. - CLOUD_PUBSUB (int): Cloud Pub/Sub. - """ - - INTEGRATED_SYSTEM_UNSPECIFIED = 0 - BIGQUERY = 1 - CLOUD_PUBSUB = 2 - - -class SearchResultType(enum.IntEnum): - """ - The different types of resources that can be returned in search. - - Attributes: - SEARCH_RESULT_TYPE_UNSPECIFIED (int): Default unknown type. - ENTRY (int): An ``Entry``. - TAG_TEMPLATE (int): A ``TagTemplate``. - ENTRY_GROUP (int): An ``EntryGroup``. - """ - - SEARCH_RESULT_TYPE_UNSPECIFIED = 0 - ENTRY = 1 - TAG_TEMPLATE = 2 - ENTRY_GROUP = 3 - - -class TableSourceType(enum.IntEnum): - """ - Table source type. - - Attributes: - TABLE_SOURCE_TYPE_UNSPECIFIED (int): Default unknown type. - BIGQUERY_VIEW (int): Table view. - BIGQUERY_TABLE (int): BigQuery native table. - """ - - TABLE_SOURCE_TYPE_UNSPECIFIED = 0 - BIGQUERY_VIEW = 2 - BIGQUERY_TABLE = 5 - - -class FieldType(object): - class PrimitiveType(enum.IntEnum): - """ - Attributes: - PRIMITIVE_TYPE_UNSPECIFIED (int): This is the default invalid value for a type. - DOUBLE (int): A double precision number. - STRING (int): An UTF-8 string. - BOOL (int): A boolean value. - TIMESTAMP (int): A timestamp. - """ - - PRIMITIVE_TYPE_UNSPECIFIED = 0 - DOUBLE = 1 - STRING = 2 - BOOL = 3 - TIMESTAMP = 4 - - -class Taxonomy(object): - class PolicyType(enum.IntEnum): - """ - Defines policy types where policy tag can be used for. - - Attributes: - POLICY_TYPE_UNSPECIFIED (int): Unspecified policy type. - FINE_GRAINED_ACCESS_CONTROL (int): Fine grained access control policy, which enables access control on - tagged resources. - """ - - POLICY_TYPE_UNSPECIFIED = 0 - FINE_GRAINED_ACCESS_CONTROL = 1 diff --git a/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_client.py b/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_client.py deleted file mode 100644 index 4d49ce44..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_client.py +++ /dev/null @@ -1,1270 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Accesses the google.cloud.datacatalog.v1beta1 PolicyTagManager API.""" - -import functools -import pkg_resources -import warnings - -from google.oauth2 import service_account -import google.api_core.client_options -import google.api_core.gapic_v1.client_info -import google.api_core.gapic_v1.config -import google.api_core.gapic_v1.method -import google.api_core.gapic_v1.routing_header -import google.api_core.grpc_helpers -import google.api_core.page_iterator -import google.api_core.path_template -import grpc - -from google.cloud.datacatalog_v1beta1.gapic import enums -from google.cloud.datacatalog_v1beta1.gapic import policy_tag_manager_client_config -from google.cloud.datacatalog_v1beta1.gapic.transports import ( - policy_tag_manager_grpc_transport, -) -from google.cloud.datacatalog_v1beta1.proto import datacatalog_pb2 -from google.cloud.datacatalog_v1beta1.proto import datacatalog_pb2_grpc -from google.cloud.datacatalog_v1beta1.proto import policytagmanager_pb2 -from google.cloud.datacatalog_v1beta1.proto import policytagmanager_pb2_grpc -from google.cloud.datacatalog_v1beta1.proto import tags_pb2 -from google.iam.v1 import iam_policy_pb2 -from google.iam.v1 import options_pb2 -from google.iam.v1 import policy_pb2 -from google.protobuf import empty_pb2 -from google.protobuf import field_mask_pb2 - - -_GAPIC_LIBRARY_VERSION = pkg_resources.get_distribution( - "google-cloud-datacatalog" -).version - - -class PolicyTagManagerClient(object): - """ - The policy tag manager API service allows clients to manage their taxonomies - and policy tags. - """ - - SERVICE_ADDRESS = "datacatalog.googleapis.com:443" - """The default address of the service.""" - - # The name of the interface for this client. This is the key used to - # find the method configuration in the client_config dictionary. - _INTERFACE_NAME = "google.cloud.datacatalog.v1beta1.PolicyTagManager" - - @classmethod - def from_service_account_file(cls, filename, *args, **kwargs): - """Creates an instance of this client using the provided credentials - file. - - Args: - filename (str): The path to the service account private key json - file. - args: Additional arguments to pass to the constructor. - kwargs: Additional arguments to pass to the constructor. - - Returns: - PolicyTagManagerClient: The constructed client. - """ - credentials = service_account.Credentials.from_service_account_file(filename) - kwargs["credentials"] = credentials - return cls(*args, **kwargs) - - from_service_account_json = from_service_account_file - - @classmethod - def location_path(cls, project, location): - """Return a fully-qualified location string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}", - project=project, - location=location, - ) - - @classmethod - def policy_tag_path(cls, project, location, taxonomy, policy_tag): - """Return a fully-qualified policy_tag string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/taxonomies/{taxonomy}/policyTags/{policy_tag}", - project=project, - location=location, - taxonomy=taxonomy, - policy_tag=policy_tag, - ) - - @classmethod - def taxonomy_path(cls, project, location, taxonomy): - """Return a fully-qualified taxonomy string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/taxonomies/{taxonomy}", - project=project, - location=location, - taxonomy=taxonomy, - ) - - def __init__( - self, - transport=None, - channel=None, - credentials=None, - client_config=None, - client_info=None, - client_options=None, - ): - """Constructor. - - Args: - transport (Union[~.PolicyTagManagerGrpcTransport, - Callable[[~.Credentials, type], ~.PolicyTagManagerGrpcTransport]): A transport - instance, responsible for actually making the API calls. - The default transport uses the gRPC protocol. - This argument may also be a callable which returns a - transport instance. Callables will be sent the credentials - as the first argument and the default transport class as - the second argument. - channel (grpc.Channel): DEPRECATED. A ``Channel`` instance - through which to make calls. This argument is mutually exclusive - with ``credentials``; providing both will raise an exception. - credentials (google.auth.credentials.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If none - are specified, the client will attempt to ascertain the - credentials from the environment. - This argument is mutually exclusive with providing a - transport instance to ``transport``; doing so will raise - an exception. - client_config (dict): DEPRECATED. A dictionary of call options for - each method. If not specified, the default configuration is used. - client_info (google.api_core.gapic_v1.client_info.ClientInfo): - The client info used to send a user-agent string along with - API requests. If ``None``, then default info will be used. - Generally, you only need to set this if you're developing - your own client library. - client_options (Union[dict, google.api_core.client_options.ClientOptions]): - Client options used to set user options on the client. API Endpoint - should be set through client_options. - """ - # Raise deprecation warnings for things we want to go away. - if client_config is not None: - warnings.warn( - "The `client_config` argument is deprecated.", - PendingDeprecationWarning, - stacklevel=2, - ) - else: - client_config = policy_tag_manager_client_config.config - - if channel: - warnings.warn( - "The `channel` argument is deprecated; use " "`transport` instead.", - PendingDeprecationWarning, - stacklevel=2, - ) - - api_endpoint = self.SERVICE_ADDRESS - if client_options: - if type(client_options) == dict: - client_options = google.api_core.client_options.from_dict( - client_options - ) - if client_options.api_endpoint: - api_endpoint = client_options.api_endpoint - - # Instantiate the transport. - # The transport is responsible for handling serialization and - # deserialization and actually sending data to the service. - if transport: - if callable(transport): - self.transport = transport( - credentials=credentials, - default_class=policy_tag_manager_grpc_transport.PolicyTagManagerGrpcTransport, - address=api_endpoint, - ) - else: - if credentials: - raise ValueError( - "Received both a transport instance and " - "credentials; these are mutually exclusive." - ) - self.transport = transport - else: - self.transport = policy_tag_manager_grpc_transport.PolicyTagManagerGrpcTransport( - address=api_endpoint, channel=channel, credentials=credentials - ) - - if client_info is None: - client_info = google.api_core.gapic_v1.client_info.ClientInfo( - gapic_version=_GAPIC_LIBRARY_VERSION - ) - else: - client_info.gapic_version = _GAPIC_LIBRARY_VERSION - self._client_info = client_info - - # Parse out the default settings for retry and timeout for each RPC - # from the client configuration. - # (Ordinarily, these are the defaults specified in the `*_config.py` - # file next to this one.) - self._method_configs = google.api_core.gapic_v1.config.parse_method_configs( - client_config["interfaces"][self._INTERFACE_NAME] - ) - - # Save a dictionary of cached API call functions. - # These are the actual callables which invoke the proper - # transport methods, wrapped with `wrap_method` to add retry, - # timeout, and the like. - self._inner_api_calls = {} - - # Service calls - def create_taxonomy( - self, - parent, - taxonomy=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates a taxonomy in the specified project. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> parent = client.location_path('[PROJECT]', '[LOCATION]') - >>> - >>> response = client.create_taxonomy(parent) - - Args: - parent (str): Required. Resource name of the project that the taxonomy will belong to. - taxonomy (Union[dict, ~google.cloud.datacatalog_v1beta1.types.Taxonomy]): The taxonomy to be created. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.Taxonomy` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Taxonomy` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_taxonomy" not in self._inner_api_calls: - self._inner_api_calls[ - "create_taxonomy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_taxonomy, - default_retry=self._method_configs["CreateTaxonomy"].retry, - default_timeout=self._method_configs["CreateTaxonomy"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.CreateTaxonomyRequest( - parent=parent, taxonomy=taxonomy - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_taxonomy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_taxonomy( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes a taxonomy. This operation will also delete all - policy tags in this taxonomy along with their associated policies. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> name = client.taxonomy_path('[PROJECT]', '[LOCATION]', '[TAXONOMY]') - >>> - >>> client.delete_taxonomy(name) - - Args: - name (str): Required. Resource name of the taxonomy to be deleted. All policy tags in - this taxonomy will also be deleted. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_taxonomy" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_taxonomy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_taxonomy, - default_retry=self._method_configs["DeleteTaxonomy"].retry, - default_timeout=self._method_configs["DeleteTaxonomy"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.DeleteTaxonomyRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_taxonomy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_taxonomy( - self, - taxonomy=None, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates a taxonomy. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> response = client.update_taxonomy() - - Args: - taxonomy (Union[dict, ~google.cloud.datacatalog_v1beta1.types.Taxonomy]): The taxonomy to update. Only description, display_name, and - activated policy types can be updated. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.Taxonomy` - update_mask (Union[dict, ~google.cloud.datacatalog_v1beta1.types.FieldMask]): The update mask applies to the resource. For the ``FieldMask`` - definition, see - https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask - If not set, defaults to all of the fields that are allowed to update. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Taxonomy` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_taxonomy" not in self._inner_api_calls: - self._inner_api_calls[ - "update_taxonomy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_taxonomy, - default_retry=self._method_configs["UpdateTaxonomy"].retry, - default_timeout=self._method_configs["UpdateTaxonomy"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.UpdateTaxonomyRequest( - taxonomy=taxonomy, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("taxonomy.name", taxonomy.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_taxonomy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def list_taxonomies( - self, - parent, - page_size=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Lists all taxonomies in a project in a particular location that the caller - has permission to view. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> parent = client.location_path('[PROJECT]', '[LOCATION]') - >>> - >>> # Iterate over all results - >>> for element in client.list_taxonomies(parent): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.list_taxonomies(parent).pages: - ... for element in page: - ... # process element - ... pass - - Args: - parent (str): Required. Resource name of the project to list the taxonomies of. - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1beta1.types.Taxonomy` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "list_taxonomies" not in self._inner_api_calls: - self._inner_api_calls[ - "list_taxonomies" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.list_taxonomies, - default_retry=self._method_configs["ListTaxonomies"].retry, - default_timeout=self._method_configs["ListTaxonomies"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.ListTaxonomiesRequest( - parent=parent, page_size=page_size - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["list_taxonomies"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="taxonomies", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def get_taxonomy( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets a taxonomy. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> name = client.taxonomy_path('[PROJECT]', '[LOCATION]', '[TAXONOMY]') - >>> - >>> response = client.get_taxonomy(name) - - Args: - name (str): Required. Resource name of the requested taxonomy. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Taxonomy` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_taxonomy" not in self._inner_api_calls: - self._inner_api_calls[ - "get_taxonomy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_taxonomy, - default_retry=self._method_configs["GetTaxonomy"].retry, - default_timeout=self._method_configs["GetTaxonomy"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.GetTaxonomyRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_taxonomy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def create_policy_tag( - self, - parent, - policy_tag=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Creates a policy tag in the specified taxonomy. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> parent = client.taxonomy_path('[PROJECT]', '[LOCATION]', '[TAXONOMY]') - >>> - >>> response = client.create_policy_tag(parent) - - Args: - parent (str): Required. Resource name of the taxonomy that the policy tag will belong to. - policy_tag (Union[dict, ~google.cloud.datacatalog_v1beta1.types.PolicyTag]): The policy tag to be created. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.PolicyTag` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.PolicyTag` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "create_policy_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "create_policy_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.create_policy_tag, - default_retry=self._method_configs["CreatePolicyTag"].retry, - default_timeout=self._method_configs["CreatePolicyTag"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.CreatePolicyTagRequest( - parent=parent, policy_tag=policy_tag - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["create_policy_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def delete_policy_tag( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Deletes a policy tag. Also deletes all of its descendant policy tags. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> name = client.policy_tag_path('[PROJECT]', '[LOCATION]', '[TAXONOMY]', '[POLICY_TAG]') - >>> - >>> client.delete_policy_tag(name) - - Args: - name (str): Required. Resource name of the policy tag to be deleted. All of its descendant - policy tags will also be deleted. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "delete_policy_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "delete_policy_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.delete_policy_tag, - default_retry=self._method_configs["DeletePolicyTag"].retry, - default_timeout=self._method_configs["DeletePolicyTag"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.DeletePolicyTagRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - self._inner_api_calls["delete_policy_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def update_policy_tag( - self, - policy_tag=None, - update_mask=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Updates a policy tag. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> response = client.update_policy_tag() - - Args: - policy_tag (Union[dict, ~google.cloud.datacatalog_v1beta1.types.PolicyTag]): The policy tag to update. Only the description, display_name, and - parent_policy_tag fields can be updated. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.PolicyTag` - update_mask (Union[dict, ~google.cloud.datacatalog_v1beta1.types.FieldMask]): The update mask applies to the resource. Only display_name, - description and parent_policy_tag can be updated and thus can be listed - in the mask. If update_mask is not provided, all allowed fields (i.e. - display_name, description and parent) will be updated. For more - information including the ``FieldMask`` definition, see - https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask - If not set, defaults to all of the fields that are allowed to update. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.FieldMask` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.PolicyTag` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "update_policy_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "update_policy_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.update_policy_tag, - default_retry=self._method_configs["UpdatePolicyTag"].retry, - default_timeout=self._method_configs["UpdatePolicyTag"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.UpdatePolicyTagRequest( - policy_tag=policy_tag, update_mask=update_mask - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("policy_tag.name", policy_tag.name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["update_policy_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def list_policy_tags( - self, - parent, - page_size=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Lists all policy tags in a taxonomy. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> parent = client.taxonomy_path('[PROJECT]', '[LOCATION]', '[TAXONOMY]') - >>> - >>> # Iterate over all results - >>> for element in client.list_policy_tags(parent): - ... # process element - ... pass - >>> - >>> - >>> # Alternatively: - >>> - >>> # Iterate over results one page at a time - >>> for page in client.list_policy_tags(parent).pages: - ... for element in page: - ... # process element - ... pass - - Args: - parent (str): Required. Resource name of the taxonomy to list the policy tags of. - page_size (int): The maximum number of resources contained in the - underlying API response. If page streaming is performed per- - resource, this parameter does not affect the return value. If page - streaming is performed per-page, this determines the maximum number - of resources in a page. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.api_core.page_iterator.PageIterator` instance. - An iterable of :class:`~google.cloud.datacatalog_v1beta1.types.PolicyTag` instances. - You can also iterate over the pages of the response - using its `pages` property. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "list_policy_tags" not in self._inner_api_calls: - self._inner_api_calls[ - "list_policy_tags" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.list_policy_tags, - default_retry=self._method_configs["ListPolicyTags"].retry, - default_timeout=self._method_configs["ListPolicyTags"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.ListPolicyTagsRequest( - parent=parent, page_size=page_size - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - iterator = google.api_core.page_iterator.GRPCIterator( - client=None, - method=functools.partial( - self._inner_api_calls["list_policy_tags"], - retry=retry, - timeout=timeout, - metadata=metadata, - ), - request=request, - items_field="policy_tags", - request_token_field="page_token", - response_token_field="next_page_token", - ) - return iterator - - def get_policy_tag( - self, - name, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets a policy tag. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> name = client.policy_tag_path('[PROJECT]', '[LOCATION]', '[TAXONOMY]', '[POLICY_TAG]') - >>> - >>> response = client.get_policy_tag(name) - - Args: - name (str): Required. Resource name of the requested policy tag. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.PolicyTag` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_policy_tag" not in self._inner_api_calls: - self._inner_api_calls[ - "get_policy_tag" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_policy_tag, - default_retry=self._method_configs["GetPolicyTag"].retry, - default_timeout=self._method_configs["GetPolicyTag"].timeout, - client_info=self._client_info, - ) - - request = policytagmanager_pb2.GetPolicyTagRequest(name=name) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("name", name)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_policy_tag"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def get_iam_policy( - self, - resource, - options_=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Gets the IAM policy for a taxonomy or a policy tag. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> # TODO: Initialize `resource`: - >>> resource = '' - >>> - >>> response = client.get_iam_policy(resource) - - Args: - resource (str): REQUIRED: The resource for which the policy is being requested. - See the operation documentation for the appropriate value for this field. - options_ (Union[dict, ~google.cloud.datacatalog_v1beta1.types.GetPolicyOptions]): OPTIONAL: A ``GetPolicyOptions`` object for specifying options to - ``GetIamPolicy``. This field is only used by Cloud IAM. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.GetPolicyOptions` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Policy` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "get_iam_policy" not in self._inner_api_calls: - self._inner_api_calls[ - "get_iam_policy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.get_iam_policy, - default_retry=self._method_configs["GetIamPolicy"].retry, - default_timeout=self._method_configs["GetIamPolicy"].timeout, - client_info=self._client_info, - ) - - request = iam_policy_pb2.GetIamPolicyRequest( - resource=resource, options=options_ - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("resource", resource)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["get_iam_policy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def set_iam_policy( - self, - resource, - policy, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Sets the IAM policy for a taxonomy or a policy tag. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> # TODO: Initialize `resource`: - >>> resource = '' - >>> - >>> # TODO: Initialize `policy`: - >>> policy = {} - >>> - >>> response = client.set_iam_policy(resource, policy) - - Args: - resource (str): REQUIRED: The resource for which the policy is being specified. - See the operation documentation for the appropriate value for this field. - policy (Union[dict, ~google.cloud.datacatalog_v1beta1.types.Policy]): REQUIRED: The complete policy to be applied to the ``resource``. The - size of the policy is limited to a few 10s of KB. An empty policy is a - valid policy but certain Cloud Platform services (such as Projects) - might reject them. - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.Policy` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.Policy` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "set_iam_policy" not in self._inner_api_calls: - self._inner_api_calls[ - "set_iam_policy" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.set_iam_policy, - default_retry=self._method_configs["SetIamPolicy"].retry, - default_timeout=self._method_configs["SetIamPolicy"].timeout, - client_info=self._client_info, - ) - - request = iam_policy_pb2.SetIamPolicyRequest(resource=resource, policy=policy) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("resource", resource)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["set_iam_policy"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def test_iam_permissions( - self, - resource, - permissions, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Returns the permissions that a caller has on the specified taxonomy or - policy tag. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerClient() - >>> - >>> # TODO: Initialize `resource`: - >>> resource = '' - >>> - >>> # TODO: Initialize `permissions`: - >>> permissions = [] - >>> - >>> response = client.test_iam_permissions(resource, permissions) - - Args: - resource (str): REQUIRED: The resource for which the policy detail is being requested. - See the operation documentation for the appropriate value for this field. - permissions (list[str]): The set of permissions to check for the ``resource``. Permissions - with wildcards (such as '*' or 'storage.*') are not allowed. For more - information see `IAM - Overview `__. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.TestIamPermissionsResponse` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "test_iam_permissions" not in self._inner_api_calls: - self._inner_api_calls[ - "test_iam_permissions" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.test_iam_permissions, - default_retry=self._method_configs["TestIamPermissions"].retry, - default_timeout=self._method_configs["TestIamPermissions"].timeout, - client_info=self._client_info, - ) - - request = iam_policy_pb2.TestIamPermissionsRequest( - resource=resource, permissions=permissions - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("resource", resource)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["test_iam_permissions"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) diff --git a/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_client_config.py b/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_client_config.py deleted file mode 100644 index 57019019..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_client_config.py +++ /dev/null @@ -1,107 +0,0 @@ -config = { - "interfaces": { - "google.cloud.datacatalog.v1beta1.PolicyTagManager": { - "retry_codes": { - "retry_policy_1_codes": ["DEADLINE_EXCEEDED", "UNAVAILABLE"], - "no_retry_codes": [], - "no_retry_1_codes": [], - }, - "retry_params": { - "retry_policy_1_params": { - "initial_retry_delay_millis": 100, - "retry_delay_multiplier": 1.3, - "max_retry_delay_millis": 60000, - "initial_rpc_timeout_millis": 60000, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 60000, - "total_timeout_millis": 60000, - }, - "no_retry_params": { - "initial_retry_delay_millis": 0, - "retry_delay_multiplier": 0.0, - "max_retry_delay_millis": 0, - "initial_rpc_timeout_millis": 0, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 0, - "total_timeout_millis": 0, - }, - "no_retry_1_params": { - "initial_retry_delay_millis": 0, - "retry_delay_multiplier": 0.0, - "max_retry_delay_millis": 0, - "initial_rpc_timeout_millis": 60000, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 60000, - "total_timeout_millis": 60000, - }, - }, - "methods": { - "CreateTaxonomy": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "DeleteTaxonomy": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "UpdateTaxonomy": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "ListTaxonomies": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "GetTaxonomy": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "CreatePolicyTag": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "DeletePolicyTag": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "UpdatePolicyTag": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "ListPolicyTags": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "GetPolicyTag": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "GetIamPolicy": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_params", - }, - "SetIamPolicy": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_params", - }, - "TestIamPermissions": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_1_codes", - "retry_params_name": "no_retry_params", - }, - }, - } - } -} diff --git a/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_serialization_client.py b/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_serialization_client.py deleted file mode 100644 index 32356923..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_serialization_client.py +++ /dev/null @@ -1,399 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Accesses the google.cloud.datacatalog.v1beta1 PolicyTagManagerSerialization API.""" - -import pkg_resources -import warnings - -from google.oauth2 import service_account -import google.api_core.client_options -import google.api_core.gapic_v1.client_info -import google.api_core.gapic_v1.config -import google.api_core.gapic_v1.method -import google.api_core.gapic_v1.routing_header -import google.api_core.grpc_helpers -import google.api_core.path_template -import google.api_core.protobuf_helpers -import grpc - -from google.cloud.datacatalog_v1beta1.gapic import enums -from google.cloud.datacatalog_v1beta1.gapic import ( - policy_tag_manager_serialization_client_config, -) -from google.cloud.datacatalog_v1beta1.gapic.transports import ( - policy_tag_manager_serialization_grpc_transport, -) -from google.cloud.datacatalog_v1beta1.proto import datacatalog_pb2 -from google.cloud.datacatalog_v1beta1.proto import datacatalog_pb2_grpc -from google.cloud.datacatalog_v1beta1.proto import policytagmanager_pb2 -from google.cloud.datacatalog_v1beta1.proto import policytagmanager_pb2_grpc -from google.cloud.datacatalog_v1beta1.proto import policytagmanagerserialization_pb2 -from google.cloud.datacatalog_v1beta1.proto import ( - policytagmanagerserialization_pb2_grpc, -) -from google.cloud.datacatalog_v1beta1.proto import tags_pb2 -from google.iam.v1 import iam_policy_pb2 -from google.iam.v1 import options_pb2 -from google.iam.v1 import policy_pb2 -from google.protobuf import empty_pb2 -from google.protobuf import field_mask_pb2 - - -_GAPIC_LIBRARY_VERSION = pkg_resources.get_distribution( - "google-cloud-datacatalog" -).version - - -class PolicyTagManagerSerializationClient(object): - """ - Policy tag manager serialization API service allows clients to manipulate - their taxonomies and policy tags data with serialized format. - """ - - SERVICE_ADDRESS = "datacatalog.googleapis.com:443" - """The default address of the service.""" - - # The name of the interface for this client. This is the key used to - # find the method configuration in the client_config dictionary. - _INTERFACE_NAME = "google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization" - - @classmethod - def from_service_account_file(cls, filename, *args, **kwargs): - """Creates an instance of this client using the provided credentials - file. - - Args: - filename (str): The path to the service account private key json - file. - args: Additional arguments to pass to the constructor. - kwargs: Additional arguments to pass to the constructor. - - Returns: - PolicyTagManagerSerializationClient: The constructed client. - """ - credentials = service_account.Credentials.from_service_account_file(filename) - kwargs["credentials"] = credentials - return cls(*args, **kwargs) - - from_service_account_json = from_service_account_file - - @classmethod - def location_path(cls, project, location): - """Return a fully-qualified location string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}", - project=project, - location=location, - ) - - @classmethod - def taxonomy_path(cls, project, location, taxonomy): - """Return a fully-qualified taxonomy string.""" - return google.api_core.path_template.expand( - "projects/{project}/locations/{location}/taxonomies/{taxonomy}", - project=project, - location=location, - taxonomy=taxonomy, - ) - - def __init__( - self, - transport=None, - channel=None, - credentials=None, - client_config=None, - client_info=None, - client_options=None, - ): - """Constructor. - - Args: - transport (Union[~.PolicyTagManagerSerializationGrpcTransport, - Callable[[~.Credentials, type], ~.PolicyTagManagerSerializationGrpcTransport]): A transport - instance, responsible for actually making the API calls. - The default transport uses the gRPC protocol. - This argument may also be a callable which returns a - transport instance. Callables will be sent the credentials - as the first argument and the default transport class as - the second argument. - channel (grpc.Channel): DEPRECATED. A ``Channel`` instance - through which to make calls. This argument is mutually exclusive - with ``credentials``; providing both will raise an exception. - credentials (google.auth.credentials.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If none - are specified, the client will attempt to ascertain the - credentials from the environment. - This argument is mutually exclusive with providing a - transport instance to ``transport``; doing so will raise - an exception. - client_config (dict): DEPRECATED. A dictionary of call options for - each method. If not specified, the default configuration is used. - client_info (google.api_core.gapic_v1.client_info.ClientInfo): - The client info used to send a user-agent string along with - API requests. If ``None``, then default info will be used. - Generally, you only need to set this if you're developing - your own client library. - client_options (Union[dict, google.api_core.client_options.ClientOptions]): - Client options used to set user options on the client. API Endpoint - should be set through client_options. - """ - # Raise deprecation warnings for things we want to go away. - if client_config is not None: - warnings.warn( - "The `client_config` argument is deprecated.", - PendingDeprecationWarning, - stacklevel=2, - ) - else: - client_config = policy_tag_manager_serialization_client_config.config - - if channel: - warnings.warn( - "The `channel` argument is deprecated; use " "`transport` instead.", - PendingDeprecationWarning, - stacklevel=2, - ) - - api_endpoint = self.SERVICE_ADDRESS - if client_options: - if type(client_options) == dict: - client_options = google.api_core.client_options.from_dict( - client_options - ) - if client_options.api_endpoint: - api_endpoint = client_options.api_endpoint - - # Instantiate the transport. - # The transport is responsible for handling serialization and - # deserialization and actually sending data to the service. - if transport: - if callable(transport): - self.transport = transport( - credentials=credentials, - default_class=policy_tag_manager_serialization_grpc_transport.PolicyTagManagerSerializationGrpcTransport, - address=api_endpoint, - ) - else: - if credentials: - raise ValueError( - "Received both a transport instance and " - "credentials; these are mutually exclusive." - ) - self.transport = transport - else: - self.transport = policy_tag_manager_serialization_grpc_transport.PolicyTagManagerSerializationGrpcTransport( - address=api_endpoint, channel=channel, credentials=credentials - ) - - if client_info is None: - client_info = google.api_core.gapic_v1.client_info.ClientInfo( - gapic_version=_GAPIC_LIBRARY_VERSION - ) - else: - client_info.gapic_version = _GAPIC_LIBRARY_VERSION - self._client_info = client_info - - # Parse out the default settings for retry and timeout for each RPC - # from the client configuration. - # (Ordinarily, these are the defaults specified in the `*_config.py` - # file next to this one.) - self._method_configs = google.api_core.gapic_v1.config.parse_method_configs( - client_config["interfaces"][self._INTERFACE_NAME] - ) - - # Save a dictionary of cached API call functions. - # These are the actual callables which invoke the proper - # transport methods, wrapped with `wrap_method` to add retry, - # timeout, and the like. - self._inner_api_calls = {} - - # Service calls - def import_taxonomies( - self, - parent, - inline_source=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Imports all taxonomies and their policy tags to a project as new - taxonomies. - - This method provides a bulk taxonomy / policy tag creation using nested - proto structure. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerSerializationClient() - >>> - >>> parent = client.location_path('[PROJECT]', '[LOCATION]') - >>> - >>> response = client.import_taxonomies(parent) - - Args: - parent (str): Required. Resource name of project that the newly created taxonomies will - belong to. - inline_source (Union[dict, ~google.cloud.datacatalog_v1beta1.types.InlineSource]): Inline source used for taxonomies import - - If a dict is provided, it must be of the same form as the protobuf - message :class:`~google.cloud.datacatalog_v1beta1.types.InlineSource` - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.ImportTaxonomiesResponse` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "import_taxonomies" not in self._inner_api_calls: - self._inner_api_calls[ - "import_taxonomies" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.import_taxonomies, - default_retry=self._method_configs["ImportTaxonomies"].retry, - default_timeout=self._method_configs["ImportTaxonomies"].timeout, - client_info=self._client_info, - ) - - # Sanity check: We have some fields which are mutually exclusive; - # raise ValueError if more than one is sent. - google.api_core.protobuf_helpers.check_oneof(inline_source=inline_source) - - request = policytagmanagerserialization_pb2.ImportTaxonomiesRequest( - parent=parent, inline_source=inline_source - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["import_taxonomies"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) - - def export_taxonomies( - self, - parent, - taxonomies, - serialized_taxonomies=None, - retry=google.api_core.gapic_v1.method.DEFAULT, - timeout=google.api_core.gapic_v1.method.DEFAULT, - metadata=None, - ): - """ - Exports all taxonomies and their policy tags in a project. - - This method generates SerializedTaxonomy protos with nested policy tags - that can be used as an input for future ImportTaxonomies calls. - - Example: - >>> from google.cloud import datacatalog_v1beta1 - >>> - >>> client = datacatalog_v1beta1.PolicyTagManagerSerializationClient() - >>> - >>> parent = client.location_path('[PROJECT]', '[LOCATION]') - >>> - >>> # TODO: Initialize `taxonomies`: - >>> taxonomies = [] - >>> - >>> response = client.export_taxonomies(parent, taxonomies) - - Args: - parent (str): Required. Resource name of the project that taxonomies to be exported - will share. - taxonomies (list[str]): Required. Resource names of the taxonomies to be exported. - serialized_taxonomies (bool): Export taxonomies as serialized taxonomies. - retry (Optional[google.api_core.retry.Retry]): A retry object used - to retry requests. If ``None`` is specified, requests will - be retried using a default configuration. - timeout (Optional[float]): The amount of time, in seconds, to wait - for the request to complete. Note that if ``retry`` is - specified, the timeout applies to each individual attempt. - metadata (Optional[Sequence[Tuple[str, str]]]): Additional metadata - that is provided to the method. - - Returns: - A :class:`~google.cloud.datacatalog_v1beta1.types.ExportTaxonomiesResponse` instance. - - Raises: - google.api_core.exceptions.GoogleAPICallError: If the request - failed for any reason. - google.api_core.exceptions.RetryError: If the request failed due - to a retryable error and retry attempts failed. - ValueError: If the parameters are invalid. - """ - # Wrap the transport method to add retry and timeout logic. - if "export_taxonomies" not in self._inner_api_calls: - self._inner_api_calls[ - "export_taxonomies" - ] = google.api_core.gapic_v1.method.wrap_method( - self.transport.export_taxonomies, - default_retry=self._method_configs["ExportTaxonomies"].retry, - default_timeout=self._method_configs["ExportTaxonomies"].timeout, - client_info=self._client_info, - ) - - # Sanity check: We have some fields which are mutually exclusive; - # raise ValueError if more than one is sent. - google.api_core.protobuf_helpers.check_oneof( - serialized_taxonomies=serialized_taxonomies - ) - - request = policytagmanagerserialization_pb2.ExportTaxonomiesRequest( - parent=parent, - taxonomies=taxonomies, - serialized_taxonomies=serialized_taxonomies, - ) - if metadata is None: - metadata = [] - metadata = list(metadata) - try: - routing_header = [("parent", parent)] - except AttributeError: - pass - else: - routing_metadata = google.api_core.gapic_v1.routing_header.to_grpc_metadata( - routing_header - ) - metadata.append(routing_metadata) - - return self._inner_api_calls["export_taxonomies"]( - request, retry=retry, timeout=timeout, metadata=metadata - ) diff --git a/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_serialization_client_config.py b/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_serialization_client_config.py deleted file mode 100644 index ec539229..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/policy_tag_manager_serialization_client_config.py +++ /dev/null @@ -1,52 +0,0 @@ -config = { - "interfaces": { - "google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization": { - "retry_codes": { - "retry_policy_1_codes": ["DEADLINE_EXCEEDED", "UNAVAILABLE"], - "no_retry_codes": [], - "no_retry_1_codes": [], - }, - "retry_params": { - "retry_policy_1_params": { - "initial_retry_delay_millis": 100, - "retry_delay_multiplier": 1.3, - "max_retry_delay_millis": 60000, - "initial_rpc_timeout_millis": 60000, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 60000, - "total_timeout_millis": 60000, - }, - "no_retry_params": { - "initial_retry_delay_millis": 0, - "retry_delay_multiplier": 0.0, - "max_retry_delay_millis": 0, - "initial_rpc_timeout_millis": 0, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 0, - "total_timeout_millis": 0, - }, - "no_retry_1_params": { - "initial_retry_delay_millis": 0, - "retry_delay_multiplier": 0.0, - "max_retry_delay_millis": 0, - "initial_rpc_timeout_millis": 60000, - "rpc_timeout_multiplier": 1.0, - "max_rpc_timeout_millis": 60000, - "total_timeout_millis": 60000, - }, - }, - "methods": { - "ImportTaxonomies": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - "ExportTaxonomies": { - "timeout_millis": 60000, - "retry_codes_name": "no_retry_codes", - "retry_params_name": "no_retry_params", - }, - }, - } - } -} diff --git a/google/cloud/datacatalog_v1beta1/gapic/transports/__init__.py b/google/cloud/datacatalog_v1beta1/gapic/transports/__init__.py deleted file mode 100644 index e69de29b..00000000 diff --git a/google/cloud/datacatalog_v1beta1/gapic/transports/data_catalog_grpc_transport.py b/google/cloud/datacatalog_v1beta1/gapic/transports/data_catalog_grpc_transport.py deleted file mode 100644 index 2b4a79e5..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/transports/data_catalog_grpc_transport.py +++ /dev/null @@ -1,591 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - - -import google.api_core.grpc_helpers - -from google.cloud.datacatalog_v1beta1.proto import datacatalog_pb2_grpc - - -class DataCatalogGrpcTransport(object): - """gRPC transport class providing stubs for - google.cloud.datacatalog.v1beta1 DataCatalog API. - - The transport provides access to the raw gRPC stubs, - which can be used to take advantage of advanced - features of gRPC. - """ - - # The scopes needed to make gRPC calls to all of the methods defined - # in this service. - _OAUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",) - - def __init__( - self, channel=None, credentials=None, address="datacatalog.googleapis.com:443" - ): - """Instantiate the transport class. - - Args: - channel (grpc.Channel): A ``Channel`` instance through - which to make calls. This argument is mutually exclusive - with ``credentials``; providing both will raise an exception. - credentials (google.auth.credentials.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If none - are specified, the client will attempt to ascertain the - credentials from the environment. - address (str): The address where the service is hosted. - """ - # If both `channel` and `credentials` are specified, raise an - # exception (channels come with credentials baked in already). - if channel is not None and credentials is not None: - raise ValueError( - "The `channel` and `credentials` arguments are mutually " "exclusive." - ) - - # Create the channel. - if channel is None: - channel = self.create_channel( - address=address, - credentials=credentials, - options={ - "grpc.max_send_message_length": -1, - "grpc.max_receive_message_length": -1, - }.items(), - ) - - self._channel = channel - - # gRPC uses objects called "stubs" that are bound to the - # channel and provide a basic method for each RPC. - self._stubs = { - "data_catalog_stub": datacatalog_pb2_grpc.DataCatalogStub(channel) - } - - @classmethod - def create_channel( - cls, address="datacatalog.googleapis.com:443", credentials=None, **kwargs - ): - """Create and return a gRPC channel object. - - Args: - address (str): The host for the channel to use. - credentials (~.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If - none are specified, the client will attempt to ascertain - the credentials from the environment. - kwargs (dict): Keyword arguments, which are passed to the - channel creation. - - Returns: - grpc.Channel: A gRPC channel object. - """ - return google.api_core.grpc_helpers.create_channel( - address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs - ) - - @property - def channel(self): - """The gRPC channel used by the transport. - - Returns: - grpc.Channel: A gRPC channel object. - """ - return self._channel - - @property - def search_catalog(self): - """Return the gRPC stub for :meth:`DataCatalogClient.search_catalog`. - - Searches Data Catalog for multiple resources like entries, tags that - match a query. - - This is a custom method - (https://cloud.google.com/apis/design/custom_methods) and does not - return the complete resource, only the resource identifier and high - level fields. Clients can subsequentally call ``Get`` methods. - - Note that Data Catalog search queries do not guarantee full recall. - Query results that match your query may not be returned, even in - subsequent result pages. Also note that results returned (and not - returned) can vary across repeated search queries. - - See `Data Catalog Search - Syntax `__ - for more information. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].SearchCatalog - - @property - def get_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.get_entry`. - - Gets an entry. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].GetEntry - - @property - def lookup_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.lookup_entry`. - - Get an entry by target resource name. This method allows clients to use - the resource name from the source Google Cloud Platform service to get the - Data Catalog Entry. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].LookupEntry - - @property - def create_entry_group(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_entry_group`. - - A maximum of 10,000 entry groups may be created per organization - across all locations. - - Users should enable the Data Catalog API in the project identified by - the ``parent`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateEntryGroup - - @property - def update_entry_group(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_entry_group`. - - Updates an EntryGroup. The user should enable the Data Catalog API - in the project identified by the ``entry_group.name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateEntryGroup - - @property - def get_entry_group(self): - """Return the gRPC stub for :meth:`DataCatalogClient.get_entry_group`. - - Gets an EntryGroup. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].GetEntryGroup - - @property - def delete_entry_group(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_entry_group`. - - Deletes an EntryGroup. Only entry groups that do not contain entries - can be deleted. Users should enable the Data Catalog API in the project - identified by the ``name`` parameter (see [Data Catalog Resource - Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteEntryGroup - - @property - def list_entry_groups(self): - """Return the gRPC stub for :meth:`DataCatalogClient.list_entry_groups`. - - Lists entry groups. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].ListEntryGroups - - @property - def create_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_entry`. - - Creates an entry. Only entries of 'FILESET' type or user-specified - type can be created. - - Users should enable the Data Catalog API in the project identified by - the ``parent`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - A maximum of 100,000 entries may be created per entry group. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateEntry - - @property - def update_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_entry`. - - Updates an existing entry. Users should enable the Data Catalog API - in the project identified by the ``entry.name`` parameter (see [Data - Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateEntry - - @property - def delete_entry(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_entry`. - - Deletes an existing entry. Only entries created through - ``CreateEntry`` method can be deleted. Users should enable the Data - Catalog API in the project identified by the ``name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteEntry - - @property - def list_entries(self): - """Return the gRPC stub for :meth:`DataCatalogClient.list_entries`. - - Lists entries. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].ListEntries - - @property - def create_tag_template(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_tag_template`. - - Creates a tag template. The user should enable the Data Catalog API - in the project identified by the ``parent`` parameter (see `Data Catalog - Resource - Project `__ - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateTagTemplate - - @property - def get_tag_template(self): - """Return the gRPC stub for :meth:`DataCatalogClient.get_tag_template`. - - Gets a tag template. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].GetTagTemplate - - @property - def update_tag_template(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_tag_template`. - - Updates a tag template. This method cannot be used to update the - fields of a template. The tag template fields are represented as - separate resources and should be updated using their own - create/update/delete methods. Users should enable the Data Catalog API - in the project identified by the ``tag_template.name`` parameter (see - [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateTagTemplate - - @property - def delete_tag_template(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_tag_template`. - - Deletes a tag template and all tags using the template. Users should - enable the Data Catalog API in the project identified by the ``name`` - parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteTagTemplate - - @property - def create_tag_template_field(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_tag_template_field`. - - Creates a field in a tag template. The user should enable the Data - Catalog API in the project identified by the ``parent`` parameter (see - `Data Catalog Resource - Project `__ - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateTagTemplateField - - @property - def update_tag_template_field(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_tag_template_field`. - - Updates a field in a tag template. This method cannot be used to - update the field type. Users should enable the Data Catalog API in the - project identified by the ``name`` parameter (see [Data Catalog Resource - Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateTagTemplateField - - @property - def rename_tag_template_field(self): - """Return the gRPC stub for :meth:`DataCatalogClient.rename_tag_template_field`. - - Renames a field in a tag template. The user should enable the Data - Catalog API in the project identified by the ``name`` parameter (see - `Data Catalog Resource - Project `__ - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].RenameTagTemplateField - - @property - def delete_tag_template_field(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_tag_template_field`. - - Deletes a field in a tag template and all uses of that field. Users - should enable the Data Catalog API in the project identified by the - ``name`` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteTagTemplateField - - @property - def create_tag(self): - """Return the gRPC stub for :meth:`DataCatalogClient.create_tag`. - - Creates a tag on an ``Entry``. Note: The project identified by the - ``parent`` parameter for the - `tag `__ - and the `tag - template `__ - used to create the tag must be from the same organization. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].CreateTag - - @property - def update_tag(self): - """Return the gRPC stub for :meth:`DataCatalogClient.update_tag`. - - Updates an existing tag. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].UpdateTag - - @property - def delete_tag(self): - """Return the gRPC stub for :meth:`DataCatalogClient.delete_tag`. - - Deletes a tag. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].DeleteTag - - @property - def list_tags(self): - """Return the gRPC stub for :meth:`DataCatalogClient.list_tags`. - - Lists the tags on an ``Entry``. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].ListTags - - @property - def set_iam_policy(self): - """Return the gRPC stub for :meth:`DataCatalogClient.set_iam_policy`. - - Sets the access control policy for a resource. Replaces any existing - policy. Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - Callers must have following Google IAM permission - - - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on tag - templates. - - ``datacatalog.entries.setIamPolicy`` to set policies on entries. - - ``datacatalog.entryGroups.setIamPolicy`` to set policies on entry - groups. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].SetIamPolicy - - @property - def get_iam_policy(self): - """Return the gRPC stub for :meth:`DataCatalogClient.get_iam_policy`. - - Gets the access control policy for a resource. A ``NOT_FOUND`` error - is returned if the resource does not exist. An empty policy is returned - if the resource exists but does not have a policy set on it. - - Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - Callers must have following Google IAM permission - - - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on tag - templates. - - ``datacatalog.entries.getIamPolicy`` to get policies on entries. - - ``datacatalog.entryGroups.getIamPolicy`` to get policies on entry - groups. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].GetIamPolicy - - @property - def test_iam_permissions(self): - """Return the gRPC stub for :meth:`DataCatalogClient.test_iam_permissions`. - - Returns the caller's permissions on a resource. If the resource does - not exist, an empty set of permissions is returned (We don't return a - ``NOT_FOUND`` error). - - Supported resources are: - - - Tag templates. - - Entries. - - Entry groups. Note, this method cannot be used to manage policies for - BigQuery, Pub/Sub and any external Google Cloud Platform resources - synced to Data Catalog. - - A caller is not required to have Google IAM permission to make this - request. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["data_catalog_stub"].TestIamPermissions diff --git a/google/cloud/datacatalog_v1beta1/gapic/transports/policy_tag_manager_grpc_transport.py b/google/cloud/datacatalog_v1beta1/gapic/transports/policy_tag_manager_grpc_transport.py deleted file mode 100644 index 71a1ca50..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/transports/policy_tag_manager_grpc_transport.py +++ /dev/null @@ -1,282 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - - -import google.api_core.grpc_helpers - -from google.cloud.datacatalog_v1beta1.proto import policytagmanager_pb2_grpc - - -class PolicyTagManagerGrpcTransport(object): - """gRPC transport class providing stubs for - google.cloud.datacatalog.v1beta1 PolicyTagManager API. - - The transport provides access to the raw gRPC stubs, - which can be used to take advantage of advanced - features of gRPC. - """ - - # The scopes needed to make gRPC calls to all of the methods defined - # in this service. - _OAUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",) - - def __init__( - self, channel=None, credentials=None, address="datacatalog.googleapis.com:443" - ): - """Instantiate the transport class. - - Args: - channel (grpc.Channel): A ``Channel`` instance through - which to make calls. This argument is mutually exclusive - with ``credentials``; providing both will raise an exception. - credentials (google.auth.credentials.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If none - are specified, the client will attempt to ascertain the - credentials from the environment. - address (str): The address where the service is hosted. - """ - # If both `channel` and `credentials` are specified, raise an - # exception (channels come with credentials baked in already). - if channel is not None and credentials is not None: - raise ValueError( - "The `channel` and `credentials` arguments are mutually " "exclusive." - ) - - # Create the channel. - if channel is None: - channel = self.create_channel( - address=address, - credentials=credentials, - options={ - "grpc.max_send_message_length": -1, - "grpc.max_receive_message_length": -1, - }.items(), - ) - - self._channel = channel - - # gRPC uses objects called "stubs" that are bound to the - # channel and provide a basic method for each RPC. - self._stubs = { - "policy_tag_manager_stub": policytagmanager_pb2_grpc.PolicyTagManagerStub( - channel - ) - } - - @classmethod - def create_channel( - cls, address="datacatalog.googleapis.com:443", credentials=None, **kwargs - ): - """Create and return a gRPC channel object. - - Args: - address (str): The host for the channel to use. - credentials (~.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If - none are specified, the client will attempt to ascertain - the credentials from the environment. - kwargs (dict): Keyword arguments, which are passed to the - channel creation. - - Returns: - grpc.Channel: A gRPC channel object. - """ - return google.api_core.grpc_helpers.create_channel( - address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs - ) - - @property - def channel(self): - """The gRPC channel used by the transport. - - Returns: - grpc.Channel: A gRPC channel object. - """ - return self._channel - - @property - def create_taxonomy(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.create_taxonomy`. - - Creates a taxonomy in the specified project. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].CreateTaxonomy - - @property - def delete_taxonomy(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.delete_taxonomy`. - - Deletes a taxonomy. This operation will also delete all - policy tags in this taxonomy along with their associated policies. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].DeleteTaxonomy - - @property - def update_taxonomy(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.update_taxonomy`. - - Updates a taxonomy. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].UpdateTaxonomy - - @property - def list_taxonomies(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.list_taxonomies`. - - Lists all taxonomies in a project in a particular location that the caller - has permission to view. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].ListTaxonomies - - @property - def get_taxonomy(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.get_taxonomy`. - - Gets a taxonomy. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].GetTaxonomy - - @property - def create_policy_tag(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.create_policy_tag`. - - Creates a policy tag in the specified taxonomy. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].CreatePolicyTag - - @property - def delete_policy_tag(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.delete_policy_tag`. - - Deletes a policy tag. Also deletes all of its descendant policy tags. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].DeletePolicyTag - - @property - def update_policy_tag(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.update_policy_tag`. - - Updates a policy tag. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].UpdatePolicyTag - - @property - def list_policy_tags(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.list_policy_tags`. - - Lists all policy tags in a taxonomy. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].ListPolicyTags - - @property - def get_policy_tag(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.get_policy_tag`. - - Gets a policy tag. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].GetPolicyTag - - @property - def get_iam_policy(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.get_iam_policy`. - - Gets the IAM policy for a taxonomy or a policy tag. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].GetIamPolicy - - @property - def set_iam_policy(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.set_iam_policy`. - - Sets the IAM policy for a taxonomy or a policy tag. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].SetIamPolicy - - @property - def test_iam_permissions(self): - """Return the gRPC stub for :meth:`PolicyTagManagerClient.test_iam_permissions`. - - Returns the permissions that a caller has on the specified taxonomy or - policy tag. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_stub"].TestIamPermissions diff --git a/google/cloud/datacatalog_v1beta1/gapic/transports/policy_tag_manager_serialization_grpc_transport.py b/google/cloud/datacatalog_v1beta1/gapic/transports/policy_tag_manager_serialization_grpc_transport.py deleted file mode 100644 index a3cfd42a..00000000 --- a/google/cloud/datacatalog_v1beta1/gapic/transports/policy_tag_manager_serialization_grpc_transport.py +++ /dev/null @@ -1,145 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - - -import google.api_core.grpc_helpers - -from google.cloud.datacatalog_v1beta1.proto import ( - policytagmanagerserialization_pb2_grpc, -) - - -class PolicyTagManagerSerializationGrpcTransport(object): - """gRPC transport class providing stubs for - google.cloud.datacatalog.v1beta1 PolicyTagManagerSerialization API. - - The transport provides access to the raw gRPC stubs, - which can be used to take advantage of advanced - features of gRPC. - """ - - # The scopes needed to make gRPC calls to all of the methods defined - # in this service. - _OAUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",) - - def __init__( - self, channel=None, credentials=None, address="datacatalog.googleapis.com:443" - ): - """Instantiate the transport class. - - Args: - channel (grpc.Channel): A ``Channel`` instance through - which to make calls. This argument is mutually exclusive - with ``credentials``; providing both will raise an exception. - credentials (google.auth.credentials.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If none - are specified, the client will attempt to ascertain the - credentials from the environment. - address (str): The address where the service is hosted. - """ - # If both `channel` and `credentials` are specified, raise an - # exception (channels come with credentials baked in already). - if channel is not None and credentials is not None: - raise ValueError( - "The `channel` and `credentials` arguments are mutually " "exclusive." - ) - - # Create the channel. - if channel is None: - channel = self.create_channel( - address=address, - credentials=credentials, - options={ - "grpc.max_send_message_length": -1, - "grpc.max_receive_message_length": -1, - }.items(), - ) - - self._channel = channel - - # gRPC uses objects called "stubs" that are bound to the - # channel and provide a basic method for each RPC. - self._stubs = { - "policy_tag_manager_serialization_stub": policytagmanagerserialization_pb2_grpc.PolicyTagManagerSerializationStub( - channel - ) - } - - @classmethod - def create_channel( - cls, address="datacatalog.googleapis.com:443", credentials=None, **kwargs - ): - """Create and return a gRPC channel object. - - Args: - address (str): The host for the channel to use. - credentials (~.Credentials): The - authorization credentials to attach to requests. These - credentials identify this application to the service. If - none are specified, the client will attempt to ascertain - the credentials from the environment. - kwargs (dict): Keyword arguments, which are passed to the - channel creation. - - Returns: - grpc.Channel: A gRPC channel object. - """ - return google.api_core.grpc_helpers.create_channel( - address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs - ) - - @property - def channel(self): - """The gRPC channel used by the transport. - - Returns: - grpc.Channel: A gRPC channel object. - """ - return self._channel - - @property - def import_taxonomies(self): - """Return the gRPC stub for :meth:`PolicyTagManagerSerializationClient.import_taxonomies`. - - Imports all taxonomies and their policy tags to a project as new - taxonomies. - - This method provides a bulk taxonomy / policy tag creation using nested - proto structure. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_serialization_stub"].ImportTaxonomies - - @property - def export_taxonomies(self): - """Return the gRPC stub for :meth:`PolicyTagManagerSerializationClient.export_taxonomies`. - - Exports all taxonomies and their policy tags in a project. - - This method generates SerializedTaxonomy protos with nested policy tags - that can be used as an input for future ImportTaxonomies calls. - - Returns: - Callable: A callable which accepts the appropriate - deserialized request object and returns a - deserialized response object. - """ - return self._stubs["policy_tag_manager_serialization_stub"].ExportTaxonomies diff --git a/google/cloud/datacatalog_v1beta1/proto/__init__.py b/google/cloud/datacatalog_v1beta1/proto/__init__.py deleted file mode 100644 index e69de29b..00000000 diff --git a/google/cloud/datacatalog_v1beta1/proto/common.proto b/google/cloud/datacatalog_v1beta1/proto/common.proto new file mode 100644 index 00000000..a759b371 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/proto/common.proto @@ -0,0 +1,38 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1beta1; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1Beta1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog"; +option java_multiple_files = true; +option java_package = "com.google.cloud.datacatalog.v1beta1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1beta1"; +option ruby_package = "Google::Cloud::DataCatalog::V1beta1"; + +// This enum describes all the possible systems that Data Catalog integrates +// with. +enum IntegratedSystem { + // Default unknown system. + INTEGRATED_SYSTEM_UNSPECIFIED = 0; + + // BigQuery. + BIGQUERY = 1; + + // Cloud Pub/Sub. + CLOUD_PUBSUB = 2; +} diff --git a/google/cloud/datacatalog_v1beta1/proto/common_pb2.py b/google/cloud/datacatalog_v1beta1/proto/common_pb2.py deleted file mode 100644 index b07c70cc..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/common_pb2.py +++ /dev/null @@ -1,75 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1beta1/proto/common.proto - -from google.protobuf.internal import enum_type_wrapper -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1beta1/proto/common.proto", - package="google.cloud.datacatalog.v1beta1", - syntax="proto3", - serialized_options=b"\n$com.google.cloud.datacatalog.v1beta1P\001ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\370\001\001\252\002 Google.Cloud.DataCatalog.V1Beta1\312\002 Google\\Cloud\\DataCatalog\\V1beta1\352\002#Google::Cloud::DataCatalog::V1beta1", - create_key=_descriptor._internal_create_key, - serialized_pb=b"\n3google/cloud/datacatalog_v1beta1/proto/common.proto\x12 google.cloud.datacatalog.v1beta1*U\n\x10IntegratedSystem\x12!\n\x1dINTEGRATED_SYSTEM_UNSPECIFIED\x10\x00\x12\x0c\n\x08\x42IGQUERY\x10\x01\x12\x10\n\x0c\x43LOUD_PUBSUB\x10\x02\x42\xe4\x01\n$com.google.cloud.datacatalog.v1beta1P\x01ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\xf8\x01\x01\xaa\x02 Google.Cloud.DataCatalog.V1Beta1\xca\x02 Google\\Cloud\\DataCatalog\\V1beta1\xea\x02#Google::Cloud::DataCatalog::V1beta1b\x06proto3", -) - -_INTEGRATEDSYSTEM = _descriptor.EnumDescriptor( - name="IntegratedSystem", - full_name="google.cloud.datacatalog.v1beta1.IntegratedSystem", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="INTEGRATED_SYSTEM_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="BIGQUERY", - index=1, - number=1, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="CLOUD_PUBSUB", - index=2, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=89, - serialized_end=174, -) -_sym_db.RegisterEnumDescriptor(_INTEGRATEDSYSTEM) - -IntegratedSystem = enum_type_wrapper.EnumTypeWrapper(_INTEGRATEDSYSTEM) -INTEGRATED_SYSTEM_UNSPECIFIED = 0 -BIGQUERY = 1 -CLOUD_PUBSUB = 2 - - -DESCRIPTOR.enum_types_by_name["IntegratedSystem"] = _INTEGRATEDSYSTEM -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - - -DESCRIPTOR._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1beta1/proto/common_pb2_grpc.py b/google/cloud/datacatalog_v1beta1/proto/common_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/common_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1beta1/proto/datacatalog.proto b/google/cloud/datacatalog_v1beta1/proto/datacatalog.proto index 8b67be1a..038e2203 100644 --- a/google/cloud/datacatalog_v1beta1/proto/datacatalog.proto +++ b/google/cloud/datacatalog_v1beta1/proto/datacatalog.proto @@ -1,4 +1,4 @@ -// Copyright 2019 Google LLC. +// Copyright 2020 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. @@ -11,7 +11,6 @@ // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. -// syntax = "proto3"; @@ -21,6 +20,7 @@ import "google/api/annotations.proto"; import "google/api/client.proto"; import "google/api/field_behavior.proto"; import "google/api/resource.proto"; +import "google/cloud/datacatalog/v1beta1/common.proto"; import "google/cloud/datacatalog/v1beta1/gcs_fileset_spec.proto"; import "google/cloud/datacatalog/v1beta1/schema.proto"; import "google/cloud/datacatalog/v1beta1/search.proto"; @@ -33,16 +33,18 @@ import "google/protobuf/empty.proto"; import "google/protobuf/field_mask.proto"; option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1Beta1"; option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog"; option java_multiple_files = true; -option java_package = "com.google.cloud.datacatalog"; +option java_package = "com.google.cloud.datacatalog.v1beta1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1beta1"; +option ruby_package = "Google::Cloud::DataCatalog::V1beta1"; // Data Catalog API service allows clients to discover, understand, and manage // their data. service DataCatalog { option (google.api.default_host) = "datacatalog.googleapis.com"; - option (google.api.oauth_scopes) = - "https://www.googleapis.com/auth/cloud-platform"; + option (google.api.oauth_scopes) = "https://www.googleapis.com/auth/cloud-platform"; // Searches Data Catalog for multiple resources like entries, tags that // match a query. @@ -52,13 +54,14 @@ service DataCatalog { // the complete resource, only the resource identifier and high level // fields. Clients can subsequentally call `Get` methods. // - // Note that searches do not have full recall. There may be results that match - // your query but are not returned, even in subsequent pages of results. These - // missing results may vary across repeated calls to search. Do not rely on - // this method if you need to guarantee full recall. + // Note that Data Catalog search queries do not guarantee full recall. Query + // results that match your query may not be returned, even in subsequent + // result pages. Also note that results returned (and not returned) can vary + // across repeated search queries. // // See [Data Catalog Search - // Syntax](/data-catalog/docs/how-to/search-reference) for more information. + // Syntax](https://cloud.google.com/data-catalog/docs/how-to/search-reference) + // for more information. rpc SearchCatalog(SearchCatalogRequest) returns (SearchCatalogResponse) { option (google.api.http) = { post: "/v1beta1/catalog:search" @@ -67,11 +70,13 @@ service DataCatalog { option (google.api.method_signature) = "scope,query"; } - // Alpha feature. - // Creates an EntryGroup. - // The user should enable the Data Catalog API in the project identified by + // A maximum of 10,000 entry groups may be created per organization across all + // locations. + // + // Users should enable the Data Catalog API in the project identified by // the `parent` parameter (see [Data Catalog Resource Project] - // (/data-catalog/docs/concepts/resource-project) for more information). + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). rpc CreateEntryGroup(CreateEntryGroupRequest) returns (EntryGroup) { option (google.api.http) = { post: "/v1beta1/{parent=projects/*/locations/*}/entryGroups" @@ -80,7 +85,20 @@ service DataCatalog { option (google.api.method_signature) = "parent,entry_group_id,entry_group"; } - // Alpha feature. + // Updates an EntryGroup. The user should enable the Data Catalog API in the + // project identified by the `entry_group.name` parameter (see [Data Catalog + // Resource Project] + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc UpdateEntryGroup(UpdateEntryGroupRequest) returns (EntryGroup) { + option (google.api.http) = { + patch: "/v1beta1/{entry_group.name=projects/*/locations/*/entryGroups/*}" + body: "entry_group" + }; + option (google.api.method_signature) = "entry_group"; + option (google.api.method_signature) = "entry_group,update_mask"; + } + // Gets an EntryGroup. rpc GetEntryGroup(GetEntryGroupRequest) returns (EntryGroup) { option (google.api.http) = { @@ -90,24 +108,35 @@ service DataCatalog { option (google.api.method_signature) = "name,read_mask"; } - // Alpha feature. // Deletes an EntryGroup. Only entry groups that do not contain entries can be - // deleted. The user should enable the Data Catalog API in the project + // deleted. Users should enable the Data Catalog API in the project // identified by the `name` parameter (see [Data Catalog Resource Project] - // (/data-catalog/docs/concepts/resource-project) for more information). - rpc DeleteEntryGroup(DeleteEntryGroupRequest) - returns (google.protobuf.Empty) { + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc DeleteEntryGroup(DeleteEntryGroupRequest) returns (google.protobuf.Empty) { option (google.api.http) = { delete: "/v1beta1/{name=projects/*/locations/*/entryGroups/*}" }; option (google.api.method_signature) = "name"; } - // Alpha feature. - // Creates an entry. Currently only entries of 'FILESET' type can be created. - // The user should enable the Data Catalog API in the project identified by + // Lists entry groups. + rpc ListEntryGroups(ListEntryGroupsRequest) returns (ListEntryGroupsResponse) { + option (google.api.http) = { + get: "/v1beta1/{parent=projects/*/locations/*}/entryGroups" + }; + option (google.api.method_signature) = "parent"; + } + + // Creates an entry. Only entries of 'FILESET' type or user-specified type can + // be created. + // + // Users should enable the Data Catalog API in the project identified by // the `parent` parameter (see [Data Catalog Resource Project] - // (/data-catalog/docs/concepts/resource-project) for more information). + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + // + // A maximum of 100,000 entries may be created per entry group. rpc CreateEntry(CreateEntryRequest) returns (Entry) { option (google.api.http) = { post: "/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/entries" @@ -117,9 +146,10 @@ service DataCatalog { } // Updates an existing entry. - // The user should enable the Data Catalog API in the project identified by + // Users should enable the Data Catalog API in the project identified by // the `entry.name` parameter (see [Data Catalog Resource Project] - // (/data-catalog/docs/concepts/resource-project) for more information). + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). rpc UpdateEntry(UpdateEntryRequest) returns (Entry) { option (google.api.http) = { patch: "/v1beta1/{entry.name=projects/*/locations/*/entryGroups/*/entries/*}" @@ -129,13 +159,13 @@ service DataCatalog { option (google.api.method_signature) = "entry,update_mask"; } - // Alpha feature. // Deletes an existing entry. Only entries created through // [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry] // method can be deleted. - // The user should enable the Data Catalog API in the project identified by + // Users should enable the Data Catalog API in the project identified by // the `name` parameter (see [Data Catalog Resource Project] - // (/data-catalog/docs/concepts/resource-project) for more information). + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). rpc DeleteEntry(DeleteEntryRequest) returns (google.protobuf.Empty) { option (google.api.http) = { delete: "/v1beta1/{name=projects/*/locations/*/entryGroups/*/entries/*}" @@ -160,17 +190,25 @@ service DataCatalog { }; } + // Lists entries. + rpc ListEntries(ListEntriesRequest) returns (ListEntriesResponse) { + option (google.api.http) = { + get: "/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/entries" + }; + option (google.api.method_signature) = "parent"; + } + // Creates a tag template. The user should enable the Data Catalog API in // the project identified by the `parent` parameter (see [Data Catalog - // Resource Project](/data-catalog/docs/concepts/resource-project) for more - // information). + // Resource + // Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) + // for more information). rpc CreateTagTemplate(CreateTagTemplateRequest) returns (TagTemplate) { option (google.api.http) = { post: "/v1beta1/{parent=projects/*/locations/*}/tagTemplates" body: "tag_template" }; - option (google.api.method_signature) = - "parent,tag_template_id,tag_template"; + option (google.api.method_signature) = "parent,tag_template_id,tag_template"; } // Gets a tag template. @@ -184,9 +222,10 @@ service DataCatalog { // Updates a tag template. This method cannot be used to update the fields of // a template. The tag template fields are represented as separate resources // and should be updated using their own create/update/delete methods. - // The user should enable the Data Catalog API in the project identified by + // Users should enable the Data Catalog API in the project identified by // the `tag_template.name` parameter (see [Data Catalog Resource Project] - // (/data-catalog/docs/concepts/resource-project) for more information). + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). rpc UpdateTagTemplate(UpdateTagTemplateRequest) returns (TagTemplate) { option (google.api.http) = { patch: "/v1beta1/{tag_template.name=projects/*/locations/*/tagTemplates/*}" @@ -197,11 +236,11 @@ service DataCatalog { } // Deletes a tag template and all tags using the template. - // The user should enable the Data Catalog API in the project identified by + // Users should enable the Data Catalog API in the project identified by // the `name` parameter (see [Data Catalog Resource Project] - // (/data-catalog/docs/concepts/resource-project) for more information). - rpc DeleteTagTemplate(DeleteTagTemplateRequest) - returns (google.protobuf.Empty) { + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc DeleteTagTemplate(DeleteTagTemplateRequest) returns (google.protobuf.Empty) { option (google.api.http) = { delete: "/v1beta1/{name=projects/*/locations/*/tagTemplates/*}" }; @@ -211,39 +250,36 @@ service DataCatalog { // Creates a field in a tag template. The user should enable the Data Catalog // API in the project identified by the `parent` parameter (see // [Data Catalog Resource - // Project](/data-catalog/docs/concepts/resource-project) for more - // information). - rpc CreateTagTemplateField(CreateTagTemplateFieldRequest) - returns (TagTemplateField) { + // Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) + // for more information). + rpc CreateTagTemplateField(CreateTagTemplateFieldRequest) returns (TagTemplateField) { option (google.api.http) = { post: "/v1beta1/{parent=projects/*/locations/*/tagTemplates/*}/fields" body: "tag_template_field" }; - option (google.api.method_signature) = - "parent,tag_template_field_id,tag_template_field"; + option (google.api.method_signature) = "parent,tag_template_field_id,tag_template_field"; } // Updates a field in a tag template. This method cannot be used to update the - // field type. The user should enable the Data Catalog API in the project + // field type. Users should enable the Data Catalog API in the project // identified by the `name` parameter (see [Data Catalog Resource Project] - // (/data-catalog/docs/concepts/resource-project) for more information). - rpc UpdateTagTemplateField(UpdateTagTemplateFieldRequest) - returns (TagTemplateField) { + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc UpdateTagTemplateField(UpdateTagTemplateFieldRequest) returns (TagTemplateField) { option (google.api.http) = { patch: "/v1beta1/{name=projects/*/locations/*/tagTemplates/*/fields/*}" body: "tag_template_field" }; option (google.api.method_signature) = "name,tag_template_field"; - option (google.api.method_signature) = - "name,tag_template_field,update_mask"; + option (google.api.method_signature) = "name,tag_template_field,update_mask"; } // Renames a field in a tag template. The user should enable the Data Catalog // API in the project identified by the `name` parameter (see [Data Catalog - // Resource Project](/data-catalog/docs/concepts/resource-project) for more - // information). - rpc RenameTagTemplateField(RenameTagTemplateFieldRequest) - returns (TagTemplateField) { + // Resource + // Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) + // for more information). + rpc RenameTagTemplateField(RenameTagTemplateFieldRequest) returns (TagTemplateField) { option (google.api.http) = { post: "/v1beta1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:rename" body: "*" @@ -252,11 +288,11 @@ service DataCatalog { } // Deletes a field in a tag template and all uses of that field. - // The user should enable the Data Catalog API in the project identified by + // Users should enable the Data Catalog API in the project identified by // the `name` parameter (see [Data Catalog Resource Project] - // (/data-catalog/docs/concepts/resource-project) for more information). - rpc DeleteTagTemplateField(DeleteTagTemplateFieldRequest) - returns (google.protobuf.Empty) { + // (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for + // more information). + rpc DeleteTagTemplateField(DeleteTagTemplateFieldRequest) returns (google.protobuf.Empty) { option (google.api.http) = { delete: "/v1beta1/{name=projects/*/locations/*/tagTemplates/*/fields/*}" }; @@ -265,15 +301,19 @@ service DataCatalog { // Creates a tag on an [Entry][google.cloud.datacatalog.v1beta1.Entry]. // Note: The project identified by the `parent` parameter for the - // [tag](/data-catalog/docs/reference/rest/v1beta1/projects.locations.entryGroups.entries.tags/create#path-parameters) + // [tag](https://cloud.google.com/data-catalog/docs/reference/rest/v1beta1/projects.locations.entryGroups.entries.tags/create#path-parameters) // and the // [tag - // template](/data-catalog/docs/reference/rest/v1beta1/projects.locations.tagTemplates/create#path-parameters) + // template](https://cloud.google.com/data-catalog/docs/reference/rest/v1beta1/projects.locations.tagTemplates/create#path-parameters) // used to create the tag must be from the same organization. rpc CreateTag(CreateTagRequest) returns (Tag) { option (google.api.http) = { post: "/v1beta1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tags" body: "tag" + additional_bindings { + post: "/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/tags" + body: "tag" + } }; option (google.api.method_signature) = "parent,tag"; } @@ -283,6 +323,10 @@ service DataCatalog { option (google.api.http) = { patch: "/v1beta1/{tag.name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}" body: "tag" + additional_bindings { + patch: "/v1beta1/{tag.name=projects/*/locations/*/entryGroups/*/tags/*}" + body: "tag" + } }; option (google.api.method_signature) = "tag"; option (google.api.method_signature) = "tag,update_mask"; @@ -292,6 +336,9 @@ service DataCatalog { rpc DeleteTag(DeleteTagRequest) returns (google.protobuf.Empty) { option (google.api.http) = { delete: "/v1beta1/{name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}" + additional_bindings { + delete: "/v1beta1/{name=projects/*/locations/*/entryGroups/*/tags/*}" + } }; option (google.api.method_signature) = "name"; } @@ -300,6 +347,9 @@ service DataCatalog { rpc ListTags(ListTagsRequest) returns (ListTagsResponse) { option (google.api.http) = { get: "/v1beta1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tags" + additional_bindings { + get: "/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/tags" + } }; option (google.api.method_signature) = "parent"; } @@ -310,17 +360,15 @@ service DataCatalog { // - Tag templates. // - Entries. // - Entry groups. - // Note, this method cannot be used to manage policies for BigQuery, Cloud - // Pub/Sub and any external Google Cloud Platform resources synced to Cloud - // Data Catalog. + // Note, this method cannot be used to manage policies for BigQuery, Pub/Sub + // and any external Google Cloud Platform resources synced to Data Catalog. // // Callers must have following Google IAM permission // - `datacatalog.tagTemplates.setIamPolicy` to set policies on tag // templates. // - `datacatalog.entries.setIamPolicy` to set policies on entries. // - `datacatalog.entryGroups.setIamPolicy` to set policies on entry groups. - rpc SetIamPolicy(google.iam.v1.SetIamPolicyRequest) - returns (google.iam.v1.Policy) { + rpc SetIamPolicy(google.iam.v1.SetIamPolicyRequest) returns (google.iam.v1.Policy) { option (google.api.http) = { post: "/v1beta1/{resource=projects/*/locations/*/tagTemplates/*}:setIamPolicy" body: "*" @@ -328,11 +376,9 @@ service DataCatalog { post: "/v1beta1/{resource=projects/*/locations/*/entryGroups/*}:setIamPolicy" body: "*" } - additional_bindings { - post: "/v1beta1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:setIamPolicy" - body: "*" - } }; + + option (google.api.method_signature) = "resource,policy"; } // Gets the access control policy for a resource. A `NOT_FOUND` error @@ -343,17 +389,15 @@ service DataCatalog { // - Tag templates. // - Entries. // - Entry groups. - // Note, this method cannot be used to manage policies for BigQuery, Cloud - // Pub/Sub and any external Google Cloud Platform resources synced to Cloud - // Data Catalog. + // Note, this method cannot be used to manage policies for BigQuery, Pub/Sub + // and any external Google Cloud Platform resources synced to Data Catalog. // // Callers must have following Google IAM permission // - `datacatalog.tagTemplates.getIamPolicy` to get policies on tag // templates. // - `datacatalog.entries.getIamPolicy` to get policies on entries. // - `datacatalog.entryGroups.getIamPolicy` to get policies on entry groups. - rpc GetIamPolicy(google.iam.v1.GetIamPolicyRequest) - returns (google.iam.v1.Policy) { + rpc GetIamPolicy(google.iam.v1.GetIamPolicyRequest) returns (google.iam.v1.Policy) { option (google.api.http) = { post: "/v1beta1/{resource=projects/*/locations/*/tagTemplates/*}:getIamPolicy" body: "*" @@ -366,6 +410,7 @@ service DataCatalog { body: "*" } }; + option (google.api.method_signature) = "resource"; } // Returns the caller's permissions on a resource. @@ -376,14 +421,12 @@ service DataCatalog { // - Tag templates. // - Entries. // - Entry groups. - // Note, this method cannot be used to manage policies for BigQuery, Cloud - // Pub/Sub and any external Google Cloud Platform resources synced to Cloud - // Data Catalog. + // Note, this method cannot be used to manage policies for BigQuery, Pub/Sub + // and any external Google Cloud Platform resources synced to Data Catalog. // // A caller is not required to have Google IAM permission to make this // request. - rpc TestIamPermissions(google.iam.v1.TestIamPermissionsRequest) - returns (google.iam.v1.TestIamPermissionsResponse) { + rpc TestIamPermissions(google.iam.v1.TestIamPermissionsRequest) returns (google.iam.v1.TestIamPermissionsResponse) { option (google.api.http) = { post: "/v1beta1/{resource=projects/*/locations/*/tagTemplates/*}:testIamPermissions" body: "*" @@ -402,19 +445,14 @@ service DataCatalog { // Request message for // [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. message SearchCatalogRequest { + // The criteria that select the subspace used for query matching. message Scope { - // Data Catalog tries to automatically choose the right corpus of data to - // search through. You can ensure an organization is included by adding it - // to `include_org_ids`. You can ensure a project's org is included with - // `include_project_ids`. You must specify at least one organization - // using `include_org_ids` or `include_project_ids` in all search requests. - // - // List of organization IDs to search within. To find your organization ID, - // follow instructions in + // The list of organization IDs to search within. To find your organization + // ID, follow instructions in // https://cloud.google.com/resource-manager/docs/creating-managing-organization. repeated string include_org_ids = 2; - // List of project IDs to search within. To learn more about the + // The list of project IDs to search within. To learn more about the // distinction between project names/IDs/numbers, go to // https://cloud.google.com/docs/overview/#projects. repeated string include_project_ids = 3; @@ -426,11 +464,13 @@ message SearchCatalogRequest { bool include_gcp_public_datasets = 7; } - // Required. The scope of this search request. + // Required. The scope of this search request. A `scope` that has empty + // `include_org_ids`, `include_project_ids` AND false + // `include_gcp_public_datasets` is considered invalid. Data Catalog will + // return an error in such a case. Scope scope = 6 [(google.api.field_behavior) = REQUIRED]; - // Required. The query string in search query syntax. The query must be - // non-empty. + // Required. The query string in search query syntax. The query must be non-empty. // // Query strings can be simple as "x" or more qualified as: // @@ -440,7 +480,8 @@ message SearchCatalogRequest { // // Note: Query tokens need to have a minimum of 3 characters for substring // matching to work correctly. See [Data Catalog Search - // Syntax](/data-catalog/docs/how-to/search-reference) for more information. + // Syntax](https://cloud.google.com/data-catalog/docs/how-to/search-reference) + // for more information. string query = 1 [(google.api.field_behavior) = REQUIRED]; // Number of results in the search page. If <=0 then defaults to 10. Max limit @@ -448,8 +489,8 @@ message SearchCatalogRequest { int32 page_size = 2; // Optional. Pagination token returned in an earlier - // [SearchCatalogResponse.next_page_token][google.cloud.datacatalog.v1beta1.SearchCatalogResponse.next_page_token], - // which indicates that this is a continuation of a prior + // [SearchCatalogResponse.next_page_token][google.cloud.datacatalog.v1beta1.SearchCatalogResponse.next_page_token], which + // indicates that this is a continuation of a prior // [SearchCatalogRequest][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog] // call, and that the system should return the next page of data. If empty, // the first page is returned. @@ -458,9 +499,7 @@ message SearchCatalogRequest { // Specifies the ordering of results, currently supported case-sensitive // choices are: // - // * `relevance`, only supports desecending - // * `last_access_timestamp [asc|desc]`, defaults to descending if not - // specified + // * `relevance`, only supports descending // * `last_modified_timestamp [asc|desc]`, defaults to descending if not // specified // @@ -495,12 +534,25 @@ message CreateEntryGroupRequest { ]; // Required. The id of the entry group to create. + // The id must begin with a letter or underscore, contain only English + // letters, numbers and underscores, and be at most 64 characters. string entry_group_id = 3 [(google.api.field_behavior) = REQUIRED]; // The entry group to create. Defaults to an empty entry group. EntryGroup entry_group = 2; } +// Request message for +// [UpdateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntryGroup]. +message UpdateEntryGroupRequest { + // Required. The updated entry group. "name" field must be set. + EntryGroup entry_group = 1 [(google.api.field_behavior) = REQUIRED]; + + // The fields to update on the entry group. If absent or empty, all modifiable + // fields are updated. + google.protobuf.FieldMask update_mask = 2; +} + // Request message for // [GetEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntryGroup]. message GetEntryGroupRequest { @@ -528,6 +580,43 @@ message DeleteEntryGroupRequest { type: "datacatalog.googleapis.com/EntryGroup" } ]; + + // Optional. If true, deletes all entries in the entry group. + bool force = 2 [(google.api.field_behavior) = OPTIONAL]; +} + +// Request message for +// [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. +message ListEntryGroupsRequest { + // Required. The name of the location that contains the entry groups, which can be + // provided in URL format. Example: + // + // * projects/{project_id}/locations/{location} + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/EntryGroup" + } + ]; + + // Optional. The maximum number of items to return. Default is 10. Max limit is 1000. + // Throws an invalid argument for `page_size > 1000`. + int32 page_size = 2 [(google.api.field_behavior) = OPTIONAL]; + + // Optional. Token that specifies which page is requested. If empty, the first page is + // returned. + string page_token = 3 [(google.api.field_behavior) = OPTIONAL]; +} + +// Response message for +// [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. +message ListEntryGroupsResponse { + // EntryGroup details. + repeated EntryGroup entry_groups = 1; + + // Token to retrieve the next page of results. It is set to empty if no items + // remain in results. + string next_page_token = 2; } // Request message for @@ -571,6 +660,14 @@ message UpdateEntryRequest { // * `description` // * `gcs_fileset_spec` // * `gcs_fileset_spec.file_patterns` + // * For entries with `user_specified_type` + // * `schema` + // * `display_name` + // * `description` + // * user_specified_type + // * user_specified_system + // * linked_resource + // * source_system_timestamps google.protobuf.FieldMask update_mask = 2; } @@ -594,11 +691,6 @@ message GetEntryRequest { // Required. The name of the entry. Example: // // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} - // - // Entry groups are logical groupings of entries. Currently, users cannot - // create/modify entry groups. They are created by Data Catalog; they include - // `@bigquery` for all BigQuery entries, and `@pubsub` for all Cloud Pub/Sub - // entries. string name = 1 [ (google.api.field_behavior) = REQUIRED, (google.api.resource_reference) = { @@ -628,10 +720,11 @@ message LookupEntryRequest { // // Examples: // - // * `cloud_pubsub.project_id.topic_id` + // * `pubsub.project_id.topic_id` // * ``pubsub.project_id.`topic.id.with.dots` `` - // * `bigquery.project_id.dataset_id.table_id` - // * `datacatalog.project_id.location_id.entry_group_id.entry_id` + // * `bigquery.table.project_id.dataset_id.table_id` + // * `bigquery.dataset.project_id.dataset_id` + // * `datacatalog.entry.project_id.location_id.entry_group_id.entry_id` // // `*_id`s shoud satisfy the standard SQL rules for identifiers. // https://cloud.google.com/bigquery/docs/reference/standard-sql/lexical. @@ -641,9 +734,10 @@ message LookupEntryRequest { // Entry Metadata. // A Data Catalog Entry resource represents another resource in Google -// Cloud Platform, such as a BigQuery dataset or a Cloud Pub/Sub topic. -// Clients can use the `linked_resource` field in the Entry resource to refer to -// the original resource ID of the source system. +// Cloud Platform (such as a BigQuery dataset or a Pub/Sub topic), or +// outside of Google Cloud Platform. Clients can use the `linked_resource` field +// in the Entry resource to refer to the original resource ID of the source +// system. // // An Entry resource contains resource details, such as its schema. An Entry can // also be used to attach flexible metadata, such as a @@ -661,10 +755,10 @@ message Entry { // Note that this Entry and its child resources may not actually be stored in // the location in this name. string name = 1 [(google.api.resource_reference) = { - type: "datacatalog.googleapis.com/EntryGroup" - }]; + type: "datacatalog.googleapis.com/EntryGroup" + }]; - // Output only. The resource this metadata entry refers to. + // The resource this metadata entry refers to. // // For Google Cloud Platform resources, `linked_resource` is the [full name of // the @@ -672,12 +766,43 @@ message Entry { // For example, the `linked_resource` for a table resource from BigQuery is: // // * //bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId - string linked_resource = 9 [(google.api.field_behavior) = OUTPUT_ONLY]; + // + // Output only when Entry is of type in the EntryType enum. For entries with + // user_specified_type, this field is optional and defaults to an empty + // string. + string linked_resource = 9; // Required. Entry type. oneof entry_type { // The type of the entry. + // Only used for Entries with types in the EntryType enum. EntryType type = 2; + + // Entry type if it does not fit any of the input-allowed values listed in + // `EntryType` enum above. When creating an entry, users should check the + // enum values first, if nothing matches the entry to be created, then + // provide a custom value, for example "my_special_type". + // `user_specified_type` strings must begin with a letter or underscore and + // can only contain letters, numbers, and underscores; are case insensitive; + // must be at least 1 character and at most 64 characters long. + // + // Currently, only FILESET enum value is allowed. All other entries created + // through Data Catalog must use `user_specified_type`. + string user_specified_type = 16; + } + + // The source system of the entry. + oneof system { + // Output only. This field indicates the entry's source system that Data Catalog + // integrates with, such as BigQuery or Pub/Sub. + IntegratedSystem integrated_system = 17 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // This field indicates the entry's source system that Data Catalog does not + // integrate with. `user_specified_system` strings must begin with a letter + // or underscore and can only contain letters, numbers, and underscores; are + // case insensitive; must be at least 1 character and at most 64 characters + // long. + string user_specified_system = 18; } // Type specification information. @@ -708,10 +833,11 @@ message Entry { // Schema of the entry. An entry might not have any schema attached to it. Schema schema = 5; - // Output only. Timestamps about the underlying Google Cloud Platform - // resource, not about this Data Catalog Entry. - SystemTimestamps source_system_timestamps = 7 - [(google.api.field_behavior) = OUTPUT_ONLY]; + // Output only. Timestamps about the underlying resource, not about this Data Catalog + // entry. Output only when Entry is of type in the EntryType enum. For entries + // with user_specified_type, this field is optional and defaults to an empty + // timestamp. + SystemTimestamps source_system_timestamps = 7 [(google.api.field_behavior) = OUTPUT_ONLY]; } // EntryGroup Metadata. @@ -740,22 +866,19 @@ message EntryGroup { // string. string description = 3; - // Output only. Timestamps about this EntryGroup. Default value is empty - // timestamps. - SystemTimestamps data_catalog_timestamps = 4 - [(google.api.field_behavior) = OUTPUT_ONLY]; + // Output only. Timestamps about this EntryGroup. Default value is empty timestamps. + SystemTimestamps data_catalog_timestamps = 4 [(google.api.field_behavior) = OUTPUT_ONLY]; } // Request message for // [CreateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplate]. message CreateTagTemplateRequest { - // Required. The name of the project and the location this template is in. - // Example: + // Required. The name of the project and the template location + // [region](https://cloud.google.com/data-catalog/docs/concepts/regions. // - // * projects/{project_id}/locations/{location} + // Example: // - // TagTemplate and its child resources may not actually be stored in the - // location in this name. + // * projects/{project_id}/locations/us-central1 string parent = 1 [ (google.api.field_behavior) = REQUIRED, (google.api.resource_reference) = { @@ -784,26 +907,6 @@ message GetTagTemplateRequest { ]; } -// Entry resources in Data Catalog can be of different types e.g. a BigQuery -// Table entry is of type `TABLE`. This enum describes all the possible types -// Data Catalog contains. -enum EntryType { - // Default unknown type - ENTRY_TYPE_UNSPECIFIED = 0; - - // Output only. The type of entry that has a GoogleSQL schema, including - // logical views. - TABLE = 2; - - // Output only. An entry type which is used for streaming entries. Example: - // Cloud Pub/Sub topic. - DATA_STREAM = 3; - - // Alpha feature. An entry type which is a set of files or objects. Example: - // Cloud Storage fileset. - FILESET = 4; -} - // Request message for // [UpdateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplate]. message UpdateTagTemplateRequest { @@ -842,8 +945,8 @@ message DeleteTagTemplateRequest { // Request message for // [CreateTag][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTag]. message CreateTagRequest { - // Required. The name of the resource to attach this tag to. Tags can be - // attached to Entries. Example: + // Required. The name of the resource to attach this tag to. Tags can be attached to + // Entries. Example: // // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} // @@ -851,7 +954,9 @@ message CreateTagRequest { // the location in this name. string parent = 1 [ (google.api.field_behavior) = REQUIRED, - (google.api.resource_reference) = { type: "datacatalog.googleapis.com/Tag" } + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/Tag" + } ]; // Required. The tag to create. @@ -886,12 +991,12 @@ message DeleteTagRequest { // Request message for // [CreateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplateField]. message CreateTagTemplateFieldRequest { - // Required. The name of the project this template is in. Example: + // Required. The name of the project and the template location + // [region](https://cloud.google.com/data-catalog/docs/concepts/regions). // - // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + // Example: // - // Note that this TagTemplateField may not actually be stored in the location - // in this name. + // * projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} string parent = 1 [ (google.api.field_behavior) = REQUIRED, (google.api.resource_reference) = { @@ -907,8 +1012,7 @@ message CreateTagTemplateFieldRequest { string tag_template_field_id = 2 [(google.api.field_behavior) = REQUIRED]; // Required. The tag template field to create. - TagTemplateField tag_template_field = 3 - [(google.api.field_behavior) = REQUIRED]; + TagTemplateField tag_template_field = 3 [(google.api.field_behavior) = REQUIRED]; } // Request message for @@ -925,22 +1029,23 @@ message UpdateTagTemplateFieldRequest { ]; // Required. The template to update. - TagTemplateField tag_template_field = 2 - [(google.api.field_behavior) = REQUIRED]; + TagTemplateField tag_template_field = 2 [(google.api.field_behavior) = REQUIRED]; - // The field mask specifies the parts of the template to be updated. + // Optional. The field mask specifies the parts of the template to be updated. // Allowed fields: // // * `display_name` // * `type.enum_type` + // * `is_required` // // If `update_mask` is not set or empty, all of the allowed fields above will // be updated. // // When updating an enum type, the provided values will be merged with the // existing values. Therefore, enum values can only be added, existing enum - // values cannot be deleted nor renamed. - google.protobuf.FieldMask update_mask = 3; + // values cannot be deleted nor renamed. Updating a template field from + // optional to required is NOT allowed. + google.protobuf.FieldMask update_mask = 3 [(google.api.field_behavior) = OPTIONAL]; } // Request message for @@ -956,8 +1061,7 @@ message RenameTagTemplateFieldRequest { } ]; - // Required. The new ID of this tag template field. For example, - // `my_new_field`. + // Required. The new ID of this tag template field. For example, `my_new_field`. string new_tag_template_field_id = 2 [(google.api.field_behavior) = REQUIRED]; } @@ -983,8 +1087,14 @@ message DeleteTagTemplateFieldRequest { // Request message for // [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. message ListTagsRequest { - // Required. The name of the Data Catalog resource to list the tags of. The - // resource could be an [Entry][google.cloud.datacatalog.v1beta1.Entry]. + // Required. The name of the Data Catalog resource to list the tags of. The resource + // could be an [Entry][google.cloud.datacatalog.v1beta1.Entry] or an + // [EntryGroup][google.cloud.datacatalog.v1beta1.EntryGroup]. + // + // Examples: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} string parent = 1 [ (google.api.field_behavior) = REQUIRED, (google.api.resource_reference) = { @@ -1010,3 +1120,67 @@ message ListTagsResponse { // remain in results. string next_page_token = 2; } + +// Request message for +// [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. +message ListEntriesRequest { + // Required. The name of the entry group that contains the entries, which can + // be provided in URL format. Example: + // + // * projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/EntryGroup" + } + ]; + + // The maximum number of items to return. Default is 10. Max limit is 1000. + // Throws an invalid argument for `page_size > 1000`. + int32 page_size = 2; + + // Token that specifies which page is requested. If empty, the first page is + // returned. + string page_token = 3; + + // The fields to return for each Entry. If not set or empty, all + // fields are returned. + // For example, setting read_mask to contain only one path "name" will cause + // ListEntries to return a list of Entries with only "name" field. + google.protobuf.FieldMask read_mask = 4; +} + +// Response message for +// [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. +message ListEntriesResponse { + // Entry details. + repeated Entry entries = 1; + + // Token to retrieve the next page of results. It is set to empty if no items + // remain in results. + string next_page_token = 2; +} + +// Entry resources in Data Catalog can be of different types e.g. a BigQuery +// Table entry is of type `TABLE`. This enum describes all the possible types +// Data Catalog contains. +enum EntryType { + // Default unknown type. + ENTRY_TYPE_UNSPECIFIED = 0; + + // Output only. The type of entry that has a GoogleSQL schema, including + // logical views. + TABLE = 2; + + // Output only. The type of models. + // https://cloud.google.com/bigquery-ml/docs/bigqueryml-intro + MODEL = 5; + + // Output only. An entry type which is used for streaming entries. Example: + // Pub/Sub topic. + DATA_STREAM = 3; + + // An entry type which is a set of files or objects. Example: + // Cloud Storage fileset. + FILESET = 4; +} diff --git a/google/cloud/datacatalog_v1beta1/proto/datacatalog_pb2.py b/google/cloud/datacatalog_v1beta1/proto/datacatalog_pb2.py deleted file mode 100644 index dd85fbce..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/datacatalog_pb2.py +++ /dev/null @@ -1,3849 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1beta1/proto/datacatalog.proto - -from google.protobuf.internal import enum_type_wrapper -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2 -from google.api import client_pb2 as google_dot_api_dot_client__pb2 -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.api import resource_pb2 as google_dot_api_dot_resource__pb2 -from google.cloud.datacatalog_v1beta1.proto import ( - common_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_common__pb2, -) -from google.cloud.datacatalog_v1beta1.proto import ( - gcs_fileset_spec_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_gcs__fileset__spec__pb2, -) -from google.cloud.datacatalog_v1beta1.proto import ( - schema_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_schema__pb2, -) -from google.cloud.datacatalog_v1beta1.proto import ( - search_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_search__pb2, -) -from google.cloud.datacatalog_v1beta1.proto import ( - table_spec_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_table__spec__pb2, -) -from google.cloud.datacatalog_v1beta1.proto import ( - tags_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2, -) -from google.cloud.datacatalog_v1beta1.proto import ( - timestamps_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_timestamps__pb2, -) -from google.iam.v1 import iam_policy_pb2 as google_dot_iam_dot_v1_dot_iam__policy__pb2 -from google.iam.v1 import policy_pb2 as google_dot_iam_dot_v1_dot_policy__pb2 -from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2 -from google.protobuf import field_mask_pb2 as google_dot_protobuf_dot_field__mask__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1beta1/proto/datacatalog.proto", - package="google.cloud.datacatalog.v1beta1", - syntax="proto3", - serialized_options=b"\n$com.google.cloud.datacatalog.v1beta1P\001ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\370\001\001\252\002 Google.Cloud.DataCatalog.V1Beta1\312\002 Google\\Cloud\\DataCatalog\\V1beta1\352\002#Google::Cloud::DataCatalog::V1beta1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n8google/cloud/datacatalog_v1beta1/proto/datacatalog.proto\x12 google.cloud.datacatalog.v1beta1\x1a\x1cgoogle/api/annotations.proto\x1a\x17google/api/client.proto\x1a\x1fgoogle/api/field_behavior.proto\x1a\x19google/api/resource.proto\x1a\x33google/cloud/datacatalog_v1beta1/proto/common.proto\x1a=google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec.proto\x1a\x33google/cloud/datacatalog_v1beta1/proto/schema.proto\x1a\x33google/cloud/datacatalog_v1beta1/proto/search.proto\x1a\x37google/cloud/datacatalog_v1beta1/proto/table_spec.proto\x1a\x31google/cloud/datacatalog_v1beta1/proto/tags.proto\x1a\x37google/cloud/datacatalog_v1beta1/proto/timestamps.proto\x1a\x1egoogle/iam/v1/iam_policy.proto\x1a\x1agoogle/iam/v1/policy.proto\x1a\x1bgoogle/protobuf/empty.proto\x1a google/protobuf/field_mask.proto"\x9e\x02\n\x14SearchCatalogRequest\x12P\n\x05scope\x18\x06 \x01(\x0b\x32<.google.cloud.datacatalog.v1beta1.SearchCatalogRequest.ScopeB\x03\xe0\x41\x02\x12\x12\n\x05query\x18\x01 \x01(\tB\x03\xe0\x41\x02\x12\x11\n\tpage_size\x18\x02 \x01(\x05\x12\x17\n\npage_token\x18\x03 \x01(\tB\x03\xe0\x41\x01\x12\x10\n\x08order_by\x18\x05 \x01(\t\x1a\x62\n\x05Scope\x12\x17\n\x0finclude_org_ids\x18\x02 \x03(\t\x12\x1b\n\x13include_project_ids\x18\x03 \x03(\t\x12#\n\x1binclude_gcp_public_datasets\x18\x07 \x01(\x08"x\n\x15SearchCatalogResponse\x12\x46\n\x07results\x18\x01 \x03(\x0b\x32\x35.google.cloud.datacatalog.v1beta1.SearchCatalogResult\x12\x17\n\x0fnext_page_token\x18\x03 \x01(\t"\xb8\x01\n\x17\x43reateEntryGroupRequest\x12=\n\x06parent\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\x12%datacatalog.googleapis.com/EntryGroup\x12\x1b\n\x0e\x65ntry_group_id\x18\x03 \x01(\tB\x03\xe0\x41\x02\x12\x41\n\x0b\x65ntry_group\x18\x02 \x01(\x0b\x32,.google.cloud.datacatalog.v1beta1.EntryGroup"\x92\x01\n\x17UpdateEntryGroupRequest\x12\x46\n\x0b\x65ntry_group\x18\x01 \x01(\x0b\x32,.google.cloud.datacatalog.v1beta1.EntryGroupB\x03\xe0\x41\x02\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"\x82\x01\n\x14GetEntryGroupRequest\x12;\n\x04name\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12-\n\tread_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"j\n\x17\x44\x65leteEntryGroupRequest\x12;\n\x04name\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x12\n\x05\x66orce\x18\x02 \x01(\x08\x42\x03\xe0\x41\x01"\x88\x01\n\x16ListEntryGroupsRequest\x12=\n\x06parent\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x16\n\tpage_size\x18\x02 \x01(\x05\x42\x03\xe0\x41\x01\x12\x17\n\npage_token\x18\x03 \x01(\tB\x03\xe0\x41\x01"v\n\x17ListEntryGroupsResponse\x12\x42\n\x0c\x65ntry_groups\x18\x01 \x03(\x0b\x32,.google.cloud.datacatalog.v1beta1.EntryGroup\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t"\xa7\x01\n\x12\x43reateEntryRequest\x12=\n\x06parent\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x15\n\x08\x65ntry_id\x18\x03 \x01(\tB\x03\xe0\x41\x02\x12;\n\x05\x65ntry\x18\x02 \x01(\x0b\x32\'.google.cloud.datacatalog.v1beta1.EntryB\x03\xe0\x41\x02"\x82\x01\n\x12UpdateEntryRequest\x12;\n\x05\x65ntry\x18\x01 \x01(\x0b\x32\'.google.cloud.datacatalog.v1beta1.EntryB\x03\xe0\x41\x02\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"L\n\x12\x44\x65leteEntryRequest\x12\x36\n\x04name\x18\x01 \x01(\tB(\xe0\x41\x02\xfa\x41"\n datacatalog.googleapis.com/Entry"I\n\x0fGetEntryRequest\x12\x36\n\x04name\x18\x01 \x01(\tB(\xe0\x41\x02\xfa\x41"\n datacatalog.googleapis.com/Entry"V\n\x12LookupEntryRequest\x12\x19\n\x0flinked_resource\x18\x01 \x01(\tH\x00\x12\x16\n\x0csql_resource\x18\x03 \x01(\tH\x00\x42\r\n\x0btarget_name"\x8f\x07\n\x05\x45ntry\x12\x38\n\x04name\x18\x01 \x01(\tB*\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x17\n\x0flinked_resource\x18\t \x01(\t\x12;\n\x04type\x18\x02 \x01(\x0e\x32+.google.cloud.datacatalog.v1beta1.EntryTypeH\x00\x12\x1d\n\x13user_specified_type\x18\x10 \x01(\tH\x00\x12T\n\x11integrated_system\x18\x11 \x01(\x0e\x32\x32.google.cloud.datacatalog.v1beta1.IntegratedSystemB\x03\xe0\x41\x03H\x01\x12\x1f\n\x15user_specified_system\x18\x12 \x01(\tH\x01\x12L\n\x10gcs_fileset_spec\x18\x06 \x01(\x0b\x32\x30.google.cloud.datacatalog.v1beta1.GcsFilesetSpecH\x02\x12R\n\x13\x62igquery_table_spec\x18\x0c \x01(\x0b\x32\x33.google.cloud.datacatalog.v1beta1.BigQueryTableSpecH\x02\x12_\n\x1a\x62igquery_date_sharded_spec\x18\x0f \x01(\x0b\x32\x39.google.cloud.datacatalog.v1beta1.BigQueryDateShardedSpecH\x02\x12\x14\n\x0c\x64isplay_name\x18\x03 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t\x12\x38\n\x06schema\x18\x05 \x01(\x0b\x32(.google.cloud.datacatalog.v1beta1.Schema\x12Y\n\x18source_system_timestamps\x18\x07 \x01(\x0b\x32\x32.google.cloud.datacatalog.v1beta1.SystemTimestampsB\x03\xe0\x41\x03:x\xea\x41u\n datacatalog.googleapis.com/Entry\x12Qprojects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}B\x0c\n\nentry_typeB\x08\n\x06systemB\x0b\n\ttype_spec"\x8e\x02\n\nEntryGroup\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x14\n\x0c\x64isplay_name\x18\x02 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x03 \x01(\t\x12X\n\x17\x64\x61ta_catalog_timestamps\x18\x04 \x01(\x0b\x32\x32.google.cloud.datacatalog.v1beta1.SystemTimestampsB\x03\xe0\x41\x03:m\xea\x41j\n%datacatalog.googleapis.com/EntryGroup\x12\x41projects/{project}/locations/{location}/entryGroups/{entry_group}"\xc2\x01\n\x18\x43reateTagTemplateRequest\x12>\n\x06parent\x18\x01 \x01(\tB.\xe0\x41\x02\xfa\x41(\x12&datacatalog.googleapis.com/TagTemplate\x12\x1c\n\x0ftag_template_id\x18\x03 \x01(\tB\x03\xe0\x41\x02\x12H\n\x0ctag_template\x18\x02 \x01(\x0b\x32-.google.cloud.datacatalog.v1beta1.TagTemplateB\x03\xe0\x41\x02"U\n\x15GetTagTemplateRequest\x12<\n\x04name\x18\x01 \x01(\tB.\xe0\x41\x02\xfa\x41(\n&datacatalog.googleapis.com/TagTemplate"\x95\x01\n\x18UpdateTagTemplateRequest\x12H\n\x0ctag_template\x18\x01 \x01(\x0b\x32-.google.cloud.datacatalog.v1beta1.TagTemplateB\x03\xe0\x41\x02\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"l\n\x18\x44\x65leteTagTemplateRequest\x12<\n\x04name\x18\x01 \x01(\tB.\xe0\x41\x02\xfa\x41(\n&datacatalog.googleapis.com/TagTemplate\x12\x12\n\x05\x66orce\x18\x02 \x01(\x08\x42\x03\xe0\x41\x02"\x83\x01\n\x10\x43reateTagRequest\x12\x36\n\x06parent\x18\x01 \x01(\tB&\xe0\x41\x02\xfa\x41 \n\x1e\x64\x61tacatalog.googleapis.com/Tag\x12\x37\n\x03tag\x18\x02 \x01(\x0b\x32%.google.cloud.datacatalog.v1beta1.TagB\x03\xe0\x41\x02"|\n\x10UpdateTagRequest\x12\x37\n\x03tag\x18\x01 \x01(\x0b\x32%.google.cloud.datacatalog.v1beta1.TagB\x03\xe0\x41\x02\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"H\n\x10\x44\x65leteTagRequest\x12\x34\n\x04name\x18\x01 \x01(\tB&\xe0\x41\x02\xfa\x41 \x12\x1e\x64\x61tacatalog.googleapis.com/Tag"\xd8\x01\n\x1d\x43reateTagTemplateFieldRequest\x12>\n\x06parent\x18\x01 \x01(\tB.\xe0\x41\x02\xfa\x41(\n&datacatalog.googleapis.com/TagTemplate\x12"\n\x15tag_template_field_id\x18\x02 \x01(\tB\x03\xe0\x41\x02\x12S\n\x12tag_template_field\x18\x03 \x01(\x0b\x32\x32.google.cloud.datacatalog.v1beta1.TagTemplateFieldB\x03\xe0\x41\x02"\xed\x01\n\x1dUpdateTagTemplateFieldRequest\x12\x41\n\x04name\x18\x01 \x01(\tB3\xe0\x41\x02\xfa\x41-\n+datacatalog.googleapis.com/TagTemplateField\x12S\n\x12tag_template_field\x18\x02 \x01(\x0b\x32\x32.google.cloud.datacatalog.v1beta1.TagTemplateFieldB\x03\xe0\x41\x02\x12\x34\n\x0bupdate_mask\x18\x03 \x01(\x0b\x32\x1a.google.protobuf.FieldMaskB\x03\xe0\x41\x01"\x8a\x01\n\x1dRenameTagTemplateFieldRequest\x12\x41\n\x04name\x18\x01 \x01(\tB3\xe0\x41\x02\xfa\x41-\n+datacatalog.googleapis.com/TagTemplateField\x12&\n\x19new_tag_template_field_id\x18\x02 \x01(\tB\x03\xe0\x41\x02"v\n\x1d\x44\x65leteTagTemplateFieldRequest\x12\x41\n\x04name\x18\x01 \x01(\tB3\xe0\x41\x02\xfa\x41-\n+datacatalog.googleapis.com/TagTemplateField\x12\x12\n\x05\x66orce\x18\x02 \x01(\x08\x42\x03\xe0\x41\x02"p\n\x0fListTagsRequest\x12\x36\n\x06parent\x18\x01 \x01(\tB&\xe0\x41\x02\xfa\x41 \x12\x1e\x64\x61tacatalog.googleapis.com/Tag\x12\x11\n\tpage_size\x18\x02 \x01(\x05\x12\x12\n\npage_token\x18\x03 \x01(\t"`\n\x10ListTagsResponse\x12\x33\n\x04tags\x18\x01 \x03(\x0b\x32%.google.cloud.datacatalog.v1beta1.Tag\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t"\xa9\x01\n\x12ListEntriesRequest\x12=\n\x06parent\x18\x01 \x01(\tB-\xe0\x41\x02\xfa\x41\'\n%datacatalog.googleapis.com/EntryGroup\x12\x11\n\tpage_size\x18\x02 \x01(\x05\x12\x12\n\npage_token\x18\x03 \x01(\t\x12-\n\tread_mask\x18\x04 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"h\n\x13ListEntriesResponse\x12\x38\n\x07\x65ntries\x18\x01 \x03(\x0b\x32\'.google.cloud.datacatalog.v1beta1.Entry\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t*[\n\tEntryType\x12\x1a\n\x16\x45NTRY_TYPE_UNSPECIFIED\x10\x00\x12\t\n\x05TABLE\x10\x02\x12\t\n\x05MODEL\x10\x05\x12\x0f\n\x0b\x44\x41TA_STREAM\x10\x03\x12\x0b\n\x07\x46ILESET\x10\x04\x32\xbc\x32\n\x0b\x44\x61taCatalog\x12\xb2\x01\n\rSearchCatalog\x12\x36.google.cloud.datacatalog.v1beta1.SearchCatalogRequest\x1a\x37.google.cloud.datacatalog.v1beta1.SearchCatalogResponse"0\x82\xd3\xe4\x93\x02\x1c"\x17/v1beta1/catalog:search:\x01*\xda\x41\x0bscope,query\x12\xea\x01\n\x10\x43reateEntryGroup\x12\x39.google.cloud.datacatalog.v1beta1.CreateEntryGroupRequest\x1a,.google.cloud.datacatalog.v1beta1.EntryGroup"m\x82\xd3\xe4\x93\x02\x43"4/v1beta1/{parent=projects/*/locations/*}/entryGroups:\x0b\x65ntry_group\xda\x41!parent,entry_group_id,entry_group\x12\xfa\x01\n\x10UpdateEntryGroup\x12\x39.google.cloud.datacatalog.v1beta1.UpdateEntryGroupRequest\x1a,.google.cloud.datacatalog.v1beta1.EntryGroup"}\x82\xd3\xe4\x93\x02O2@/v1beta1/{entry_group.name=projects/*/locations/*/entryGroups/*}:\x0b\x65ntry_group\xda\x41\x0b\x65ntry_group\xda\x41\x17\x65ntry_group,update_mask\x12\xcb\x01\n\rGetEntryGroup\x12\x36.google.cloud.datacatalog.v1beta1.GetEntryGroupRequest\x1a,.google.cloud.datacatalog.v1beta1.EntryGroup"T\x82\xd3\xe4\x93\x02\x36\x12\x34/v1beta1/{name=projects/*/locations/*/entryGroups/*}\xda\x41\x04name\xda\x41\x0ename,read_mask\x12\xaa\x01\n\x10\x44\x65leteEntryGroup\x12\x39.google.cloud.datacatalog.v1beta1.DeleteEntryGroupRequest\x1a\x16.google.protobuf.Empty"C\x82\xd3\xe4\x93\x02\x36*4/v1beta1/{name=projects/*/locations/*/entryGroups/*}\xda\x41\x04name\x12\xcd\x01\n\x0fListEntryGroups\x12\x38.google.cloud.datacatalog.v1beta1.ListEntryGroupsRequest\x1a\x39.google.cloud.datacatalog.v1beta1.ListEntryGroupsResponse"E\x82\xd3\xe4\x93\x02\x36\x12\x34/v1beta1/{parent=projects/*/locations/*}/entryGroups\xda\x41\x06parent\x12\xd3\x01\n\x0b\x43reateEntry\x12\x34.google.cloud.datacatalog.v1beta1.CreateEntryRequest\x1a\'.google.cloud.datacatalog.v1beta1.Entry"e\x82\xd3\xe4\x93\x02G">/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/entries:\x05\x65ntry\xda\x41\x15parent,entry_id,entry\x12\xdd\x01\n\x0bUpdateEntry\x12\x34.google.cloud.datacatalog.v1beta1.UpdateEntryRequest\x1a\'.google.cloud.datacatalog.v1beta1.Entry"o\x82\xd3\xe4\x93\x02M2D/v1beta1/{entry.name=projects/*/locations/*/entryGroups/*/entries/*}:\x05\x65ntry\xda\x41\x05\x65ntry\xda\x41\x11\x65ntry,update_mask\x12\xaa\x01\n\x0b\x44\x65leteEntry\x12\x34.google.cloud.datacatalog.v1beta1.DeleteEntryRequest\x1a\x16.google.protobuf.Empty"M\x82\xd3\xe4\x93\x02@*>/v1beta1/{name=projects/*/locations/*/entryGroups/*/entries/*}\xda\x41\x04name\x12\xb5\x01\n\x08GetEntry\x12\x31.google.cloud.datacatalog.v1beta1.GetEntryRequest\x1a\'.google.cloud.datacatalog.v1beta1.Entry"M\x82\xd3\xe4\x93\x02@\x12>/v1beta1/{name=projects/*/locations/*/entryGroups/*/entries/*}\xda\x41\x04name\x12\x8d\x01\n\x0bLookupEntry\x12\x34.google.cloud.datacatalog.v1beta1.LookupEntryRequest\x1a\'.google.cloud.datacatalog.v1beta1.Entry"\x1f\x82\xd3\xe4\x93\x02\x19\x12\x17/v1beta1/entries:lookup\x12\xcb\x01\n\x0bListEntries\x12\x34.google.cloud.datacatalog.v1beta1.ListEntriesRequest\x1a\x35.google.cloud.datacatalog.v1beta1.ListEntriesResponse"O\x82\xd3\xe4\x93\x02@\x12>/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/entries\xda\x41\x06parent\x12\xf1\x01\n\x11\x43reateTagTemplate\x12:.google.cloud.datacatalog.v1beta1.CreateTagTemplateRequest\x1a-.google.cloud.datacatalog.v1beta1.TagTemplate"q\x82\xd3\xe4\x93\x02\x45"5/v1beta1/{parent=projects/*/locations/*}/tagTemplates:\x0ctag_template\xda\x41#parent,tag_template_id,tag_template\x12\xbe\x01\n\x0eGetTagTemplate\x12\x37.google.cloud.datacatalog.v1beta1.GetTagTemplateRequest\x1a-.google.cloud.datacatalog.v1beta1.TagTemplate"D\x82\xd3\xe4\x93\x02\x37\x12\x35/v1beta1/{name=projects/*/locations/*/tagTemplates/*}\xda\x41\x04name\x12\x83\x02\n\x11UpdateTagTemplate\x12:.google.cloud.datacatalog.v1beta1.UpdateTagTemplateRequest\x1a-.google.cloud.datacatalog.v1beta1.TagTemplate"\x82\x01\x82\xd3\xe4\x93\x02R2B/v1beta1/{tag_template.name=projects/*/locations/*/tagTemplates/*}:\x0ctag_template\xda\x41\x0ctag_template\xda\x41\x18tag_template,update_mask\x12\xb3\x01\n\x11\x44\x65leteTagTemplate\x12:.google.cloud.datacatalog.v1beta1.DeleteTagTemplateRequest\x1a\x16.google.protobuf.Empty"J\x82\xd3\xe4\x93\x02\x37*5/v1beta1/{name=projects/*/locations/*/tagTemplates/*}\xda\x41\nname,force\x12\x9c\x02\n\x16\x43reateTagTemplateField\x12?.google.cloud.datacatalog.v1beta1.CreateTagTemplateFieldRequest\x1a\x32.google.cloud.datacatalog.v1beta1.TagTemplateField"\x8c\x01\x82\xd3\xe4\x93\x02T">/v1beta1/{parent=projects/*/locations/*/tagTemplates/*}/fields:\x12tag_template_field\xda\x41/parent,tag_template_field_id,tag_template_field\x12\xaa\x02\n\x16UpdateTagTemplateField\x12?.google.cloud.datacatalog.v1beta1.UpdateTagTemplateFieldRequest\x1a\x32.google.cloud.datacatalog.v1beta1.TagTemplateField"\x9a\x01\x82\xd3\xe4\x93\x02T2>/v1beta1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:\x12tag_template_field\xda\x41\x17name,tag_template_field\xda\x41#name,tag_template_field,update_mask\x12\x80\x02\n\x16RenameTagTemplateField\x12?.google.cloud.datacatalog.v1beta1.RenameTagTemplateFieldRequest\x1a\x32.google.cloud.datacatalog.v1beta1.TagTemplateField"q\x82\xd3\xe4\x93\x02J"E/v1beta1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:rename:\x01*\xda\x41\x1ename,new_tag_template_field_id\x12\xc6\x01\n\x16\x44\x65leteTagTemplateField\x12?.google.cloud.datacatalog.v1beta1.DeleteTagTemplateFieldRequest\x1a\x16.google.protobuf.Empty"S\x82\xd3\xe4\x93\x02@*>/v1beta1/{name=projects/*/locations/*/tagTemplates/*/fields/*}\xda\x41\nname,force\x12\x8d\x02\n\tCreateTag\x12\x32.google.cloud.datacatalog.v1beta1.CreateTagRequest\x1a%.google.cloud.datacatalog.v1beta1.Tag"\xa4\x01\x82\xd3\xe4\x93\x02\x90\x01"E/v1beta1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tags:\x03tagZB";/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/tags:\x03tag\xda\x41\nparent,tag\x12\xa0\x02\n\tUpdateTag\x12\x32.google.cloud.datacatalog.v1beta1.UpdateTagRequest\x1a%.google.cloud.datacatalog.v1beta1.Tag"\xb7\x01\x82\xd3\xe4\x93\x02\x98\x01\x32I/v1beta1/{tag.name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}:\x03tagZF2?/v1beta1/{tag.name=projects/*/locations/*/entryGroups/*/tags/*}:\x03tag\xda\x41\x03tag\xda\x41\x0ftag,update_mask\x12\xee\x01\n\tDeleteTag\x12\x32.google.cloud.datacatalog.v1beta1.DeleteTagRequest\x1a\x16.google.protobuf.Empty"\x94\x01\x82\xd3\xe4\x93\x02\x86\x01*E/v1beta1/{name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}Z=*;/v1beta1/{name=projects/*/locations/*/entryGroups/*/tags/*}\xda\x41\x04name\x12\x8a\x02\n\x08ListTags\x12\x31.google.cloud.datacatalog.v1beta1.ListTagsRequest\x1a\x32.google.cloud.datacatalog.v1beta1.ListTagsResponse"\x96\x01\x82\xd3\xe4\x93\x02\x86\x01\x12\x45/v1beta1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tagsZ=\x12;/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/tags\xda\x41\x06parent\x12\xfc\x01\n\x0cSetIamPolicy\x12".google.iam.v1.SetIamPolicyRequest\x1a\x15.google.iam.v1.Policy"\xb0\x01\x82\xd3\xe4\x93\x02\x97\x01"F/v1beta1/{resource=projects/*/locations/*/tagTemplates/*}:setIamPolicy:\x01*ZJ"E/v1beta1/{resource=projects/*/locations/*/entryGroups/*}:setIamPolicy:\x01*\xda\x41\x0fresource,policy\x12\xcb\x02\n\x0cGetIamPolicy\x12".google.iam.v1.GetIamPolicyRequest\x1a\x15.google.iam.v1.Policy"\xff\x01\x82\xd3\xe4\x93\x02\xed\x01"F/v1beta1/{resource=projects/*/locations/*/tagTemplates/*}:getIamPolicy:\x01*ZJ"E/v1beta1/{resource=projects/*/locations/*/entryGroups/*}:getIamPolicy:\x01*ZT"O/v1beta1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:getIamPolicy:\x01*\xda\x41\x08resource\x12\xf2\x02\n\x12TestIamPermissions\x12(.google.iam.v1.TestIamPermissionsRequest\x1a).google.iam.v1.TestIamPermissionsResponse"\x86\x02\x82\xd3\xe4\x93\x02\xff\x01"L/v1beta1/{resource=projects/*/locations/*/tagTemplates/*}:testIamPermissions:\x01*ZP"K/v1beta1/{resource=projects/*/locations/*/entryGroups/*}:testIamPermissions:\x01*ZZ"U/v1beta1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:testIamPermissions:\x01*\x1aN\xca\x41\x1a\x64\x61tacatalog.googleapis.com\xd2\x41.https://www.googleapis.com/auth/cloud-platformB\xe4\x01\n$com.google.cloud.datacatalog.v1beta1P\x01ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\xf8\x01\x01\xaa\x02 Google.Cloud.DataCatalog.V1Beta1\xca\x02 Google\\Cloud\\DataCatalog\\V1beta1\xea\x02#Google::Cloud::DataCatalog::V1beta1b\x06proto3', - dependencies=[ - google_dot_api_dot_annotations__pb2.DESCRIPTOR, - google_dot_api_dot_client__pb2.DESCRIPTOR, - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_api_dot_resource__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_common__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_gcs__fileset__spec__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_schema__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_search__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_table__spec__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_timestamps__pb2.DESCRIPTOR, - google_dot_iam_dot_v1_dot_iam__policy__pb2.DESCRIPTOR, - google_dot_iam_dot_v1_dot_policy__pb2.DESCRIPTOR, - google_dot_protobuf_dot_empty__pb2.DESCRIPTOR, - google_dot_protobuf_dot_field__mask__pb2.DESCRIPTOR, - ], -) - -_ENTRYTYPE = _descriptor.EnumDescriptor( - name="EntryType", - full_name="google.cloud.datacatalog.v1beta1.EntryType", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="ENTRY_TYPE_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="TABLE", - index=1, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="MODEL", - index=2, - number=5, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="DATA_STREAM", - index=3, - number=3, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="FILESET", - index=4, - number=4, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=5787, - serialized_end=5878, -) -_sym_db.RegisterEnumDescriptor(_ENTRYTYPE) - -EntryType = enum_type_wrapper.EnumTypeWrapper(_ENTRYTYPE) -ENTRY_TYPE_UNSPECIFIED = 0 -TABLE = 2 -MODEL = 5 -DATA_STREAM = 3 -FILESET = 4 - - -_SEARCHCATALOGREQUEST_SCOPE = _descriptor.Descriptor( - name="Scope", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest.Scope", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="include_org_ids", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest.Scope.include_org_ids", - index=0, - number=2, - type=9, - cpp_type=9, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="include_project_ids", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest.Scope.include_project_ids", - index=1, - number=3, - type=9, - cpp_type=9, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="include_gcp_public_datasets", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest.Scope.include_gcp_public_datasets", - index=2, - number=7, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=908, - serialized_end=1006, -) - -_SEARCHCATALOGREQUEST = _descriptor.Descriptor( - name="SearchCatalogRequest", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="scope", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest.scope", - index=0, - number=6, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="query", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest.query", - index=1, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_size", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest.page_size", - index=2, - number=2, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_token", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest.page_token", - index=3, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="order_by", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogRequest.order_by", - index=4, - number=5, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_SEARCHCATALOGREQUEST_SCOPE], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=720, - serialized_end=1006, -) - - -_SEARCHCATALOGRESPONSE = _descriptor.Descriptor( - name="SearchCatalogResponse", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="results", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogResponse.results", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="next_page_token", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogResponse.next_page_token", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1008, - serialized_end=1128, -) - - -_CREATEENTRYGROUPREQUEST = _descriptor.Descriptor( - name="CreateEntryGroupRequest", - full_name="google.cloud.datacatalog.v1beta1.CreateEntryGroupRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.CreateEntryGroupRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\022%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="entry_group_id", - full_name="google.cloud.datacatalog.v1beta1.CreateEntryGroupRequest.entry_group_id", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="entry_group", - full_name="google.cloud.datacatalog.v1beta1.CreateEntryGroupRequest.entry_group", - index=2, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1131, - serialized_end=1315, -) - - -_UPDATEENTRYGROUPREQUEST = _descriptor.Descriptor( - name="UpdateEntryGroupRequest", - full_name="google.cloud.datacatalog.v1beta1.UpdateEntryGroupRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="entry_group", - full_name="google.cloud.datacatalog.v1beta1.UpdateEntryGroupRequest.entry_group", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1beta1.UpdateEntryGroupRequest.update_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1318, - serialized_end=1464, -) - - -_GETENTRYGROUPREQUEST = _descriptor.Descriptor( - name="GetEntryGroupRequest", - full_name="google.cloud.datacatalog.v1beta1.GetEntryGroupRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.GetEntryGroupRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="read_mask", - full_name="google.cloud.datacatalog.v1beta1.GetEntryGroupRequest.read_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1467, - serialized_end=1597, -) - - -_DELETEENTRYGROUPREQUEST = _descriptor.Descriptor( - name="DeleteEntryGroupRequest", - full_name="google.cloud.datacatalog.v1beta1.DeleteEntryGroupRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.DeleteEntryGroupRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="force", - full_name="google.cloud.datacatalog.v1beta1.DeleteEntryGroupRequest.force", - index=1, - number=2, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1599, - serialized_end=1705, -) - - -_LISTENTRYGROUPSREQUEST = _descriptor.Descriptor( - name="ListEntryGroupsRequest", - full_name="google.cloud.datacatalog.v1beta1.ListEntryGroupsRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.ListEntryGroupsRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_size", - full_name="google.cloud.datacatalog.v1beta1.ListEntryGroupsRequest.page_size", - index=1, - number=2, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_token", - full_name="google.cloud.datacatalog.v1beta1.ListEntryGroupsRequest.page_token", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1708, - serialized_end=1844, -) - - -_LISTENTRYGROUPSRESPONSE = _descriptor.Descriptor( - name="ListEntryGroupsResponse", - full_name="google.cloud.datacatalog.v1beta1.ListEntryGroupsResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="entry_groups", - full_name="google.cloud.datacatalog.v1beta1.ListEntryGroupsResponse.entry_groups", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="next_page_token", - full_name="google.cloud.datacatalog.v1beta1.ListEntryGroupsResponse.next_page_token", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1846, - serialized_end=1964, -) - - -_CREATEENTRYREQUEST = _descriptor.Descriptor( - name="CreateEntryRequest", - full_name="google.cloud.datacatalog.v1beta1.CreateEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.CreateEntryRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="entry_id", - full_name="google.cloud.datacatalog.v1beta1.CreateEntryRequest.entry_id", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="entry", - full_name="google.cloud.datacatalog.v1beta1.CreateEntryRequest.entry", - index=2, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1967, - serialized_end=2134, -) - - -_UPDATEENTRYREQUEST = _descriptor.Descriptor( - name="UpdateEntryRequest", - full_name="google.cloud.datacatalog.v1beta1.UpdateEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="entry", - full_name="google.cloud.datacatalog.v1beta1.UpdateEntryRequest.entry", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1beta1.UpdateEntryRequest.update_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=2137, - serialized_end=2267, -) - - -_DELETEENTRYREQUEST = _descriptor.Descriptor( - name="DeleteEntryRequest", - full_name="google.cloud.datacatalog.v1beta1.DeleteEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.DeleteEntryRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b'\340A\002\372A"\n datacatalog.googleapis.com/Entry', - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=2269, - serialized_end=2345, -) - - -_GETENTRYREQUEST = _descriptor.Descriptor( - name="GetEntryRequest", - full_name="google.cloud.datacatalog.v1beta1.GetEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.GetEntryRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b'\340A\002\372A"\n datacatalog.googleapis.com/Entry', - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=2347, - serialized_end=2420, -) - - -_LOOKUPENTRYREQUEST = _descriptor.Descriptor( - name="LookupEntryRequest", - full_name="google.cloud.datacatalog.v1beta1.LookupEntryRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="linked_resource", - full_name="google.cloud.datacatalog.v1beta1.LookupEntryRequest.linked_resource", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="sql_resource", - full_name="google.cloud.datacatalog.v1beta1.LookupEntryRequest.sql_resource", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="target_name", - full_name="google.cloud.datacatalog.v1beta1.LookupEntryRequest.target_name", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=2422, - serialized_end=2508, -) - - -_ENTRY = _descriptor.Descriptor( - name="Entry", - full_name="google.cloud.datacatalog.v1beta1.Entry", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.Entry.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="linked_resource", - full_name="google.cloud.datacatalog.v1beta1.Entry.linked_resource", - index=1, - number=9, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="type", - full_name="google.cloud.datacatalog.v1beta1.Entry.type", - index=2, - number=2, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="user_specified_type", - full_name="google.cloud.datacatalog.v1beta1.Entry.user_specified_type", - index=3, - number=16, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="integrated_system", - full_name="google.cloud.datacatalog.v1beta1.Entry.integrated_system", - index=4, - number=17, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="user_specified_system", - full_name="google.cloud.datacatalog.v1beta1.Entry.user_specified_system", - index=5, - number=18, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="gcs_fileset_spec", - full_name="google.cloud.datacatalog.v1beta1.Entry.gcs_fileset_spec", - index=6, - number=6, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="bigquery_table_spec", - full_name="google.cloud.datacatalog.v1beta1.Entry.bigquery_table_spec", - index=7, - number=12, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="bigquery_date_sharded_spec", - full_name="google.cloud.datacatalog.v1beta1.Entry.bigquery_date_sharded_spec", - index=8, - number=15, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1beta1.Entry.display_name", - index=9, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="description", - full_name="google.cloud.datacatalog.v1beta1.Entry.description", - index=10, - number=4, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="schema", - full_name="google.cloud.datacatalog.v1beta1.Entry.schema", - index=11, - number=5, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="source_system_timestamps", - full_name="google.cloud.datacatalog.v1beta1.Entry.source_system_timestamps", - index=12, - number=7, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"\352Au\n datacatalog.googleapis.com/Entry\022Qprojects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="entry_type", - full_name="google.cloud.datacatalog.v1beta1.Entry.entry_type", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ), - _descriptor.OneofDescriptor( - name="system", - full_name="google.cloud.datacatalog.v1beta1.Entry.system", - index=1, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ), - _descriptor.OneofDescriptor( - name="type_spec", - full_name="google.cloud.datacatalog.v1beta1.Entry.type_spec", - index=2, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ), - ], - serialized_start=2511, - serialized_end=3422, -) - - -_ENTRYGROUP = _descriptor.Descriptor( - name="EntryGroup", - full_name="google.cloud.datacatalog.v1beta1.EntryGroup", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.EntryGroup.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1beta1.EntryGroup.display_name", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="description", - full_name="google.cloud.datacatalog.v1beta1.EntryGroup.description", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="data_catalog_timestamps", - full_name="google.cloud.datacatalog.v1beta1.EntryGroup.data_catalog_timestamps", - index=3, - number=4, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"\352Aj\n%datacatalog.googleapis.com/EntryGroup\022Aprojects/{project}/locations/{location}/entryGroups/{entry_group}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=3425, - serialized_end=3695, -) - - -_CREATETAGTEMPLATEREQUEST = _descriptor.Descriptor( - name="CreateTagTemplateRequest", - full_name="google.cloud.datacatalog.v1beta1.CreateTagTemplateRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.CreateTagTemplateRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A(\022&datacatalog.googleapis.com/TagTemplate", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template_id", - full_name="google.cloud.datacatalog.v1beta1.CreateTagTemplateRequest.tag_template_id", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template", - full_name="google.cloud.datacatalog.v1beta1.CreateTagTemplateRequest.tag_template", - index=2, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=3698, - serialized_end=3892, -) - - -_GETTAGTEMPLATEREQUEST = _descriptor.Descriptor( - name="GetTagTemplateRequest", - full_name="google.cloud.datacatalog.v1beta1.GetTagTemplateRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.GetTagTemplateRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A(\n&datacatalog.googleapis.com/TagTemplate", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=3894, - serialized_end=3979, -) - - -_UPDATETAGTEMPLATEREQUEST = _descriptor.Descriptor( - name="UpdateTagTemplateRequest", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagTemplateRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="tag_template", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagTemplateRequest.tag_template", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagTemplateRequest.update_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=3982, - serialized_end=4131, -) - - -_DELETETAGTEMPLATEREQUEST = _descriptor.Descriptor( - name="DeleteTagTemplateRequest", - full_name="google.cloud.datacatalog.v1beta1.DeleteTagTemplateRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.DeleteTagTemplateRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A(\n&datacatalog.googleapis.com/TagTemplate", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="force", - full_name="google.cloud.datacatalog.v1beta1.DeleteTagTemplateRequest.force", - index=1, - number=2, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4133, - serialized_end=4241, -) - - -_CREATETAGREQUEST = _descriptor.Descriptor( - name="CreateTagRequest", - full_name="google.cloud.datacatalog.v1beta1.CreateTagRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.CreateTagRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A \n\036datacatalog.googleapis.com/Tag", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag", - full_name="google.cloud.datacatalog.v1beta1.CreateTagRequest.tag", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4244, - serialized_end=4375, -) - - -_UPDATETAGREQUEST = _descriptor.Descriptor( - name="UpdateTagRequest", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="tag", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagRequest.tag", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagRequest.update_mask", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4377, - serialized_end=4501, -) - - -_DELETETAGREQUEST = _descriptor.Descriptor( - name="DeleteTagRequest", - full_name="google.cloud.datacatalog.v1beta1.DeleteTagRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.DeleteTagRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A \022\036datacatalog.googleapis.com/Tag", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4503, - serialized_end=4575, -) - - -_CREATETAGTEMPLATEFIELDREQUEST = _descriptor.Descriptor( - name="CreateTagTemplateFieldRequest", - full_name="google.cloud.datacatalog.v1beta1.CreateTagTemplateFieldRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.CreateTagTemplateFieldRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A(\n&datacatalog.googleapis.com/TagTemplate", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template_field_id", - full_name="google.cloud.datacatalog.v1beta1.CreateTagTemplateFieldRequest.tag_template_field_id", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template_field", - full_name="google.cloud.datacatalog.v1beta1.CreateTagTemplateFieldRequest.tag_template_field", - index=2, - number=3, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4578, - serialized_end=4794, -) - - -_UPDATETAGTEMPLATEFIELDREQUEST = _descriptor.Descriptor( - name="UpdateTagTemplateFieldRequest", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagTemplateFieldRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagTemplateFieldRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A-\n+datacatalog.googleapis.com/TagTemplateField", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="tag_template_field", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagTemplateFieldRequest.tag_template_field", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_mask", - full_name="google.cloud.datacatalog.v1beta1.UpdateTagTemplateFieldRequest.update_mask", - index=2, - number=3, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=4797, - serialized_end=5034, -) - - -_RENAMETAGTEMPLATEFIELDREQUEST = _descriptor.Descriptor( - name="RenameTagTemplateFieldRequest", - full_name="google.cloud.datacatalog.v1beta1.RenameTagTemplateFieldRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.RenameTagTemplateFieldRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A-\n+datacatalog.googleapis.com/TagTemplateField", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="new_tag_template_field_id", - full_name="google.cloud.datacatalog.v1beta1.RenameTagTemplateFieldRequest.new_tag_template_field_id", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5037, - serialized_end=5175, -) - - -_DELETETAGTEMPLATEFIELDREQUEST = _descriptor.Descriptor( - name="DeleteTagTemplateFieldRequest", - full_name="google.cloud.datacatalog.v1beta1.DeleteTagTemplateFieldRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.DeleteTagTemplateFieldRequest.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A-\n+datacatalog.googleapis.com/TagTemplateField", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="force", - full_name="google.cloud.datacatalog.v1beta1.DeleteTagTemplateFieldRequest.force", - index=1, - number=2, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5177, - serialized_end=5295, -) - - -_LISTTAGSREQUEST = _descriptor.Descriptor( - name="ListTagsRequest", - full_name="google.cloud.datacatalog.v1beta1.ListTagsRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.ListTagsRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A \022\036datacatalog.googleapis.com/Tag", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_size", - full_name="google.cloud.datacatalog.v1beta1.ListTagsRequest.page_size", - index=1, - number=2, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_token", - full_name="google.cloud.datacatalog.v1beta1.ListTagsRequest.page_token", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5297, - serialized_end=5409, -) - - -_LISTTAGSRESPONSE = _descriptor.Descriptor( - name="ListTagsResponse", - full_name="google.cloud.datacatalog.v1beta1.ListTagsResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="tags", - full_name="google.cloud.datacatalog.v1beta1.ListTagsResponse.tags", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="next_page_token", - full_name="google.cloud.datacatalog.v1beta1.ListTagsResponse.next_page_token", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5411, - serialized_end=5507, -) - - -_LISTENTRIESREQUEST = _descriptor.Descriptor( - name="ListEntriesRequest", - full_name="google.cloud.datacatalog.v1beta1.ListEntriesRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.ListEntriesRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A'\n%datacatalog.googleapis.com/EntryGroup", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_size", - full_name="google.cloud.datacatalog.v1beta1.ListEntriesRequest.page_size", - index=1, - number=2, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="page_token", - full_name="google.cloud.datacatalog.v1beta1.ListEntriesRequest.page_token", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="read_mask", - full_name="google.cloud.datacatalog.v1beta1.ListEntriesRequest.read_mask", - index=3, - number=4, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5510, - serialized_end=5679, -) - - -_LISTENTRIESRESPONSE = _descriptor.Descriptor( - name="ListEntriesResponse", - full_name="google.cloud.datacatalog.v1beta1.ListEntriesResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="entries", - full_name="google.cloud.datacatalog.v1beta1.ListEntriesResponse.entries", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="next_page_token", - full_name="google.cloud.datacatalog.v1beta1.ListEntriesResponse.next_page_token", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=5681, - serialized_end=5785, -) - -_SEARCHCATALOGREQUEST_SCOPE.containing_type = _SEARCHCATALOGREQUEST -_SEARCHCATALOGREQUEST.fields_by_name["scope"].message_type = _SEARCHCATALOGREQUEST_SCOPE -_SEARCHCATALOGRESPONSE.fields_by_name[ - "results" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_search__pb2._SEARCHCATALOGRESULT -) -_CREATEENTRYGROUPREQUEST.fields_by_name["entry_group"].message_type = _ENTRYGROUP -_UPDATEENTRYGROUPREQUEST.fields_by_name["entry_group"].message_type = _ENTRYGROUP -_UPDATEENTRYGROUPREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_GETENTRYGROUPREQUEST.fields_by_name[ - "read_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_LISTENTRYGROUPSRESPONSE.fields_by_name["entry_groups"].message_type = _ENTRYGROUP -_CREATEENTRYREQUEST.fields_by_name["entry"].message_type = _ENTRY -_UPDATEENTRYREQUEST.fields_by_name["entry"].message_type = _ENTRY -_UPDATEENTRYREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_LOOKUPENTRYREQUEST.oneofs_by_name["target_name"].fields.append( - _LOOKUPENTRYREQUEST.fields_by_name["linked_resource"] -) -_LOOKUPENTRYREQUEST.fields_by_name[ - "linked_resource" -].containing_oneof = _LOOKUPENTRYREQUEST.oneofs_by_name["target_name"] -_LOOKUPENTRYREQUEST.oneofs_by_name["target_name"].fields.append( - _LOOKUPENTRYREQUEST.fields_by_name["sql_resource"] -) -_LOOKUPENTRYREQUEST.fields_by_name[ - "sql_resource" -].containing_oneof = _LOOKUPENTRYREQUEST.oneofs_by_name["target_name"] -_ENTRY.fields_by_name["type"].enum_type = _ENTRYTYPE -_ENTRY.fields_by_name[ - "integrated_system" -].enum_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_common__pb2._INTEGRATEDSYSTEM -) -_ENTRY.fields_by_name[ - "gcs_fileset_spec" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_gcs__fileset__spec__pb2._GCSFILESETSPEC -) -_ENTRY.fields_by_name[ - "bigquery_table_spec" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_table__spec__pb2._BIGQUERYTABLESPEC -) -_ENTRY.fields_by_name[ - "bigquery_date_sharded_spec" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_table__spec__pb2._BIGQUERYDATESHARDEDSPEC -) -_ENTRY.fields_by_name[ - "schema" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_schema__pb2._SCHEMA -) -_ENTRY.fields_by_name[ - "source_system_timestamps" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_timestamps__pb2._SYSTEMTIMESTAMPS -) -_ENTRY.oneofs_by_name["entry_type"].fields.append(_ENTRY.fields_by_name["type"]) -_ENTRY.fields_by_name["type"].containing_oneof = _ENTRY.oneofs_by_name["entry_type"] -_ENTRY.oneofs_by_name["entry_type"].fields.append( - _ENTRY.fields_by_name["user_specified_type"] -) -_ENTRY.fields_by_name["user_specified_type"].containing_oneof = _ENTRY.oneofs_by_name[ - "entry_type" -] -_ENTRY.oneofs_by_name["system"].fields.append( - _ENTRY.fields_by_name["integrated_system"] -) -_ENTRY.fields_by_name["integrated_system"].containing_oneof = _ENTRY.oneofs_by_name[ - "system" -] -_ENTRY.oneofs_by_name["system"].fields.append( - _ENTRY.fields_by_name["user_specified_system"] -) -_ENTRY.fields_by_name["user_specified_system"].containing_oneof = _ENTRY.oneofs_by_name[ - "system" -] -_ENTRY.oneofs_by_name["type_spec"].fields.append( - _ENTRY.fields_by_name["gcs_fileset_spec"] -) -_ENTRY.fields_by_name["gcs_fileset_spec"].containing_oneof = _ENTRY.oneofs_by_name[ - "type_spec" -] -_ENTRY.oneofs_by_name["type_spec"].fields.append( - _ENTRY.fields_by_name["bigquery_table_spec"] -) -_ENTRY.fields_by_name["bigquery_table_spec"].containing_oneof = _ENTRY.oneofs_by_name[ - "type_spec" -] -_ENTRY.oneofs_by_name["type_spec"].fields.append( - _ENTRY.fields_by_name["bigquery_date_sharded_spec"] -) -_ENTRY.fields_by_name[ - "bigquery_date_sharded_spec" -].containing_oneof = _ENTRY.oneofs_by_name["type_spec"] -_ENTRYGROUP.fields_by_name[ - "data_catalog_timestamps" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_timestamps__pb2._SYSTEMTIMESTAMPS -) -_CREATETAGTEMPLATEREQUEST.fields_by_name[ - "tag_template" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATE -) -_UPDATETAGTEMPLATEREQUEST.fields_by_name[ - "tag_template" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATE -) -_UPDATETAGTEMPLATEREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_CREATETAGREQUEST.fields_by_name[ - "tag" -].message_type = google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAG -_UPDATETAGREQUEST.fields_by_name[ - "tag" -].message_type = google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAG -_UPDATETAGREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_CREATETAGTEMPLATEFIELDREQUEST.fields_by_name[ - "tag_template_field" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD -) -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name[ - "tag_template_field" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD -) -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name[ - "update_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_LISTTAGSRESPONSE.fields_by_name[ - "tags" -].message_type = google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAG -_LISTENTRIESREQUEST.fields_by_name[ - "read_mask" -].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK -_LISTENTRIESRESPONSE.fields_by_name["entries"].message_type = _ENTRY -DESCRIPTOR.message_types_by_name["SearchCatalogRequest"] = _SEARCHCATALOGREQUEST -DESCRIPTOR.message_types_by_name["SearchCatalogResponse"] = _SEARCHCATALOGRESPONSE -DESCRIPTOR.message_types_by_name["CreateEntryGroupRequest"] = _CREATEENTRYGROUPREQUEST -DESCRIPTOR.message_types_by_name["UpdateEntryGroupRequest"] = _UPDATEENTRYGROUPREQUEST -DESCRIPTOR.message_types_by_name["GetEntryGroupRequest"] = _GETENTRYGROUPREQUEST -DESCRIPTOR.message_types_by_name["DeleteEntryGroupRequest"] = _DELETEENTRYGROUPREQUEST -DESCRIPTOR.message_types_by_name["ListEntryGroupsRequest"] = _LISTENTRYGROUPSREQUEST -DESCRIPTOR.message_types_by_name["ListEntryGroupsResponse"] = _LISTENTRYGROUPSRESPONSE -DESCRIPTOR.message_types_by_name["CreateEntryRequest"] = _CREATEENTRYREQUEST -DESCRIPTOR.message_types_by_name["UpdateEntryRequest"] = _UPDATEENTRYREQUEST -DESCRIPTOR.message_types_by_name["DeleteEntryRequest"] = _DELETEENTRYREQUEST -DESCRIPTOR.message_types_by_name["GetEntryRequest"] = _GETENTRYREQUEST -DESCRIPTOR.message_types_by_name["LookupEntryRequest"] = _LOOKUPENTRYREQUEST -DESCRIPTOR.message_types_by_name["Entry"] = _ENTRY -DESCRIPTOR.message_types_by_name["EntryGroup"] = _ENTRYGROUP -DESCRIPTOR.message_types_by_name["CreateTagTemplateRequest"] = _CREATETAGTEMPLATEREQUEST -DESCRIPTOR.message_types_by_name["GetTagTemplateRequest"] = _GETTAGTEMPLATEREQUEST -DESCRIPTOR.message_types_by_name["UpdateTagTemplateRequest"] = _UPDATETAGTEMPLATEREQUEST -DESCRIPTOR.message_types_by_name["DeleteTagTemplateRequest"] = _DELETETAGTEMPLATEREQUEST -DESCRIPTOR.message_types_by_name["CreateTagRequest"] = _CREATETAGREQUEST -DESCRIPTOR.message_types_by_name["UpdateTagRequest"] = _UPDATETAGREQUEST -DESCRIPTOR.message_types_by_name["DeleteTagRequest"] = _DELETETAGREQUEST -DESCRIPTOR.message_types_by_name[ - "CreateTagTemplateFieldRequest" -] = _CREATETAGTEMPLATEFIELDREQUEST -DESCRIPTOR.message_types_by_name[ - "UpdateTagTemplateFieldRequest" -] = _UPDATETAGTEMPLATEFIELDREQUEST -DESCRIPTOR.message_types_by_name[ - "RenameTagTemplateFieldRequest" -] = _RENAMETAGTEMPLATEFIELDREQUEST -DESCRIPTOR.message_types_by_name[ - "DeleteTagTemplateFieldRequest" -] = _DELETETAGTEMPLATEFIELDREQUEST -DESCRIPTOR.message_types_by_name["ListTagsRequest"] = _LISTTAGSREQUEST -DESCRIPTOR.message_types_by_name["ListTagsResponse"] = _LISTTAGSRESPONSE -DESCRIPTOR.message_types_by_name["ListEntriesRequest"] = _LISTENTRIESREQUEST -DESCRIPTOR.message_types_by_name["ListEntriesResponse"] = _LISTENTRIESRESPONSE -DESCRIPTOR.enum_types_by_name["EntryType"] = _ENTRYTYPE -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -SearchCatalogRequest = _reflection.GeneratedProtocolMessageType( - "SearchCatalogRequest", - (_message.Message,), - { - "Scope": _reflection.GeneratedProtocolMessageType( - "Scope", - (_message.Message,), - { - "DESCRIPTOR": _SEARCHCATALOGREQUEST_SCOPE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """The criteria that select the subspace used for query matching. - - Attributes: - include_org_ids: - The list of organization IDs to search within. To find your - organization ID, follow instructions in - https://cloud.google.com/resource-manager/docs/creating- - managing-organization. - include_project_ids: - The list of project IDs to search within. To learn more about - the distinction between project names/IDs/numbers, go to - https://cloud.google.com/docs/overview/#projects. - include_gcp_public_datasets: - If ``true``, include Google Cloud Platform (GCP) public - datasets in the search results. Info on GCP public datasets is - available at https://cloud.google.com/public-datasets/. By - default, GCP public datasets are excluded. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.SearchCatalogRequest.Scope) - }, - ), - "DESCRIPTOR": _SEARCHCATALOGREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [SearchCatalog][google.cloud.datacatalog.v1beta1.D - ataCatalog.SearchCatalog]. - - Attributes: - scope: - Required. The scope of this search request. A ``scope`` that - has empty ``include_org_ids``, ``include_project_ids`` AND - false ``include_gcp_public_datasets`` is considered invalid. - Data Catalog will return an error in such a case. - query: - Required. The query string in search query syntax. The query - must be non-empty. Query strings can be simple as “x” or more - qualified as: - name:x - column:x - description:y Note: - Query tokens need to have a minimum of 3 characters for - substring matching to work correctly. See `Data Catalog Search - Syntax `__ for more information. - page_size: - Number of results in the search page. If <=0 then defaults to - 10. Max limit for page_size is 1000. Throws an invalid - argument for page_size > 1000. - page_token: - Optional. Pagination token returned in an earlier [SearchCatal - ogResponse.next_page_token][google.cloud.datacatalog.v1beta1.S - earchCatalogResponse.next_page_token], which indicates that - this is a continuation of a prior [SearchCatalogRequest][googl - e.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog] call, - and that the system should return the next page of data. If - empty, the first page is returned. - order_by: - Specifies the ordering of results, currently supported case- - sensitive choices are: - ``relevance``, only supports - descending - ``last_modified_timestamp [asc|desc]``, defaults - to descending if not specified If not specified, defaults - to ``relevance`` descending. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.SearchCatalogRequest) - }, -) -_sym_db.RegisterMessage(SearchCatalogRequest) -_sym_db.RegisterMessage(SearchCatalogRequest.Scope) - -SearchCatalogResponse = _reflection.GeneratedProtocolMessageType( - "SearchCatalogResponse", - (_message.Message,), - { - "DESCRIPTOR": _SEARCHCATALOGRESPONSE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Response message for [SearchCatalog][google.cloud.datacatalog.v1beta1. - DataCatalog.SearchCatalog]. - - Attributes: - results: - Search results. - next_page_token: - The token that can be used to retrieve the next page of - results. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.SearchCatalogResponse) - }, -) -_sym_db.RegisterMessage(SearchCatalogResponse) - -CreateEntryGroupRequest = _reflection.GeneratedProtocolMessageType( - "CreateEntryGroupRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATEENTRYGROUPREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [CreateEntryGroup][google.cloud.datacatalog.v1beta - 1.DataCatalog.CreateEntryGroup]. - - Attributes: - parent: - Required. The name of the project this entry group is in. - Example: - projects/{project_id}/locations/{location} Note - that this EntryGroup and its child resources may not actually - be stored in the location in this name. - entry_group_id: - Required. The id of the entry group to create. The id must - begin with a letter or underscore, contain only English - letters, numbers and underscores, and be at most 64 - characters. - entry_group: - The entry group to create. Defaults to an empty entry group. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.CreateEntryGroupRequest) - }, -) -_sym_db.RegisterMessage(CreateEntryGroupRequest) - -UpdateEntryGroupRequest = _reflection.GeneratedProtocolMessageType( - "UpdateEntryGroupRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATEENTRYGROUPREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [UpdateEntryGroup][google.cloud.datacatalog.v1beta - 1.DataCatalog.UpdateEntryGroup]. - - Attributes: - entry_group: - Required. The updated entry group. “name” field must be set. - update_mask: - The fields to update on the entry group. If absent or empty, - all modifiable fields are updated. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.UpdateEntryGroupRequest) - }, -) -_sym_db.RegisterMessage(UpdateEntryGroupRequest) - -GetEntryGroupRequest = _reflection.GeneratedProtocolMessageType( - "GetEntryGroupRequest", - (_message.Message,), - { - "DESCRIPTOR": _GETENTRYGROUPREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [GetEntryGroup][google.cloud.datacatalog.v1beta1.D - ataCatalog.GetEntryGroup]. - - Attributes: - name: - Required. The name of the entry group. For example, ``projects - /{project_id}/locations/{location}/entryGroups/{entry_group_id - }``. - read_mask: - The fields to return. If not set or empty, all fields are - returned. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.GetEntryGroupRequest) - }, -) -_sym_db.RegisterMessage(GetEntryGroupRequest) - -DeleteEntryGroupRequest = _reflection.GeneratedProtocolMessageType( - "DeleteEntryGroupRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETEENTRYGROUPREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [DeleteEntryGroup][google.cloud.datacatalog.v1beta - 1.DataCatalog.DeleteEntryGroup]. - - Attributes: - name: - Required. The name of the entry group. For example, ``projects - /{project_id}/locations/{location}/entryGroups/{entry_group_id - }``. - force: - Optional. If true, deletes all entries in the entry group. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.DeleteEntryGroupRequest) - }, -) -_sym_db.RegisterMessage(DeleteEntryGroupRequest) - -ListEntryGroupsRequest = _reflection.GeneratedProtocolMessageType( - "ListEntryGroupsRequest", - (_message.Message,), - { - "DESCRIPTOR": _LISTENTRYGROUPSREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [ListEntryGroups][google.cloud.datacatalog.v1beta1 - .DataCatalog.ListEntryGroups]. - - Attributes: - parent: - Required. The name of the location that contains the entry - groups, which can be provided in URL format. Example: - - projects/{project_id}/locations/{location} - page_size: - Optional. The maximum number of items to return. Default is - 10. Max limit is 1000. Throws an invalid argument for - ``page_size > 1000``. - page_token: - Optional. Token that specifies which page is requested. If - empty, the first page is returned. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListEntryGroupsRequest) - }, -) -_sym_db.RegisterMessage(ListEntryGroupsRequest) - -ListEntryGroupsResponse = _reflection.GeneratedProtocolMessageType( - "ListEntryGroupsResponse", - (_message.Message,), - { - "DESCRIPTOR": _LISTENTRYGROUPSRESPONSE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Response message for [ListEntryGroups][google.cloud.datacatalog.v1beta - 1.DataCatalog.ListEntryGroups]. - - Attributes: - entry_groups: - EntryGroup details. - next_page_token: - Token to retrieve the next page of results. It is set to empty - if no items remain in results. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListEntryGroupsResponse) - }, -) -_sym_db.RegisterMessage(ListEntryGroupsResponse) - -CreateEntryRequest = _reflection.GeneratedProtocolMessageType( - "CreateEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATEENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [CreateEntry][google.cloud.datacatalog.v1beta1.Dat - aCatalog.CreateEntry]. - - Attributes: - parent: - Required. The name of the entry group this entry is in. - Example: - projects/{project_id}/locations/{location}/entryG - roups/{entry_group_id} Note that this Entry and its child - resources may not actually be stored in the location in this - name. - entry_id: - Required. The id of the entry to create. - entry: - Required. The entry to create. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.CreateEntryRequest) - }, -) -_sym_db.RegisterMessage(CreateEntryRequest) - -UpdateEntryRequest = _reflection.GeneratedProtocolMessageType( - "UpdateEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATEENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [UpdateEntry][google.cloud.datacatalog.v1beta1.Dat - aCatalog.UpdateEntry]. - - Attributes: - entry: - Required. The updated entry. The “name” field must be set. - update_mask: - The fields to update on the entry. If absent or empty, all - modifiable fields are updated. The following fields are - modifiable: \* For entries with type ``DATA_STREAM``: \* - ``schema`` \* For entries with type ``FILESET`` \* ``schema`` - \* ``display_name`` \* ``description`` \* ``gcs_fileset_spec`` - \* ``gcs_fileset_spec.file_patterns`` \* For entries with - ``user_specified_type`` \* ``schema`` \* ``display_name`` \* - ``description`` \* user_specified_type \* - user_specified_system \* linked_resource \* - source_system_timestamps - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.UpdateEntryRequest) - }, -) -_sym_db.RegisterMessage(UpdateEntryRequest) - -DeleteEntryRequest = _reflection.GeneratedProtocolMessageType( - "DeleteEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETEENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [DeleteEntry][google.cloud.datacatalog.v1beta1.Dat - aCatalog.DeleteEntry]. - - Attributes: - name: - Required. The name of the entry. Example: - projects/{projec - t_id}/locations/{location}/entryGroups/{entry_group_id}/entrie - s/{entry_id} - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.DeleteEntryRequest) - }, -) -_sym_db.RegisterMessage(DeleteEntryRequest) - -GetEntryRequest = _reflection.GeneratedProtocolMessageType( - "GetEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _GETENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for - [GetEntry][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntry]. - - Attributes: - name: - Required. The name of the entry. Example: - projects/{projec - t_id}/locations/{location}/entryGroups/{entry_group_id}/entrie - s/{entry_id} - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.GetEntryRequest) - }, -) -_sym_db.RegisterMessage(GetEntryRequest) - -LookupEntryRequest = _reflection.GeneratedProtocolMessageType( - "LookupEntryRequest", - (_message.Message,), - { - "DESCRIPTOR": _LOOKUPENTRYREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [LookupEntry][google.cloud.datacatalog.v1beta1.Dat - aCatalog.LookupEntry]. - - Attributes: - target_name: - Required. Represents either the Google Cloud Platform resource - or SQL name for a Google Cloud Platform resource. - linked_resource: - The full name of the Google Cloud Platform resource the Data - Catalog entry represents. See: https://cloud.google.com/apis/d - esign/resource_names#full_resource_name. Full names are case- - sensitive. Examples: - //bigquery.googleapis.com/projects/p - rojectId/datasets/datasetId/tables/tableId - - //pubsub.googleapis.com/projects/projectId/topics/topicId - sql_resource: - The SQL name of the entry. SQL names are case-sensitive. - Examples: - ``pubsub.project_id.topic_id`` - - :literal:`pubsub.project_id.`topic.id.with.dots\`` - - ``bigquery.table.project_id.dataset_id.table_id`` - - ``bigquery.dataset.project_id.dataset_id`` - ``datacatalog.en - try.project_id.location_id.entry_group_id.entry_id`` - ``*_id``\ s shoud satisfy the standard SQL rules for - identifiers. - https://cloud.google.com/bigquery/docs/reference/standard- - sql/lexical. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.LookupEntryRequest) - }, -) -_sym_db.RegisterMessage(LookupEntryRequest) - -Entry = _reflection.GeneratedProtocolMessageType( - "Entry", - (_message.Message,), - { - "DESCRIPTOR": _ENTRY, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Entry Metadata. A Data Catalog Entry resource represents another - resource in Google Cloud Platform (such as a BigQuery dataset or a - Pub/Sub topic), or outside of Google Cloud Platform. Clients can use - the ``linked_resource`` field in the Entry resource to refer to the - original resource ID of the source system. An Entry resource contains - resource details, such as its schema. An Entry can also be used to - attach flexible metadata, such as a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. - - Attributes: - name: - The Data Catalog resource name of the entry in URL format. - Example: - projects/{project_id}/locations/{location}/entryG - roups/{entry_group_id}/entries/{entry_id} Note that this - Entry and its child resources may not actually be stored in - the location in this name. - linked_resource: - The resource this metadata entry refers to. For Google Cloud - Platform resources, ``linked_resource`` is the `full name of - the resource `__. For example, the - ``linked_resource`` for a table resource from BigQuery is: - - //bigquery.googleapis.com/projects/projectId/datasets/datasetI - d/tables/tableId Output only when Entry is of type in the - EntryType enum. For entries with user_specified_type, this - field is optional and defaults to an empty string. - entry_type: - Required. Entry type. - type: - The type of the entry. Only used for Entries with types in the - EntryType enum. - user_specified_type: - Entry type if it does not fit any of the input-allowed values - listed in ``EntryType`` enum above. When creating an entry, - users should check the enum values first, if nothing matches - the entry to be created, then provide a custom value, for - example “my_special_type”. ``user_specified_type`` strings - must begin with a letter or underscore and can only contain - letters, numbers, and underscores; are case insensitive; must - be at least 1 character and at most 64 characters long. - Currently, only FILESET enum value is allowed. All other - entries created through Data Catalog must use - ``user_specified_type``. - system: - The source system of the entry. - integrated_system: - Output only. This field indicates the entry’s source system - that Data Catalog integrates with, such as BigQuery or - Pub/Sub. - user_specified_system: - This field indicates the entry’s source system that Data - Catalog does not integrate with. ``user_specified_system`` - strings must begin with a letter or underscore and can only - contain letters, numbers, and underscores; are case - insensitive; must be at least 1 character and at most 64 - characters long. - type_spec: - Type specification information. - gcs_fileset_spec: - Specification that applies to a Cloud Storage fileset. This is - only valid on entries of type FILESET. - bigquery_table_spec: - Specification that applies to a BigQuery table. This is only - valid on entries of type ``TABLE``. - bigquery_date_sharded_spec: - Specification for a group of BigQuery tables with name pattern - ``[prefix]YYYYMMDD``. Context: - https://cloud.google.com/bigquery/docs/partitioned- - tables#partitioning_versus_sharding. - display_name: - Display information such as title and description. A short - name to identify the entry, for example, “Analytics Data - Jan - 2011”. Default value is an empty string. - description: - Entry description, which can consist of several sentences or - paragraphs that describe entry contents. Default value is an - empty string. - schema: - Schema of the entry. An entry might not have any schema - attached to it. - source_system_timestamps: - Output only. Timestamps about the underlying resource, not - about this Data Catalog entry. Output only when Entry is of - type in the EntryType enum. For entries with - user_specified_type, this field is optional and defaults to an - empty timestamp. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.Entry) - }, -) -_sym_db.RegisterMessage(Entry) - -EntryGroup = _reflection.GeneratedProtocolMessageType( - "EntryGroup", - (_message.Message,), - { - "DESCRIPTOR": _ENTRYGROUP, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """EntryGroup Metadata. An EntryGroup resource represents a logical - grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1beta1.Entry] resources. - - Attributes: - name: - The resource name of the entry group in URL format. Example: - - projects/{project_id}/locations/{location}/entryGroups/{ent - ry_group_id} Note that this EntryGroup and its child - resources may not actually be stored in the location in this - name. - display_name: - A short name to identify the entry group, for example, - “analytics data - jan 2011”. Default value is an empty string. - description: - Entry group description, which can consist of several - sentences or paragraphs that describe entry group contents. - Default value is an empty string. - data_catalog_timestamps: - Output only. Timestamps about this EntryGroup. Default value - is empty timestamps. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.EntryGroup) - }, -) -_sym_db.RegisterMessage(EntryGroup) - -CreateTagTemplateRequest = _reflection.GeneratedProtocolMessageType( - "CreateTagTemplateRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATETAGTEMPLATEREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [CreateTagTemplate][google.cloud.datacatalog.v1bet - a1.DataCatalog.CreateTagTemplate]. - - Attributes: - parent: - Required. The name of the project and the template location - [region](https://cloud.google.com/data- - catalog/docs/concepts/regions. Example: - - projects/{project_id}/locations/us-central1 - tag_template_id: - Required. The id of the tag template to create. - tag_template: - Required. The tag template to create. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.CreateTagTemplateRequest) - }, -) -_sym_db.RegisterMessage(CreateTagTemplateRequest) - -GetTagTemplateRequest = _reflection.GeneratedProtocolMessageType( - "GetTagTemplateRequest", - (_message.Message,), - { - "DESCRIPTOR": _GETTAGTEMPLATEREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [GetTagTemplate][google.cloud.datacatalog.v1beta1. - DataCatalog.GetTagTemplate]. - - Attributes: - name: - Required. The name of the tag template. Example: - projects/ - {project_id}/locations/{location}/tagTemplates/{tag_template_i - d} - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.GetTagTemplateRequest) - }, -) -_sym_db.RegisterMessage(GetTagTemplateRequest) - -UpdateTagTemplateRequest = _reflection.GeneratedProtocolMessageType( - "UpdateTagTemplateRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATETAGTEMPLATEREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [UpdateTagTemplate][google.cloud.datacatalog.v1bet - a1.DataCatalog.UpdateTagTemplate]. - - Attributes: - tag_template: - Required. The template to update. The “name” field must be - set. - update_mask: - The field mask specifies the parts of the template to - overwrite. Allowed fields: - ``display_name`` If absent or - empty, all of the allowed fields above will be updated. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.UpdateTagTemplateRequest) - }, -) -_sym_db.RegisterMessage(UpdateTagTemplateRequest) - -DeleteTagTemplateRequest = _reflection.GeneratedProtocolMessageType( - "DeleteTagTemplateRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETETAGTEMPLATEREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [DeleteTagTemplate][google.cloud.datacatalog.v1bet - a1.DataCatalog.DeleteTagTemplate]. - - Attributes: - name: - Required. The name of the tag template to delete. Example: - - projects/{project_id}/locations/{location}/tagTemplates/{tag_t - emplate_id} - force: - Required. Currently, this field must always be set to - ``true``. This confirms the deletion of any possible tags - using this template. ``force = false`` will be supported in - the future. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.DeleteTagTemplateRequest) - }, -) -_sym_db.RegisterMessage(DeleteTagTemplateRequest) - -CreateTagRequest = _reflection.GeneratedProtocolMessageType( - "CreateTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATETAGREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for - [CreateTag][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTag]. - - Attributes: - parent: - Required. The name of the resource to attach this tag to. Tags - can be attached to Entries. Example: - projects/{project_id} - /locations/{location}/entryGroups/{entry_group_id}/entries/{en - try_id} Note that this Tag and its child resources may not - actually be stored in the location in this name. - tag: - Required. The tag to create. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.CreateTagRequest) - }, -) -_sym_db.RegisterMessage(CreateTagRequest) - -UpdateTagRequest = _reflection.GeneratedProtocolMessageType( - "UpdateTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATETAGREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for - [UpdateTag][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTag]. - - Attributes: - tag: - Required. The updated tag. The “name” field must be set. - update_mask: - The fields to update on the Tag. If absent or empty, all - modifiable fields are updated. Currently the only modifiable - field is the field ``fields``. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.UpdateTagRequest) - }, -) -_sym_db.RegisterMessage(UpdateTagRequest) - -DeleteTagRequest = _reflection.GeneratedProtocolMessageType( - "DeleteTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETETAGREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for - [DeleteTag][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTag]. - - Attributes: - name: - Required. The name of the tag to delete. Example: - projects - /{project_id}/locations/{location}/entryGroups/{entry_group_id - }/entries/{entry_id}/tags/{tag_id} - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.DeleteTagRequest) - }, -) -_sym_db.RegisterMessage(DeleteTagRequest) - -CreateTagTemplateFieldRequest = _reflection.GeneratedProtocolMessageType( - "CreateTagTemplateFieldRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATETAGTEMPLATEFIELDREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [CreateTagTemplateField][google.cloud.datacatalog. - v1beta1.DataCatalog.CreateTagTemplateField]. - - Attributes: - parent: - Required. The name of the project and the template location - `region `__. Example: - - projects/{project_id}/locations/us- - central1/tagTemplates/{tag_template_id} - tag_template_field_id: - Required. The ID of the tag template field to create. Field - ids can contain letters (both uppercase and lowercase), - numbers (0-9), underscores (_) and dashes (-). Field IDs must - be at least 1 character long and at most 128 characters long. - Field IDs must also be unique within their template. - tag_template_field: - Required. The tag template field to create. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.CreateTagTemplateFieldRequest) - }, -) -_sym_db.RegisterMessage(CreateTagTemplateFieldRequest) - -UpdateTagTemplateFieldRequest = _reflection.GeneratedProtocolMessageType( - "UpdateTagTemplateFieldRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATETAGTEMPLATEFIELDREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [UpdateTagTemplateField][google.cloud.datacatalog. - v1beta1.DataCatalog.UpdateTagTemplateField]. - - Attributes: - name: - Required. The name of the tag template field. Example: - pro - jects/{project_id}/locations/{location}/tagTemplates/{tag_temp - late_id}/fields/{tag_template_field_id} - tag_template_field: - Required. The template to update. - update_mask: - Optional. The field mask specifies the parts of the template - to be updated. Allowed fields: - ``display_name`` - - ``type.enum_type`` - ``is_required`` If ``update_mask`` is - not set or empty, all of the allowed fields above will be - updated. When updating an enum type, the provided values will - be merged with the existing values. Therefore, enum values can - only be added, existing enum values cannot be deleted nor - renamed. Updating a template field from optional to required - is NOT allowed. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.UpdateTagTemplateFieldRequest) - }, -) -_sym_db.RegisterMessage(UpdateTagTemplateFieldRequest) - -RenameTagTemplateFieldRequest = _reflection.GeneratedProtocolMessageType( - "RenameTagTemplateFieldRequest", - (_message.Message,), - { - "DESCRIPTOR": _RENAMETAGTEMPLATEFIELDREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [RenameTagTemplateField][google.cloud.datacatalog. - v1beta1.DataCatalog.RenameTagTemplateField]. - - Attributes: - name: - Required. The name of the tag template. Example: - projects/ - {project_id}/locations/{location}/tagTemplates/{tag_template_i - d}/fields/{tag_template_field_id} - new_tag_template_field_id: - Required. The new ID of this tag template field. For example, - ``my_new_field``. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.RenameTagTemplateFieldRequest) - }, -) -_sym_db.RegisterMessage(RenameTagTemplateFieldRequest) - -DeleteTagTemplateFieldRequest = _reflection.GeneratedProtocolMessageType( - "DeleteTagTemplateFieldRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETETAGTEMPLATEFIELDREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [DeleteTagTemplateField][google.cloud.datacatalog. - v1beta1.DataCatalog.DeleteTagTemplateField]. - - Attributes: - name: - Required. The name of the tag template field to delete. - Example: - projects/{project_id}/locations/{location}/tagTem - plates/{tag_template_id}/fields/{tag_template_field_id} - force: - Required. Currently, this field must always be set to - ``true``. This confirms the deletion of this field from any - tags using this field. ``force = false`` will be supported in - the future. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.DeleteTagTemplateFieldRequest) - }, -) -_sym_db.RegisterMessage(DeleteTagTemplateFieldRequest) - -ListTagsRequest = _reflection.GeneratedProtocolMessageType( - "ListTagsRequest", - (_message.Message,), - { - "DESCRIPTOR": _LISTTAGSREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for - [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. - - Attributes: - parent: - Required. The name of the Data Catalog resource to list the - tags of. The resource could be an - [Entry][google.cloud.datacatalog.v1beta1.Entry] or an - [EntryGroup][google.cloud.datacatalog.v1beta1.EntryGroup]. - Examples: - projects/{project_id}/locations/{location}/entry - Groups/{entry_group_id} - projects/{project_id}/locations/{lo - cation}/entryGroups/{entry_group_id}/entries/{entry_id} - page_size: - The maximum number of tags to return. Default is 10. Max limit - is 1000. - page_token: - Token that specifies which page is requested. If empty, the - first page is returned. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListTagsRequest) - }, -) -_sym_db.RegisterMessage(ListTagsRequest) - -ListTagsResponse = _reflection.GeneratedProtocolMessageType( - "ListTagsResponse", - (_message.Message,), - { - "DESCRIPTOR": _LISTTAGSRESPONSE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Response message for - [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. - - Attributes: - tags: - [Tag][google.cloud.datacatalog.v1beta1.Tag] details. - next_page_token: - Token to retrieve the next page of results. It is set to empty - if no items remain in results. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListTagsResponse) - }, -) -_sym_db.RegisterMessage(ListTagsResponse) - -ListEntriesRequest = _reflection.GeneratedProtocolMessageType( - "ListEntriesRequest", - (_message.Message,), - { - "DESCRIPTOR": _LISTENTRIESREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Request message for [ListEntries][google.cloud.datacatalog.v1beta1.Dat - aCatalog.ListEntries]. - - Attributes: - parent: - Required. The name of the entry group that contains the - entries, which can be provided in URL format. Example: - pro - jects/{project_id}/locations/{location}/entryGroups/{entry_gro - up_id} - page_size: - The maximum number of items to return. Default is 10. Max - limit is 1000. Throws an invalid argument for ``page_size > - 1000``. - page_token: - Token that specifies which page is requested. If empty, the - first page is returned. - read_mask: - The fields to return for each Entry. If not set or empty, all - fields are returned. For example, setting read_mask to contain - only one path “name” will cause ListEntries to return a list - of Entries with only “name” field. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListEntriesRequest) - }, -) -_sym_db.RegisterMessage(ListEntriesRequest) - -ListEntriesResponse = _reflection.GeneratedProtocolMessageType( - "ListEntriesResponse", - (_message.Message,), - { - "DESCRIPTOR": _LISTENTRIESRESPONSE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.datacatalog_pb2", - "__doc__": """Response message for [ListEntries][google.cloud.datacatalog.v1beta1.Da - taCatalog.ListEntries]. - - Attributes: - entries: - Entry details. - next_page_token: - Token to retrieve the next page of results. It is set to empty - if no items remain in results. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListEntriesResponse) - }, -) -_sym_db.RegisterMessage(ListEntriesResponse) - - -DESCRIPTOR._options = None -_SEARCHCATALOGREQUEST.fields_by_name["scope"]._options = None -_SEARCHCATALOGREQUEST.fields_by_name["query"]._options = None -_SEARCHCATALOGREQUEST.fields_by_name["page_token"]._options = None -_CREATEENTRYGROUPREQUEST.fields_by_name["parent"]._options = None -_CREATEENTRYGROUPREQUEST.fields_by_name["entry_group_id"]._options = None -_UPDATEENTRYGROUPREQUEST.fields_by_name["entry_group"]._options = None -_GETENTRYGROUPREQUEST.fields_by_name["name"]._options = None -_DELETEENTRYGROUPREQUEST.fields_by_name["name"]._options = None -_DELETEENTRYGROUPREQUEST.fields_by_name["force"]._options = None -_LISTENTRYGROUPSREQUEST.fields_by_name["parent"]._options = None -_LISTENTRYGROUPSREQUEST.fields_by_name["page_size"]._options = None -_LISTENTRYGROUPSREQUEST.fields_by_name["page_token"]._options = None -_CREATEENTRYREQUEST.fields_by_name["parent"]._options = None -_CREATEENTRYREQUEST.fields_by_name["entry_id"]._options = None -_CREATEENTRYREQUEST.fields_by_name["entry"]._options = None -_UPDATEENTRYREQUEST.fields_by_name["entry"]._options = None -_DELETEENTRYREQUEST.fields_by_name["name"]._options = None -_GETENTRYREQUEST.fields_by_name["name"]._options = None -_ENTRY.fields_by_name["name"]._options = None -_ENTRY.fields_by_name["integrated_system"]._options = None -_ENTRY.fields_by_name["source_system_timestamps"]._options = None -_ENTRY._options = None -_ENTRYGROUP.fields_by_name["data_catalog_timestamps"]._options = None -_ENTRYGROUP._options = None -_CREATETAGTEMPLATEREQUEST.fields_by_name["parent"]._options = None -_CREATETAGTEMPLATEREQUEST.fields_by_name["tag_template_id"]._options = None -_CREATETAGTEMPLATEREQUEST.fields_by_name["tag_template"]._options = None -_GETTAGTEMPLATEREQUEST.fields_by_name["name"]._options = None -_UPDATETAGTEMPLATEREQUEST.fields_by_name["tag_template"]._options = None -_DELETETAGTEMPLATEREQUEST.fields_by_name["name"]._options = None -_DELETETAGTEMPLATEREQUEST.fields_by_name["force"]._options = None -_CREATETAGREQUEST.fields_by_name["parent"]._options = None -_CREATETAGREQUEST.fields_by_name["tag"]._options = None -_UPDATETAGREQUEST.fields_by_name["tag"]._options = None -_DELETETAGREQUEST.fields_by_name["name"]._options = None -_CREATETAGTEMPLATEFIELDREQUEST.fields_by_name["parent"]._options = None -_CREATETAGTEMPLATEFIELDREQUEST.fields_by_name["tag_template_field_id"]._options = None -_CREATETAGTEMPLATEFIELDREQUEST.fields_by_name["tag_template_field"]._options = None -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name["name"]._options = None -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name["tag_template_field"]._options = None -_UPDATETAGTEMPLATEFIELDREQUEST.fields_by_name["update_mask"]._options = None -_RENAMETAGTEMPLATEFIELDREQUEST.fields_by_name["name"]._options = None -_RENAMETAGTEMPLATEFIELDREQUEST.fields_by_name[ - "new_tag_template_field_id" -]._options = None -_DELETETAGTEMPLATEFIELDREQUEST.fields_by_name["name"]._options = None -_DELETETAGTEMPLATEFIELDREQUEST.fields_by_name["force"]._options = None -_LISTTAGSREQUEST.fields_by_name["parent"]._options = None -_LISTENTRIESREQUEST.fields_by_name["parent"]._options = None - -_DATACATALOG = _descriptor.ServiceDescriptor( - name="DataCatalog", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog", - file=DESCRIPTOR, - index=0, - serialized_options=b"\312A\032datacatalog.googleapis.com\322A.https://www.googleapis.com/auth/cloud-platform", - create_key=_descriptor._internal_create_key, - serialized_start=5881, - serialized_end=12341, - methods=[ - _descriptor.MethodDescriptor( - name="SearchCatalog", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog", - index=0, - containing_service=None, - input_type=_SEARCHCATALOGREQUEST, - output_type=_SEARCHCATALOGRESPONSE, - serialized_options=b'\202\323\344\223\002\034"\027/v1beta1/catalog:search:\001*\332A\013scope,query', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateEntryGroup", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntryGroup", - index=1, - containing_service=None, - input_type=_CREATEENTRYGROUPREQUEST, - output_type=_ENTRYGROUP, - serialized_options=b'\202\323\344\223\002C"4/v1beta1/{parent=projects/*/locations/*}/entryGroups:\013entry_group\332A!parent,entry_group_id,entry_group', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateEntryGroup", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntryGroup", - index=2, - containing_service=None, - input_type=_UPDATEENTRYGROUPREQUEST, - output_type=_ENTRYGROUP, - serialized_options=b"\202\323\344\223\002O2@/v1beta1/{entry_group.name=projects/*/locations/*/entryGroups/*}:\013entry_group\332A\013entry_group\332A\027entry_group,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="GetEntryGroup", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.GetEntryGroup", - index=3, - containing_service=None, - input_type=_GETENTRYGROUPREQUEST, - output_type=_ENTRYGROUP, - serialized_options=b"\202\323\344\223\0026\0224/v1beta1/{name=projects/*/locations/*/entryGroups/*}\332A\004name\332A\016name,read_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteEntryGroup", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntryGroup", - index=4, - containing_service=None, - input_type=_DELETEENTRYGROUPREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\0026*4/v1beta1/{name=projects/*/locations/*/entryGroups/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="ListEntryGroups", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups", - index=5, - containing_service=None, - input_type=_LISTENTRYGROUPSREQUEST, - output_type=_LISTENTRYGROUPSRESPONSE, - serialized_options=b"\202\323\344\223\0026\0224/v1beta1/{parent=projects/*/locations/*}/entryGroups\332A\006parent", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateEntry", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry", - index=6, - containing_service=None, - input_type=_CREATEENTRYREQUEST, - output_type=_ENTRY, - serialized_options=b'\202\323\344\223\002G">/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/entries:\005entry\332A\025parent,entry_id,entry', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateEntry", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntry", - index=7, - containing_service=None, - input_type=_UPDATEENTRYREQUEST, - output_type=_ENTRY, - serialized_options=b"\202\323\344\223\002M2D/v1beta1/{entry.name=projects/*/locations/*/entryGroups/*/entries/*}:\005entry\332A\005entry\332A\021entry,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteEntry", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntry", - index=8, - containing_service=None, - input_type=_DELETEENTRYREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\002@*>/v1beta1/{name=projects/*/locations/*/entryGroups/*/entries/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="GetEntry", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.GetEntry", - index=9, - containing_service=None, - input_type=_GETENTRYREQUEST, - output_type=_ENTRY, - serialized_options=b"\202\323\344\223\002@\022>/v1beta1/{name=projects/*/locations/*/entryGroups/*/entries/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="LookupEntry", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.LookupEntry", - index=10, - containing_service=None, - input_type=_LOOKUPENTRYREQUEST, - output_type=_ENTRY, - serialized_options=b"\202\323\344\223\002\031\022\027/v1beta1/entries:lookup", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="ListEntries", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries", - index=11, - containing_service=None, - input_type=_LISTENTRIESREQUEST, - output_type=_LISTENTRIESRESPONSE, - serialized_options=b"\202\323\344\223\002@\022>/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/entries\332A\006parent", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateTagTemplate", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplate", - index=12, - containing_service=None, - input_type=_CREATETAGTEMPLATEREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATE, - serialized_options=b'\202\323\344\223\002E"5/v1beta1/{parent=projects/*/locations/*}/tagTemplates:\014tag_template\332A#parent,tag_template_id,tag_template', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="GetTagTemplate", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.GetTagTemplate", - index=13, - containing_service=None, - input_type=_GETTAGTEMPLATEREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATE, - serialized_options=b"\202\323\344\223\0027\0225/v1beta1/{name=projects/*/locations/*/tagTemplates/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateTagTemplate", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplate", - index=14, - containing_service=None, - input_type=_UPDATETAGTEMPLATEREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATE, - serialized_options=b"\202\323\344\223\002R2B/v1beta1/{tag_template.name=projects/*/locations/*/tagTemplates/*}:\014tag_template\332A\014tag_template\332A\030tag_template,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteTagTemplate", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplate", - index=15, - containing_service=None, - input_type=_DELETETAGTEMPLATEREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\0027*5/v1beta1/{name=projects/*/locations/*/tagTemplates/*}\332A\nname,force", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateTagTemplateField", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplateField", - index=16, - containing_service=None, - input_type=_CREATETAGTEMPLATEFIELDREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD, - serialized_options=b'\202\323\344\223\002T">/v1beta1/{parent=projects/*/locations/*/tagTemplates/*}/fields:\022tag_template_field\332A/parent,tag_template_field_id,tag_template_field', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateTagTemplateField", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplateField", - index=17, - containing_service=None, - input_type=_UPDATETAGTEMPLATEFIELDREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD, - serialized_options=b"\202\323\344\223\002T2>/v1beta1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:\022tag_template_field\332A\027name,tag_template_field\332A#name,tag_template_field,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="RenameTagTemplateField", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.RenameTagTemplateField", - index=18, - containing_service=None, - input_type=_RENAMETAGTEMPLATEFIELDREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAGTEMPLATEFIELD, - serialized_options=b'\202\323\344\223\002J"E/v1beta1/{name=projects/*/locations/*/tagTemplates/*/fields/*}:rename:\001*\332A\036name,new_tag_template_field_id', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteTagTemplateField", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplateField", - index=19, - containing_service=None, - input_type=_DELETETAGTEMPLATEFIELDREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\002@*>/v1beta1/{name=projects/*/locations/*/tagTemplates/*/fields/*}\332A\nname,force", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="CreateTag", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.CreateTag", - index=20, - containing_service=None, - input_type=_CREATETAGREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAG, - serialized_options=b'\202\323\344\223\002\220\001"E/v1beta1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tags:\003tagZB";/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/tags:\003tag\332A\nparent,tag', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateTag", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTag", - index=21, - containing_service=None, - input_type=_UPDATETAGREQUEST, - output_type=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2._TAG, - serialized_options=b"\202\323\344\223\002\230\0012I/v1beta1/{tag.name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}:\003tagZF2?/v1beta1/{tag.name=projects/*/locations/*/entryGroups/*/tags/*}:\003tag\332A\003tag\332A\017tag,update_mask", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteTag", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTag", - index=22, - containing_service=None, - input_type=_DELETETAGREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\002\206\001*E/v1beta1/{name=projects/*/locations/*/entryGroups/*/entries/*/tags/*}Z=*;/v1beta1/{name=projects/*/locations/*/entryGroups/*/tags/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="ListTags", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.ListTags", - index=23, - containing_service=None, - input_type=_LISTTAGSREQUEST, - output_type=_LISTTAGSRESPONSE, - serialized_options=b"\202\323\344\223\002\206\001\022E/v1beta1/{parent=projects/*/locations/*/entryGroups/*/entries/*}/tagsZ=\022;/v1beta1/{parent=projects/*/locations/*/entryGroups/*}/tags\332A\006parent", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="SetIamPolicy", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.SetIamPolicy", - index=24, - containing_service=None, - input_type=google_dot_iam_dot_v1_dot_iam__policy__pb2._SETIAMPOLICYREQUEST, - output_type=google_dot_iam_dot_v1_dot_policy__pb2._POLICY, - serialized_options=b'\202\323\344\223\002\227\001"F/v1beta1/{resource=projects/*/locations/*/tagTemplates/*}:setIamPolicy:\001*ZJ"E/v1beta1/{resource=projects/*/locations/*/entryGroups/*}:setIamPolicy:\001*\332A\017resource,policy', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="GetIamPolicy", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.GetIamPolicy", - index=25, - containing_service=None, - input_type=google_dot_iam_dot_v1_dot_iam__policy__pb2._GETIAMPOLICYREQUEST, - output_type=google_dot_iam_dot_v1_dot_policy__pb2._POLICY, - serialized_options=b'\202\323\344\223\002\355\001"F/v1beta1/{resource=projects/*/locations/*/tagTemplates/*}:getIamPolicy:\001*ZJ"E/v1beta1/{resource=projects/*/locations/*/entryGroups/*}:getIamPolicy:\001*ZT"O/v1beta1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:getIamPolicy:\001*\332A\010resource', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="TestIamPermissions", - full_name="google.cloud.datacatalog.v1beta1.DataCatalog.TestIamPermissions", - index=26, - containing_service=None, - input_type=google_dot_iam_dot_v1_dot_iam__policy__pb2._TESTIAMPERMISSIONSREQUEST, - output_type=google_dot_iam_dot_v1_dot_iam__policy__pb2._TESTIAMPERMISSIONSRESPONSE, - serialized_options=b'\202\323\344\223\002\377\001"L/v1beta1/{resource=projects/*/locations/*/tagTemplates/*}:testIamPermissions:\001*ZP"K/v1beta1/{resource=projects/*/locations/*/entryGroups/*}:testIamPermissions:\001*ZZ"U/v1beta1/{resource=projects/*/locations/*/entryGroups/*/entries/*}:testIamPermissions:\001*', - create_key=_descriptor._internal_create_key, - ), - ], -) -_sym_db.RegisterServiceDescriptor(_DATACATALOG) - -DESCRIPTOR.services_by_name["DataCatalog"] = _DATACATALOG - -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1beta1/proto/datacatalog_pb2_grpc.py b/google/cloud/datacatalog_v1beta1/proto/datacatalog_pb2_grpc.py deleted file mode 100644 index 4761d635..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/datacatalog_pb2_grpc.py +++ /dev/null @@ -1,1361 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc - -from google.cloud.datacatalog_v1beta1.proto import ( - datacatalog_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2, -) -from google.cloud.datacatalog_v1beta1.proto import ( - tags_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2, -) -from google.iam.v1 import iam_policy_pb2 as google_dot_iam_dot_v1_dot_iam__policy__pb2 -from google.iam.v1 import policy_pb2 as google_dot_iam_dot_v1_dot_policy__pb2 -from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2 - - -class DataCatalogStub(object): - """Data Catalog API service allows clients to discover, understand, and manage - their data. - """ - - def __init__(self, channel): - """Constructor. - - Args: - channel: A grpc.Channel. - """ - self.SearchCatalog = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/SearchCatalog", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.SearchCatalogRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.SearchCatalogResponse.FromString, - ) - self.CreateEntryGroup = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateEntryGroup", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateEntryGroupRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - ) - self.UpdateEntryGroup = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateEntryGroup", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateEntryGroupRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - ) - self.GetEntryGroup = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/GetEntryGroup", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.GetEntryGroupRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - ) - self.DeleteEntryGroup = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteEntryGroup", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteEntryGroupRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.ListEntryGroups = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/ListEntryGroups", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsResponse.FromString, - ) - self.CreateEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateEntryRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - ) - self.UpdateEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateEntryRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - ) - self.DeleteEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteEntryRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.GetEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/GetEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.GetEntryRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - ) - self.LookupEntry = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/LookupEntry", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.LookupEntryRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - ) - self.ListEntries = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/ListEntries", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntriesRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntriesResponse.FromString, - ) - self.CreateTagTemplate = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTagTemplate", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - ) - self.GetTagTemplate = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/GetTagTemplate", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.GetTagTemplateRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - ) - self.UpdateTagTemplate = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTagTemplate", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - ) - self.DeleteTagTemplate = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTagTemplate", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.CreateTagTemplateField = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTagTemplateField", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateFieldRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - ) - self.UpdateTagTemplateField = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTagTemplateField", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateFieldRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - ) - self.RenameTagTemplateField = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/RenameTagTemplateField", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.RenameTagTemplateFieldRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - ) - self.DeleteTagTemplateField = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTagTemplateField", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateFieldRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.CreateTag = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTag", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateTagRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.Tag.FromString, - ) - self.UpdateTag = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTag", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateTagRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.Tag.FromString, - ) - self.DeleteTag = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTag", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteTagRequest.SerializeToString, - response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString, - ) - self.ListTags = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/ListTags", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListTagsRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListTagsResponse.FromString, - ) - self.SetIamPolicy = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/SetIamPolicy", - request_serializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.SetIamPolicyRequest.SerializeToString, - response_deserializer=google_dot_iam_dot_v1_dot_policy__pb2.Policy.FromString, - ) - self.GetIamPolicy = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/GetIamPolicy", - request_serializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.GetIamPolicyRequest.SerializeToString, - response_deserializer=google_dot_iam_dot_v1_dot_policy__pb2.Policy.FromString, - ) - self.TestIamPermissions = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.DataCatalog/TestIamPermissions", - request_serializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsRequest.SerializeToString, - response_deserializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsResponse.FromString, - ) - - -class DataCatalogServicer(object): - """Data Catalog API service allows clients to discover, understand, and manage - their data. - """ - - def SearchCatalog(self, request, context): - """Searches Data Catalog for multiple resources like entries, tags that - match a query. - - This is a custom method - (https://cloud.google.com/apis/design/custom_methods) and does not return - the complete resource, only the resource identifier and high level - fields. Clients can subsequentally call `Get` methods. - - Note that Data Catalog search queries do not guarantee full recall. Query - results that match your query may not be returned, even in subsequent - result pages. Also note that results returned (and not returned) can vary - across repeated search queries. - - See [Data Catalog Search - Syntax](https://cloud.google.com/data-catalog/docs/how-to/search-reference) - for more information. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateEntryGroup(self, request, context): - """A maximum of 10,000 entry groups may be created per organization across all - locations. - - Users should enable the Data Catalog API in the project identified by - the `parent` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateEntryGroup(self, request, context): - """Updates an EntryGroup. The user should enable the Data Catalog API in the - project identified by the `entry_group.name` parameter (see [Data Catalog - Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def GetEntryGroup(self, request, context): - """Gets an EntryGroup. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteEntryGroup(self, request, context): - """Deletes an EntryGroup. Only entry groups that do not contain entries can be - deleted. Users should enable the Data Catalog API in the project - identified by the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def ListEntryGroups(self, request, context): - """Lists entry groups. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateEntry(self, request, context): - """Creates an entry. Only entries of 'FILESET' type or user-specified type can - be created. - - Users should enable the Data Catalog API in the project identified by - the `parent` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - - A maximum of 100,000 entries may be created per entry group. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateEntry(self, request, context): - """Updates an existing entry. - Users should enable the Data Catalog API in the project identified by - the `entry.name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteEntry(self, request, context): - """Deletes an existing entry. Only entries created through - [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry] - method can be deleted. - Users should enable the Data Catalog API in the project identified by - the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def GetEntry(self, request, context): - """Gets an entry. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def LookupEntry(self, request, context): - """Get an entry by target resource name. This method allows clients to use - the resource name from the source Google Cloud Platform service to get the - Data Catalog Entry. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def ListEntries(self, request, context): - """Lists entries. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateTagTemplate(self, request, context): - """Creates a tag template. The user should enable the Data Catalog API in - the project identified by the `parent` parameter (see [Data Catalog - Resource - Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def GetTagTemplate(self, request, context): - """Gets a tag template. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateTagTemplate(self, request, context): - """Updates a tag template. This method cannot be used to update the fields of - a template. The tag template fields are represented as separate resources - and should be updated using their own create/update/delete methods. - Users should enable the Data Catalog API in the project identified by - the `tag_template.name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteTagTemplate(self, request, context): - """Deletes a tag template and all tags using the template. - Users should enable the Data Catalog API in the project identified by - the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateTagTemplateField(self, request, context): - """Creates a field in a tag template. The user should enable the Data Catalog - API in the project identified by the `parent` parameter (see - [Data Catalog Resource - Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateTagTemplateField(self, request, context): - """Updates a field in a tag template. This method cannot be used to update the - field type. Users should enable the Data Catalog API in the project - identified by the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def RenameTagTemplateField(self, request, context): - """Renames a field in a tag template. The user should enable the Data Catalog - API in the project identified by the `name` parameter (see [Data Catalog - Resource - Project](https://cloud.google.com/data-catalog/docs/concepts/resource-project) - for more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteTagTemplateField(self, request, context): - """Deletes a field in a tag template and all uses of that field. - Users should enable the Data Catalog API in the project identified by - the `name` parameter (see [Data Catalog Resource Project] - (https://cloud.google.com/data-catalog/docs/concepts/resource-project) for - more information). - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def CreateTag(self, request, context): - """Creates a tag on an [Entry][google.cloud.datacatalog.v1beta1.Entry]. - Note: The project identified by the `parent` parameter for the - [tag](https://cloud.google.com/data-catalog/docs/reference/rest/v1beta1/projects.locations.entryGroups.entries.tags/create#path-parameters) - and the - [tag - template](https://cloud.google.com/data-catalog/docs/reference/rest/v1beta1/projects.locations.tagTemplates/create#path-parameters) - used to create the tag must be from the same organization. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def UpdateTag(self, request, context): - """Updates an existing tag. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def DeleteTag(self, request, context): - """Deletes a tag. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def ListTags(self, request, context): - """Lists the tags on an [Entry][google.cloud.datacatalog.v1beta1.Entry]. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def SetIamPolicy(self, request, context): - """Sets the access control policy for a resource. Replaces any existing - policy. - Supported resources are: - - Tag templates. - - Entries. - - Entry groups. - Note, this method cannot be used to manage policies for BigQuery, Pub/Sub - and any external Google Cloud Platform resources synced to Data Catalog. - - Callers must have following Google IAM permission - - `datacatalog.tagTemplates.setIamPolicy` to set policies on tag - templates. - - `datacatalog.entries.setIamPolicy` to set policies on entries. - - `datacatalog.entryGroups.setIamPolicy` to set policies on entry groups. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def GetIamPolicy(self, request, context): - """Gets the access control policy for a resource. A `NOT_FOUND` error - is returned if the resource does not exist. An empty policy is returned - if the resource exists but does not have a policy set on it. - - Supported resources are: - - Tag templates. - - Entries. - - Entry groups. - Note, this method cannot be used to manage policies for BigQuery, Pub/Sub - and any external Google Cloud Platform resources synced to Data Catalog. - - Callers must have following Google IAM permission - - `datacatalog.tagTemplates.getIamPolicy` to get policies on tag - templates. - - `datacatalog.entries.getIamPolicy` to get policies on entries. - - `datacatalog.entryGroups.getIamPolicy` to get policies on entry groups. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def TestIamPermissions(self, request, context): - """Returns the caller's permissions on a resource. - If the resource does not exist, an empty set of permissions is returned - (We don't return a `NOT_FOUND` error). - - Supported resources are: - - Tag templates. - - Entries. - - Entry groups. - Note, this method cannot be used to manage policies for BigQuery, Pub/Sub - and any external Google Cloud Platform resources synced to Data Catalog. - - A caller is not required to have Google IAM permission to make this - request. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - -def add_DataCatalogServicer_to_server(servicer, server): - rpc_method_handlers = { - "SearchCatalog": grpc.unary_unary_rpc_method_handler( - servicer.SearchCatalog, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.SearchCatalogRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.SearchCatalogResponse.SerializeToString, - ), - "CreateEntryGroup": grpc.unary_unary_rpc_method_handler( - servicer.CreateEntryGroup, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateEntryGroupRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.EntryGroup.SerializeToString, - ), - "UpdateEntryGroup": grpc.unary_unary_rpc_method_handler( - servicer.UpdateEntryGroup, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateEntryGroupRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.EntryGroup.SerializeToString, - ), - "GetEntryGroup": grpc.unary_unary_rpc_method_handler( - servicer.GetEntryGroup, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.GetEntryGroupRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.EntryGroup.SerializeToString, - ), - "DeleteEntryGroup": grpc.unary_unary_rpc_method_handler( - servicer.DeleteEntryGroup, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteEntryGroupRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "ListEntryGroups": grpc.unary_unary_rpc_method_handler( - servicer.ListEntryGroups, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsResponse.SerializeToString, - ), - "CreateEntry": grpc.unary_unary_rpc_method_handler( - servicer.CreateEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateEntryRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.SerializeToString, - ), - "UpdateEntry": grpc.unary_unary_rpc_method_handler( - servicer.UpdateEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateEntryRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.SerializeToString, - ), - "DeleteEntry": grpc.unary_unary_rpc_method_handler( - servicer.DeleteEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteEntryRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "GetEntry": grpc.unary_unary_rpc_method_handler( - servicer.GetEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.GetEntryRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.SerializeToString, - ), - "LookupEntry": grpc.unary_unary_rpc_method_handler( - servicer.LookupEntry, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.LookupEntryRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.SerializeToString, - ), - "ListEntries": grpc.unary_unary_rpc_method_handler( - servicer.ListEntries, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntriesRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntriesResponse.SerializeToString, - ), - "CreateTagTemplate": grpc.unary_unary_rpc_method_handler( - servicer.CreateTagTemplate, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplate.SerializeToString, - ), - "GetTagTemplate": grpc.unary_unary_rpc_method_handler( - servicer.GetTagTemplate, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.GetTagTemplateRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplate.SerializeToString, - ), - "UpdateTagTemplate": grpc.unary_unary_rpc_method_handler( - servicer.UpdateTagTemplate, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplate.SerializeToString, - ), - "DeleteTagTemplate": grpc.unary_unary_rpc_method_handler( - servicer.DeleteTagTemplate, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "CreateTagTemplateField": grpc.unary_unary_rpc_method_handler( - servicer.CreateTagTemplateField, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateFieldRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplateField.SerializeToString, - ), - "UpdateTagTemplateField": grpc.unary_unary_rpc_method_handler( - servicer.UpdateTagTemplateField, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateFieldRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplateField.SerializeToString, - ), - "RenameTagTemplateField": grpc.unary_unary_rpc_method_handler( - servicer.RenameTagTemplateField, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.RenameTagTemplateFieldRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplateField.SerializeToString, - ), - "DeleteTagTemplateField": grpc.unary_unary_rpc_method_handler( - servicer.DeleteTagTemplateField, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateFieldRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "CreateTag": grpc.unary_unary_rpc_method_handler( - servicer.CreateTag, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateTagRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.Tag.SerializeToString, - ), - "UpdateTag": grpc.unary_unary_rpc_method_handler( - servicer.UpdateTag, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateTagRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.Tag.SerializeToString, - ), - "DeleteTag": grpc.unary_unary_rpc_method_handler( - servicer.DeleteTag, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteTagRequest.FromString, - response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, - ), - "ListTags": grpc.unary_unary_rpc_method_handler( - servicer.ListTags, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListTagsRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListTagsResponse.SerializeToString, - ), - "SetIamPolicy": grpc.unary_unary_rpc_method_handler( - servicer.SetIamPolicy, - request_deserializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.SetIamPolicyRequest.FromString, - response_serializer=google_dot_iam_dot_v1_dot_policy__pb2.Policy.SerializeToString, - ), - "GetIamPolicy": grpc.unary_unary_rpc_method_handler( - servicer.GetIamPolicy, - request_deserializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.GetIamPolicyRequest.FromString, - response_serializer=google_dot_iam_dot_v1_dot_policy__pb2.Policy.SerializeToString, - ), - "TestIamPermissions": grpc.unary_unary_rpc_method_handler( - servicer.TestIamPermissions, - request_deserializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsRequest.FromString, - response_serializer=google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsResponse.SerializeToString, - ), - } - generic_handler = grpc.method_handlers_generic_handler( - "google.cloud.datacatalog.v1beta1.DataCatalog", rpc_method_handlers - ) - server.add_generic_rpc_handlers((generic_handler,)) - - -# This class is part of an EXPERIMENTAL API. -class DataCatalog(object): - """Data Catalog API service allows clients to discover, understand, and manage - their data. - """ - - @staticmethod - def SearchCatalog( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/SearchCatalog", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.SearchCatalogRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.SearchCatalogResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateEntryGroup( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateEntryGroup", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateEntryGroupRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateEntryGroup( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateEntryGroup", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateEntryGroupRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def GetEntryGroup( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/GetEntryGroup", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.GetEntryGroupRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.EntryGroup.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteEntryGroup( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteEntryGroup", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteEntryGroupRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def ListEntryGroups( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/ListEntryGroups", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntryGroupsResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateEntry", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateEntryRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateEntry", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateEntryRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteEntry", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteEntryRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def GetEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/GetEntry", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.GetEntryRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def LookupEntry( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/LookupEntry", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.LookupEntryRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.Entry.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def ListEntries( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/ListEntries", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntriesRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListEntriesResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateTagTemplate( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTagTemplate", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def GetTagTemplate( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/GetTagTemplate", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.GetTagTemplateRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateTagTemplate( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTagTemplate", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplate.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteTagTemplate( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTagTemplate", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateTagTemplateField( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTagTemplateField", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateTagTemplateFieldRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateTagTemplateField( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTagTemplateField", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateTagTemplateFieldRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def RenameTagTemplateField( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/RenameTagTemplateField", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.RenameTagTemplateFieldRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.TagTemplateField.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteTagTemplateField( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTagTemplateField", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteTagTemplateFieldRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def CreateTag( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTag", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.CreateTagRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.Tag.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def UpdateTag( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTag", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.UpdateTagRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_tags__pb2.Tag.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def DeleteTag( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTag", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.DeleteTagRequest.SerializeToString, - google_dot_protobuf_dot_empty__pb2.Empty.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def ListTags( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/ListTags", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListTagsRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_datacatalog__pb2.ListTagsResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def SetIamPolicy( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/SetIamPolicy", - google_dot_iam_dot_v1_dot_iam__policy__pb2.SetIamPolicyRequest.SerializeToString, - google_dot_iam_dot_v1_dot_policy__pb2.Policy.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def GetIamPolicy( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/GetIamPolicy", - google_dot_iam_dot_v1_dot_iam__policy__pb2.GetIamPolicyRequest.SerializeToString, - google_dot_iam_dot_v1_dot_policy__pb2.Policy.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def TestIamPermissions( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.DataCatalog/TestIamPermissions", - google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsRequest.SerializeToString, - google_dot_iam_dot_v1_dot_iam__policy__pb2.TestIamPermissionsResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) diff --git a/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec.proto b/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec.proto index e7397d05..c8ca9779 100644 --- a/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec.proto +++ b/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec.proto @@ -1,4 +1,4 @@ -// Copyright 2019 Google LLC. +// Copyright 2020 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. @@ -11,7 +11,6 @@ // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. -// syntax = "proto3"; @@ -21,38 +20,57 @@ import "google/api/field_behavior.proto"; import "google/cloud/datacatalog/v1beta1/timestamps.proto"; option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1Beta1"; option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog"; option java_multiple_files = true; -option java_package = "com.google.cloud.datacatalog"; +option java_package = "com.google.cloud.datacatalog.v1beta1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1beta1"; +option ruby_package = "Google::Cloud::DataCatalog::V1beta1"; // Describes a Cloud Storage fileset entry. message GcsFilesetSpec { // Required. Patterns to identify a set of files in Google Cloud Storage. + // See [Cloud Storage + // documentation](https://cloud.google.com/storage/docs/gsutil/addlhelp/WildcardNames) + // for more information. Note that bucket wildcards are currently not + // supported. // // Examples of valid file_patterns: // - // * `gs://bucket_name/*`: matches all files in `bucket_name` + // * `gs://bucket_name/dir/*`: matches all files within `bucket_name/dir` + // directory. + // * `gs://bucket_name/dir/**`: matches all files in `bucket_name/dir` + // spanning all subdirectories. // * `gs://bucket_name/file*`: matches files prefixed by `file` in // `bucket_name` + // * `gs://bucket_name/??.txt`: matches files with two characters followed by + // `.txt` in `bucket_name` + // * `gs://bucket_name/[aeiou].txt`: matches files that contain a single + // vowel character followed by `.txt` in + // `bucket_name` + // * `gs://bucket_name/[a-m].txt`: matches files that contain `a`, `b`, ... + // or `m` followed by `.txt` in `bucket_name` // * `gs://bucket_name/a/*/b`: matches all files in `bucket_name` that match // `a/*/b` pattern, such as `a/c/b`, `a/d/b` // * `gs://another_bucket/a.txt`: matches `gs://another_bucket/a.txt` + // + // You can combine wildcards to provide more powerful matches, for example: + // + // * `gs://bucket_name/[a-m]??.j*g` repeated string file_patterns = 1 [(google.api.field_behavior) = REQUIRED]; - // Output only. Sample files contained in this fileset, not all files - // contained in this fileset are represented here. - repeated GcsFileSpec sample_gcs_file_specs = 2 - [(google.api.field_behavior) = OUTPUT_ONLY]; + // Output only. Sample files contained in this fileset, not all files contained in this + // fileset are represented here. + repeated GcsFileSpec sample_gcs_file_specs = 2 [(google.api.field_behavior) = OUTPUT_ONLY]; } -// Specifications of a single file in GCS. +// Specifications of a single file in Cloud Storage. message GcsFileSpec { // Required. The full file path. Example: `gs://bucket_name/a/b.txt`. string file_path = 1 [(google.api.field_behavior) = REQUIRED]; - // Output only. Timestamps about the GCS file. - SystemTimestamps gcs_timestamps = 2 - [(google.api.field_behavior) = OUTPUT_ONLY]; + // Output only. Timestamps about the Cloud Storage file. + SystemTimestamps gcs_timestamps = 2 [(google.api.field_behavior) = OUTPUT_ONLY]; // Output only. The size of the file, in bytes. int64 size_bytes = 4 [(google.api.field_behavior) = OUTPUT_ONLY]; diff --git a/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec_pb2.py b/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec_pb2.py deleted file mode 100644 index 12387dfc..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec_pb2.py +++ /dev/null @@ -1,254 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec.proto - -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.cloud.datacatalog_v1beta1.proto import ( - timestamps_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_timestamps__pb2, -) - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec.proto", - package="google.cloud.datacatalog.v1beta1", - syntax="proto3", - serialized_options=b"\n$com.google.cloud.datacatalog.v1beta1P\001ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\370\001\001\252\002 Google.Cloud.DataCatalog.V1Beta1\312\002 Google\\Cloud\\DataCatalog\\V1beta1\352\002#Google::Cloud::DataCatalog::V1beta1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n=google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec.proto\x12 google.cloud.datacatalog.v1beta1\x1a\x1fgoogle/api/field_behavior.proto\x1a\x37google/cloud/datacatalog_v1beta1/proto/timestamps.proto"\x7f\n\x0eGcsFilesetSpec\x12\x1a\n\rfile_patterns\x18\x01 \x03(\tB\x03\xe0\x41\x02\x12Q\n\x15sample_gcs_file_specs\x18\x02 \x03(\x0b\x32-.google.cloud.datacatalog.v1beta1.GcsFileSpecB\x03\xe0\x41\x03"\x8f\x01\n\x0bGcsFileSpec\x12\x16\n\tfile_path\x18\x01 \x01(\tB\x03\xe0\x41\x02\x12O\n\x0egcs_timestamps\x18\x02 \x01(\x0b\x32\x32.google.cloud.datacatalog.v1beta1.SystemTimestampsB\x03\xe0\x41\x03\x12\x17\n\nsize_bytes\x18\x04 \x01(\x03\x42\x03\xe0\x41\x03\x42\xe4\x01\n$com.google.cloud.datacatalog.v1beta1P\x01ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\xf8\x01\x01\xaa\x02 Google.Cloud.DataCatalog.V1Beta1\xca\x02 Google\\Cloud\\DataCatalog\\V1beta1\xea\x02#Google::Cloud::DataCatalog::V1beta1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_timestamps__pb2.DESCRIPTOR, - ], -) - - -_GCSFILESETSPEC = _descriptor.Descriptor( - name="GcsFilesetSpec", - full_name="google.cloud.datacatalog.v1beta1.GcsFilesetSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="file_patterns", - full_name="google.cloud.datacatalog.v1beta1.GcsFilesetSpec.file_patterns", - index=0, - number=1, - type=9, - cpp_type=9, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="sample_gcs_file_specs", - full_name="google.cloud.datacatalog.v1beta1.GcsFilesetSpec.sample_gcs_file_specs", - index=1, - number=2, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=189, - serialized_end=316, -) - - -_GCSFILESPEC = _descriptor.Descriptor( - name="GcsFileSpec", - full_name="google.cloud.datacatalog.v1beta1.GcsFileSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="file_path", - full_name="google.cloud.datacatalog.v1beta1.GcsFileSpec.file_path", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="gcs_timestamps", - full_name="google.cloud.datacatalog.v1beta1.GcsFileSpec.gcs_timestamps", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="size_bytes", - full_name="google.cloud.datacatalog.v1beta1.GcsFileSpec.size_bytes", - index=2, - number=4, - type=3, - cpp_type=2, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=319, - serialized_end=462, -) - -_GCSFILESETSPEC.fields_by_name["sample_gcs_file_specs"].message_type = _GCSFILESPEC -_GCSFILESPEC.fields_by_name[ - "gcs_timestamps" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_timestamps__pb2._SYSTEMTIMESTAMPS -) -DESCRIPTOR.message_types_by_name["GcsFilesetSpec"] = _GCSFILESETSPEC -DESCRIPTOR.message_types_by_name["GcsFileSpec"] = _GCSFILESPEC -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -GcsFilesetSpec = _reflection.GeneratedProtocolMessageType( - "GcsFilesetSpec", - (_message.Message,), - { - "DESCRIPTOR": _GCSFILESETSPEC, - "__module__": "google.cloud.datacatalog_v1beta1.proto.gcs_fileset_spec_pb2", - "__doc__": """Describes a Cloud Storage fileset entry. - - Attributes: - file_patterns: - Required. Patterns to identify a set of files in Google Cloud - Storage. See `Cloud Storage documentation `__ for more - information. Note that bucket wildcards are currently not - supported. Examples of valid file_patterns: - - ``gs://bucket_name/dir/*``: matches all files within - ``bucket_name/dir`` directory. - ``gs://bucket_name/dir/**``: - matches all files in ``bucket_name/dir`` spanning all - subdirectories. - ``gs://bucket_name/file*``: matches files - prefixed by ``file`` in ``bucket_name`` - - ``gs://bucket_name/??.txt``: matches files with two characters - followed by ``.txt`` in ``bucket_name`` - - ``gs://bucket_name/[aeiou].txt``: matches files that contain a - single vowel character followed by ``.txt`` in - ``bucket_name`` - ``gs://bucket_name/[a-m].txt``: matches - files that contain ``a``, ``b``, … or ``m`` followed by - ``.txt`` in ``bucket_name`` - ``gs://bucket_name/a/*/b``: - matches all files in ``bucket_name`` that match ``a/*/b`` - pattern, such as ``a/c/b``, ``a/d/b`` - - ``gs://another_bucket/a.txt``: matches - ``gs://another_bucket/a.txt`` You can combine wildcards to - provide more powerful matches, for example: - - ``gs://bucket_name/[a-m]??.j*g`` - sample_gcs_file_specs: - Output only. Sample files contained in this fileset, not all - files contained in this fileset are represented here. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.GcsFilesetSpec) - }, -) -_sym_db.RegisterMessage(GcsFilesetSpec) - -GcsFileSpec = _reflection.GeneratedProtocolMessageType( - "GcsFileSpec", - (_message.Message,), - { - "DESCRIPTOR": _GCSFILESPEC, - "__module__": "google.cloud.datacatalog_v1beta1.proto.gcs_fileset_spec_pb2", - "__doc__": """Specifications of a single file in Cloud Storage. - - Attributes: - file_path: - Required. The full file path. Example: - ``gs://bucket_name/a/b.txt``. - gcs_timestamps: - Output only. Timestamps about the Cloud Storage file. - size_bytes: - Output only. The size of the file, in bytes. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.GcsFileSpec) - }, -) -_sym_db.RegisterMessage(GcsFileSpec) - - -DESCRIPTOR._options = None -_GCSFILESETSPEC.fields_by_name["file_patterns"]._options = None -_GCSFILESETSPEC.fields_by_name["sample_gcs_file_specs"]._options = None -_GCSFILESPEC.fields_by_name["file_path"]._options = None -_GCSFILESPEC.fields_by_name["gcs_timestamps"]._options = None -_GCSFILESPEC.fields_by_name["size_bytes"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec_pb2_grpc.py b/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1beta1/proto/policytagmanager.proto b/google/cloud/datacatalog_v1beta1/proto/policytagmanager.proto new file mode 100644 index 00000000..5602bcf1 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/proto/policytagmanager.proto @@ -0,0 +1,417 @@ +// Copyright 2020 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +syntax = "proto3"; + +package google.cloud.datacatalog.v1beta1; + +import "google/api/annotations.proto"; +import "google/api/client.proto"; +import "google/api/field_behavior.proto"; +import "google/api/resource.proto"; +import "google/cloud/datacatalog/v1beta1/timestamps.proto"; +import "google/iam/v1/iam_policy.proto"; +import "google/iam/v1/policy.proto"; +import "google/protobuf/empty.proto"; +import "google/protobuf/field_mask.proto"; + +option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1Beta1"; +option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog"; +option java_multiple_files = true; +option java_outer_classname = "PolicyTagManagerProto"; +option java_package = "com.google.cloud.datacatalog.v1beta1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1beta1"; +option ruby_package = "Google::Cloud::DataCatalog::V1beta1"; + +// The policy tag manager API service allows clients to manage their taxonomies +// and policy tags. +service PolicyTagManager { + option (google.api.default_host) = "datacatalog.googleapis.com"; + option (google.api.oauth_scopes) = "https://www.googleapis.com/auth/cloud-platform"; + + // Creates a taxonomy in the specified project. + rpc CreateTaxonomy(CreateTaxonomyRequest) returns (Taxonomy) { + option (google.api.http) = { + post: "/v1beta1/{parent=projects/*/locations/*}/taxonomies" + body: "taxonomy" + }; + option (google.api.method_signature) = "parent,taxonomy"; + } + + // Deletes a taxonomy. This operation will also delete all + // policy tags in this taxonomy along with their associated policies. + rpc DeleteTaxonomy(DeleteTaxonomyRequest) returns (google.protobuf.Empty) { + option (google.api.http) = { + delete: "/v1beta1/{name=projects/*/locations/*/taxonomies/*}" + }; + option (google.api.method_signature) = "name"; + } + + // Updates a taxonomy. + rpc UpdateTaxonomy(UpdateTaxonomyRequest) returns (Taxonomy) { + option (google.api.http) = { + patch: "/v1beta1/{taxonomy.name=projects/*/locations/*/taxonomies/*}" + body: "taxonomy" + }; + option (google.api.method_signature) = "taxonomy"; + } + + // Lists all taxonomies in a project in a particular location that the caller + // has permission to view. + rpc ListTaxonomies(ListTaxonomiesRequest) returns (ListTaxonomiesResponse) { + option (google.api.http) = { + get: "/v1beta1/{parent=projects/*/locations/*}/taxonomies" + }; + option (google.api.method_signature) = "parent"; + } + + // Gets a taxonomy. + rpc GetTaxonomy(GetTaxonomyRequest) returns (Taxonomy) { + option (google.api.http) = { + get: "/v1beta1/{name=projects/*/locations/*/taxonomies/*}" + }; + option (google.api.method_signature) = "name"; + } + + // Creates a policy tag in the specified taxonomy. + rpc CreatePolicyTag(CreatePolicyTagRequest) returns (PolicyTag) { + option (google.api.http) = { + post: "/v1beta1/{parent=projects/*/locations/*/taxonomies/*}/policyTags" + body: "policy_tag" + }; + option (google.api.method_signature) = "parent,policy_tag"; + } + + // Deletes a policy tag. Also deletes all of its descendant policy tags. + rpc DeletePolicyTag(DeletePolicyTagRequest) returns (google.protobuf.Empty) { + option (google.api.http) = { + delete: "/v1beta1/{name=projects/*/locations/*/taxonomies/*/policyTags/*}" + }; + option (google.api.method_signature) = "name"; + } + + // Updates a policy tag. + rpc UpdatePolicyTag(UpdatePolicyTagRequest) returns (PolicyTag) { + option (google.api.http) = { + patch: "/v1beta1/{policy_tag.name=projects/*/locations/*/taxonomies/*/policyTags/*}" + body: "policy_tag" + }; + option (google.api.method_signature) = "policy_tag"; + } + + // Lists all policy tags in a taxonomy. + rpc ListPolicyTags(ListPolicyTagsRequest) returns (ListPolicyTagsResponse) { + option (google.api.http) = { + get: "/v1beta1/{parent=projects/*/locations/*/taxonomies/*}/policyTags" + }; + option (google.api.method_signature) = "parent"; + } + + // Gets a policy tag. + rpc GetPolicyTag(GetPolicyTagRequest) returns (PolicyTag) { + option (google.api.http) = { + get: "/v1beta1/{name=projects/*/locations/*/taxonomies/*/policyTags/*}" + }; + option (google.api.method_signature) = "name"; + } + + // Gets the IAM policy for a taxonomy or a policy tag. + rpc GetIamPolicy(google.iam.v1.GetIamPolicyRequest) returns (google.iam.v1.Policy) { + option (google.api.http) = { + post: "/v1beta1/{resource=projects/*/locations/*/taxonomies/*}:getIamPolicy" + body: "*" + additional_bindings { + post: "/v1beta1/{resource=projects/*/locations/*/taxonomies/*/policyTags/*}:getIamPolicy" + body: "*" + } + }; + } + + // Sets the IAM policy for a taxonomy or a policy tag. + rpc SetIamPolicy(google.iam.v1.SetIamPolicyRequest) returns (google.iam.v1.Policy) { + option (google.api.http) = { + post: "/v1beta1/{resource=projects/*/locations/*/taxonomies/*}:setIamPolicy" + body: "*" + additional_bindings { + post: "/v1beta1/{resource=projects/*/locations/*/taxonomies/*/policyTags/*}:setIamPolicy" + body: "*" + } + }; + } + + // Returns the permissions that a caller has on the specified taxonomy or + // policy tag. + rpc TestIamPermissions(google.iam.v1.TestIamPermissionsRequest) returns (google.iam.v1.TestIamPermissionsResponse) { + option (google.api.http) = { + post: "/v1beta1/{resource=projects/*/locations/*/taxonomies/*}:testIamPermissions" + body: "*" + additional_bindings { + post: "/v1beta1/{resource=projects/*/locations/*/taxonomies/*/policyTags/*}:testIamPermissions" + body: "*" + } + }; + } +} + +// A taxonomy is a collection of policy tags that classify data along a common +// axis. For instance a data *sensitivity* taxonomy could contain policy tags +// denoting PII such as age, zipcode, and SSN. A data *origin* taxonomy could +// contain policy tags to distinguish user data, employee data, partner data, +// public data. +message Taxonomy { + option (google.api.resource) = { + type: "datacatalog.googleapis.com/Taxonomy" + pattern: "projects/{project}/locations/{location}/taxonomies/{taxonomy}" + }; + + // Defines policy types where policy tag can be used for. + enum PolicyType { + // Unspecified policy type. + POLICY_TYPE_UNSPECIFIED = 0; + + // Fine grained access control policy, which enables access control on + // tagged resources. + FINE_GRAINED_ACCESS_CONTROL = 1; + } + + // Output only. Resource name of this taxonomy, whose format is: + // "projects/{project_number}/locations/{location_id}/taxonomies/{id}". + string name = 1 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // Required. User defined name of this taxonomy. It must: contain only unicode letters, + // numbers, underscores, dashes and spaces; not start or end with spaces; and + // be at most 200 bytes long when encoded in UTF-8. + string display_name = 2 [(google.api.field_behavior) = REQUIRED]; + + // Optional. Description of this taxonomy. It must: contain only unicode characters, + // tabs, newlines, carriage returns and page breaks; and be at most 2000 bytes + // long when encoded in UTF-8. If not set, defaults to an empty description. + string description = 3 [(google.api.field_behavior) = OPTIONAL]; + + // Optional. A list of policy types that are activated for this taxonomy. If not set, + // defaults to an empty list. + repeated PolicyType activated_policy_types = 6 [(google.api.field_behavior) = OPTIONAL]; +} + +// Denotes one policy tag in a taxonomy (e.g. ssn). Policy Tags can be defined +// in a hierarchy. For example, consider the following hierarchy: +// Geolocation -> (LatLong, City, ZipCode). PolicyTag "Geolocation" +// contains three child policy tags: "LatLong", "City", and "ZipCode". +message PolicyTag { + option (google.api.resource) = { + type: "datacatalog.googleapis.com/PolicyTag" + pattern: "projects/{project}/locations/{location}/taxonomies/{taxonomy}/policyTags/{policy_tag}" + }; + + // Output only. Resource name of this policy tag, whose format is: + // "projects/{project_number}/locations/{location_id}/taxonomies/{taxonomy_id}/policyTags/{id}". + string name = 1 [(google.api.field_behavior) = OUTPUT_ONLY]; + + // Required. User defined name of this policy tag. It must: be unique within the parent + // taxonomy; contain only unicode letters, numbers, underscores, dashes and + // spaces; not start or end with spaces; and be at most 200 bytes long when + // encoded in UTF-8. + string display_name = 2 [(google.api.field_behavior) = REQUIRED]; + + // Description of this policy tag. It must: contain only unicode characters, + // tabs, newlines, carriage returns and page breaks; and be at most 2000 bytes + // long when encoded in UTF-8. If not set, defaults to an empty description. + // If not set, defaults to an empty description. + string description = 3; + + // Resource name of this policy tag's parent policy tag (e.g. for the + // "LatLong" policy tag in the example above, this field contains the + // resource name of the "Geolocation" policy tag). If empty, it means this + // policy tag is a top level policy tag (e.g. this field is empty for the + // "Geolocation" policy tag in the example above). If not set, defaults to an + // empty string. + string parent_policy_tag = 4; + + // Output only. Resource names of child policy tags of this policy tag. + repeated string child_policy_tags = 5 [(google.api.field_behavior) = OUTPUT_ONLY]; +} + +// Request message for +// [CreateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreateTaxonomy]. +message CreateTaxonomyRequest { + // Required. Resource name of the project that the taxonomy will belong to. + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + child_type: "datacatalog.googleapis.com/Taxonomy" + } + ]; + + // The taxonomy to be created. + Taxonomy taxonomy = 2; +} + +// Request message for +// [DeleteTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeleteTaxonomy]. +message DeleteTaxonomyRequest { + // Required. Resource name of the taxonomy to be deleted. All policy tags in + // this taxonomy will also be deleted. + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/Taxonomy" + } + ]; +} + +// Request message for +// [UpdateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdateTaxonomy]. +message UpdateTaxonomyRequest { + // The taxonomy to update. Only description, display_name, and activated + // policy types can be updated. + Taxonomy taxonomy = 1; + + // The update mask applies to the resource. For the `FieldMask` definition, + // see + // https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask + // If not set, defaults to all of the fields that are allowed to update. + google.protobuf.FieldMask update_mask = 2; +} + +// Request message for +// [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. +message ListTaxonomiesRequest { + // Required. Resource name of the project to list the taxonomies of. + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + child_type: "datacatalog.googleapis.com/Taxonomy" + } + ]; + + // The maximum number of items to return. Must be a value between 1 and 1000. + // If not set, defaults to 50. + int32 page_size = 2; + + // The next_page_token value returned from a previous list request, if any. If + // not set, defaults to an empty string. + string page_token = 3; +} + +// Response message for +// [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. +message ListTaxonomiesResponse { + // Taxonomies that the project contains. + repeated Taxonomy taxonomies = 1; + + // Token used to retrieve the next page of results, or empty if there are no + // more results in the list. + string next_page_token = 2; +} + +// Request message for +// [GetTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetTaxonomy]. +message GetTaxonomyRequest { + // Required. Resource name of the requested taxonomy. + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/Taxonomy" + } + ]; +} + +// Request message for +// [CreatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreatePolicyTag]. +message CreatePolicyTagRequest { + // Required. Resource name of the taxonomy that the policy tag will belong to. + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + child_type: "datacatalog.googleapis.com/PolicyTag" + } + ]; + + // The policy tag to be created. + PolicyTag policy_tag = 2; +} + +// Request message for +// [DeletePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeletePolicyTag]. +message DeletePolicyTagRequest { + // Required. Resource name of the policy tag to be deleted. All of its descendant + // policy tags will also be deleted. + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/PolicyTag" + } + ]; +} + +// Request message for +// [UpdatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdatePolicyTag]. +message UpdatePolicyTagRequest { + // The policy tag to update. Only the description, display_name, and + // parent_policy_tag fields can be updated. + PolicyTag policy_tag = 1; + + // The update mask applies to the resource. Only display_name, description and + // parent_policy_tag can be updated and thus can be listed in the mask. If + // update_mask is not provided, all allowed fields (i.e. display_name, + // description and parent) will be updated. For more information including the + // `FieldMask` definition, see + // https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask + // If not set, defaults to all of the fields that are allowed to update. + google.protobuf.FieldMask update_mask = 2; +} + +// Request message for +// [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. +message ListPolicyTagsRequest { + // Required. Resource name of the taxonomy to list the policy tags of. + string parent = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + child_type: "datacatalog.googleapis.com/PolicyTag" + } + ]; + + // The maximum number of items to return. Must be a value between 1 and 1000. + // If not set, defaults to 50. + int32 page_size = 2; + + // The next_page_token value returned from a previous List request, if any. If + // not set, defaults to an empty string. + string page_token = 3; +} + +// Response message for +// [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. +message ListPolicyTagsResponse { + // The policy tags that are in the requested taxonomy. + repeated PolicyTag policy_tags = 1; + + // Token used to retrieve the next page of results, or empty if there are no + // more results in the list. + string next_page_token = 2; +} + +// Request message for +// [GetPolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetPolicyTag]. +message GetPolicyTagRequest { + // Required. Resource name of the requested policy tag. + string name = 1 [ + (google.api.field_behavior) = REQUIRED, + (google.api.resource_reference) = { + type: "datacatalog.googleapis.com/PolicyTag" + } + ]; +} diff --git a/google/cloud/datacatalog_v1beta1/proto/policytagmanager_pb2.py b/google/cloud/datacatalog_v1beta1/proto/policytagmanager_pb2.py deleted file mode 100644 index 3866a3bc..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/policytagmanager_pb2.py +++ /dev/null @@ -1,1514 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1beta1/proto/policytagmanager.proto - -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2 -from google.api import client_pb2 as google_dot_api_dot_client__pb2 -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.api import resource_pb2 as google_dot_api_dot_resource__pb2 -from google.cloud.datacatalog_v1beta1.proto import ( - timestamps_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_timestamps__pb2, -) -from google.iam.v1 import iam_policy_pb2 as google_dot_iam_dot_v1_dot_iam__policy__pb2 -from google.iam.v1 import policy_pb2 as google_dot_iam_dot_v1_dot_policy__pb2 -from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2 -from google.protobuf import field_mask_pb2 as google_dot_protobuf_dot_field__mask__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1beta1/proto/policytagmanager.proto", - package="google.cloud.datacatalog.v1beta1", - syntax="proto3", - serialized_options=b"\n$com.google.cloud.datacatalog.v1beta1B\025PolicyTagManagerProtoP\001ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\370\001\001\252\002 Google.Cloud.DataCatalog.V1Beta1\312\002 Google\\Cloud\\DataCatalog\\V1beta1\352\002#Google::Cloud::DataCatalog::V1beta1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n=google/cloud/datacatalog_v1beta1/proto/policytagmanager.proto\x12 google.cloud.datacatalog.v1beta1\x1a\x1cgoogle/api/annotations.proto\x1a\x17google/api/client.proto\x1a\x1fgoogle/api/field_behavior.proto\x1a\x19google/api/resource.proto\x1a\x37google/cloud/datacatalog_v1beta1/proto/timestamps.proto\x1a\x1egoogle/iam/v1/iam_policy.proto\x1a\x1agoogle/iam/v1/policy.proto\x1a\x1bgoogle/protobuf/empty.proto\x1a google/protobuf/field_mask.proto"\xe3\x02\n\x08Taxonomy\x12\x11\n\x04name\x18\x01 \x01(\tB\x03\xe0\x41\x03\x12\x19\n\x0c\x64isplay_name\x18\x02 \x01(\tB\x03\xe0\x41\x02\x12\x18\n\x0b\x64\x65scription\x18\x03 \x01(\tB\x03\xe0\x41\x01\x12Z\n\x16\x61\x63tivated_policy_types\x18\x06 \x03(\x0e\x32\x35.google.cloud.datacatalog.v1beta1.Taxonomy.PolicyTypeB\x03\xe0\x41\x01"J\n\nPolicyType\x12\x1b\n\x17POLICY_TYPE_UNSPECIFIED\x10\x00\x12\x1f\n\x1b\x46INE_GRAINED_ACCESS_CONTROL\x10\x01:g\xea\x41\x64\n#datacatalog.googleapis.com/Taxonomy\x12=projects/{project}/locations/{location}/taxonomies/{taxonomy}"\x8c\x02\n\tPolicyTag\x12\x11\n\x04name\x18\x01 \x01(\tB\x03\xe0\x41\x03\x12\x19\n\x0c\x64isplay_name\x18\x02 \x01(\tB\x03\xe0\x41\x02\x12\x13\n\x0b\x64\x65scription\x18\x03 \x01(\t\x12\x19\n\x11parent_policy_tag\x18\x04 \x01(\t\x12\x1e\n\x11\x63hild_policy_tags\x18\x05 \x03(\tB\x03\xe0\x41\x03:\x80\x01\xea\x41}\n$datacatalog.googleapis.com/PolicyTag\x12Uprojects/{project}/locations/{location}/taxonomies/{taxonomy}/policyTags/{policy_tag}"\x92\x01\n\x15\x43reateTaxonomyRequest\x12;\n\x06parent\x18\x01 \x01(\tB+\xe0\x41\x02\xfa\x41%\x12#datacatalog.googleapis.com/Taxonomy\x12<\n\x08taxonomy\x18\x02 \x01(\x0b\x32*.google.cloud.datacatalog.v1beta1.Taxonomy"R\n\x15\x44\x65leteTaxonomyRequest\x12\x39\n\x04name\x18\x01 \x01(\tB+\xe0\x41\x02\xfa\x41%\n#datacatalog.googleapis.com/Taxonomy"\x86\x01\n\x15UpdateTaxonomyRequest\x12<\n\x08taxonomy\x18\x01 \x01(\x0b\x32*.google.cloud.datacatalog.v1beta1.Taxonomy\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"{\n\x15ListTaxonomiesRequest\x12;\n\x06parent\x18\x01 \x01(\tB+\xe0\x41\x02\xfa\x41%\x12#datacatalog.googleapis.com/Taxonomy\x12\x11\n\tpage_size\x18\x02 \x01(\x05\x12\x12\n\npage_token\x18\x03 \x01(\t"q\n\x16ListTaxonomiesResponse\x12>\n\ntaxonomies\x18\x01 \x03(\x0b\x32*.google.cloud.datacatalog.v1beta1.Taxonomy\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t"O\n\x12GetTaxonomyRequest\x12\x39\n\x04name\x18\x01 \x01(\tB+\xe0\x41\x02\xfa\x41%\n#datacatalog.googleapis.com/Taxonomy"\x97\x01\n\x16\x43reatePolicyTagRequest\x12<\n\x06parent\x18\x01 \x01(\tB,\xe0\x41\x02\xfa\x41&\x12$datacatalog.googleapis.com/PolicyTag\x12?\n\npolicy_tag\x18\x02 \x01(\x0b\x32+.google.cloud.datacatalog.v1beta1.PolicyTag"T\n\x16\x44\x65letePolicyTagRequest\x12:\n\x04name\x18\x01 \x01(\tB,\xe0\x41\x02\xfa\x41&\n$datacatalog.googleapis.com/PolicyTag"\x8a\x01\n\x16UpdatePolicyTagRequest\x12?\n\npolicy_tag\x18\x01 \x01(\x0b\x32+.google.cloud.datacatalog.v1beta1.PolicyTag\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"|\n\x15ListPolicyTagsRequest\x12<\n\x06parent\x18\x01 \x01(\tB,\xe0\x41\x02\xfa\x41&\x12$datacatalog.googleapis.com/PolicyTag\x12\x11\n\tpage_size\x18\x02 \x01(\x05\x12\x12\n\npage_token\x18\x03 \x01(\t"s\n\x16ListPolicyTagsResponse\x12@\n\x0bpolicy_tags\x18\x01 \x03(\x0b\x32+.google.cloud.datacatalog.v1beta1.PolicyTag\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t"Q\n\x13GetPolicyTagRequest\x12:\n\x04name\x18\x01 \x01(\tB,\xe0\x41\x02\xfa\x41&\n$datacatalog.googleapis.com/PolicyTag2\xe5\x16\n\x10PolicyTagManager\x12\xce\x01\n\x0e\x43reateTaxonomy\x12\x37.google.cloud.datacatalog.v1beta1.CreateTaxonomyRequest\x1a*.google.cloud.datacatalog.v1beta1.Taxonomy"W\x82\xd3\xe4\x93\x02?"3/v1beta1/{parent=projects/*/locations/*}/taxonomies:\x08taxonomy\xda\x41\x0fparent,taxonomy\x12\xa5\x01\n\x0e\x44\x65leteTaxonomy\x12\x37.google.cloud.datacatalog.v1beta1.DeleteTaxonomyRequest\x1a\x16.google.protobuf.Empty"B\x82\xd3\xe4\x93\x02\x35*3/v1beta1/{name=projects/*/locations/*/taxonomies/*}\xda\x41\x04name\x12\xd0\x01\n\x0eUpdateTaxonomy\x12\x37.google.cloud.datacatalog.v1beta1.UpdateTaxonomyRequest\x1a*.google.cloud.datacatalog.v1beta1.Taxonomy"Y\x82\xd3\xe4\x93\x02H2 (LatLong, City, ZipCode). PolicyTag “Geolocation” - contains three child policy tags: “LatLong”, “City”, and “ZipCode”. - - Attributes: - name: - Output only. Resource name of this policy tag, whose format - is: “projects/{project_number}/locations/{location_id}/taxonom - ies/{taxonomy_id}/policyTags/{id}”. - display_name: - Required. User defined name of this policy tag. It must: be - unique within the parent taxonomy; contain only unicode - letters, numbers, underscores, dashes and spaces; not start or - end with spaces; and be at most 200 bytes long when encoded in - UTF-8. - description: - Description of this policy tag. It must: contain only unicode - characters, tabs, newlines, carriage returns and page breaks; - and be at most 2000 bytes long when encoded in UTF-8. If not - set, defaults to an empty description. If not set, defaults to - an empty description. - parent_policy_tag: - Resource name of this policy tag’s parent policy tag (e.g. for - the “LatLong” policy tag in the example above, this field - contains the resource name of the “Geolocation” policy tag). - If empty, it means this policy tag is a top level policy tag - (e.g. this field is empty for the “Geolocation” policy tag in - the example above). If not set, defaults to an empty string. - child_policy_tags: - Output only. Resource names of child policy tags of this - policy tag. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.PolicyTag) - }, -) -_sym_db.RegisterMessage(PolicyTag) - -CreateTaxonomyRequest = _reflection.GeneratedProtocolMessageType( - "CreateTaxonomyRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATETAXONOMYREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [CreateTaxonomy][google.cloud.datacatalog.v1beta1. - PolicyTagManager.CreateTaxonomy]. - - Attributes: - parent: - Required. Resource name of the project that the taxonomy will - belong to. - taxonomy: - The taxonomy to be created. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.CreateTaxonomyRequest) - }, -) -_sym_db.RegisterMessage(CreateTaxonomyRequest) - -DeleteTaxonomyRequest = _reflection.GeneratedProtocolMessageType( - "DeleteTaxonomyRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETETAXONOMYREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [DeleteTaxonomy][google.cloud.datacatalog.v1beta1. - PolicyTagManager.DeleteTaxonomy]. - - Attributes: - name: - Required. Resource name of the taxonomy to be deleted. All - policy tags in this taxonomy will also be deleted. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.DeleteTaxonomyRequest) - }, -) -_sym_db.RegisterMessage(DeleteTaxonomyRequest) - -UpdateTaxonomyRequest = _reflection.GeneratedProtocolMessageType( - "UpdateTaxonomyRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATETAXONOMYREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [UpdateTaxonomy][google.cloud.datacatalog.v1beta1. - PolicyTagManager.UpdateTaxonomy]. - - Attributes: - taxonomy: - The taxonomy to update. Only description, display_name, and - activated policy types can be updated. - update_mask: - The update mask applies to the resource. For the ``FieldMask`` - definition, see https://developers.google.com/protocol- - buffers/docs/reference/google.protobuf#fieldmask If not set, - defaults to all of the fields that are allowed to update. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.UpdateTaxonomyRequest) - }, -) -_sym_db.RegisterMessage(UpdateTaxonomyRequest) - -ListTaxonomiesRequest = _reflection.GeneratedProtocolMessageType( - "ListTaxonomiesRequest", - (_message.Message,), - { - "DESCRIPTOR": _LISTTAXONOMIESREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [ListTaxonomies][google.cloud.datacatalog.v1beta1. - PolicyTagManager.ListTaxonomies]. - - Attributes: - parent: - Required. Resource name of the project to list the taxonomies - of. - page_size: - The maximum number of items to return. Must be a value between - 1 and 1000. If not set, defaults to 50. - page_token: - The next_page_token value returned from a previous list - request, if any. If not set, defaults to an empty string. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListTaxonomiesRequest) - }, -) -_sym_db.RegisterMessage(ListTaxonomiesRequest) - -ListTaxonomiesResponse = _reflection.GeneratedProtocolMessageType( - "ListTaxonomiesResponse", - (_message.Message,), - { - "DESCRIPTOR": _LISTTAXONOMIESRESPONSE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Response message for [ListTaxonomies][google.cloud.datacatalog.v1beta1 - .PolicyTagManager.ListTaxonomies]. - - Attributes: - taxonomies: - Taxonomies that the project contains. - next_page_token: - Token used to retrieve the next page of results, or empty if - there are no more results in the list. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListTaxonomiesResponse) - }, -) -_sym_db.RegisterMessage(ListTaxonomiesResponse) - -GetTaxonomyRequest = _reflection.GeneratedProtocolMessageType( - "GetTaxonomyRequest", - (_message.Message,), - { - "DESCRIPTOR": _GETTAXONOMYREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [GetTaxonomy][google.cloud.datacatalog.v1beta1.Pol - icyTagManager.GetTaxonomy]. - - Attributes: - name: - Required. Resource name of the requested taxonomy. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.GetTaxonomyRequest) - }, -) -_sym_db.RegisterMessage(GetTaxonomyRequest) - -CreatePolicyTagRequest = _reflection.GeneratedProtocolMessageType( - "CreatePolicyTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _CREATEPOLICYTAGREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [CreatePolicyTag][google.cloud.datacatalog.v1beta1 - .PolicyTagManager.CreatePolicyTag]. - - Attributes: - parent: - Required. Resource name of the taxonomy that the policy tag - will belong to. - policy_tag: - The policy tag to be created. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.CreatePolicyTagRequest) - }, -) -_sym_db.RegisterMessage(CreatePolicyTagRequest) - -DeletePolicyTagRequest = _reflection.GeneratedProtocolMessageType( - "DeletePolicyTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _DELETEPOLICYTAGREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [DeletePolicyTag][google.cloud.datacatalog.v1beta1 - .PolicyTagManager.DeletePolicyTag]. - - Attributes: - name: - Required. Resource name of the policy tag to be deleted. All - of its descendant policy tags will also be deleted. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.DeletePolicyTagRequest) - }, -) -_sym_db.RegisterMessage(DeletePolicyTagRequest) - -UpdatePolicyTagRequest = _reflection.GeneratedProtocolMessageType( - "UpdatePolicyTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _UPDATEPOLICYTAGREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [UpdatePolicyTag][google.cloud.datacatalog.v1beta1 - .PolicyTagManager.UpdatePolicyTag]. - - Attributes: - policy_tag: - The policy tag to update. Only the description, display_name, - and parent_policy_tag fields can be updated. - update_mask: - The update mask applies to the resource. Only display_name, - description and parent_policy_tag can be updated and thus can - be listed in the mask. If update_mask is not provided, all - allowed fields (i.e. display_name, description and parent) - will be updated. For more information including the - ``FieldMask`` definition, see - https://developers.google.com/protocol- - buffers/docs/reference/google.protobuf#fieldmask If not set, - defaults to all of the fields that are allowed to update. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.UpdatePolicyTagRequest) - }, -) -_sym_db.RegisterMessage(UpdatePolicyTagRequest) - -ListPolicyTagsRequest = _reflection.GeneratedProtocolMessageType( - "ListPolicyTagsRequest", - (_message.Message,), - { - "DESCRIPTOR": _LISTPOLICYTAGSREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [ListPolicyTags][google.cloud.datacatalog.v1beta1. - PolicyTagManager.ListPolicyTags]. - - Attributes: - parent: - Required. Resource name of the taxonomy to list the policy - tags of. - page_size: - The maximum number of items to return. Must be a value between - 1 and 1000. If not set, defaults to 50. - page_token: - The next_page_token value returned from a previous List - request, if any. If not set, defaults to an empty string. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListPolicyTagsRequest) - }, -) -_sym_db.RegisterMessage(ListPolicyTagsRequest) - -ListPolicyTagsResponse = _reflection.GeneratedProtocolMessageType( - "ListPolicyTagsResponse", - (_message.Message,), - { - "DESCRIPTOR": _LISTPOLICYTAGSRESPONSE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Response message for [ListPolicyTags][google.cloud.datacatalog.v1beta1 - .PolicyTagManager.ListPolicyTags]. - - Attributes: - policy_tags: - The policy tags that are in the requested taxonomy. - next_page_token: - Token used to retrieve the next page of results, or empty if - there are no more results in the list. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ListPolicyTagsResponse) - }, -) -_sym_db.RegisterMessage(ListPolicyTagsResponse) - -GetPolicyTagRequest = _reflection.GeneratedProtocolMessageType( - "GetPolicyTagRequest", - (_message.Message,), - { - "DESCRIPTOR": _GETPOLICYTAGREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanager_pb2", - "__doc__": """Request message for [GetPolicyTag][google.cloud.datacatalog.v1beta1.Po - licyTagManager.GetPolicyTag]. - - Attributes: - name: - Required. Resource name of the requested policy tag. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.GetPolicyTagRequest) - }, -) -_sym_db.RegisterMessage(GetPolicyTagRequest) - - -DESCRIPTOR._options = None -_TAXONOMY.fields_by_name["name"]._options = None -_TAXONOMY.fields_by_name["display_name"]._options = None -_TAXONOMY.fields_by_name["description"]._options = None -_TAXONOMY.fields_by_name["activated_policy_types"]._options = None -_TAXONOMY._options = None -_POLICYTAG.fields_by_name["name"]._options = None -_POLICYTAG.fields_by_name["display_name"]._options = None -_POLICYTAG.fields_by_name["child_policy_tags"]._options = None -_POLICYTAG._options = None -_CREATETAXONOMYREQUEST.fields_by_name["parent"]._options = None -_DELETETAXONOMYREQUEST.fields_by_name["name"]._options = None -_LISTTAXONOMIESREQUEST.fields_by_name["parent"]._options = None -_GETTAXONOMYREQUEST.fields_by_name["name"]._options = None -_CREATEPOLICYTAGREQUEST.fields_by_name["parent"]._options = None -_DELETEPOLICYTAGREQUEST.fields_by_name["name"]._options = None -_LISTPOLICYTAGSREQUEST.fields_by_name["parent"]._options = None -_GETPOLICYTAGREQUEST.fields_by_name["name"]._options = None - -_POLICYTAGMANAGER = _descriptor.ServiceDescriptor( - name="PolicyTagManager", - full_name="google.cloud.datacatalog.v1beta1.PolicyTagManager", - file=DESCRIPTOR, - index=0, - serialized_options=b"\312A\032datacatalog.googleapis.com\322A.https://www.googleapis.com/auth/cloud-platform", - create_key=_descriptor._internal_create_key, - serialized_start=2422, - serialized_end=5339, - methods=[ - _descriptor.MethodDescriptor( - name="CreateTaxonomy", - full_name="google.cloud.datacatalog.v1beta1.PolicyTagManager.CreateTaxonomy", - index=0, - containing_service=None, - input_type=_CREATETAXONOMYREQUEST, - output_type=_TAXONOMY, - serialized_options=b'\202\323\344\223\002?"3/v1beta1/{parent=projects/*/locations/*}/taxonomies:\010taxonomy\332A\017parent,taxonomy', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="DeleteTaxonomy", - full_name="google.cloud.datacatalog.v1beta1.PolicyTagManager.DeleteTaxonomy", - index=1, - containing_service=None, - input_type=_DELETETAXONOMYREQUEST, - output_type=google_dot_protobuf_dot_empty__pb2._EMPTY, - serialized_options=b"\202\323\344\223\0025*3/v1beta1/{name=projects/*/locations/*/taxonomies/*}\332A\004name", - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="UpdateTaxonomy", - full_name="google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdateTaxonomy", - index=2, - containing_service=None, - input_type=_UPDATETAXONOMYREQUEST, - output_type=_TAXONOMY, - serialized_options=b"\202\323\344\223\002H2\n\ntaxonomies\x18\x01 \x03(\x0b\x32*.google.cloud.datacatalog.v1beta1.Taxonomy"\xc7\x01\n\x17\x45xportTaxonomiesRequest\x12;\n\x06parent\x18\x01 \x01(\tB+\xe0\x41\x02\xfa\x41%\x12#datacatalog.googleapis.com/Taxonomy\x12?\n\ntaxonomies\x18\x02 \x03(\tB+\xe0\x41\x02\xfa\x41%\n#datacatalog.googleapis.com/Taxonomy\x12\x1f\n\x15serialized_taxonomies\x18\x03 \x01(\x08H\x00\x42\r\n\x0b\x64\x65stination"d\n\x18\x45xportTaxonomiesResponse\x12H\n\ntaxonomies\x18\x01 \x03(\x0b\x32\x34.google.cloud.datacatalog.v1beta1.SerializedTaxonomy2\x92\x04\n\x1dPolicyTagManagerSerialization\x12\xd0\x01\n\x10ImportTaxonomies\x12\x39.google.cloud.datacatalog.v1beta1.ImportTaxonomiesRequest\x1a:.google.cloud.datacatalog.v1beta1.ImportTaxonomiesResponse"E\x82\xd3\xe4\x93\x02?":/v1beta1/{parent=projects/*/locations/*}/taxonomies:import:\x01*\x12\xcd\x01\n\x10\x45xportTaxonomies\x12\x39.google.cloud.datacatalog.v1beta1.ExportTaxonomiesRequest\x1a:.google.cloud.datacatalog.v1beta1.ExportTaxonomiesResponse"B\x82\xd3\xe4\x93\x02<\x12:/v1beta1/{parent=projects/*/locations/*}/taxonomies:export\x1aN\xca\x41\x1a\x64\x61tacatalog.googleapis.com\xd2\x41.https://www.googleapis.com/auth/cloud-platformB\x88\x02\n$com.google.cloud.datacatalog.v1beta1B"PolicyTagManagerSerializationProtoP\x01ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\xf8\x01\x01\xaa\x02 Google.Cloud.DataCatalog.V1Beta1\xca\x02 Google\\Cloud\\DataCatalog\\V1beta1\xea\x02#Google::Cloud::DataCatalog::V1beta1b\x06proto3', - dependencies=[ - google_dot_api_dot_annotations__pb2.DESCRIPTOR, - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_api_dot_resource__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanager__pb2.DESCRIPTOR, - google_dot_iam_dot_v1_dot_policy__pb2.DESCRIPTOR, - google_dot_api_dot_client__pb2.DESCRIPTOR, - ], -) - - -_SERIALIZEDTAXONOMY = _descriptor.Descriptor( - name="SerializedTaxonomy", - full_name="google.cloud.datacatalog.v1beta1.SerializedTaxonomy", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1beta1.SerializedTaxonomy.display_name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="description", - full_name="google.cloud.datacatalog.v1beta1.SerializedTaxonomy.description", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="policy_tags", - full_name="google.cloud.datacatalog.v1beta1.SerializedTaxonomy.policy_tags", - index=2, - number=3, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=319, - serialized_end=463, -) - - -_SERIALIZEDPOLICYTAG = _descriptor.Descriptor( - name="SerializedPolicyTag", - full_name="google.cloud.datacatalog.v1beta1.SerializedPolicyTag", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1beta1.SerializedPolicyTag.display_name", - index=0, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="description", - full_name="google.cloud.datacatalog.v1beta1.SerializedPolicyTag.description", - index=1, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="child_policy_tags", - full_name="google.cloud.datacatalog.v1beta1.SerializedPolicyTag.child_policy_tags", - index=2, - number=4, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=466, - serialized_end=617, -) - - -_IMPORTTAXONOMIESREQUEST = _descriptor.Descriptor( - name="ImportTaxonomiesRequest", - full_name="google.cloud.datacatalog.v1beta1.ImportTaxonomiesRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.ImportTaxonomiesRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A%\022#datacatalog.googleapis.com/Taxonomy", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="inline_source", - full_name="google.cloud.datacatalog.v1beta1.ImportTaxonomiesRequest.inline_source", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="source", - full_name="google.cloud.datacatalog.v1beta1.ImportTaxonomiesRequest.source", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=620, - serialized_end=789, -) - - -_INLINESOURCE = _descriptor.Descriptor( - name="InlineSource", - full_name="google.cloud.datacatalog.v1beta1.InlineSource", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="taxonomies", - full_name="google.cloud.datacatalog.v1beta1.InlineSource.taxonomies", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=791, - serialized_end=884, -) - - -_IMPORTTAXONOMIESRESPONSE = _descriptor.Descriptor( - name="ImportTaxonomiesResponse", - full_name="google.cloud.datacatalog.v1beta1.ImportTaxonomiesResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="taxonomies", - full_name="google.cloud.datacatalog.v1beta1.ImportTaxonomiesResponse.taxonomies", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=886, - serialized_end=976, -) - - -_EXPORTTAXONOMIESREQUEST = _descriptor.Descriptor( - name="ExportTaxonomiesRequest", - full_name="google.cloud.datacatalog.v1beta1.ExportTaxonomiesRequest", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="parent", - full_name="google.cloud.datacatalog.v1beta1.ExportTaxonomiesRequest.parent", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A%\022#datacatalog.googleapis.com/Taxonomy", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="taxonomies", - full_name="google.cloud.datacatalog.v1beta1.ExportTaxonomiesRequest.taxonomies", - index=1, - number=2, - type=9, - cpp_type=9, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002\372A%\n#datacatalog.googleapis.com/Taxonomy", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="serialized_taxonomies", - full_name="google.cloud.datacatalog.v1beta1.ExportTaxonomiesRequest.serialized_taxonomies", - index=2, - number=3, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="destination", - full_name="google.cloud.datacatalog.v1beta1.ExportTaxonomiesRequest.destination", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=979, - serialized_end=1178, -) - - -_EXPORTTAXONOMIESRESPONSE = _descriptor.Descriptor( - name="ExportTaxonomiesResponse", - full_name="google.cloud.datacatalog.v1beta1.ExportTaxonomiesResponse", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="taxonomies", - full_name="google.cloud.datacatalog.v1beta1.ExportTaxonomiesResponse.taxonomies", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1180, - serialized_end=1280, -) - -_SERIALIZEDTAXONOMY.fields_by_name["policy_tags"].message_type = _SERIALIZEDPOLICYTAG -_SERIALIZEDPOLICYTAG.fields_by_name[ - "child_policy_tags" -].message_type = _SERIALIZEDPOLICYTAG -_IMPORTTAXONOMIESREQUEST.fields_by_name["inline_source"].message_type = _INLINESOURCE -_IMPORTTAXONOMIESREQUEST.oneofs_by_name["source"].fields.append( - _IMPORTTAXONOMIESREQUEST.fields_by_name["inline_source"] -) -_IMPORTTAXONOMIESREQUEST.fields_by_name[ - "inline_source" -].containing_oneof = _IMPORTTAXONOMIESREQUEST.oneofs_by_name["source"] -_INLINESOURCE.fields_by_name["taxonomies"].message_type = _SERIALIZEDTAXONOMY -_IMPORTTAXONOMIESRESPONSE.fields_by_name[ - "taxonomies" -].message_type = ( - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanager__pb2._TAXONOMY -) -_EXPORTTAXONOMIESREQUEST.oneofs_by_name["destination"].fields.append( - _EXPORTTAXONOMIESREQUEST.fields_by_name["serialized_taxonomies"] -) -_EXPORTTAXONOMIESREQUEST.fields_by_name[ - "serialized_taxonomies" -].containing_oneof = _EXPORTTAXONOMIESREQUEST.oneofs_by_name["destination"] -_EXPORTTAXONOMIESRESPONSE.fields_by_name[ - "taxonomies" -].message_type = _SERIALIZEDTAXONOMY -DESCRIPTOR.message_types_by_name["SerializedTaxonomy"] = _SERIALIZEDTAXONOMY -DESCRIPTOR.message_types_by_name["SerializedPolicyTag"] = _SERIALIZEDPOLICYTAG -DESCRIPTOR.message_types_by_name["ImportTaxonomiesRequest"] = _IMPORTTAXONOMIESREQUEST -DESCRIPTOR.message_types_by_name["InlineSource"] = _INLINESOURCE -DESCRIPTOR.message_types_by_name["ImportTaxonomiesResponse"] = _IMPORTTAXONOMIESRESPONSE -DESCRIPTOR.message_types_by_name["ExportTaxonomiesRequest"] = _EXPORTTAXONOMIESREQUEST -DESCRIPTOR.message_types_by_name["ExportTaxonomiesResponse"] = _EXPORTTAXONOMIESRESPONSE -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -SerializedTaxonomy = _reflection.GeneratedProtocolMessageType( - "SerializedTaxonomy", - (_message.Message,), - { - "DESCRIPTOR": _SERIALIZEDTAXONOMY, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanagerserialization_pb2", - "__doc__": """Message capturing a taxonomy and its policy tag hierarchy as a nested - proto. Used for taxonomy import/export and mutation. - - Attributes: - display_name: - Required. Display name of the taxonomy. Max 200 bytes when - encoded in UTF-8. - description: - Description of the serialized taxonomy. The length of the - description is limited to 2000 bytes when encoded in UTF-8. If - not set, defaults to an empty description. - policy_tags: - Top level policy tags associated with the taxonomy if any. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.SerializedTaxonomy) - }, -) -_sym_db.RegisterMessage(SerializedTaxonomy) - -SerializedPolicyTag = _reflection.GeneratedProtocolMessageType( - "SerializedPolicyTag", - (_message.Message,), - { - "DESCRIPTOR": _SERIALIZEDPOLICYTAG, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanagerserialization_pb2", - "__doc__": """Message representing one policy tag when exported as a nested proto. - - Attributes: - display_name: - Required. Display name of the policy tag. Max 200 bytes when - encoded in UTF-8. - description: - Description of the serialized policy tag. The length of the - description is limited to 2000 bytes when encoded in UTF-8. If - not set, defaults to an empty description. - child_policy_tags: - Children of the policy tag if any. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.SerializedPolicyTag) - }, -) -_sym_db.RegisterMessage(SerializedPolicyTag) - -ImportTaxonomiesRequest = _reflection.GeneratedProtocolMessageType( - "ImportTaxonomiesRequest", - (_message.Message,), - { - "DESCRIPTOR": _IMPORTTAXONOMIESREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanagerserialization_pb2", - "__doc__": """Request message for [ImportTaxonomies][google.cloud.datacatalog.v1beta - 1.PolicyTagManagerSerialization.ImportTaxonomies]. - - Attributes: - parent: - Required. Resource name of project that the newly created - taxonomies will belong to. - source: - Required. Source taxonomies to be imported in a tree - structure. - inline_source: - Inline source used for taxonomies import - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ImportTaxonomiesRequest) - }, -) -_sym_db.RegisterMessage(ImportTaxonomiesRequest) - -InlineSource = _reflection.GeneratedProtocolMessageType( - "InlineSource", - (_message.Message,), - { - "DESCRIPTOR": _INLINESOURCE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanagerserialization_pb2", - "__doc__": """Inline source used for taxonomies import. - - Attributes: - taxonomies: - Required. Taxonomies to be imported. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.InlineSource) - }, -) -_sym_db.RegisterMessage(InlineSource) - -ImportTaxonomiesResponse = _reflection.GeneratedProtocolMessageType( - "ImportTaxonomiesResponse", - (_message.Message,), - { - "DESCRIPTOR": _IMPORTTAXONOMIESRESPONSE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanagerserialization_pb2", - "__doc__": """Response message for [ImportTaxonomies][google.cloud.datacatalog.v1bet - a1.PolicyTagManagerSerialization.ImportTaxonomies]. - - Attributes: - taxonomies: - Taxonomies that were imported. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ImportTaxonomiesResponse) - }, -) -_sym_db.RegisterMessage(ImportTaxonomiesResponse) - -ExportTaxonomiesRequest = _reflection.GeneratedProtocolMessageType( - "ExportTaxonomiesRequest", - (_message.Message,), - { - "DESCRIPTOR": _EXPORTTAXONOMIESREQUEST, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanagerserialization_pb2", - "__doc__": """Request message for [ExportTaxonomies][google.cloud.datacatalog.v1beta - 1.PolicyTagManagerSerialization.ExportTaxonomies]. - - Attributes: - parent: - Required. Resource name of the project that taxonomies to be - exported will share. - taxonomies: - Required. Resource names of the taxonomies to be exported. - destination: - Required. Taxonomies export destination. - serialized_taxonomies: - Export taxonomies as serialized taxonomies. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ExportTaxonomiesRequest) - }, -) -_sym_db.RegisterMessage(ExportTaxonomiesRequest) - -ExportTaxonomiesResponse = _reflection.GeneratedProtocolMessageType( - "ExportTaxonomiesResponse", - (_message.Message,), - { - "DESCRIPTOR": _EXPORTTAXONOMIESRESPONSE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.policytagmanagerserialization_pb2", - "__doc__": """Response message for [ExportTaxonomies][google.cloud.datacatalog.v1bet - a1.PolicyTagManagerSerialization.ExportTaxonomies]. - - Attributes: - taxonomies: - List of taxonomies and policy tags in a tree structure. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ExportTaxonomiesResponse) - }, -) -_sym_db.RegisterMessage(ExportTaxonomiesResponse) - - -DESCRIPTOR._options = None -_SERIALIZEDTAXONOMY.fields_by_name["display_name"]._options = None -_SERIALIZEDPOLICYTAG.fields_by_name["display_name"]._options = None -_IMPORTTAXONOMIESREQUEST.fields_by_name["parent"]._options = None -_INLINESOURCE.fields_by_name["taxonomies"]._options = None -_EXPORTTAXONOMIESREQUEST.fields_by_name["parent"]._options = None -_EXPORTTAXONOMIESREQUEST.fields_by_name["taxonomies"]._options = None - -_POLICYTAGMANAGERSERIALIZATION = _descriptor.ServiceDescriptor( - name="PolicyTagManagerSerialization", - full_name="google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization", - file=DESCRIPTOR, - index=0, - serialized_options=b"\312A\032datacatalog.googleapis.com\322A.https://www.googleapis.com/auth/cloud-platform", - create_key=_descriptor._internal_create_key, - serialized_start=1283, - serialized_end=1813, - methods=[ - _descriptor.MethodDescriptor( - name="ImportTaxonomies", - full_name="google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies", - index=0, - containing_service=None, - input_type=_IMPORTTAXONOMIESREQUEST, - output_type=_IMPORTTAXONOMIESRESPONSE, - serialized_options=b'\202\323\344\223\002?":/v1beta1/{parent=projects/*/locations/*}/taxonomies:import:\001*', - create_key=_descriptor._internal_create_key, - ), - _descriptor.MethodDescriptor( - name="ExportTaxonomies", - full_name="google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies", - index=1, - containing_service=None, - input_type=_EXPORTTAXONOMIESREQUEST, - output_type=_EXPORTTAXONOMIESRESPONSE, - serialized_options=b"\202\323\344\223\002<\022:/v1beta1/{parent=projects/*/locations/*}/taxonomies:export", - create_key=_descriptor._internal_create_key, - ), - ], -) -_sym_db.RegisterServiceDescriptor(_POLICYTAGMANAGERSERIALIZATION) - -DESCRIPTOR.services_by_name[ - "PolicyTagManagerSerialization" -] = _POLICYTAGMANAGERSERIALIZATION - -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1beta1/proto/policytagmanagerserialization_pb2_grpc.py b/google/cloud/datacatalog_v1beta1/proto/policytagmanagerserialization_pb2_grpc.py deleted file mode 100644 index 1d06f122..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/policytagmanagerserialization_pb2_grpc.py +++ /dev/null @@ -1,138 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc - -from google.cloud.datacatalog_v1beta1.proto import ( - policytagmanagerserialization_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2, -) - - -class PolicyTagManagerSerializationStub(object): - """Policy tag manager serialization API service allows clients to manipulate - their taxonomies and policy tags data with serialized format. - """ - - def __init__(self, channel): - """Constructor. - - Args: - channel: A grpc.Channel. - """ - self.ImportTaxonomies = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization/ImportTaxonomies", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ImportTaxonomiesRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ImportTaxonomiesResponse.FromString, - ) - self.ExportTaxonomies = channel.unary_unary( - "/google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization/ExportTaxonomies", - request_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ExportTaxonomiesRequest.SerializeToString, - response_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ExportTaxonomiesResponse.FromString, - ) - - -class PolicyTagManagerSerializationServicer(object): - """Policy tag manager serialization API service allows clients to manipulate - their taxonomies and policy tags data with serialized format. - """ - - def ImportTaxonomies(self, request, context): - """Imports all taxonomies and their policy tags to a project as new - taxonomies. - - This method provides a bulk taxonomy / policy tag creation using nested - proto structure. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - def ExportTaxonomies(self, request, context): - """Exports all taxonomies and their policy tags in a project. - - This method generates SerializedTaxonomy protos with nested policy tags - that can be used as an input for future ImportTaxonomies calls. - """ - context.set_code(grpc.StatusCode.UNIMPLEMENTED) - context.set_details("Method not implemented!") - raise NotImplementedError("Method not implemented!") - - -def add_PolicyTagManagerSerializationServicer_to_server(servicer, server): - rpc_method_handlers = { - "ImportTaxonomies": grpc.unary_unary_rpc_method_handler( - servicer.ImportTaxonomies, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ImportTaxonomiesRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ImportTaxonomiesResponse.SerializeToString, - ), - "ExportTaxonomies": grpc.unary_unary_rpc_method_handler( - servicer.ExportTaxonomies, - request_deserializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ExportTaxonomiesRequest.FromString, - response_serializer=google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ExportTaxonomiesResponse.SerializeToString, - ), - } - generic_handler = grpc.method_handlers_generic_handler( - "google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization", - rpc_method_handlers, - ) - server.add_generic_rpc_handlers((generic_handler,)) - - -# This class is part of an EXPERIMENTAL API. -class PolicyTagManagerSerialization(object): - """Policy tag manager serialization API service allows clients to manipulate - their taxonomies and policy tags data with serialized format. - """ - - @staticmethod - def ImportTaxonomies( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization/ImportTaxonomies", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ImportTaxonomiesRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ImportTaxonomiesResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) - - @staticmethod - def ExportTaxonomies( - request, - target, - options=(), - channel_credentials=None, - call_credentials=None, - compression=None, - wait_for_ready=None, - timeout=None, - metadata=None, - ): - return grpc.experimental.unary_unary( - request, - target, - "/google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization/ExportTaxonomies", - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ExportTaxonomiesRequest.SerializeToString, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_policytagmanagerserialization__pb2.ExportTaxonomiesResponse.FromString, - options, - channel_credentials, - call_credentials, - compression, - wait_for_ready, - timeout, - metadata, - ) diff --git a/google/cloud/datacatalog_v1beta1/proto/schema.proto b/google/cloud/datacatalog_v1beta1/proto/schema.proto index aca588b4..d8e69fd4 100644 --- a/google/cloud/datacatalog_v1beta1/proto/schema.proto +++ b/google/cloud/datacatalog_v1beta1/proto/schema.proto @@ -1,4 +1,4 @@ -// Copyright 2019 Google LLC. +// Copyright 2020 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. @@ -11,7 +11,6 @@ // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. -// syntax = "proto3"; @@ -20,14 +19,17 @@ package google.cloud.datacatalog.v1beta1; import "google/api/field_behavior.proto"; option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1Beta1"; option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog"; option java_multiple_files = true; -option java_package = "com.google.cloud.datacatalog"; +option java_package = "com.google.cloud.datacatalog.v1beta1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1beta1"; +option ruby_package = "Google::Cloud::DataCatalog::V1beta1"; // Represents a schema (e.g. BigQuery, GoogleSQL, Avro schema). message Schema { - // Required. Schema of columns. A maximum of 10,000 columns and sub-columns - // can be specified. + // Required. Schema of columns. A maximum of 10,000 columns and sub-columns can be + // specified. repeated ColumnSchema columns = 2 [(google.api.field_behavior) = REQUIRED]; } @@ -43,12 +45,11 @@ message ColumnSchema { // Optional. Description of the column. Default value is an empty string. string description = 2 [(google.api.field_behavior) = OPTIONAL]; - // Optional. A column's mode indicates whether the values in this column are - // required, nullable, etc. Only `NULLABLE`, `REQUIRED` and `REPEATED` are - // supported. Default mode is `NULLABLE`. + // Optional. A column's mode indicates whether the values in this column are required, + // nullable, etc. Only `NULLABLE`, `REQUIRED` and `REPEATED` are supported. + // Default mode is `NULLABLE`. string mode = 3 [(google.api.field_behavior) = OPTIONAL]; - // Optional. Schema of sub-columns. A column can have zero or more - // sub-columns. + // Optional. Schema of sub-columns. A column can have zero or more sub-columns. repeated ColumnSchema subcolumns = 7 [(google.api.field_behavior) = OPTIONAL]; } diff --git a/google/cloud/datacatalog_v1beta1/proto/schema_pb2.py b/google/cloud/datacatalog_v1beta1/proto/schema_pb2.py deleted file mode 100644 index 9485c7f8..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/schema_pb2.py +++ /dev/null @@ -1,249 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1beta1/proto/schema.proto - -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1beta1/proto/schema.proto", - package="google.cloud.datacatalog.v1beta1", - syntax="proto3", - serialized_options=b"\n$com.google.cloud.datacatalog.v1beta1P\001ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\370\001\001\252\002 Google.Cloud.DataCatalog.V1Beta1\312\002 Google\\Cloud\\DataCatalog\\V1beta1\352\002#Google::Cloud::DataCatalog::V1beta1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n3google/cloud/datacatalog_v1beta1/proto/schema.proto\x12 google.cloud.datacatalog.v1beta1\x1a\x1fgoogle/api/field_behavior.proto"N\n\x06Schema\x12\x44\n\x07\x63olumns\x18\x02 \x03(\x0b\x32..google.cloud.datacatalog.v1beta1.ColumnSchemaB\x03\xe0\x41\x02"\xac\x01\n\x0c\x43olumnSchema\x12\x13\n\x06\x63olumn\x18\x06 \x01(\tB\x03\xe0\x41\x02\x12\x11\n\x04type\x18\x01 \x01(\tB\x03\xe0\x41\x02\x12\x18\n\x0b\x64\x65scription\x18\x02 \x01(\tB\x03\xe0\x41\x01\x12\x11\n\x04mode\x18\x03 \x01(\tB\x03\xe0\x41\x01\x12G\n\nsubcolumns\x18\x07 \x03(\x0b\x32..google.cloud.datacatalog.v1beta1.ColumnSchemaB\x03\xe0\x41\x01\x42\xe4\x01\n$com.google.cloud.datacatalog.v1beta1P\x01ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\xf8\x01\x01\xaa\x02 Google.Cloud.DataCatalog.V1Beta1\xca\x02 Google\\Cloud\\DataCatalog\\V1beta1\xea\x02#Google::Cloud::DataCatalog::V1beta1b\x06proto3', - dependencies=[google_dot_api_dot_field__behavior__pb2.DESCRIPTOR], -) - - -_SCHEMA = _descriptor.Descriptor( - name="Schema", - full_name="google.cloud.datacatalog.v1beta1.Schema", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="columns", - full_name="google.cloud.datacatalog.v1beta1.Schema.columns", - index=0, - number=2, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=122, - serialized_end=200, -) - - -_COLUMNSCHEMA = _descriptor.Descriptor( - name="ColumnSchema", - full_name="google.cloud.datacatalog.v1beta1.ColumnSchema", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="column", - full_name="google.cloud.datacatalog.v1beta1.ColumnSchema.column", - index=0, - number=6, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="type", - full_name="google.cloud.datacatalog.v1beta1.ColumnSchema.type", - index=1, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="description", - full_name="google.cloud.datacatalog.v1beta1.ColumnSchema.description", - index=2, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="mode", - full_name="google.cloud.datacatalog.v1beta1.ColumnSchema.mode", - index=3, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="subcolumns", - full_name="google.cloud.datacatalog.v1beta1.ColumnSchema.subcolumns", - index=4, - number=7, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\001", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=203, - serialized_end=375, -) - -_SCHEMA.fields_by_name["columns"].message_type = _COLUMNSCHEMA -_COLUMNSCHEMA.fields_by_name["subcolumns"].message_type = _COLUMNSCHEMA -DESCRIPTOR.message_types_by_name["Schema"] = _SCHEMA -DESCRIPTOR.message_types_by_name["ColumnSchema"] = _COLUMNSCHEMA -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -Schema = _reflection.GeneratedProtocolMessageType( - "Schema", - (_message.Message,), - { - "DESCRIPTOR": _SCHEMA, - "__module__": "google.cloud.datacatalog_v1beta1.proto.schema_pb2", - "__doc__": """Represents a schema (e.g. BigQuery, GoogleSQL, Avro schema). - - Attributes: - columns: - Required. Schema of columns. A maximum of 10,000 columns and - sub-columns can be specified. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.Schema) - }, -) -_sym_db.RegisterMessage(Schema) - -ColumnSchema = _reflection.GeneratedProtocolMessageType( - "ColumnSchema", - (_message.Message,), - { - "DESCRIPTOR": _COLUMNSCHEMA, - "__module__": "google.cloud.datacatalog_v1beta1.proto.schema_pb2", - "__doc__": """Representation of a column within a schema. Columns could be nested - inside other columns. - - Attributes: - column: - Required. Name of the column. - type: - Required. Type of the column. - description: - Optional. Description of the column. Default value is an empty - string. - mode: - Optional. A column’s mode indicates whether the values in this - column are required, nullable, etc. Only ``NULLABLE``, - ``REQUIRED`` and ``REPEATED`` are supported. Default mode is - ``NULLABLE``. - subcolumns: - Optional. Schema of sub-columns. A column can have zero or - more sub-columns. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ColumnSchema) - }, -) -_sym_db.RegisterMessage(ColumnSchema) - - -DESCRIPTOR._options = None -_SCHEMA.fields_by_name["columns"]._options = None -_COLUMNSCHEMA.fields_by_name["column"]._options = None -_COLUMNSCHEMA.fields_by_name["type"]._options = None -_COLUMNSCHEMA.fields_by_name["description"]._options = None -_COLUMNSCHEMA.fields_by_name["mode"]._options = None -_COLUMNSCHEMA.fields_by_name["subcolumns"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1beta1/proto/schema_pb2_grpc.py b/google/cloud/datacatalog_v1beta1/proto/schema_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/schema_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1beta1/proto/search.proto b/google/cloud/datacatalog_v1beta1/proto/search.proto index 372c1573..c1f41412 100644 --- a/google/cloud/datacatalog_v1beta1/proto/search.proto +++ b/google/cloud/datacatalog_v1beta1/proto/search.proto @@ -1,4 +1,4 @@ -// Copyright 2019 Google LLC. +// Copyright 2020 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. @@ -11,19 +11,22 @@ // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. -// syntax = "proto3"; package google.cloud.datacatalog.v1beta1; import "google/api/field_behavior.proto"; +import "google/cloud/datacatalog/v1beta1/common.proto"; import "google/protobuf/timestamp.proto"; option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1Beta1"; option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog"; option java_multiple_files = true; -option java_package = "com.google.cloud.datacatalog"; +option java_package = "com.google.cloud.datacatalog.v1beta1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1beta1"; +option ruby_package = "Google::Cloud::DataCatalog::V1beta1"; // A result that appears in the response of a search request. Each result // captures details of one entry that matches the search. diff --git a/google/cloud/datacatalog_v1beta1/proto/search_pb2.py b/google/cloud/datacatalog_v1beta1/proto/search_pb2.py deleted file mode 100644 index ce38e790..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/search_pb2.py +++ /dev/null @@ -1,230 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1beta1/proto/search.proto - -from google.protobuf.internal import enum_type_wrapper -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.cloud.datacatalog_v1beta1.proto import ( - common_pb2 as google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_common__pb2, -) -from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1beta1/proto/search.proto", - package="google.cloud.datacatalog.v1beta1", - syntax="proto3", - serialized_options=b"\n$com.google.cloud.datacatalog.v1beta1P\001ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\370\001\001\252\002 Google.Cloud.DataCatalog.V1Beta1\312\002 Google\\Cloud\\DataCatalog\\V1beta1\352\002#Google::Cloud::DataCatalog::V1beta1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n3google/cloud/datacatalog_v1beta1/proto/search.proto\x12 google.cloud.datacatalog.v1beta1\x1a\x1fgoogle/api/field_behavior.proto\x1a\x33google/cloud/datacatalog_v1beta1/proto/common.proto\x1a\x1fgoogle/protobuf/timestamp.proto"\xbd\x01\n\x13SearchCatalogResult\x12N\n\x12search_result_type\x18\x01 \x01(\x0e\x32\x32.google.cloud.datacatalog.v1beta1.SearchResultType\x12\x1d\n\x15search_result_subtype\x18\x02 \x01(\t\x12\x1e\n\x16relative_resource_name\x18\x03 \x01(\t\x12\x17\n\x0flinked_resource\x18\x04 \x01(\t*d\n\x10SearchResultType\x12"\n\x1eSEARCH_RESULT_TYPE_UNSPECIFIED\x10\x00\x12\t\n\x05\x45NTRY\x10\x01\x12\x10\n\x0cTAG_TEMPLATE\x10\x02\x12\x0f\n\x0b\x45NTRY_GROUP\x10\x03\x42\xe4\x01\n$com.google.cloud.datacatalog.v1beta1P\x01ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\xf8\x01\x01\xaa\x02 Google.Cloud.DataCatalog.V1Beta1\xca\x02 Google\\Cloud\\DataCatalog\\V1beta1\xea\x02#Google::Cloud::DataCatalog::V1beta1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_cloud_dot_datacatalog__v1beta1_dot_proto_dot_common__pb2.DESCRIPTOR, - google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR, - ], -) - -_SEARCHRESULTTYPE = _descriptor.EnumDescriptor( - name="SearchResultType", - full_name="google.cloud.datacatalog.v1beta1.SearchResultType", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="SEARCH_RESULT_TYPE_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="ENTRY", - index=1, - number=1, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="TAG_TEMPLATE", - index=2, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="ENTRY_GROUP", - index=3, - number=3, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=400, - serialized_end=500, -) -_sym_db.RegisterEnumDescriptor(_SEARCHRESULTTYPE) - -SearchResultType = enum_type_wrapper.EnumTypeWrapper(_SEARCHRESULTTYPE) -SEARCH_RESULT_TYPE_UNSPECIFIED = 0 -ENTRY = 1 -TAG_TEMPLATE = 2 -ENTRY_GROUP = 3 - - -_SEARCHCATALOGRESULT = _descriptor.Descriptor( - name="SearchCatalogResult", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogResult", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="search_result_type", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogResult.search_result_type", - index=0, - number=1, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="search_result_subtype", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogResult.search_result_subtype", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="relative_resource_name", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogResult.relative_resource_name", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="linked_resource", - full_name="google.cloud.datacatalog.v1beta1.SearchCatalogResult.linked_resource", - index=3, - number=4, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=209, - serialized_end=398, -) - -_SEARCHCATALOGRESULT.fields_by_name["search_result_type"].enum_type = _SEARCHRESULTTYPE -DESCRIPTOR.message_types_by_name["SearchCatalogResult"] = _SEARCHCATALOGRESULT -DESCRIPTOR.enum_types_by_name["SearchResultType"] = _SEARCHRESULTTYPE -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -SearchCatalogResult = _reflection.GeneratedProtocolMessageType( - "SearchCatalogResult", - (_message.Message,), - { - "DESCRIPTOR": _SEARCHCATALOGRESULT, - "__module__": "google.cloud.datacatalog_v1beta1.proto.search_pb2", - "__doc__": """A result that appears in the response of a search request. Each result - captures details of one entry that matches the search. - - Attributes: - search_result_type: - Type of the search result. This field can be used to determine - which Get method to call to fetch the full resource. - search_result_subtype: - Sub-type of the search result. This is a dot-delimited - description of the resource’s full type, and is the same as - the value callers would provide in the “type” search facet. - Examples: ``entry.table``, ``entry.dataStream``, - ``tagTemplate``. - relative_resource_name: - The relative resource name of the resource in URL format. - Examples: - ``projects/{project_id}/locations/{location_id}/ - entryGroups/{entry_group_id}/entries/{entry_id}`` - - ``projects/{project_id}/tagTemplates/{tag_template_id}`` - linked_resource: - The full name of the cloud resource the entry belongs to. See: - https://cloud.google.com/apis/design/resource_names#full_resou - rce_name. Example: - ``//bigquery.googleapis.com/projects/pr - ojectId/datasets/datasetId/tables/tableId`` - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.SearchCatalogResult) - }, -) -_sym_db.RegisterMessage(SearchCatalogResult) - - -DESCRIPTOR._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1beta1/proto/search_pb2_grpc.py b/google/cloud/datacatalog_v1beta1/proto/search_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/search_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1beta1/proto/table_spec.proto b/google/cloud/datacatalog_v1beta1/proto/table_spec.proto index 4f9fddaa..c08f43ef 100644 --- a/google/cloud/datacatalog_v1beta1/proto/table_spec.proto +++ b/google/cloud/datacatalog_v1beta1/proto/table_spec.proto @@ -1,4 +1,4 @@ -// Copyright 2019 Google LLC. +// Copyright 2020 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. @@ -11,7 +11,6 @@ // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. -// syntax = "proto3"; @@ -21,15 +20,17 @@ import "google/api/field_behavior.proto"; import "google/api/resource.proto"; option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1Beta1"; option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog"; option java_multiple_files = true; -option java_package = "com.google.cloud.datacatalog"; +option java_package = "com.google.cloud.datacatalog.v1beta1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1beta1"; +option ruby_package = "Google::Cloud::DataCatalog::V1beta1"; // Describes a BigQuery table. message BigQueryTableSpec { // Output only. The table source type. - TableSourceType table_source_type = 1 - [(google.api.field_behavior) = OUTPUT_ONLY]; + TableSourceType table_source_type = 1 [(google.api.field_behavior) = OUTPUT_ONLY]; // Output only. oneof type_spec { @@ -63,9 +64,9 @@ message ViewSpec { // Normal BigQuery table spec. message TableSpec { - // Output only. If the table is a dated shard, i.e., with name pattern - // `[prefix]YYYYMMDD`, `grouped_entry` is the Data Catalog resource name of - // the date sharded grouped entry, for example, + // Output only. If the table is a dated shard, i.e., with name pattern `[prefix]YYYYMMDD`, + // `grouped_entry` is the Data Catalog resource name of the date sharded + // grouped entry, for example, // `projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}`. // Otherwise, `grouped_entry` is empty. string grouped_entry = 1 [ @@ -80,8 +81,8 @@ message TableSpec { // Context: // https://cloud.google.com/bigquery/docs/partitioned-tables#partitioning_versus_sharding message BigQueryDateShardedSpec { - // Output only. The Data Catalog resource name of the dataset entry the - // current table belongs to, for example, + // Output only. The Data Catalog resource name of the dataset entry the current table + // belongs to, for example, // `projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}`. string dataset = 1 [ (google.api.field_behavior) = OUTPUT_ONLY, @@ -90,8 +91,7 @@ message BigQueryDateShardedSpec { } ]; - // Output only. The table name prefix of the shards. The name of any given - // shard is + // Output only. The table name prefix of the shards. The name of any given shard is // `[table_prefix]YYYYMMDD`, for example, for shard `MyTable20180101`, the // `table_prefix` is `MyTable`. string table_prefix = 2 [(google.api.field_behavior) = OUTPUT_ONLY]; diff --git a/google/cloud/datacatalog_v1beta1/proto/table_spec_pb2.py b/google/cloud/datacatalog_v1beta1/proto/table_spec_pb2.py deleted file mode 100644 index 875da600..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/table_spec_pb2.py +++ /dev/null @@ -1,450 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1beta1/proto/table_spec.proto - -from google.protobuf.internal import enum_type_wrapper -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.api import resource_pb2 as google_dot_api_dot_resource__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1beta1/proto/table_spec.proto", - package="google.cloud.datacatalog.v1beta1", - syntax="proto3", - serialized_options=b"\n$com.google.cloud.datacatalog.v1beta1P\001ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\370\001\001\252\002 Google.Cloud.DataCatalog.V1Beta1\312\002 Google\\Cloud\\DataCatalog\\V1beta1\352\002#Google::Cloud::DataCatalog::V1beta1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n7google/cloud/datacatalog_v1beta1/proto/table_spec.proto\x12 google.cloud.datacatalog.v1beta1\x1a\x1fgoogle/api/field_behavior.proto\x1a\x19google/api/resource.proto"\xf7\x01\n\x11\x42igQueryTableSpec\x12Q\n\x11table_source_type\x18\x01 \x01(\x0e\x32\x31.google.cloud.datacatalog.v1beta1.TableSourceTypeB\x03\xe0\x41\x03\x12?\n\tview_spec\x18\x02 \x01(\x0b\x32*.google.cloud.datacatalog.v1beta1.ViewSpecH\x00\x12\x41\n\ntable_spec\x18\x03 \x01(\x0b\x32+.google.cloud.datacatalog.v1beta1.TableSpecH\x00\x42\x0b\n\ttype_spec"#\n\x08ViewSpec\x12\x17\n\nview_query\x18\x01 \x01(\tB\x03\xe0\x41\x03"L\n\tTableSpec\x12?\n\rgrouped_entry\x18\x01 \x01(\tB(\xe0\x41\x03\xfa\x41"\n datacatalog.googleapis.com/Entry"\x89\x01\n\x17\x42igQueryDateShardedSpec\x12\x39\n\x07\x64\x61taset\x18\x01 \x01(\tB(\xe0\x41\x03\xfa\x41"\n datacatalog.googleapis.com/Entry\x12\x19\n\x0ctable_prefix\x18\x02 \x01(\tB\x03\xe0\x41\x03\x12\x18\n\x0bshard_count\x18\x03 \x01(\x03\x42\x03\xe0\x41\x03*[\n\x0fTableSourceType\x12!\n\x1dTABLE_SOURCE_TYPE_UNSPECIFIED\x10\x00\x12\x11\n\rBIGQUERY_VIEW\x10\x02\x12\x12\n\x0e\x42IGQUERY_TABLE\x10\x05\x42\xe4\x01\n$com.google.cloud.datacatalog.v1beta1P\x01ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\xf8\x01\x01\xaa\x02 Google.Cloud.DataCatalog.V1Beta1\xca\x02 Google\\Cloud\\DataCatalog\\V1beta1\xea\x02#Google::Cloud::DataCatalog::V1beta1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_api_dot_resource__pb2.DESCRIPTOR, - ], -) - -_TABLESOURCETYPE = _descriptor.EnumDescriptor( - name="TableSourceType", - full_name="google.cloud.datacatalog.v1beta1.TableSourceType", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="TABLE_SOURCE_TYPE_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="BIGQUERY_VIEW", - index=1, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="BIGQUERY_TABLE", - index=2, - number=5, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=658, - serialized_end=749, -) -_sym_db.RegisterEnumDescriptor(_TABLESOURCETYPE) - -TableSourceType = enum_type_wrapper.EnumTypeWrapper(_TABLESOURCETYPE) -TABLE_SOURCE_TYPE_UNSPECIFIED = 0 -BIGQUERY_VIEW = 2 -BIGQUERY_TABLE = 5 - - -_BIGQUERYTABLESPEC = _descriptor.Descriptor( - name="BigQueryTableSpec", - full_name="google.cloud.datacatalog.v1beta1.BigQueryTableSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="table_source_type", - full_name="google.cloud.datacatalog.v1beta1.BigQueryTableSpec.table_source_type", - index=0, - number=1, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="view_spec", - full_name="google.cloud.datacatalog.v1beta1.BigQueryTableSpec.view_spec", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="table_spec", - full_name="google.cloud.datacatalog.v1beta1.BigQueryTableSpec.table_spec", - index=2, - number=3, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="type_spec", - full_name="google.cloud.datacatalog.v1beta1.BigQueryTableSpec.type_spec", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=154, - serialized_end=401, -) - - -_VIEWSPEC = _descriptor.Descriptor( - name="ViewSpec", - full_name="google.cloud.datacatalog.v1beta1.ViewSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="view_query", - full_name="google.cloud.datacatalog.v1beta1.ViewSpec.view_query", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=403, - serialized_end=438, -) - - -_TABLESPEC = _descriptor.Descriptor( - name="TableSpec", - full_name="google.cloud.datacatalog.v1beta1.TableSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="grouped_entry", - full_name="google.cloud.datacatalog.v1beta1.TableSpec.grouped_entry", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b'\340A\003\372A"\n datacatalog.googleapis.com/Entry', - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=440, - serialized_end=516, -) - - -_BIGQUERYDATESHARDEDSPEC = _descriptor.Descriptor( - name="BigQueryDateShardedSpec", - full_name="google.cloud.datacatalog.v1beta1.BigQueryDateShardedSpec", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="dataset", - full_name="google.cloud.datacatalog.v1beta1.BigQueryDateShardedSpec.dataset", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b'\340A\003\372A"\n datacatalog.googleapis.com/Entry', - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="table_prefix", - full_name="google.cloud.datacatalog.v1beta1.BigQueryDateShardedSpec.table_prefix", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="shard_count", - full_name="google.cloud.datacatalog.v1beta1.BigQueryDateShardedSpec.shard_count", - index=2, - number=3, - type=3, - cpp_type=2, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=519, - serialized_end=656, -) - -_BIGQUERYTABLESPEC.fields_by_name["table_source_type"].enum_type = _TABLESOURCETYPE -_BIGQUERYTABLESPEC.fields_by_name["view_spec"].message_type = _VIEWSPEC -_BIGQUERYTABLESPEC.fields_by_name["table_spec"].message_type = _TABLESPEC -_BIGQUERYTABLESPEC.oneofs_by_name["type_spec"].fields.append( - _BIGQUERYTABLESPEC.fields_by_name["view_spec"] -) -_BIGQUERYTABLESPEC.fields_by_name[ - "view_spec" -].containing_oneof = _BIGQUERYTABLESPEC.oneofs_by_name["type_spec"] -_BIGQUERYTABLESPEC.oneofs_by_name["type_spec"].fields.append( - _BIGQUERYTABLESPEC.fields_by_name["table_spec"] -) -_BIGQUERYTABLESPEC.fields_by_name[ - "table_spec" -].containing_oneof = _BIGQUERYTABLESPEC.oneofs_by_name["type_spec"] -DESCRIPTOR.message_types_by_name["BigQueryTableSpec"] = _BIGQUERYTABLESPEC -DESCRIPTOR.message_types_by_name["ViewSpec"] = _VIEWSPEC -DESCRIPTOR.message_types_by_name["TableSpec"] = _TABLESPEC -DESCRIPTOR.message_types_by_name["BigQueryDateShardedSpec"] = _BIGQUERYDATESHARDEDSPEC -DESCRIPTOR.enum_types_by_name["TableSourceType"] = _TABLESOURCETYPE -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -BigQueryTableSpec = _reflection.GeneratedProtocolMessageType( - "BigQueryTableSpec", - (_message.Message,), - { - "DESCRIPTOR": _BIGQUERYTABLESPEC, - "__module__": "google.cloud.datacatalog_v1beta1.proto.table_spec_pb2", - "__doc__": """Describes a BigQuery table. - - Attributes: - table_source_type: - Output only. The table source type. - type_spec: - Output only. - view_spec: - Table view specification. This field should only be populated - if ``table_source_type`` is ``BIGQUERY_VIEW``. - table_spec: - Spec of a BigQuery table. This field should only be populated - if ``table_source_type`` is ``BIGQUERY_TABLE``. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.BigQueryTableSpec) - }, -) -_sym_db.RegisterMessage(BigQueryTableSpec) - -ViewSpec = _reflection.GeneratedProtocolMessageType( - "ViewSpec", - (_message.Message,), - { - "DESCRIPTOR": _VIEWSPEC, - "__module__": "google.cloud.datacatalog_v1beta1.proto.table_spec_pb2", - "__doc__": """Table view specification. - - Attributes: - view_query: - Output only. The query that defines the table view. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.ViewSpec) - }, -) -_sym_db.RegisterMessage(ViewSpec) - -TableSpec = _reflection.GeneratedProtocolMessageType( - "TableSpec", - (_message.Message,), - { - "DESCRIPTOR": _TABLESPEC, - "__module__": "google.cloud.datacatalog_v1beta1.proto.table_spec_pb2", - "__doc__": """Normal BigQuery table spec. - - Attributes: - grouped_entry: - Output only. If the table is a dated shard, i.e., with name - pattern ``[prefix]YYYYMMDD``, ``grouped_entry`` is the Data - Catalog resource name of the date sharded grouped entry, for - example, ``projects/{project_id}/locations/{location}/entrygro - ups/{entry_group_id}/entries/{entry_id}``. Otherwise, - ``grouped_entry`` is empty. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.TableSpec) - }, -) -_sym_db.RegisterMessage(TableSpec) - -BigQueryDateShardedSpec = _reflection.GeneratedProtocolMessageType( - "BigQueryDateShardedSpec", - (_message.Message,), - { - "DESCRIPTOR": _BIGQUERYDATESHARDEDSPEC, - "__module__": "google.cloud.datacatalog_v1beta1.proto.table_spec_pb2", - "__doc__": """Spec for a group of BigQuery tables with name pattern - ``[prefix]YYYYMMDD``. Context: - https://cloud.google.com/bigquery/docs/partitioned- - tables#partitioning_versus_sharding - - Attributes: - dataset: - Output only. The Data Catalog resource name of the dataset - entry the current table belongs to, for example, ``projects/{p - roject_id}/locations/{location}/entrygroups/{entry_group_id}/e - ntries/{entry_id}``. - table_prefix: - Output only. The table name prefix of the shards. The name of - any given shard is ``[table_prefix]YYYYMMDD``, for example, - for shard ``MyTable20180101``, the ``table_prefix`` is - ``MyTable``. - shard_count: - Output only. Total number of shards. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.BigQueryDateShardedSpec) - }, -) -_sym_db.RegisterMessage(BigQueryDateShardedSpec) - - -DESCRIPTOR._options = None -_BIGQUERYTABLESPEC.fields_by_name["table_source_type"]._options = None -_VIEWSPEC.fields_by_name["view_query"]._options = None -_TABLESPEC.fields_by_name["grouped_entry"]._options = None -_BIGQUERYDATESHARDEDSPEC.fields_by_name["dataset"]._options = None -_BIGQUERYDATESHARDEDSPEC.fields_by_name["table_prefix"]._options = None -_BIGQUERYDATESHARDEDSPEC.fields_by_name["shard_count"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1beta1/proto/table_spec_pb2_grpc.py b/google/cloud/datacatalog_v1beta1/proto/table_spec_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/table_spec_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1beta1/proto/tags.proto b/google/cloud/datacatalog_v1beta1/proto/tags.proto index c2fc2da4..c15fb218 100644 --- a/google/cloud/datacatalog_v1beta1/proto/tags.proto +++ b/google/cloud/datacatalog_v1beta1/proto/tags.proto @@ -1,4 +1,4 @@ -// Copyright 2019 Google LLC. +// Copyright 2020 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. @@ -11,7 +11,6 @@ // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. -// syntax = "proto3"; @@ -22,12 +21,19 @@ import "google/api/resource.proto"; import "google/protobuf/timestamp.proto"; option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1Beta1"; option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog"; option java_multiple_files = true; -option java_package = "com.google.cloud.datacatalog"; +option java_package = "com.google.cloud.datacatalog.v1beta1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1beta1"; +option ruby_package = "Google::Cloud::DataCatalog::V1beta1"; // Tags are used to attach custom metadata to Data Catalog resources. Tags // conform to the specifications within their tag template. +// +// See [Data Catalog +// IAM](https://cloud.google.com/data-catalog/docs/concepts/iam) for information +// on the permissions needed to create or view tags. message Tag { option (google.api.resource) = { type: "datacatalog.googleapis.com/Tag" @@ -42,8 +48,7 @@ message Tag { // Note that this Tag may not actually be stored in the location in this name. string name = 1; - // Required. The resource name of the tag template that this tag uses. - // Example: + // Required. The resource name of the tag template that this tag uses. Example: // // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} // @@ -68,9 +73,9 @@ message Tag { string column = 4; } - // Required. This maps the ID of a tag field to the value of and additional - // information about that field. Valid field IDs are defined by the tag's - // template. A tag must have at least 1 field and at most 500 fields. + // Required. This maps the ID of a tag field to the value of and additional information + // about that field. Valid field IDs are defined by the tag's template. A tag + // must have at least 1 field and at most 500 fields. map fields = 3 [(google.api.field_behavior) = REQUIRED]; } @@ -104,11 +109,23 @@ message TagField { // one of the allowed values in the definition of this enum. EnumValue enum_value = 6; } + + // Output only. The order of this field with respect to other fields in this tag. It can be + // set in [Tag][google.cloud.datacatalog.v1beta1.TagTemplateField.order]. For + // example, a higher value can indicate a more important field. The value can + // be negative. Multiple fields can have the same order, and field orders + // within a tag do not have to be sequential. + int32 order = 7 [(google.api.field_behavior) = OUTPUT_ONLY]; } -// A tag template defines the schema of the tags used to attach to Data Catalog -// resources. It defines the mapping of accepted field names and types that can -// be used within the tag. The tag template also controls the access to the tag. +// A tag template defines a tag, which can have one or more typed fields. +// The template is used to create and attach the tag to GCP resources. +// [Tag template +// roles](https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) +// provide permissions to create, edit, and use the template. See, for example, +// the [TagTemplate +// User](https://cloud.google.com/data-catalog/docs/how-to/template-user) role, +// which includes permission to use the tag template to tag resources. message TagTemplate { option (google.api.resource) = { type: "datacatalog.googleapis.com/TagTemplate" @@ -134,8 +151,7 @@ message TagTemplate { // letters (both uppercase and lowercase), numbers (0-9) and underscores (_). // Field IDs must be at least 1 character long and at most // 64 characters long. Field IDs must start with a letter or underscore. - map fields = 3 - [(google.api.field_behavior) = REQUIRED]; + map fields = 3 [(google.api.field_behavior) = REQUIRED]; } // The template for an individual field within a tag template. @@ -145,8 +161,7 @@ message TagTemplateField { pattern: "projects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}" }; - // Output only. The resource name of the tag template field in URL format. - // Example: + // Output only. The resource name of the tag template field in URL format. Example: // // * projects/{project_id}/locations/{location}/tagTemplates/{tag_template}/fields/{field} // @@ -159,13 +174,21 @@ message TagTemplateField { // Required. The type of value this tag field can contain. FieldType type = 2 [(google.api.field_behavior) = REQUIRED]; + + // Whether this is a required field. Defaults to false. + bool is_required = 3; + + // The order of this field with respect to other fields in this tag + // template. A higher value indicates a more important field. The value can + // be negative. Multiple fields can have the same order, and field orders + // within a tag do not have to be sequential. + int32 order = 5; } message FieldType { message EnumType { message EnumValue { - // Required. The display name of the enum value. Must not be an empty - // string. + // Required. The display name of the enum value. Must not be an empty string. string display_name = 1 [(google.api.field_behavior) = REQUIRED]; } diff --git a/google/cloud/datacatalog_v1beta1/proto/tags_pb2.py b/google/cloud/datacatalog_v1beta1/proto/tags_pb2.py deleted file mode 100644 index 2d6c783e..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/tags_pb2.py +++ /dev/null @@ -1,1216 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1beta1/proto/tags.proto - -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.api import resource_pb2 as google_dot_api_dot_resource__pb2 -from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1beta1/proto/tags.proto", - package="google.cloud.datacatalog.v1beta1", - syntax="proto3", - serialized_options=b"\n$com.google.cloud.datacatalog.v1beta1P\001ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\370\001\001\252\002 Google.Cloud.DataCatalog.V1Beta1\312\002 Google\\Cloud\\DataCatalog\\V1beta1\352\002#Google::Cloud::DataCatalog::V1beta1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n1google/cloud/datacatalog_v1beta1/proto/tags.proto\x12 google.cloud.datacatalog.v1beta1\x1a\x1fgoogle/api/field_behavior.proto\x1a\x19google/api/resource.proto\x1a\x1fgoogle/protobuf/timestamp.proto"\x90\x03\n\x03Tag\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x15\n\x08template\x18\x02 \x01(\tB\x03\xe0\x41\x02\x12"\n\x15template_display_name\x18\x05 \x01(\tB\x03\xe0\x41\x03\x12\x10\n\x06\x63olumn\x18\x04 \x01(\tH\x00\x12\x46\n\x06\x66ields\x18\x03 \x03(\x0b\x32\x31.google.cloud.datacatalog.v1beta1.Tag.FieldsEntryB\x03\xe0\x41\x02\x1aY\n\x0b\x46ieldsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x39\n\x05value\x18\x02 \x01(\x0b\x32*.google.cloud.datacatalog.v1beta1.TagField:\x02\x38\x01:\x81\x01\xea\x41~\n\x1e\x64\x61tacatalog.googleapis.com/Tag\x12\\projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}B\x07\n\x05scope"\xad\x02\n\x08TagField\x12\x19\n\x0c\x64isplay_name\x18\x01 \x01(\tB\x03\xe0\x41\x03\x12\x16\n\x0c\x64ouble_value\x18\x02 \x01(\x01H\x00\x12\x16\n\x0cstring_value\x18\x03 \x01(\tH\x00\x12\x14\n\nbool_value\x18\x04 \x01(\x08H\x00\x12\x35\n\x0ftimestamp_value\x18\x05 \x01(\x0b\x32\x1a.google.protobuf.TimestampH\x00\x12J\n\nenum_value\x18\x06 \x01(\x0b\x32\x34.google.cloud.datacatalog.v1beta1.TagField.EnumValueH\x00\x12\x12\n\x05order\x18\x07 \x01(\x05\x42\x03\xe0\x41\x03\x1a!\n\tEnumValue\x12\x14\n\x0c\x64isplay_name\x18\x01 \x01(\tB\x06\n\x04kind"\xd6\x02\n\x0bTagTemplate\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x14\n\x0c\x64isplay_name\x18\x02 \x01(\t\x12N\n\x06\x66ields\x18\x03 \x03(\x0b\x32\x39.google.cloud.datacatalog.v1beta1.TagTemplate.FieldsEntryB\x03\xe0\x41\x02\x1a\x61\n\x0b\x46ieldsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x41\n\x05value\x18\x02 \x01(\x0b\x32\x32.google.cloud.datacatalog.v1beta1.TagTemplateField:\x02\x38\x01:p\xea\x41m\n&datacatalog.googleapis.com/TagTemplate\x12\x43projects/{project}/locations/{location}/tagTemplates/{tag_template}"\xa7\x02\n\x10TagTemplateField\x12\x11\n\x04name\x18\x06 \x01(\tB\x03\xe0\x41\x03\x12\x14\n\x0c\x64isplay_name\x18\x01 \x01(\t\x12>\n\x04type\x18\x02 \x01(\x0b\x32+.google.cloud.datacatalog.v1beta1.FieldTypeB\x03\xe0\x41\x02\x12\x13\n\x0bis_required\x18\x03 \x01(\x08\x12\r\n\x05order\x18\x05 \x01(\x05:\x85\x01\xea\x41\x81\x01\n+datacatalog.googleapis.com/TagTemplateField\x12Rprojects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}"\xa7\x03\n\tFieldType\x12S\n\x0eprimitive_type\x18\x01 \x01(\x0e\x32\x39.google.cloud.datacatalog.v1beta1.FieldType.PrimitiveTypeH\x00\x12I\n\tenum_type\x18\x02 \x01(\x0b\x32\x34.google.cloud.datacatalog.v1beta1.FieldType.EnumTypeH\x00\x1a\x8a\x01\n\x08\x45numType\x12V\n\x0e\x61llowed_values\x18\x01 \x03(\x0b\x32>.google.cloud.datacatalog.v1beta1.FieldType.EnumType.EnumValue\x1a&\n\tEnumValue\x12\x19\n\x0c\x64isplay_name\x18\x01 \x01(\tB\x03\xe0\x41\x02"`\n\rPrimitiveType\x12\x1e\n\x1aPRIMITIVE_TYPE_UNSPECIFIED\x10\x00\x12\n\n\x06\x44OUBLE\x10\x01\x12\n\n\x06STRING\x10\x02\x12\x08\n\x04\x42OOL\x10\x03\x12\r\n\tTIMESTAMP\x10\x04\x42\x0b\n\ttype_declB\xe4\x01\n$com.google.cloud.datacatalog.v1beta1P\x01ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\xf8\x01\x01\xaa\x02 Google.Cloud.DataCatalog.V1Beta1\xca\x02 Google\\Cloud\\DataCatalog\\V1beta1\xea\x02#Google::Cloud::DataCatalog::V1beta1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_api_dot_resource__pb2.DESCRIPTOR, - google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR, - ], -) - - -_FIELDTYPE_PRIMITIVETYPE = _descriptor.EnumDescriptor( - name="PrimitiveType", - full_name="google.cloud.datacatalog.v1beta1.FieldType.PrimitiveType", - filename=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - values=[ - _descriptor.EnumValueDescriptor( - name="PRIMITIVE_TYPE_UNSPECIFIED", - index=0, - number=0, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="DOUBLE", - index=1, - number=1, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="STRING", - index=2, - number=2, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="BOOL", - index=3, - number=3, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - _descriptor.EnumValueDescriptor( - name="TIMESTAMP", - index=4, - number=4, - serialized_options=None, - type=None, - create_key=_descriptor._internal_create_key, - ), - ], - containing_type=None, - serialized_options=None, - serialized_start=1845, - serialized_end=1941, -) -_sym_db.RegisterEnumDescriptor(_FIELDTYPE_PRIMITIVETYPE) - - -_TAG_FIELDSENTRY = _descriptor.Descriptor( - name="FieldsEntry", - full_name="google.cloud.datacatalog.v1beta1.Tag.FieldsEntry", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="key", - full_name="google.cloud.datacatalog.v1beta1.Tag.FieldsEntry.key", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="value", - full_name="google.cloud.datacatalog.v1beta1.Tag.FieldsEntry.value", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"8\001", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=351, - serialized_end=440, -) - -_TAG = _descriptor.Descriptor( - name="Tag", - full_name="google.cloud.datacatalog.v1beta1.Tag", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.Tag.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="template", - full_name="google.cloud.datacatalog.v1beta1.Tag.template", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="template_display_name", - full_name="google.cloud.datacatalog.v1beta1.Tag.template_display_name", - index=2, - number=5, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="column", - full_name="google.cloud.datacatalog.v1beta1.Tag.column", - index=3, - number=4, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="fields", - full_name="google.cloud.datacatalog.v1beta1.Tag.fields", - index=4, - number=3, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_TAG_FIELDSENTRY], - enum_types=[], - serialized_options=b"\352A~\n\036datacatalog.googleapis.com/Tag\022\\projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="scope", - full_name="google.cloud.datacatalog.v1beta1.Tag.scope", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=181, - serialized_end=581, -) - - -_TAGFIELD_ENUMVALUE = _descriptor.Descriptor( - name="EnumValue", - full_name="google.cloud.datacatalog.v1beta1.TagField.EnumValue", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1beta1.TagField.EnumValue.display_name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=844, - serialized_end=877, -) - -_TAGFIELD = _descriptor.Descriptor( - name="TagField", - full_name="google.cloud.datacatalog.v1beta1.TagField", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1beta1.TagField.display_name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="double_value", - full_name="google.cloud.datacatalog.v1beta1.TagField.double_value", - index=1, - number=2, - type=1, - cpp_type=5, - label=1, - has_default_value=False, - default_value=float(0), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="string_value", - full_name="google.cloud.datacatalog.v1beta1.TagField.string_value", - index=2, - number=3, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="bool_value", - full_name="google.cloud.datacatalog.v1beta1.TagField.bool_value", - index=3, - number=4, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="timestamp_value", - full_name="google.cloud.datacatalog.v1beta1.TagField.timestamp_value", - index=4, - number=5, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="enum_value", - full_name="google.cloud.datacatalog.v1beta1.TagField.enum_value", - index=5, - number=6, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="order", - full_name="google.cloud.datacatalog.v1beta1.TagField.order", - index=6, - number=7, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_TAGFIELD_ENUMVALUE], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="kind", - full_name="google.cloud.datacatalog.v1beta1.TagField.kind", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=584, - serialized_end=885, -) - - -_TAGTEMPLATE_FIELDSENTRY = _descriptor.Descriptor( - name="FieldsEntry", - full_name="google.cloud.datacatalog.v1beta1.TagTemplate.FieldsEntry", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="key", - full_name="google.cloud.datacatalog.v1beta1.TagTemplate.FieldsEntry.key", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="value", - full_name="google.cloud.datacatalog.v1beta1.TagTemplate.FieldsEntry.value", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"8\001", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1019, - serialized_end=1116, -) - -_TAGTEMPLATE = _descriptor.Descriptor( - name="TagTemplate", - full_name="google.cloud.datacatalog.v1beta1.TagTemplate", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.TagTemplate.name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1beta1.TagTemplate.display_name", - index=1, - number=2, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="fields", - full_name="google.cloud.datacatalog.v1beta1.TagTemplate.fields", - index=2, - number=3, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_TAGTEMPLATE_FIELDSENTRY], - enum_types=[], - serialized_options=b"\352Am\n&datacatalog.googleapis.com/TagTemplate\022Cprojects/{project}/locations/{location}/tagTemplates/{tag_template}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=888, - serialized_end=1230, -) - - -_TAGTEMPLATEFIELD = _descriptor.Descriptor( - name="TagTemplateField", - full_name="google.cloud.datacatalog.v1beta1.TagTemplateField", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="name", - full_name="google.cloud.datacatalog.v1beta1.TagTemplateField.name", - index=0, - number=6, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1beta1.TagTemplateField.display_name", - index=1, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="type", - full_name="google.cloud.datacatalog.v1beta1.TagTemplateField.type", - index=2, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="is_required", - full_name="google.cloud.datacatalog.v1beta1.TagTemplateField.is_required", - index=3, - number=3, - type=8, - cpp_type=7, - label=1, - has_default_value=False, - default_value=False, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="order", - full_name="google.cloud.datacatalog.v1beta1.TagTemplateField.order", - index=4, - number=5, - type=5, - cpp_type=1, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=b"\352A\201\001\n+datacatalog.googleapis.com/TagTemplateField\022Rprojects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}", - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1233, - serialized_end=1528, -) - - -_FIELDTYPE_ENUMTYPE_ENUMVALUE = _descriptor.Descriptor( - name="EnumValue", - full_name="google.cloud.datacatalog.v1beta1.FieldType.EnumType.EnumValue", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="display_name", - full_name="google.cloud.datacatalog.v1beta1.FieldType.EnumType.EnumValue.display_name", - index=0, - number=1, - type=9, - cpp_type=9, - label=1, - has_default_value=False, - default_value=b"".decode("utf-8"), - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\002", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1805, - serialized_end=1843, -) - -_FIELDTYPE_ENUMTYPE = _descriptor.Descriptor( - name="EnumType", - full_name="google.cloud.datacatalog.v1beta1.FieldType.EnumType", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="allowed_values", - full_name="google.cloud.datacatalog.v1beta1.FieldType.EnumType.allowed_values", - index=0, - number=1, - type=11, - cpp_type=10, - label=3, - has_default_value=False, - default_value=[], - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ) - ], - extensions=[], - nested_types=[_FIELDTYPE_ENUMTYPE_ENUMVALUE], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=1705, - serialized_end=1843, -) - -_FIELDTYPE = _descriptor.Descriptor( - name="FieldType", - full_name="google.cloud.datacatalog.v1beta1.FieldType", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="primitive_type", - full_name="google.cloud.datacatalog.v1beta1.FieldType.primitive_type", - index=0, - number=1, - type=14, - cpp_type=8, - label=1, - has_default_value=False, - default_value=0, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="enum_type", - full_name="google.cloud.datacatalog.v1beta1.FieldType.enum_type", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[_FIELDTYPE_ENUMTYPE], - enum_types=[_FIELDTYPE_PRIMITIVETYPE], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name="type_decl", - full_name="google.cloud.datacatalog.v1beta1.FieldType.type_decl", - index=0, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[], - ) - ], - serialized_start=1531, - serialized_end=1954, -) - -_TAG_FIELDSENTRY.fields_by_name["value"].message_type = _TAGFIELD -_TAG_FIELDSENTRY.containing_type = _TAG -_TAG.fields_by_name["fields"].message_type = _TAG_FIELDSENTRY -_TAG.oneofs_by_name["scope"].fields.append(_TAG.fields_by_name["column"]) -_TAG.fields_by_name["column"].containing_oneof = _TAG.oneofs_by_name["scope"] -_TAGFIELD_ENUMVALUE.containing_type = _TAGFIELD -_TAGFIELD.fields_by_name[ - "timestamp_value" -].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP -_TAGFIELD.fields_by_name["enum_value"].message_type = _TAGFIELD_ENUMVALUE -_TAGFIELD.oneofs_by_name["kind"].fields.append(_TAGFIELD.fields_by_name["double_value"]) -_TAGFIELD.fields_by_name["double_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGFIELD.oneofs_by_name["kind"].fields.append(_TAGFIELD.fields_by_name["string_value"]) -_TAGFIELD.fields_by_name["string_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGFIELD.oneofs_by_name["kind"].fields.append(_TAGFIELD.fields_by_name["bool_value"]) -_TAGFIELD.fields_by_name["bool_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGFIELD.oneofs_by_name["kind"].fields.append( - _TAGFIELD.fields_by_name["timestamp_value"] -) -_TAGFIELD.fields_by_name["timestamp_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGFIELD.oneofs_by_name["kind"].fields.append(_TAGFIELD.fields_by_name["enum_value"]) -_TAGFIELD.fields_by_name["enum_value"].containing_oneof = _TAGFIELD.oneofs_by_name[ - "kind" -] -_TAGTEMPLATE_FIELDSENTRY.fields_by_name["value"].message_type = _TAGTEMPLATEFIELD -_TAGTEMPLATE_FIELDSENTRY.containing_type = _TAGTEMPLATE -_TAGTEMPLATE.fields_by_name["fields"].message_type = _TAGTEMPLATE_FIELDSENTRY -_TAGTEMPLATEFIELD.fields_by_name["type"].message_type = _FIELDTYPE -_FIELDTYPE_ENUMTYPE_ENUMVALUE.containing_type = _FIELDTYPE_ENUMTYPE -_FIELDTYPE_ENUMTYPE.fields_by_name[ - "allowed_values" -].message_type = _FIELDTYPE_ENUMTYPE_ENUMVALUE -_FIELDTYPE_ENUMTYPE.containing_type = _FIELDTYPE -_FIELDTYPE.fields_by_name["primitive_type"].enum_type = _FIELDTYPE_PRIMITIVETYPE -_FIELDTYPE.fields_by_name["enum_type"].message_type = _FIELDTYPE_ENUMTYPE -_FIELDTYPE_PRIMITIVETYPE.containing_type = _FIELDTYPE -_FIELDTYPE.oneofs_by_name["type_decl"].fields.append( - _FIELDTYPE.fields_by_name["primitive_type"] -) -_FIELDTYPE.fields_by_name[ - "primitive_type" -].containing_oneof = _FIELDTYPE.oneofs_by_name["type_decl"] -_FIELDTYPE.oneofs_by_name["type_decl"].fields.append( - _FIELDTYPE.fields_by_name["enum_type"] -) -_FIELDTYPE.fields_by_name["enum_type"].containing_oneof = _FIELDTYPE.oneofs_by_name[ - "type_decl" -] -DESCRIPTOR.message_types_by_name["Tag"] = _TAG -DESCRIPTOR.message_types_by_name["TagField"] = _TAGFIELD -DESCRIPTOR.message_types_by_name["TagTemplate"] = _TAGTEMPLATE -DESCRIPTOR.message_types_by_name["TagTemplateField"] = _TAGTEMPLATEFIELD -DESCRIPTOR.message_types_by_name["FieldType"] = _FIELDTYPE -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -Tag = _reflection.GeneratedProtocolMessageType( - "Tag", - (_message.Message,), - { - "FieldsEntry": _reflection.GeneratedProtocolMessageType( - "FieldsEntry", - (_message.Message,), - { - "DESCRIPTOR": _TAG_FIELDSENTRY, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2" - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.Tag.FieldsEntry) - }, - ), - "DESCRIPTOR": _TAG, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2", - "__doc__": """Tags are used to attach custom metadata to Data Catalog resources. - Tags conform to the specifications within their tag template. See - `Data Catalog IAM `__ for information on the permissions - needed to create or view tags. - - Attributes: - name: - The resource name of the tag in URL format. Example: - proje - cts/{project_id}/locations/{location}/entrygroups/{entry_group - _id}/entries/{entry_id}/tags/{tag_id} where ``tag_id`` is a - system-generated identifier. Note that this Tag may not - actually be stored in the location in this name. - template: - Required. The resource name of the tag template that this tag - uses. Example: - projects/{project_id}/locations/{location}/ - tagTemplates/{tag_template_id} This field cannot be modified - after creation. - template_display_name: - Output only. The display name of the tag template. - scope: - The scope within the parent resource that this tag is attached - to. If not provided, the tag is attached to the parent - resource itself. Deleting the scope from the parent resource - will delete all tags attached to that scope. These fields - cannot be updated after creation. - column: - Resources like Entry can have schemas associated with them. - This scope allows users to attach tags to an individual column - based on that schema. For attaching a tag to a nested column, - use ``.`` to separate the column names. Example: - - ``outer_column.inner_column`` - fields: - Required. This maps the ID of a tag field to the value of and - additional information about that field. Valid field IDs are - defined by the tag’s template. A tag must have at least 1 - field and at most 500 fields. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.Tag) - }, -) -_sym_db.RegisterMessage(Tag) -_sym_db.RegisterMessage(Tag.FieldsEntry) - -TagField = _reflection.GeneratedProtocolMessageType( - "TagField", - (_message.Message,), - { - "EnumValue": _reflection.GeneratedProtocolMessageType( - "EnumValue", - (_message.Message,), - { - "DESCRIPTOR": _TAGFIELD_ENUMVALUE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2", - "__doc__": """Holds an enum value. - - Attributes: - display_name: - The display name of the enum value. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.TagField.EnumValue) - }, - ), - "DESCRIPTOR": _TAGFIELD, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2", - "__doc__": """Contains the value and supporting information for a field within a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. - - Attributes: - display_name: - Output only. The display name of this field. - kind: - Required. The value of this field. - double_value: - Holds the value for a tag field with double type. - string_value: - Holds the value for a tag field with string type. - bool_value: - Holds the value for a tag field with boolean type. - timestamp_value: - Holds the value for a tag field with timestamp type. - enum_value: - Holds the value for a tag field with enum type. This value - must be one of the allowed values in the definition of this - enum. - order: - Output only. The order of this field with respect to other - fields in this tag. It can be set in [Tag][google.cloud.dataca - talog.v1beta1.TagTemplateField.order]. For example, a higher - value can indicate a more important field. The value can be - negative. Multiple fields can have the same order, and field - orders within a tag do not have to be sequential. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.TagField) - }, -) -_sym_db.RegisterMessage(TagField) -_sym_db.RegisterMessage(TagField.EnumValue) - -TagTemplate = _reflection.GeneratedProtocolMessageType( - "TagTemplate", - (_message.Message,), - { - "FieldsEntry": _reflection.GeneratedProtocolMessageType( - "FieldsEntry", - (_message.Message,), - { - "DESCRIPTOR": _TAGTEMPLATE_FIELDSENTRY, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2" - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.TagTemplate.FieldsEntry) - }, - ), - "DESCRIPTOR": _TAGTEMPLATE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2", - "__doc__": """A tag template defines a tag, which can have one or more typed fields. - The template is used to create and attach the tag to GCP resources. - `Tag template roles `__ provide permissions to create, edit, and - use the template. See, for example, the `TagTemplate User - `__ - role, which includes permission to use the tag template to tag - resources. - - Attributes: - name: - The resource name of the tag template in URL format. Example: - - projects/{project_id}/locations/{location}/tagTemplates/{ta - g_template_id} Note that this TagTemplate and its child - resources may not actually be stored in the location in this - name. - display_name: - The display name for this template. Defaults to an empty - string. - fields: - Required. Map of tag template field IDs to the settings for - the field. This map is an exhaustive list of the allowed - fields. This map must contain at least one field and at most - 500 fields. The keys to this map are tag template field IDs. - Field IDs can contain letters (both uppercase and lowercase), - numbers (0-9) and underscores (_). Field IDs must be at least - 1 character long and at most 64 characters long. Field IDs - must start with a letter or underscore. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.TagTemplate) - }, -) -_sym_db.RegisterMessage(TagTemplate) -_sym_db.RegisterMessage(TagTemplate.FieldsEntry) - -TagTemplateField = _reflection.GeneratedProtocolMessageType( - "TagTemplateField", - (_message.Message,), - { - "DESCRIPTOR": _TAGTEMPLATEFIELD, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2", - "__doc__": """The template for an individual field within a tag template. - - Attributes: - name: - Output only. The resource name of the tag template field in - URL format. Example: - projects/{project_id}/locations/{loca - tion}/tagTemplates/{tag_template}/fields/{field} Note that - this TagTemplateField may not actually be stored in the - location in this name. - display_name: - The display name for this field. Defaults to an empty string. - type: - Required. The type of value this tag field can contain. - is_required: - Whether this is a required field. Defaults to false. - order: - The order of this field with respect to other fields in this - tag template. A higher value indicates a more important field. - The value can be negative. Multiple fields can have the same - order, and field orders within a tag do not have to be - sequential. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.TagTemplateField) - }, -) -_sym_db.RegisterMessage(TagTemplateField) - -FieldType = _reflection.GeneratedProtocolMessageType( - "FieldType", - (_message.Message,), - { - "EnumType": _reflection.GeneratedProtocolMessageType( - "EnumType", - (_message.Message,), - { - "EnumValue": _reflection.GeneratedProtocolMessageType( - "EnumValue", - (_message.Message,), - { - "DESCRIPTOR": _FIELDTYPE_ENUMTYPE_ENUMVALUE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2", - "__doc__": """ - Attributes: - display_name: - Required. The display name of the enum value. Must not be an - empty string. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.FieldType.EnumType.EnumValue) - }, - ), - "DESCRIPTOR": _FIELDTYPE_ENUMTYPE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2", - "__doc__": """ - Attributes: - allowed_values: - Required on create; optional on update. The set of allowed - values for this enum. This set must not be empty, the display - names of the values in this set must not be empty and the - display names of the values must be case-insensitively unique - within this set. Currently, enum values can only be added to - the list of allowed values. Deletion and renaming of enum - values are not supported. Can have up to 500 allowed values. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.FieldType.EnumType) - }, - ), - "DESCRIPTOR": _FIELDTYPE, - "__module__": "google.cloud.datacatalog_v1beta1.proto.tags_pb2", - "__doc__": """ - Attributes: - type_decl: - Required. - primitive_type: - Represents primitive types - string, bool etc. - enum_type: - Represents an enum type. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.FieldType) - }, -) -_sym_db.RegisterMessage(FieldType) -_sym_db.RegisterMessage(FieldType.EnumType) -_sym_db.RegisterMessage(FieldType.EnumType.EnumValue) - - -DESCRIPTOR._options = None -_TAG_FIELDSENTRY._options = None -_TAG.fields_by_name["template"]._options = None -_TAG.fields_by_name["template_display_name"]._options = None -_TAG.fields_by_name["fields"]._options = None -_TAG._options = None -_TAGFIELD.fields_by_name["display_name"]._options = None -_TAGFIELD.fields_by_name["order"]._options = None -_TAGTEMPLATE_FIELDSENTRY._options = None -_TAGTEMPLATE.fields_by_name["fields"]._options = None -_TAGTEMPLATE._options = None -_TAGTEMPLATEFIELD.fields_by_name["name"]._options = None -_TAGTEMPLATEFIELD.fields_by_name["type"]._options = None -_TAGTEMPLATEFIELD._options = None -_FIELDTYPE_ENUMTYPE_ENUMVALUE.fields_by_name["display_name"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1beta1/proto/tags_pb2_grpc.py b/google/cloud/datacatalog_v1beta1/proto/tags_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/tags_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1beta1/proto/timestamps.proto b/google/cloud/datacatalog_v1beta1/proto/timestamps.proto index 9a3d640e..dc49c75c 100644 --- a/google/cloud/datacatalog_v1beta1/proto/timestamps.proto +++ b/google/cloud/datacatalog_v1beta1/proto/timestamps.proto @@ -1,4 +1,4 @@ -// Copyright 2019 Google LLC. +// Copyright 2020 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. @@ -11,7 +11,6 @@ // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. -// syntax = "proto3"; @@ -21,9 +20,12 @@ import "google/api/field_behavior.proto"; import "google/protobuf/timestamp.proto"; option cc_enable_arenas = true; +option csharp_namespace = "Google.Cloud.DataCatalog.V1Beta1"; option go_package = "google.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog"; option java_multiple_files = true; -option java_package = "com.google.cloud.datacatalog"; +option java_package = "com.google.cloud.datacatalog.v1beta1"; +option php_namespace = "Google\\Cloud\\DataCatalog\\V1beta1"; +option ruby_package = "Google::Cloud::DataCatalog::V1beta1"; // Timestamps about this resource according to a particular system. message SystemTimestamps { @@ -35,6 +37,5 @@ message SystemTimestamps { // Output only. The expiration time of the resource within the given system. // Currently only apllicable to BigQuery resources. - google.protobuf.Timestamp expire_time = 3 - [(google.api.field_behavior) = OUTPUT_ONLY]; + google.protobuf.Timestamp expire_time = 3 [(google.api.field_behavior) = OUTPUT_ONLY]; } diff --git a/google/cloud/datacatalog_v1beta1/proto/timestamps_pb2.py b/google/cloud/datacatalog_v1beta1/proto/timestamps_pb2.py deleted file mode 100644 index c8cea0a4..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/timestamps_pb2.py +++ /dev/null @@ -1,149 +0,0 @@ -# -*- coding: utf-8 -*- -# Generated by the protocol buffer compiler. DO NOT EDIT! -# source: google/cloud/datacatalog_v1beta1/proto/timestamps.proto - -from google.protobuf import descriptor as _descriptor -from google.protobuf import message as _message -from google.protobuf import reflection as _reflection -from google.protobuf import symbol_database as _symbol_database - -# @@protoc_insertion_point(imports) - -_sym_db = _symbol_database.Default() - - -from google.api import field_behavior_pb2 as google_dot_api_dot_field__behavior__pb2 -from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2 - - -DESCRIPTOR = _descriptor.FileDescriptor( - name="google/cloud/datacatalog_v1beta1/proto/timestamps.proto", - package="google.cloud.datacatalog.v1beta1", - syntax="proto3", - serialized_options=b"\n$com.google.cloud.datacatalog.v1beta1P\001ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\370\001\001\252\002 Google.Cloud.DataCatalog.V1Beta1\312\002 Google\\Cloud\\DataCatalog\\V1beta1\352\002#Google::Cloud::DataCatalog::V1beta1", - create_key=_descriptor._internal_create_key, - serialized_pb=b'\n7google/cloud/datacatalog_v1beta1/proto/timestamps.proto\x12 google.cloud.datacatalog.v1beta1\x1a\x1fgoogle/api/field_behavior.proto\x1a\x1fgoogle/protobuf/timestamp.proto"\xaa\x01\n\x10SystemTimestamps\x12/\n\x0b\x63reate_time\x18\x01 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12/\n\x0bupdate_time\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12\x34\n\x0b\x65xpire_time\x18\x03 \x01(\x0b\x32\x1a.google.protobuf.TimestampB\x03\xe0\x41\x03\x42\xe4\x01\n$com.google.cloud.datacatalog.v1beta1P\x01ZKgoogle.golang.org/genproto/googleapis/cloud/datacatalog/v1beta1;datacatalog\xf8\x01\x01\xaa\x02 Google.Cloud.DataCatalog.V1Beta1\xca\x02 Google\\Cloud\\DataCatalog\\V1beta1\xea\x02#Google::Cloud::DataCatalog::V1beta1b\x06proto3', - dependencies=[ - google_dot_api_dot_field__behavior__pb2.DESCRIPTOR, - google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR, - ], -) - - -_SYSTEMTIMESTAMPS = _descriptor.Descriptor( - name="SystemTimestamps", - full_name="google.cloud.datacatalog.v1beta1.SystemTimestamps", - filename=None, - file=DESCRIPTOR, - containing_type=None, - create_key=_descriptor._internal_create_key, - fields=[ - _descriptor.FieldDescriptor( - name="create_time", - full_name="google.cloud.datacatalog.v1beta1.SystemTimestamps.create_time", - index=0, - number=1, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="update_time", - full_name="google.cloud.datacatalog.v1beta1.SystemTimestamps.update_time", - index=1, - number=2, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=None, - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - _descriptor.FieldDescriptor( - name="expire_time", - full_name="google.cloud.datacatalog.v1beta1.SystemTimestamps.expire_time", - index=2, - number=3, - type=11, - cpp_type=10, - label=1, - has_default_value=False, - default_value=None, - message_type=None, - enum_type=None, - containing_type=None, - is_extension=False, - extension_scope=None, - serialized_options=b"\340A\003", - file=DESCRIPTOR, - create_key=_descriptor._internal_create_key, - ), - ], - extensions=[], - nested_types=[], - enum_types=[], - serialized_options=None, - is_extendable=False, - syntax="proto3", - extension_ranges=[], - oneofs=[], - serialized_start=160, - serialized_end=330, -) - -_SYSTEMTIMESTAMPS.fields_by_name[ - "create_time" -].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP -_SYSTEMTIMESTAMPS.fields_by_name[ - "update_time" -].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP -_SYSTEMTIMESTAMPS.fields_by_name[ - "expire_time" -].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP -DESCRIPTOR.message_types_by_name["SystemTimestamps"] = _SYSTEMTIMESTAMPS -_sym_db.RegisterFileDescriptor(DESCRIPTOR) - -SystemTimestamps = _reflection.GeneratedProtocolMessageType( - "SystemTimestamps", - (_message.Message,), - { - "DESCRIPTOR": _SYSTEMTIMESTAMPS, - "__module__": "google.cloud.datacatalog_v1beta1.proto.timestamps_pb2", - "__doc__": """Timestamps about this resource according to a particular system. - - Attributes: - create_time: - The creation time of the resource within the given system. - update_time: - The last-modified time of the resource within the given - system. - expire_time: - Output only. The expiration time of the resource within the - given system. Currently only apllicable to BigQuery resources. - """, - # @@protoc_insertion_point(class_scope:google.cloud.datacatalog.v1beta1.SystemTimestamps) - }, -) -_sym_db.RegisterMessage(SystemTimestamps) - - -DESCRIPTOR._options = None -_SYSTEMTIMESTAMPS.fields_by_name["expire_time"]._options = None -# @@protoc_insertion_point(module_scope) diff --git a/google/cloud/datacatalog_v1beta1/proto/timestamps_pb2_grpc.py b/google/cloud/datacatalog_v1beta1/proto/timestamps_pb2_grpc.py deleted file mode 100644 index 8a939394..00000000 --- a/google/cloud/datacatalog_v1beta1/proto/timestamps_pb2_grpc.py +++ /dev/null @@ -1,3 +0,0 @@ -# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! -"""Client and server classes corresponding to protobuf-defined services.""" -import grpc diff --git a/google/cloud/datacatalog_v1beta1/py.typed b/google/cloud/datacatalog_v1beta1/py.typed new file mode 100644 index 00000000..bb4088a3 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/py.typed @@ -0,0 +1,2 @@ +# Marker file for PEP 561. +# The google-cloud-datacatalog package uses inline types. diff --git a/google/cloud/datacatalog_v1beta1/services/__init__.py b/google/cloud/datacatalog_v1beta1/services/__init__.py new file mode 100644 index 00000000..42ffdf2b --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/__init__.py @@ -0,0 +1,16 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# diff --git a/google/cloud/__init__.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/__init__.py similarity index 71% rename from google/cloud/__init__.py rename to google/cloud/datacatalog_v1beta1/services/data_catalog/__init__.py index 9a1b64a6..e56ed8a6 100644 --- a/google/cloud/__init__.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/__init__.py @@ -1,24 +1,24 @@ # -*- coding: utf-8 -*- -# + # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # -# https://www.apache.org/licenses/LICENSE-2.0 +# http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. +# -try: - import pkg_resources - - pkg_resources.declare_namespace(__name__) -except ImportError: - import pkgutil +from .client import DataCatalogClient +from .async_client import DataCatalogAsyncClient - __path__ = pkgutil.extend_path(__path__, __name__) +__all__ = ( + "DataCatalogClient", + "DataCatalogAsyncClient", +) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py new file mode 100644 index 00000000..ee21855f --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py @@ -0,0 +1,2740 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +import functools +import re +from typing import Dict, Sequence, Tuple, Type, Union +import pkg_resources + +import google.api_core.client_options as ClientOptions # type: ignore +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore +from google.oauth2 import service_account # type: ignore + +from google.cloud.datacatalog_v1beta1.services.data_catalog import pagers +from google.cloud.datacatalog_v1beta1.types import common +from google.cloud.datacatalog_v1beta1.types import datacatalog +from google.cloud.datacatalog_v1beta1.types import gcs_fileset_spec +from google.cloud.datacatalog_v1beta1.types import schema +from google.cloud.datacatalog_v1beta1.types import search +from google.cloud.datacatalog_v1beta1.types import table_spec +from google.cloud.datacatalog_v1beta1.types import tags +from google.cloud.datacatalog_v1beta1.types import timestamps +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import field_mask_pb2 as field_mask # type: ignore + +from .transports.base import DataCatalogTransport +from .transports.grpc_asyncio import DataCatalogGrpcAsyncIOTransport +from .client import DataCatalogClient + + +class DataCatalogAsyncClient: + """Data Catalog API service allows clients to discover, + understand, and manage their data. + """ + + _client: DataCatalogClient + + DEFAULT_ENDPOINT = DataCatalogClient.DEFAULT_ENDPOINT + DEFAULT_MTLS_ENDPOINT = DataCatalogClient.DEFAULT_MTLS_ENDPOINT + + tag_template_path = staticmethod(DataCatalogClient.tag_template_path) + + entry_path = staticmethod(DataCatalogClient.entry_path) + + entry_group_path = staticmethod(DataCatalogClient.entry_group_path) + + tag_template_field_path = staticmethod(DataCatalogClient.tag_template_field_path) + + tag_path = staticmethod(DataCatalogClient.tag_path) + + from_service_account_file = DataCatalogClient.from_service_account_file + from_service_account_json = from_service_account_file + + get_transport_class = functools.partial( + type(DataCatalogClient).get_transport_class, type(DataCatalogClient) + ) + + def __init__( + self, + *, + credentials: credentials.Credentials = None, + transport: Union[str, DataCatalogTransport] = "grpc_asyncio", + client_options: ClientOptions = None, + ) -> None: + """Instantiate the data catalog client. + + Args: + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + transport (Union[str, ~.DataCatalogTransport]): The + transport to use. If set to None, a transport is chosen + automatically. + client_options (ClientOptions): Custom options for the client. It + won't take effect if a ``transport`` instance is provided. + (1) The ``api_endpoint`` property can be used to override the + default endpoint provided by the client. GOOGLE_API_USE_MTLS + environment variable can also be used to override the endpoint: + "always" (always use the default mTLS endpoint), "never" (always + use the default regular endpoint, this is the default value for + the environment variable) and "auto" (auto switch to the default + mTLS endpoint if client SSL credentials is present). However, + the ``api_endpoint`` property takes precedence if provided. + (2) The ``client_cert_source`` property is used to provide client + SSL credentials for mutual TLS transport. If not provided, the + default SSL credentials will be used if present. + + Raises: + google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport + creation failed for any reason. + """ + + self._client = DataCatalogClient( + credentials=credentials, transport=transport, client_options=client_options, + ) + + async def search_catalog( + self, + request: datacatalog.SearchCatalogRequest = None, + *, + scope: datacatalog.SearchCatalogRequest.Scope = None, + query: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.SearchCatalogAsyncPager: + r"""Searches Data Catalog for multiple resources like entries, tags + that match a query. + + This is a custom method + (https://cloud.google.com/apis/design/custom_methods) and does + not return the complete resource, only the resource identifier + and high level fields. Clients can subsequentally call ``Get`` + methods. + + Note that Data Catalog search queries do not guarantee full + recall. Query results that match your query may not be returned, + even in subsequent result pages. Also note that results returned + (and not returned) can vary across repeated search queries. + + See `Data Catalog Search + Syntax `__ + for more information. + + Args: + request (:class:`~.datacatalog.SearchCatalogRequest`): + The request object. Request message for + [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. + scope (:class:`~.datacatalog.SearchCatalogRequest.Scope`): + Required. The scope of this search request. A ``scope`` + that has empty ``include_org_ids``, + ``include_project_ids`` AND false + ``include_gcp_public_datasets`` is considered invalid. + Data Catalog will return an error in such a case. + This corresponds to the ``scope`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + query (:class:`str`): + Required. The query string in search query syntax. The + query must be non-empty. + + Query strings can be simple as "x" or more qualified as: + + - name:x + - column:x + - description:y + + Note: Query tokens need to have a minimum of 3 + characters for substring matching to work correctly. See + `Data Catalog Search + Syntax `__ + for more information. + This corresponds to the ``query`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.SearchCatalogAsyncPager: + Response message for + [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([scope, query]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.SearchCatalogRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if scope is not None: + request.scope = scope + if query is not None: + request.query = query + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.search_catalog, + default_timeout=None, + client_info=_client_info, + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.SearchCatalogAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def create_entry_group( + self, + request: datacatalog.CreateEntryGroupRequest = None, + *, + parent: str = None, + entry_group_id: str = None, + entry_group: datacatalog.EntryGroup = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""A maximum of 10,000 entry groups may be created per organization + across all locations. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.CreateEntryGroupRequest`): + The request object. Request message for + [CreateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntryGroup]. + parent (:class:`str`): + Required. The name of the project this entry group is + in. Example: + + - projects/{project_id}/locations/{location} + + Note that this EntryGroup and its child resources may + not actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_group_id (:class:`str`): + Required. The id of the entry group + to create. The id must begin with a + letter or underscore, contain only + English letters, numbers and + underscores, and be at most 64 + characters. + This corresponds to the ``entry_group_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_group (:class:`~.datacatalog.EntryGroup`): + The entry group to create. Defaults + to an empty entry group. + This corresponds to the ``entry_group`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, entry_group_id, entry_group]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if entry_group_id is not None: + request.entry_group_id = entry_group_id + if entry_group is not None: + request.entry_group = entry_group + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_entry_group, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_entry_group( + self, + request: datacatalog.UpdateEntryGroupRequest = None, + *, + entry_group: datacatalog.EntryGroup = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Updates an EntryGroup. The user should enable the Data Catalog + API in the project identified by the ``entry_group.name`` + parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateEntryGroupRequest`): + The request object. Request message for + [UpdateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntryGroup]. + entry_group (:class:`~.datacatalog.EntryGroup`): + Required. The updated entry group. + "name" field must be set. + This corresponds to the ``entry_group`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the entry + group. If absent or empty, all + modifiable fields are updated. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([entry_group, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if entry_group is not None: + request.entry_group = entry_group + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_entry_group, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("entry_group.name", request.entry_group.name),) + ), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def get_entry_group( + self, + request: datacatalog.GetEntryGroupRequest = None, + *, + name: str = None, + read_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Gets an EntryGroup. + + Args: + request (:class:`~.datacatalog.GetEntryGroupRequest`): + The request object. Request message for + [GetEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntryGroup]. + name (:class:`str`): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + read_mask (:class:`~.field_mask.FieldMask`): + The fields to return. If not set or + empty, all fields are returned. + This corresponds to the ``read_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, read_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.GetEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if read_mask is not None: + request.read_mask = read_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_entry_group, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_entry_group( + self, + request: datacatalog.DeleteEntryGroupRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes an EntryGroup. Only entry groups that do not contain + entries can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteEntryGroupRequest`): + The request object. Request message for + [DeleteEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntryGroup]. + name (:class:`str`): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_entry_group, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def list_entry_groups( + self, + request: datacatalog.ListEntryGroupsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListEntryGroupsAsyncPager: + r"""Lists entry groups. + + Args: + request (:class:`~.datacatalog.ListEntryGroupsRequest`): + The request object. Request message for + [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. + parent (:class:`str`): + Required. The name of the location that contains the + entry groups, which can be provided in URL format. + Example: + + - projects/{project_id}/locations/{location} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListEntryGroupsAsyncPager: + Response message for + [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.ListEntryGroupsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.list_entry_groups, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.ListEntryGroupsAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def create_entry( + self, + request: datacatalog.CreateEntryRequest = None, + *, + parent: str = None, + entry_id: str = None, + entry: datacatalog.Entry = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Creates an entry. Only entries of 'FILESET' type or + user-specified type can be created. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + A maximum of 100,000 entries may be created per entry group. + + Args: + request (:class:`~.datacatalog.CreateEntryRequest`): + The request object. Request message for + [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry]. + parent (:class:`str`): + Required. The name of the entry group this entry is in. + Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + + Note that this Entry and its child resources may not + actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_id (:class:`str`): + Required. The id of the entry to + create. + This corresponds to the ``entry_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry (:class:`~.datacatalog.Entry`): + Required. The entry to create. + This corresponds to the ``entry`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic), or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, entry_id, entry]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if entry_id is not None: + request.entry_id = entry_id + if entry is not None: + request.entry = entry + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_entry, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_entry( + self, + request: datacatalog.UpdateEntryRequest = None, + *, + entry: datacatalog.Entry = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Updates an existing entry. Users should enable the Data Catalog + API in the project identified by the ``entry.name`` parameter + (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateEntryRequest`): + The request object. Request message for + [UpdateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntry]. + entry (:class:`~.datacatalog.Entry`): + Required. The updated entry. The + "name" field must be set. + This corresponds to the ``entry`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the entry. If absent or empty, + all modifiable fields are updated. + + The following fields are modifiable: + + - For entries with type ``DATA_STREAM``: + + - ``schema`` + + - For entries with type ``FILESET`` + + - ``schema`` + - ``display_name`` + - ``description`` + - ``gcs_fileset_spec`` + - ``gcs_fileset_spec.file_patterns`` + + - For entries with ``user_specified_type`` + + - ``schema`` + - ``display_name`` + - ``description`` + - user_specified_type + - user_specified_system + - linked_resource + - source_system_timestamps + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic), or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([entry, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if entry is not None: + request.entry = entry + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_entry, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("entry.name", request.entry.name),) + ), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_entry( + self, + request: datacatalog.DeleteEntryRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes an existing entry. Only entries created through + [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry] + method can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteEntryRequest`): + The request object. Request message for + [DeleteEntry][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntry]. + name (:class:`str`): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def get_entry( + self, + request: datacatalog.GetEntryRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Gets an entry. + + Args: + request (:class:`~.datacatalog.GetEntryRequest`): + The request object. Request message for + [GetEntry][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntry]. + name (:class:`str`): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic), or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.GetEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def lookup_entry( + self, + request: datacatalog.LookupEntryRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Get an entry by target resource name. This method + allows clients to use the resource name from the source + Google Cloud Platform service to get the Data Catalog + Entry. + + Args: + request (:class:`~.datacatalog.LookupEntryRequest`): + The request object. Request message for + [LookupEntry][google.cloud.datacatalog.v1beta1.DataCatalog.LookupEntry]. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic), or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + """ + # Create or coerce a protobuf request object. + + request = datacatalog.LookupEntryRequest(request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.lookup_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def list_entries( + self, + request: datacatalog.ListEntriesRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListEntriesAsyncPager: + r"""Lists entries. + + Args: + request (:class:`~.datacatalog.ListEntriesRequest`): + The request object. Request message for + [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. + parent (:class:`str`): + Required. The name of the entry group that contains the + entries, which can be provided in URL format. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListEntriesAsyncPager: + Response message for + [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.ListEntriesRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.list_entries, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.ListEntriesAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def create_tag_template( + self, + request: datacatalog.CreateTagTemplateRequest = None, + *, + parent: str = None, + tag_template_id: str = None, + tag_template: tags.TagTemplate = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Creates a tag template. The user should enable the Data Catalog + API in the project identified by the ``parent`` parameter (see + `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.CreateTagTemplateRequest`): + The request object. Request message for + [CreateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplate]. + parent (:class:`str`): + Required. The name of the project and the template + location + [region](https://cloud.google.com/data-catalog/docs/concepts/regions. + + Example: + + - projects/{project_id}/locations/us-central1 + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_id (:class:`str`): + Required. The id of the tag template + to create. + This corresponds to the ``tag_template_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template (:class:`~.tags.TagTemplate`): + Required. The tag template to create. + This corresponds to the ``tag_template`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, tag_template_id, tag_template]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag_template_id is not None: + request.tag_template_id = tag_template_id + if tag_template is not None: + request.tag_template = tag_template + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_tag_template, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def get_tag_template( + self, + request: datacatalog.GetTagTemplateRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Gets a tag template. + + Args: + request (:class:`~.datacatalog.GetTagTemplateRequest`): + The request object. Request message for + [GetTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.GetTagTemplate]. + name (:class:`str`): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.GetTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_tag_template, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_tag_template( + self, + request: datacatalog.UpdateTagTemplateRequest = None, + *, + tag_template: tags.TagTemplate = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Updates a tag template. This method cannot be used to update the + fields of a template. The tag template fields are represented as + separate resources and should be updated using their own + create/update/delete methods. Users should enable the Data + Catalog API in the project identified by the + ``tag_template.name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateTagTemplateRequest`): + The request object. Request message for + [UpdateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplate]. + tag_template (:class:`~.tags.TagTemplate`): + Required. The template to update. The + "name" field must be set. + This corresponds to the ``tag_template`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The field mask specifies the parts of the template to + overwrite. + + Allowed fields: + + - ``display_name`` + + If absent or empty, all of the allowed fields above will + be updated. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([tag_template, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if tag_template is not None: + request.tag_template = tag_template + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_tag_template, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("tag_template.name", request.tag_template.name),) + ), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_tag_template( + self, + request: datacatalog.DeleteTagTemplateRequest = None, + *, + name: str = None, + force: bool = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a tag template and all tags using the template. Users + should enable the Data Catalog API in the project identified by + the ``name`` parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteTagTemplateRequest`): + The request object. Request message for + [DeleteTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplate]. + name (:class:`str`): + Required. The name of the tag template to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + force (:class:`bool`): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of any possible + tags using this template. ``force = false`` will be + supported in the future. + This corresponds to the ``force`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, force]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if force is not None: + request.force = force + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_tag_template, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def create_tag_template_field( + self, + request: datacatalog.CreateTagTemplateFieldRequest = None, + *, + parent: str = None, + tag_template_field_id: str = None, + tag_template_field: tags.TagTemplateField = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Creates a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``parent`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.CreateTagTemplateFieldRequest`): + The request object. Request message for + [CreateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplateField]. + parent (:class:`str`): + Required. The name of the project and the template + location + `region `__. + + Example: + + - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field_id (:class:`str`): + Required. The ID of the tag template field to create. + Field ids can contain letters (both uppercase and + lowercase), numbers (0-9), underscores (_) and dashes + (-). Field IDs must be at least 1 character long and at + most 128 characters long. Field IDs must also be unique + within their template. + This corresponds to the ``tag_template_field_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field (:class:`~.tags.TagTemplateField`): + Required. The tag template field to + create. + This corresponds to the ``tag_template_field`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any( + [parent, tag_template_field_id, tag_template_field] + ): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag_template_field_id is not None: + request.tag_template_field_id = tag_template_field_id + if tag_template_field is not None: + request.tag_template_field = tag_template_field + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_tag_template_field, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_tag_template_field( + self, + request: datacatalog.UpdateTagTemplateFieldRequest = None, + *, + name: str = None, + tag_template_field: tags.TagTemplateField = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Updates a field in a tag template. This method cannot be used to + update the field type. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateTagTemplateFieldRequest`): + The request object. Request message for + [UpdateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template field. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field (:class:`~.tags.TagTemplateField`): + Required. The template to update. + This corresponds to the ``tag_template_field`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + Optional. The field mask specifies the parts of the + template to be updated. Allowed fields: + + - ``display_name`` + - ``type.enum_type`` + - ``is_required`` + + If ``update_mask`` is not set or empty, all of the + allowed fields above will be updated. + + When updating an enum type, the provided values will be + merged with the existing values. Therefore, enum values + can only be added, existing enum values cannot be + deleted nor renamed. Updating a template field from + optional to required is NOT allowed. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, tag_template_field, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if tag_template_field is not None: + request.tag_template_field = tag_template_field + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_tag_template_field, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def rename_tag_template_field( + self, + request: datacatalog.RenameTagTemplateFieldRequest = None, + *, + name: str = None, + new_tag_template_field_id: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Renames a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``name`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.RenameTagTemplateFieldRequest`): + The request object. Request message for + [RenameTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.RenameTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + new_tag_template_field_id (:class:`str`): + Required. The new ID of this tag template field. For + example, ``my_new_field``. + This corresponds to the ``new_tag_template_field_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, new_tag_template_field_id]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.RenameTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if new_tag_template_field_id is not None: + request.new_tag_template_field_id = new_tag_template_field_id + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.rename_tag_template_field, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_tag_template_field( + self, + request: datacatalog.DeleteTagTemplateFieldRequest = None, + *, + name: str = None, + force: bool = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a field in a tag template and all uses of that field. + Users should enable the Data Catalog API in the project + identified by the ``name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteTagTemplateFieldRequest`): + The request object. Request message for + [DeleteTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template field to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + force (:class:`bool`): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of this field from + any tags using this field. ``force = false`` will be + supported in the future. + This corresponds to the ``force`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name, force]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if force is not None: + request.force = force + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_tag_template_field, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def create_tag( + self, + request: datacatalog.CreateTagRequest = None, + *, + parent: str = None, + tag: tags.Tag = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.Tag: + r"""Creates a tag on an + [Entry][google.cloud.datacatalog.v1beta1.Entry]. Note: The + project identified by the ``parent`` parameter for the + `tag `__ + and the `tag + template `__ + used to create the tag must be from the same organization. + + Args: + request (:class:`~.datacatalog.CreateTagRequest`): + The request object. Request message for + [CreateTag][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTag]. + parent (:class:`str`): + Required. The name of the resource to attach this tag + to. Tags can be attached to Entries. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + + Note that this Tag and its child resources may not + actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag (:class:`~.tags.Tag`): + Required. The tag to create. + This corresponds to the ``tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.Tag: + Tags are used to attach custom metadata to Data Catalog + resources. Tags conform to the specifications within + their tag template. + + See `Data Catalog + IAM `__ + for information on the permissions needed to create or + view tags. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, tag]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.CreateTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag is not None: + request.tag = tag + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_tag, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def update_tag( + self, + request: datacatalog.UpdateTagRequest = None, + *, + tag: tags.Tag = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.Tag: + r"""Updates an existing tag. + + Args: + request (:class:`~.datacatalog.UpdateTagRequest`): + The request object. Request message for + [UpdateTag][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTag]. + tag (:class:`~.tags.Tag`): + Required. The updated tag. The "name" + field must be set. + This corresponds to the ``tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the Tag. If absent or empty, all + modifiable fields are updated. Currently the only + modifiable field is the field ``fields``. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.Tag: + Tags are used to attach custom metadata to Data Catalog + resources. Tags conform to the specifications within + their tag template. + + See `Data Catalog + IAM `__ + for information on the permissions needed to create or + view tags. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([tag, update_mask]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.UpdateTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if tag is not None: + request.tag = tag + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_tag, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("tag.name", request.tag.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_tag( + self, + request: datacatalog.DeleteTagRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a tag. + + Args: + request (:class:`~.datacatalog.DeleteTagRequest`): + The request object. Request message for + [DeleteTag][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTag]. + name (:class:`str`): + Required. The name of the tag to delete. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.DeleteTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_tag, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def list_tags( + self, + request: datacatalog.ListTagsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListTagsAsyncPager: + r"""Lists the tags on an + [Entry][google.cloud.datacatalog.v1beta1.Entry]. + + Args: + request (:class:`~.datacatalog.ListTagsRequest`): + The request object. Request message for + [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. + parent (:class:`str`): + Required. The name of the Data Catalog resource to list + the tags of. The resource could be an + [Entry][google.cloud.datacatalog.v1beta1.Entry] or an + [EntryGroup][google.cloud.datacatalog.v1beta1.EntryGroup]. + + Examples: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListTagsAsyncPager: + Response message for + [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = datacatalog.ListTagsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.list_tags, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.ListTagsAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def set_iam_policy( + self, + request: iam_policy.SetIamPolicyRequest = None, + *, + resource: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Sets the access control policy for a resource. Replaces any + existing policy. Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on + tag templates. + - ``datacatalog.entries.setIamPolicy`` to set policies on + entries. + - ``datacatalog.entryGroups.setIamPolicy`` to set policies on + entry groups. + + Args: + request (:class:`~.iam_policy.SetIamPolicyRequest`): + The request object. Request message for `SetIamPolicy` + method. + resource (:class:`str`): + REQUIRED: The resource for which the + policy is being specified. See the + operation documentation for the + appropriate value for this field. + This corresponds to the ``resource`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([resource]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.SetIamPolicyRequest(**request) + + elif not request: + request = iam_policy.SetIamPolicyRequest() + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if resource is not None: + request.resource = resource + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.set_iam_policy, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def get_iam_policy( + self, + request: iam_policy.GetIamPolicyRequest = None, + *, + resource: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Gets the access control policy for a resource. A ``NOT_FOUND`` + error is returned if the resource does not exist. An empty + policy is returned if the resource exists but does not have a + policy set on it. + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on + tag templates. + - ``datacatalog.entries.getIamPolicy`` to get policies on + entries. + - ``datacatalog.entryGroups.getIamPolicy`` to get policies on + entry groups. + + Args: + request (:class:`~.iam_policy.GetIamPolicyRequest`): + The request object. Request message for `GetIamPolicy` + method. + resource (:class:`str`): + REQUIRED: The resource for which the + policy is being requested. See the + operation documentation for the + appropriate value for this field. + This corresponds to the ``resource`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([resource]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.GetIamPolicyRequest(**request) + + elif not request: + request = iam_policy.GetIamPolicyRequest() + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if resource is not None: + request.resource = resource + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_iam_policy, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def test_iam_permissions( + self, + request: iam_policy.TestIamPermissionsRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> iam_policy.TestIamPermissionsResponse: + r"""Returns the caller's permissions on a resource. If the resource + does not exist, an empty set of permissions is returned (We + don't return a ``NOT_FOUND`` error). + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + A caller is not required to have Google IAM permission to make + this request. + + Args: + request (:class:`~.iam_policy.TestIamPermissionsRequest`): + The request object. Request message for + `TestIamPermissions` method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.iam_policy.TestIamPermissionsResponse: + Response message for ``TestIamPermissions`` method. + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.TestIamPermissionsRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.test_iam_permissions, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +__all__ = ("DataCatalogAsyncClient",) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py new file mode 100644 index 00000000..08b4f5b7 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py @@ -0,0 +1,2905 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +import os +import re +from typing import Callable, Dict, Sequence, Tuple, Type, Union +import pkg_resources + +import google.api_core.client_options as ClientOptions # type: ignore +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport import mtls # type: ignore +from google.auth.exceptions import MutualTLSChannelError # type: ignore +from google.oauth2 import service_account # type: ignore + +from google.cloud.datacatalog_v1beta1.services.data_catalog import pagers +from google.cloud.datacatalog_v1beta1.types import common +from google.cloud.datacatalog_v1beta1.types import datacatalog +from google.cloud.datacatalog_v1beta1.types import gcs_fileset_spec +from google.cloud.datacatalog_v1beta1.types import schema +from google.cloud.datacatalog_v1beta1.types import search +from google.cloud.datacatalog_v1beta1.types import table_spec +from google.cloud.datacatalog_v1beta1.types import tags +from google.cloud.datacatalog_v1beta1.types import timestamps +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import field_mask_pb2 as field_mask # type: ignore + +from .transports.base import DataCatalogTransport +from .transports.grpc import DataCatalogGrpcTransport +from .transports.grpc_asyncio import DataCatalogGrpcAsyncIOTransport + + +class DataCatalogClientMeta(type): + """Metaclass for the DataCatalog client. + + This provides class-level methods for building and retrieving + support objects (e.g. transport) without polluting the client instance + objects. + """ + + _transport_registry = OrderedDict() # type: Dict[str, Type[DataCatalogTransport]] + _transport_registry["grpc"] = DataCatalogGrpcTransport + _transport_registry["grpc_asyncio"] = DataCatalogGrpcAsyncIOTransport + + def get_transport_class(cls, label: str = None,) -> Type[DataCatalogTransport]: + """Return an appropriate transport class. + + Args: + label: The name of the desired transport. If none is + provided, then the first transport in the registry is used. + + Returns: + The transport class to use. + """ + # If a specific transport is requested, return that one. + if label: + return cls._transport_registry[label] + + # No transport is requested; return the default (that is, the first one + # in the dictionary). + return next(iter(cls._transport_registry.values())) + + +class DataCatalogClient(metaclass=DataCatalogClientMeta): + """Data Catalog API service allows clients to discover, + understand, and manage their data. + """ + + @staticmethod + def _get_default_mtls_endpoint(api_endpoint): + """Convert api endpoint to mTLS endpoint. + Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to + "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively. + Args: + api_endpoint (Optional[str]): the api endpoint to convert. + Returns: + str: converted mTLS api endpoint. + """ + if not api_endpoint: + return api_endpoint + + mtls_endpoint_re = re.compile( + r"(?P[^.]+)(?P\.mtls)?(?P\.sandbox)?(?P\.googleapis\.com)?" + ) + + m = mtls_endpoint_re.match(api_endpoint) + name, mtls, sandbox, googledomain = m.groups() + if mtls or not googledomain: + return api_endpoint + + if sandbox: + return api_endpoint.replace( + "sandbox.googleapis.com", "mtls.sandbox.googleapis.com" + ) + + return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com") + + DEFAULT_ENDPOINT = "datacatalog.googleapis.com" + DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore + DEFAULT_ENDPOINT + ) + + @classmethod + def from_service_account_file(cls, filename: str, *args, **kwargs): + """Creates an instance of this client using the provided credentials + file. + + Args: + filename (str): The path to the service account private key json + file. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + {@api.name}: The constructed client. + """ + credentials = service_account.Credentials.from_service_account_file(filename) + kwargs["credentials"] = credentials + return cls(*args, **kwargs) + + from_service_account_json = from_service_account_file + + @staticmethod + def entry_path(project: str, location: str, entry_group: str, entry: str,) -> str: + """Return a fully-qualified entry string.""" + return "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}".format( + project=project, location=location, entry_group=entry_group, entry=entry, + ) + + @staticmethod + def parse_entry_path(path: str) -> Dict[str, str]: + """Parse a entry path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/entryGroups/(?P.+?)/entries/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def entry_group_path(project: str, location: str, entry_group: str,) -> str: + """Return a fully-qualified entry_group string.""" + return "projects/{project}/locations/{location}/entryGroups/{entry_group}".format( + project=project, location=location, entry_group=entry_group, + ) + + @staticmethod + def parse_entry_group_path(path: str) -> Dict[str, str]: + """Parse a entry_group path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/entryGroups/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def tag_path( + project: str, location: str, entry_group: str, entry: str, tag: str, + ) -> str: + """Return a fully-qualified tag string.""" + return "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}".format( + project=project, + location=location, + entry_group=entry_group, + entry=entry, + tag=tag, + ) + + @staticmethod + def parse_tag_path(path: str) -> Dict[str, str]: + """Parse a tag path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/entryGroups/(?P.+?)/entries/(?P.+?)/tags/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def tag_template_path(project: str, location: str, tag_template: str,) -> str: + """Return a fully-qualified tag_template string.""" + return "projects/{project}/locations/{location}/tagTemplates/{tag_template}".format( + project=project, location=location, tag_template=tag_template, + ) + + @staticmethod + def parse_tag_template_path(path: str) -> Dict[str, str]: + """Parse a tag_template path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/tagTemplates/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def tag_template_field_path( + project: str, location: str, tag_template: str, field: str, + ) -> str: + """Return a fully-qualified tag_template_field string.""" + return "projects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}".format( + project=project, location=location, tag_template=tag_template, field=field, + ) + + @staticmethod + def parse_tag_template_field_path(path: str) -> Dict[str, str]: + """Parse a tag_template_field path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/tagTemplates/(?P.+?)/fields/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + def __init__( + self, + *, + credentials: credentials.Credentials = None, + transport: Union[str, DataCatalogTransport] = None, + client_options: ClientOptions = None, + ) -> None: + """Instantiate the data catalog client. + + Args: + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + transport (Union[str, ~.DataCatalogTransport]): The + transport to use. If set to None, a transport is chosen + automatically. + client_options (ClientOptions): Custom options for the client. It + won't take effect if a ``transport`` instance is provided. + (1) The ``api_endpoint`` property can be used to override the + default endpoint provided by the client. GOOGLE_API_USE_MTLS + environment variable can also be used to override the endpoint: + "always" (always use the default mTLS endpoint), "never" (always + use the default regular endpoint, this is the default value for + the environment variable) and "auto" (auto switch to the default + mTLS endpoint if client SSL credentials is present). However, + the ``api_endpoint`` property takes precedence if provided. + (2) The ``client_cert_source`` property is used to provide client + SSL credentials for mutual TLS transport. If not provided, the + default SSL credentials will be used if present. + + Raises: + google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport + creation failed for any reason. + """ + if isinstance(client_options, dict): + client_options = ClientOptions.from_dict(client_options) + if client_options is None: + client_options = ClientOptions.ClientOptions() + + if client_options.api_endpoint is None: + use_mtls_env = os.getenv("GOOGLE_API_USE_MTLS", "never") + if use_mtls_env == "never": + client_options.api_endpoint = self.DEFAULT_ENDPOINT + elif use_mtls_env == "always": + client_options.api_endpoint = self.DEFAULT_MTLS_ENDPOINT + elif use_mtls_env == "auto": + has_client_cert_source = ( + client_options.client_cert_source is not None + or mtls.has_default_client_cert_source() + ) + client_options.api_endpoint = ( + self.DEFAULT_MTLS_ENDPOINT + if has_client_cert_source + else self.DEFAULT_ENDPOINT + ) + else: + raise MutualTLSChannelError( + "Unsupported GOOGLE_API_USE_MTLS value. Accepted values: never, auto, always" + ) + + # Save or instantiate the transport. + # Ordinarily, we provide the transport, but allowing a custom transport + # instance provides an extensibility point for unusual situations. + if isinstance(transport, DataCatalogTransport): + # transport is a DataCatalogTransport instance. + if credentials or client_options.credentials_file: + raise ValueError( + "When providing a transport instance, " + "provide its credentials directly." + ) + if client_options.scopes: + raise ValueError( + "When providing a transport instance, " + "provide its scopes directly." + ) + self._transport = transport + else: + Transport = type(self).get_transport_class(transport) + self._transport = Transport( + credentials=credentials, + credentials_file=client_options.credentials_file, + host=client_options.api_endpoint, + scopes=client_options.scopes, + api_mtls_endpoint=client_options.api_endpoint, + client_cert_source=client_options.client_cert_source, + quota_project_id=client_options.quota_project_id, + ) + + def search_catalog( + self, + request: datacatalog.SearchCatalogRequest = None, + *, + scope: datacatalog.SearchCatalogRequest.Scope = None, + query: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.SearchCatalogPager: + r"""Searches Data Catalog for multiple resources like entries, tags + that match a query. + + This is a custom method + (https://cloud.google.com/apis/design/custom_methods) and does + not return the complete resource, only the resource identifier + and high level fields. Clients can subsequentally call ``Get`` + methods. + + Note that Data Catalog search queries do not guarantee full + recall. Query results that match your query may not be returned, + even in subsequent result pages. Also note that results returned + (and not returned) can vary across repeated search queries. + + See `Data Catalog Search + Syntax `__ + for more information. + + Args: + request (:class:`~.datacatalog.SearchCatalogRequest`): + The request object. Request message for + [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. + scope (:class:`~.datacatalog.SearchCatalogRequest.Scope`): + Required. The scope of this search request. A ``scope`` + that has empty ``include_org_ids``, + ``include_project_ids`` AND false + ``include_gcp_public_datasets`` is considered invalid. + Data Catalog will return an error in such a case. + This corresponds to the ``scope`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + query (:class:`str`): + Required. The query string in search query syntax. The + query must be non-empty. + + Query strings can be simple as "x" or more qualified as: + + - name:x + - column:x + - description:y + + Note: Query tokens need to have a minimum of 3 + characters for substring matching to work correctly. See + `Data Catalog Search + Syntax `__ + for more information. + This corresponds to the ``query`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.SearchCatalogPager: + Response message for + [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([scope, query]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.SearchCatalogRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.SearchCatalogRequest): + request = datacatalog.SearchCatalogRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if scope is not None: + request.scope = scope + if query is not None: + request.query = query + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.search_catalog] + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.SearchCatalogPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def create_entry_group( + self, + request: datacatalog.CreateEntryGroupRequest = None, + *, + parent: str = None, + entry_group_id: str = None, + entry_group: datacatalog.EntryGroup = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""A maximum of 10,000 entry groups may be created per organization + across all locations. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.CreateEntryGroupRequest`): + The request object. Request message for + [CreateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntryGroup]. + parent (:class:`str`): + Required. The name of the project this entry group is + in. Example: + + - projects/{project_id}/locations/{location} + + Note that this EntryGroup and its child resources may + not actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_group_id (:class:`str`): + Required. The id of the entry group + to create. The id must begin with a + letter or underscore, contain only + English letters, numbers and + underscores, and be at most 64 + characters. + This corresponds to the ``entry_group_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_group (:class:`~.datacatalog.EntryGroup`): + The entry group to create. Defaults + to an empty entry group. + This corresponds to the ``entry_group`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, entry_group_id, entry_group]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateEntryGroupRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateEntryGroupRequest): + request = datacatalog.CreateEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if entry_group_id is not None: + request.entry_group_id = entry_group_id + if entry_group is not None: + request.entry_group = entry_group + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_entry_group] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_entry_group( + self, + request: datacatalog.UpdateEntryGroupRequest = None, + *, + entry_group: datacatalog.EntryGroup = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Updates an EntryGroup. The user should enable the Data Catalog + API in the project identified by the ``entry_group.name`` + parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateEntryGroupRequest`): + The request object. Request message for + [UpdateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntryGroup]. + entry_group (:class:`~.datacatalog.EntryGroup`): + Required. The updated entry group. + "name" field must be set. + This corresponds to the ``entry_group`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the entry + group. If absent or empty, all + modifiable fields are updated. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([entry_group, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateEntryGroupRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateEntryGroupRequest): + request = datacatalog.UpdateEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if entry_group is not None: + request.entry_group = entry_group + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_entry_group] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("entry_group.name", request.entry_group.name),) + ), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def get_entry_group( + self, + request: datacatalog.GetEntryGroupRequest = None, + *, + name: str = None, + read_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.EntryGroup: + r"""Gets an EntryGroup. + + Args: + request (:class:`~.datacatalog.GetEntryGroupRequest`): + The request object. Request message for + [GetEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntryGroup]. + name (:class:`str`): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + read_mask (:class:`~.field_mask.FieldMask`): + The fields to return. If not set or + empty, all fields are returned. + This corresponds to the ``read_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.EntryGroup: + EntryGroup Metadata. An EntryGroup resource represents a + logical grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, read_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.GetEntryGroupRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.GetEntryGroupRequest): + request = datacatalog.GetEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if read_mask is not None: + request.read_mask = read_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_entry_group] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_entry_group( + self, + request: datacatalog.DeleteEntryGroupRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes an EntryGroup. Only entry groups that do not contain + entries can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteEntryGroupRequest`): + The request object. Request message for + [DeleteEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntryGroup]. + name (:class:`str`): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteEntryGroupRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteEntryGroupRequest): + request = datacatalog.DeleteEntryGroupRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_entry_group] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def list_entry_groups( + self, + request: datacatalog.ListEntryGroupsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListEntryGroupsPager: + r"""Lists entry groups. + + Args: + request (:class:`~.datacatalog.ListEntryGroupsRequest`): + The request object. Request message for + [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. + parent (:class:`str`): + Required. The name of the location that contains the + entry groups, which can be provided in URL format. + Example: + + - projects/{project_id}/locations/{location} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListEntryGroupsPager: + Response message for + [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.ListEntryGroupsRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.ListEntryGroupsRequest): + request = datacatalog.ListEntryGroupsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.list_entry_groups] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.ListEntryGroupsPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def create_entry( + self, + request: datacatalog.CreateEntryRequest = None, + *, + parent: str = None, + entry_id: str = None, + entry: datacatalog.Entry = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Creates an entry. Only entries of 'FILESET' type or + user-specified type can be created. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + A maximum of 100,000 entries may be created per entry group. + + Args: + request (:class:`~.datacatalog.CreateEntryRequest`): + The request object. Request message for + [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry]. + parent (:class:`str`): + Required. The name of the entry group this entry is in. + Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + + Note that this Entry and its child resources may not + actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry_id (:class:`str`): + Required. The id of the entry to + create. + This corresponds to the ``entry_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + entry (:class:`~.datacatalog.Entry`): + Required. The entry to create. + This corresponds to the ``entry`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic), or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, entry_id, entry]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateEntryRequest): + request = datacatalog.CreateEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if entry_id is not None: + request.entry_id = entry_id + if entry is not None: + request.entry = entry + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_entry] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_entry( + self, + request: datacatalog.UpdateEntryRequest = None, + *, + entry: datacatalog.Entry = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Updates an existing entry. Users should enable the Data Catalog + API in the project identified by the ``entry.name`` parameter + (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateEntryRequest`): + The request object. Request message for + [UpdateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntry]. + entry (:class:`~.datacatalog.Entry`): + Required. The updated entry. The + "name" field must be set. + This corresponds to the ``entry`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the entry. If absent or empty, + all modifiable fields are updated. + + The following fields are modifiable: + + - For entries with type ``DATA_STREAM``: + + - ``schema`` + + - For entries with type ``FILESET`` + + - ``schema`` + - ``display_name`` + - ``description`` + - ``gcs_fileset_spec`` + - ``gcs_fileset_spec.file_patterns`` + + - For entries with ``user_specified_type`` + + - ``schema`` + - ``display_name`` + - ``description`` + - user_specified_type + - user_specified_system + - linked_resource + - source_system_timestamps + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic), or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([entry, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateEntryRequest): + request = datacatalog.UpdateEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if entry is not None: + request.entry = entry + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_entry] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("entry.name", request.entry.name),) + ), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_entry( + self, + request: datacatalog.DeleteEntryRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes an existing entry. Only entries created through + [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry] + method can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteEntryRequest`): + The request object. Request message for + [DeleteEntry][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntry]. + name (:class:`str`): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteEntryRequest): + request = datacatalog.DeleteEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_entry] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def get_entry( + self, + request: datacatalog.GetEntryRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Gets an entry. + + Args: + request (:class:`~.datacatalog.GetEntryRequest`): + The request object. Request message for + [GetEntry][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntry]. + name (:class:`str`): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic), or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.GetEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.GetEntryRequest): + request = datacatalog.GetEntryRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_entry] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def lookup_entry( + self, + request: datacatalog.LookupEntryRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> datacatalog.Entry: + r"""Get an entry by target resource name. This method + allows clients to use the resource name from the source + Google Cloud Platform service to get the Data Catalog + Entry. + + Args: + request (:class:`~.datacatalog.LookupEntryRequest`): + The request object. Request message for + [LookupEntry][google.cloud.datacatalog.v1beta1.DataCatalog.LookupEntry]. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.datacatalog.Entry: + Entry Metadata. A Data Catalog Entry resource represents + another resource in Google Cloud Platform (such as a + BigQuery dataset or a Pub/Sub topic), or outside of + Google Cloud Platform. Clients can use the + ``linked_resource`` field in the Entry resource to refer + to the original resource ID of the source system. + + An Entry resource contains resource details, such as its + schema. An Entry can also be used to attach flexible + metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + """ + # Create or coerce a protobuf request object. + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.LookupEntryRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.LookupEntryRequest): + request = datacatalog.LookupEntryRequest(request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.lookup_entry] + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def list_entries( + self, + request: datacatalog.ListEntriesRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListEntriesPager: + r"""Lists entries. + + Args: + request (:class:`~.datacatalog.ListEntriesRequest`): + The request object. Request message for + [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. + parent (:class:`str`): + Required. The name of the entry group that contains the + entries, which can be provided in URL format. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListEntriesPager: + Response message for + [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.ListEntriesRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.ListEntriesRequest): + request = datacatalog.ListEntriesRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.list_entries] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.ListEntriesPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def create_tag_template( + self, + request: datacatalog.CreateTagTemplateRequest = None, + *, + parent: str = None, + tag_template_id: str = None, + tag_template: tags.TagTemplate = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Creates a tag template. The user should enable the Data Catalog + API in the project identified by the ``parent`` parameter (see + `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.CreateTagTemplateRequest`): + The request object. Request message for + [CreateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplate]. + parent (:class:`str`): + Required. The name of the project and the template + location + [region](https://cloud.google.com/data-catalog/docs/concepts/regions. + + Example: + + - projects/{project_id}/locations/us-central1 + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_id (:class:`str`): + Required. The id of the tag template + to create. + This corresponds to the ``tag_template_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template (:class:`~.tags.TagTemplate`): + Required. The tag template to create. + This corresponds to the ``tag_template`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, tag_template_id, tag_template]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateTagTemplateRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateTagTemplateRequest): + request = datacatalog.CreateTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag_template_id is not None: + request.tag_template_id = tag_template_id + if tag_template is not None: + request.tag_template = tag_template + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_tag_template] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def get_tag_template( + self, + request: datacatalog.GetTagTemplateRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Gets a tag template. + + Args: + request (:class:`~.datacatalog.GetTagTemplateRequest`): + The request object. Request message for + [GetTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.GetTagTemplate]. + name (:class:`str`): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.GetTagTemplateRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.GetTagTemplateRequest): + request = datacatalog.GetTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_tag_template] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_tag_template( + self, + request: datacatalog.UpdateTagTemplateRequest = None, + *, + tag_template: tags.TagTemplate = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplate: + r"""Updates a tag template. This method cannot be used to update the + fields of a template. The tag template fields are represented as + separate resources and should be updated using their own + create/update/delete methods. Users should enable the Data + Catalog API in the project identified by the + ``tag_template.name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateTagTemplateRequest`): + The request object. Request message for + [UpdateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplate]. + tag_template (:class:`~.tags.TagTemplate`): + Required. The template to update. The + "name" field must be set. + This corresponds to the ``tag_template`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The field mask specifies the parts of the template to + overwrite. + + Allowed fields: + + - ``display_name`` + + If absent or empty, all of the allowed fields above will + be updated. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplate: + A tag template defines a tag, which can have one or more + typed fields. The template is used to create and attach + the tag to GCP resources. `Tag template + roles `__ + provide permissions to create, edit, and use the + template. See, for example, the `TagTemplate + User `__ + role, which includes permission to use the tag template + to tag resources. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([tag_template, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateTagTemplateRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateTagTemplateRequest): + request = datacatalog.UpdateTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if tag_template is not None: + request.tag_template = tag_template + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_tag_template] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("tag_template.name", request.tag_template.name),) + ), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_tag_template( + self, + request: datacatalog.DeleteTagTemplateRequest = None, + *, + name: str = None, + force: bool = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a tag template and all tags using the template. Users + should enable the Data Catalog API in the project identified by + the ``name`` parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteTagTemplateRequest`): + The request object. Request message for + [DeleteTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplate]. + name (:class:`str`): + Required. The name of the tag template to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + force (:class:`bool`): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of any possible + tags using this template. ``force = false`` will be + supported in the future. + This corresponds to the ``force`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, force]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteTagTemplateRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteTagTemplateRequest): + request = datacatalog.DeleteTagTemplateRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if force is not None: + request.force = force + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_tag_template] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def create_tag_template_field( + self, + request: datacatalog.CreateTagTemplateFieldRequest = None, + *, + parent: str = None, + tag_template_field_id: str = None, + tag_template_field: tags.TagTemplateField = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Creates a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``parent`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.CreateTagTemplateFieldRequest`): + The request object. Request message for + [CreateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplateField]. + parent (:class:`str`): + Required. The name of the project and the template + location + `region `__. + + Example: + + - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field_id (:class:`str`): + Required. The ID of the tag template field to create. + Field ids can contain letters (both uppercase and + lowercase), numbers (0-9), underscores (_) and dashes + (-). Field IDs must be at least 1 character long and at + most 128 characters long. Field IDs must also be unique + within their template. + This corresponds to the ``tag_template_field_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field (:class:`~.tags.TagTemplateField`): + Required. The tag template field to + create. + This corresponds to the ``tag_template_field`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, tag_template_field_id, tag_template_field]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateTagTemplateFieldRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateTagTemplateFieldRequest): + request = datacatalog.CreateTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag_template_field_id is not None: + request.tag_template_field_id = tag_template_field_id + if tag_template_field is not None: + request.tag_template_field = tag_template_field + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[ + self._transport.create_tag_template_field + ] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_tag_template_field( + self, + request: datacatalog.UpdateTagTemplateFieldRequest = None, + *, + name: str = None, + tag_template_field: tags.TagTemplateField = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Updates a field in a tag template. This method cannot be used to + update the field type. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.UpdateTagTemplateFieldRequest`): + The request object. Request message for + [UpdateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template field. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag_template_field (:class:`~.tags.TagTemplateField`): + Required. The template to update. + This corresponds to the ``tag_template_field`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + Optional. The field mask specifies the parts of the + template to be updated. Allowed fields: + + - ``display_name`` + - ``type.enum_type`` + - ``is_required`` + + If ``update_mask`` is not set or empty, all of the + allowed fields above will be updated. + + When updating an enum type, the provided values will be + merged with the existing values. Therefore, enum values + can only be added, existing enum values cannot be + deleted nor renamed. Updating a template field from + optional to required is NOT allowed. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, tag_template_field, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateTagTemplateFieldRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateTagTemplateFieldRequest): + request = datacatalog.UpdateTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if tag_template_field is not None: + request.tag_template_field = tag_template_field + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[ + self._transport.update_tag_template_field + ] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def rename_tag_template_field( + self, + request: datacatalog.RenameTagTemplateFieldRequest = None, + *, + name: str = None, + new_tag_template_field_id: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.TagTemplateField: + r"""Renames a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``name`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Args: + request (:class:`~.datacatalog.RenameTagTemplateFieldRequest`): + The request object. Request message for + [RenameTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.RenameTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + new_tag_template_field_id (:class:`str`): + Required. The new ID of this tag template field. For + example, ``my_new_field``. + This corresponds to the ``new_tag_template_field_id`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.TagTemplateField: + The template for an individual field + within a tag template. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, new_tag_template_field_id]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.RenameTagTemplateFieldRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.RenameTagTemplateFieldRequest): + request = datacatalog.RenameTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if new_tag_template_field_id is not None: + request.new_tag_template_field_id = new_tag_template_field_id + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[ + self._transport.rename_tag_template_field + ] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_tag_template_field( + self, + request: datacatalog.DeleteTagTemplateFieldRequest = None, + *, + name: str = None, + force: bool = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a field in a tag template and all uses of that field. + Users should enable the Data Catalog API in the project + identified by the ``name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Args: + request (:class:`~.datacatalog.DeleteTagTemplateFieldRequest`): + The request object. Request message for + [DeleteTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplateField]. + name (:class:`str`): + Required. The name of the tag template field to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + force (:class:`bool`): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of this field from + any tags using this field. ``force = false`` will be + supported in the future. + This corresponds to the ``force`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name, force]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteTagTemplateFieldRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteTagTemplateFieldRequest): + request = datacatalog.DeleteTagTemplateFieldRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + if force is not None: + request.force = force + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[ + self._transport.delete_tag_template_field + ] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def create_tag( + self, + request: datacatalog.CreateTagRequest = None, + *, + parent: str = None, + tag: tags.Tag = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.Tag: + r"""Creates a tag on an + [Entry][google.cloud.datacatalog.v1beta1.Entry]. Note: The + project identified by the ``parent`` parameter for the + `tag `__ + and the `tag + template `__ + used to create the tag must be from the same organization. + + Args: + request (:class:`~.datacatalog.CreateTagRequest`): + The request object. Request message for + [CreateTag][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTag]. + parent (:class:`str`): + Required. The name of the resource to attach this tag + to. Tags can be attached to Entries. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + + Note that this Tag and its child resources may not + actually be stored in the location in this name. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + tag (:class:`~.tags.Tag`): + Required. The tag to create. + This corresponds to the ``tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.Tag: + Tags are used to attach custom metadata to Data Catalog + resources. Tags conform to the specifications within + their tag template. + + See `Data Catalog + IAM `__ + for information on the permissions needed to create or + view tags. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, tag]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.CreateTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.CreateTagRequest): + request = datacatalog.CreateTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if tag is not None: + request.tag = tag + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def update_tag( + self, + request: datacatalog.UpdateTagRequest = None, + *, + tag: tags.Tag = None, + update_mask: field_mask.FieldMask = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> tags.Tag: + r"""Updates an existing tag. + + Args: + request (:class:`~.datacatalog.UpdateTagRequest`): + The request object. Request message for + [UpdateTag][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTag]. + tag (:class:`~.tags.Tag`): + Required. The updated tag. The "name" + field must be set. + This corresponds to the ``tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + update_mask (:class:`~.field_mask.FieldMask`): + The fields to update on the Tag. If absent or empty, all + modifiable fields are updated. Currently the only + modifiable field is the field ``fields``. + This corresponds to the ``update_mask`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.tags.Tag: + Tags are used to attach custom metadata to Data Catalog + resources. Tags conform to the specifications within + their tag template. + + See `Data Catalog + IAM `__ + for information on the permissions needed to create or + view tags. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([tag, update_mask]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.UpdateTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.UpdateTagRequest): + request = datacatalog.UpdateTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if tag is not None: + request.tag = tag + if update_mask is not None: + request.update_mask = update_mask + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("tag.name", request.tag.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_tag( + self, + request: datacatalog.DeleteTagRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a tag. + + Args: + request (:class:`~.datacatalog.DeleteTagRequest`): + The request object. Request message for + [DeleteTag][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTag]. + name (:class:`str`): + Required. The name of the tag to delete. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.DeleteTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.DeleteTagRequest): + request = datacatalog.DeleteTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def list_tags( + self, + request: datacatalog.ListTagsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListTagsPager: + r"""Lists the tags on an + [Entry][google.cloud.datacatalog.v1beta1.Entry]. + + Args: + request (:class:`~.datacatalog.ListTagsRequest`): + The request object. Request message for + [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. + parent (:class:`str`): + Required. The name of the Data Catalog resource to list + the tags of. The resource could be an + [Entry][google.cloud.datacatalog.v1beta1.Entry] or an + [EntryGroup][google.cloud.datacatalog.v1beta1.EntryGroup]. + + Examples: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListTagsPager: + Response message for + [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a datacatalog.ListTagsRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, datacatalog.ListTagsRequest): + request = datacatalog.ListTagsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.list_tags] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.ListTagsPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def set_iam_policy( + self, + request: iam_policy.SetIamPolicyRequest = None, + *, + resource: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Sets the access control policy for a resource. Replaces any + existing policy. Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on + tag templates. + - ``datacatalog.entries.setIamPolicy`` to set policies on + entries. + - ``datacatalog.entryGroups.setIamPolicy`` to set policies on + entry groups. + + Args: + request (:class:`~.iam_policy.SetIamPolicyRequest`): + The request object. Request message for `SetIamPolicy` + method. + resource (:class:`str`): + REQUIRED: The resource for which the + policy is being specified. See the + operation documentation for the + appropriate value for this field. + This corresponds to the ``resource`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([resource]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.SetIamPolicyRequest(**request) + + elif not request: + request = iam_policy.SetIamPolicyRequest() + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if resource is not None: + request.resource = resource + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.set_iam_policy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def get_iam_policy( + self, + request: iam_policy.GetIamPolicyRequest = None, + *, + resource: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Gets the access control policy for a resource. A ``NOT_FOUND`` + error is returned if the resource does not exist. An empty + policy is returned if the resource exists but does not have a + policy set on it. + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on + tag templates. + - ``datacatalog.entries.getIamPolicy`` to get policies on + entries. + - ``datacatalog.entryGroups.getIamPolicy`` to get policies on + entry groups. + + Args: + request (:class:`~.iam_policy.GetIamPolicyRequest`): + The request object. Request message for `GetIamPolicy` + method. + resource (:class:`str`): + REQUIRED: The resource for which the + policy is being requested. See the + operation documentation for the + appropriate value for this field. + This corresponds to the ``resource`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([resource]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.GetIamPolicyRequest(**request) + + elif not request: + request = iam_policy.GetIamPolicyRequest() + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if resource is not None: + request.resource = resource + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_iam_policy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def test_iam_permissions( + self, + request: iam_policy.TestIamPermissionsRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> iam_policy.TestIamPermissionsResponse: + r"""Returns the caller's permissions on a resource. If the resource + does not exist, an empty set of permissions is returned (We + don't return a ``NOT_FOUND`` error). + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + A caller is not required to have Google IAM permission to make + this request. + + Args: + request (:class:`~.iam_policy.TestIamPermissionsRequest`): + The request object. Request message for + `TestIamPermissions` method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.iam_policy.TestIamPermissionsResponse: + Response message for ``TestIamPermissions`` method. + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.TestIamPermissionsRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.test_iam_permissions] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +__all__ = ("DataCatalogClient",) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py new file mode 100644 index 00000000..ae87331f --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py @@ -0,0 +1,534 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Any, AsyncIterable, Awaitable, Callable, Iterable, Sequence, Tuple + +from google.cloud.datacatalog_v1beta1.types import datacatalog +from google.cloud.datacatalog_v1beta1.types import search +from google.cloud.datacatalog_v1beta1.types import tags + + +class SearchCatalogPager: + """A pager for iterating through ``search_catalog`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.SearchCatalogResponse` object, and + provides an ``__iter__`` method to iterate through its + ``results`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``SearchCatalog`` requests and continue to iterate + through the ``results`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.SearchCatalogResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., datacatalog.SearchCatalogResponse], + request: datacatalog.SearchCatalogRequest, + response: datacatalog.SearchCatalogResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.SearchCatalogRequest`): + The initial request object. + response (:class:`~.datacatalog.SearchCatalogResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.SearchCatalogRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[datacatalog.SearchCatalogResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[search.SearchCatalogResult]: + for page in self.pages: + yield from page.results + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class SearchCatalogAsyncPager: + """A pager for iterating through ``search_catalog`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.SearchCatalogResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``results`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``SearchCatalog`` requests and continue to iterate + through the ``results`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.SearchCatalogResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[datacatalog.SearchCatalogResponse]], + request: datacatalog.SearchCatalogRequest, + response: datacatalog.SearchCatalogResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.SearchCatalogRequest`): + The initial request object. + response (:class:`~.datacatalog.SearchCatalogResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.SearchCatalogRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[datacatalog.SearchCatalogResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[search.SearchCatalogResult]: + async def async_generator(): + async for page in self.pages: + for response in page.results: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListEntryGroupsPager: + """A pager for iterating through ``list_entry_groups`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListEntryGroupsResponse` object, and + provides an ``__iter__`` method to iterate through its + ``entry_groups`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``ListEntryGroups`` requests and continue to iterate + through the ``entry_groups`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListEntryGroupsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., datacatalog.ListEntryGroupsResponse], + request: datacatalog.ListEntryGroupsRequest, + response: datacatalog.ListEntryGroupsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListEntryGroupsRequest`): + The initial request object. + response (:class:`~.datacatalog.ListEntryGroupsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListEntryGroupsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[datacatalog.ListEntryGroupsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[datacatalog.EntryGroup]: + for page in self.pages: + yield from page.entry_groups + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListEntryGroupsAsyncPager: + """A pager for iterating through ``list_entry_groups`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListEntryGroupsResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``entry_groups`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``ListEntryGroups`` requests and continue to iterate + through the ``entry_groups`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListEntryGroupsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[datacatalog.ListEntryGroupsResponse]], + request: datacatalog.ListEntryGroupsRequest, + response: datacatalog.ListEntryGroupsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListEntryGroupsRequest`): + The initial request object. + response (:class:`~.datacatalog.ListEntryGroupsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListEntryGroupsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[datacatalog.ListEntryGroupsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[datacatalog.EntryGroup]: + async def async_generator(): + async for page in self.pages: + for response in page.entry_groups: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListEntriesPager: + """A pager for iterating through ``list_entries`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListEntriesResponse` object, and + provides an ``__iter__`` method to iterate through its + ``entries`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``ListEntries`` requests and continue to iterate + through the ``entries`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListEntriesResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., datacatalog.ListEntriesResponse], + request: datacatalog.ListEntriesRequest, + response: datacatalog.ListEntriesResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListEntriesRequest`): + The initial request object. + response (:class:`~.datacatalog.ListEntriesResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListEntriesRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[datacatalog.ListEntriesResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[datacatalog.Entry]: + for page in self.pages: + yield from page.entries + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListEntriesAsyncPager: + """A pager for iterating through ``list_entries`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListEntriesResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``entries`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``ListEntries`` requests and continue to iterate + through the ``entries`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListEntriesResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[datacatalog.ListEntriesResponse]], + request: datacatalog.ListEntriesRequest, + response: datacatalog.ListEntriesResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListEntriesRequest`): + The initial request object. + response (:class:`~.datacatalog.ListEntriesResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListEntriesRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[datacatalog.ListEntriesResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[datacatalog.Entry]: + async def async_generator(): + async for page in self.pages: + for response in page.entries: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListTagsPager: + """A pager for iterating through ``list_tags`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListTagsResponse` object, and + provides an ``__iter__`` method to iterate through its + ``tags`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``ListTags`` requests and continue to iterate + through the ``tags`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListTagsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., datacatalog.ListTagsResponse], + request: datacatalog.ListTagsRequest, + response: datacatalog.ListTagsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListTagsRequest`): + The initial request object. + response (:class:`~.datacatalog.ListTagsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListTagsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[datacatalog.ListTagsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[tags.Tag]: + for page in self.pages: + yield from page.tags + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListTagsAsyncPager: + """A pager for iterating through ``list_tags`` requests. + + This class thinly wraps an initial + :class:`~.datacatalog.ListTagsResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``tags`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``ListTags`` requests and continue to iterate + through the ``tags`` field on the + corresponding responses. + + All the usual :class:`~.datacatalog.ListTagsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[datacatalog.ListTagsResponse]], + request: datacatalog.ListTagsRequest, + response: datacatalog.ListTagsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.datacatalog.ListTagsRequest`): + The initial request object. + response (:class:`~.datacatalog.ListTagsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = datacatalog.ListTagsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[datacatalog.ListTagsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[tags.Tag]: + async def async_generator(): + async for page in self.pages: + for response in page.tags: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/__init__.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/__init__.py new file mode 100644 index 00000000..77a41a96 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/__init__.py @@ -0,0 +1,36 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +from typing import Dict, Type + +from .base import DataCatalogTransport +from .grpc import DataCatalogGrpcTransport +from .grpc_asyncio import DataCatalogGrpcAsyncIOTransport + + +# Compile a registry of transports. +_transport_registry = OrderedDict() # type: Dict[str, Type[DataCatalogTransport]] +_transport_registry["grpc"] = DataCatalogGrpcTransport +_transport_registry["grpc_asyncio"] = DataCatalogGrpcAsyncIOTransport + + +__all__ = ( + "DataCatalogTransport", + "DataCatalogGrpcTransport", + "DataCatalogGrpcAsyncIOTransport", +) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/base.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/base.py new file mode 100644 index 00000000..fac99233 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/base.py @@ -0,0 +1,560 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import abc +import typing +import pkg_resources + +from google import auth +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore + +from google.cloud.datacatalog_v1beta1.types import datacatalog +from google.cloud.datacatalog_v1beta1.types import tags +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import empty_pb2 as empty # type: ignore + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +class DataCatalogTransport(abc.ABC): + """Abstract transport class for DataCatalog.""" + + AUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",) + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: typing.Optional[str] = None, + scopes: typing.Optional[typing.Sequence[str]] = AUTH_SCOPES, + quota_project_id: typing.Optional[str] = None, + **kwargs, + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is mutually exclusive with credentials. + scope (Optional[Sequence[str]]): A list of scopes. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + """ + # Save the hostname. Default to port 443 (HTTPS) if none is specified. + if ":" not in host: + host += ":443" + self._host = host + + # If no credentials are provided, then determine the appropriate + # defaults. + if credentials and credentials_file: + raise exceptions.DuplicateCredentialArgs( + "'credentials_file' and 'credentials' are mutually exclusive" + ) + + if credentials_file is not None: + credentials, _ = auth.load_credentials_from_file( + credentials_file, scopes=scopes, quota_project_id=quota_project_id + ) + + elif credentials is None: + credentials, _ = auth.default( + scopes=scopes, quota_project_id=quota_project_id + ) + + # Save the credentials. + self._credentials = credentials + + # Lifted into its own function so it can be stubbed out during tests. + self._prep_wrapped_messages() + + def _prep_wrapped_messages(self): + # Precompute the wrapped methods. + self._wrapped_methods = { + self.search_catalog: gapic_v1.method.wrap_method( + self.search_catalog, default_timeout=None, client_info=_client_info, + ), + self.create_entry_group: gapic_v1.method.wrap_method( + self.create_entry_group, default_timeout=None, client_info=_client_info, + ), + self.update_entry_group: gapic_v1.method.wrap_method( + self.update_entry_group, default_timeout=None, client_info=_client_info, + ), + self.get_entry_group: gapic_v1.method.wrap_method( + self.get_entry_group, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.delete_entry_group: gapic_v1.method.wrap_method( + self.delete_entry_group, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.list_entry_groups: gapic_v1.method.wrap_method( + self.list_entry_groups, default_timeout=None, client_info=_client_info, + ), + self.create_entry: gapic_v1.method.wrap_method( + self.create_entry, default_timeout=None, client_info=_client_info, + ), + self.update_entry: gapic_v1.method.wrap_method( + self.update_entry, default_timeout=None, client_info=_client_info, + ), + self.delete_entry: gapic_v1.method.wrap_method( + self.delete_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.get_entry: gapic_v1.method.wrap_method( + self.get_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.lookup_entry: gapic_v1.method.wrap_method( + self.lookup_entry, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.list_entries: gapic_v1.method.wrap_method( + self.list_entries, default_timeout=None, client_info=_client_info, + ), + self.create_tag_template: gapic_v1.method.wrap_method( + self.create_tag_template, + default_timeout=None, + client_info=_client_info, + ), + self.get_tag_template: gapic_v1.method.wrap_method( + self.get_tag_template, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.update_tag_template: gapic_v1.method.wrap_method( + self.update_tag_template, + default_timeout=None, + client_info=_client_info, + ), + self.delete_tag_template: gapic_v1.method.wrap_method( + self.delete_tag_template, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.create_tag_template_field: gapic_v1.method.wrap_method( + self.create_tag_template_field, + default_timeout=None, + client_info=_client_info, + ), + self.update_tag_template_field: gapic_v1.method.wrap_method( + self.update_tag_template_field, + default_timeout=None, + client_info=_client_info, + ), + self.rename_tag_template_field: gapic_v1.method.wrap_method( + self.rename_tag_template_field, + default_timeout=None, + client_info=_client_info, + ), + self.delete_tag_template_field: gapic_v1.method.wrap_method( + self.delete_tag_template_field, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.create_tag: gapic_v1.method.wrap_method( + self.create_tag, default_timeout=None, client_info=_client_info, + ), + self.update_tag: gapic_v1.method.wrap_method( + self.update_tag, default_timeout=None, client_info=_client_info, + ), + self.delete_tag: gapic_v1.method.wrap_method( + self.delete_tag, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.list_tags: gapic_v1.method.wrap_method( + self.list_tags, + default_retry=retries.Retry( + initial=0.1, + maximum=60.0, + multiplier=1.3, + predicate=retries.if_exception_type( + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, + ), + ), + default_timeout=60.0, + client_info=_client_info, + ), + self.set_iam_policy: gapic_v1.method.wrap_method( + self.set_iam_policy, default_timeout=None, client_info=_client_info, + ), + self.get_iam_policy: gapic_v1.method.wrap_method( + self.get_iam_policy, default_timeout=None, client_info=_client_info, + ), + self.test_iam_permissions: gapic_v1.method.wrap_method( + self.test_iam_permissions, + default_timeout=None, + client_info=_client_info, + ), + } + + @property + def search_catalog( + self, + ) -> typing.Callable[ + [datacatalog.SearchCatalogRequest], + typing.Union[ + datacatalog.SearchCatalogResponse, + typing.Awaitable[datacatalog.SearchCatalogResponse], + ], + ]: + raise NotImplementedError() + + @property + def create_entry_group( + self, + ) -> typing.Callable[ + [datacatalog.CreateEntryGroupRequest], + typing.Union[datacatalog.EntryGroup, typing.Awaitable[datacatalog.EntryGroup]], + ]: + raise NotImplementedError() + + @property + def update_entry_group( + self, + ) -> typing.Callable[ + [datacatalog.UpdateEntryGroupRequest], + typing.Union[datacatalog.EntryGroup, typing.Awaitable[datacatalog.EntryGroup]], + ]: + raise NotImplementedError() + + @property + def get_entry_group( + self, + ) -> typing.Callable[ + [datacatalog.GetEntryGroupRequest], + typing.Union[datacatalog.EntryGroup, typing.Awaitable[datacatalog.EntryGroup]], + ]: + raise NotImplementedError() + + @property + def delete_entry_group( + self, + ) -> typing.Callable[ + [datacatalog.DeleteEntryGroupRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def list_entry_groups( + self, + ) -> typing.Callable[ + [datacatalog.ListEntryGroupsRequest], + typing.Union[ + datacatalog.ListEntryGroupsResponse, + typing.Awaitable[datacatalog.ListEntryGroupsResponse], + ], + ]: + raise NotImplementedError() + + @property + def create_entry( + self, + ) -> typing.Callable[ + [datacatalog.CreateEntryRequest], + typing.Union[datacatalog.Entry, typing.Awaitable[datacatalog.Entry]], + ]: + raise NotImplementedError() + + @property + def update_entry( + self, + ) -> typing.Callable[ + [datacatalog.UpdateEntryRequest], + typing.Union[datacatalog.Entry, typing.Awaitable[datacatalog.Entry]], + ]: + raise NotImplementedError() + + @property + def delete_entry( + self, + ) -> typing.Callable[ + [datacatalog.DeleteEntryRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def get_entry( + self, + ) -> typing.Callable[ + [datacatalog.GetEntryRequest], + typing.Union[datacatalog.Entry, typing.Awaitable[datacatalog.Entry]], + ]: + raise NotImplementedError() + + @property + def lookup_entry( + self, + ) -> typing.Callable[ + [datacatalog.LookupEntryRequest], + typing.Union[datacatalog.Entry, typing.Awaitable[datacatalog.Entry]], + ]: + raise NotImplementedError() + + @property + def list_entries( + self, + ) -> typing.Callable[ + [datacatalog.ListEntriesRequest], + typing.Union[ + datacatalog.ListEntriesResponse, + typing.Awaitable[datacatalog.ListEntriesResponse], + ], + ]: + raise NotImplementedError() + + @property + def create_tag_template( + self, + ) -> typing.Callable[ + [datacatalog.CreateTagTemplateRequest], + typing.Union[tags.TagTemplate, typing.Awaitable[tags.TagTemplate]], + ]: + raise NotImplementedError() + + @property + def get_tag_template( + self, + ) -> typing.Callable[ + [datacatalog.GetTagTemplateRequest], + typing.Union[tags.TagTemplate, typing.Awaitable[tags.TagTemplate]], + ]: + raise NotImplementedError() + + @property + def update_tag_template( + self, + ) -> typing.Callable[ + [datacatalog.UpdateTagTemplateRequest], + typing.Union[tags.TagTemplate, typing.Awaitable[tags.TagTemplate]], + ]: + raise NotImplementedError() + + @property + def delete_tag_template( + self, + ) -> typing.Callable[ + [datacatalog.DeleteTagTemplateRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def create_tag_template_field( + self, + ) -> typing.Callable[ + [datacatalog.CreateTagTemplateFieldRequest], + typing.Union[tags.TagTemplateField, typing.Awaitable[tags.TagTemplateField]], + ]: + raise NotImplementedError() + + @property + def update_tag_template_field( + self, + ) -> typing.Callable[ + [datacatalog.UpdateTagTemplateFieldRequest], + typing.Union[tags.TagTemplateField, typing.Awaitable[tags.TagTemplateField]], + ]: + raise NotImplementedError() + + @property + def rename_tag_template_field( + self, + ) -> typing.Callable[ + [datacatalog.RenameTagTemplateFieldRequest], + typing.Union[tags.TagTemplateField, typing.Awaitable[tags.TagTemplateField]], + ]: + raise NotImplementedError() + + @property + def delete_tag_template_field( + self, + ) -> typing.Callable[ + [datacatalog.DeleteTagTemplateFieldRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def create_tag( + self, + ) -> typing.Callable[ + [datacatalog.CreateTagRequest], + typing.Union[tags.Tag, typing.Awaitable[tags.Tag]], + ]: + raise NotImplementedError() + + @property + def update_tag( + self, + ) -> typing.Callable[ + [datacatalog.UpdateTagRequest], + typing.Union[tags.Tag, typing.Awaitable[tags.Tag]], + ]: + raise NotImplementedError() + + @property + def delete_tag( + self, + ) -> typing.Callable[ + [datacatalog.DeleteTagRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def list_tags( + self, + ) -> typing.Callable[ + [datacatalog.ListTagsRequest], + typing.Union[ + datacatalog.ListTagsResponse, typing.Awaitable[datacatalog.ListTagsResponse] + ], + ]: + raise NotImplementedError() + + @property + def set_iam_policy( + self, + ) -> typing.Callable[ + [iam_policy.SetIamPolicyRequest], + typing.Union[policy.Policy, typing.Awaitable[policy.Policy]], + ]: + raise NotImplementedError() + + @property + def get_iam_policy( + self, + ) -> typing.Callable[ + [iam_policy.GetIamPolicyRequest], + typing.Union[policy.Policy, typing.Awaitable[policy.Policy]], + ]: + raise NotImplementedError() + + @property + def test_iam_permissions( + self, + ) -> typing.Callable[ + [iam_policy.TestIamPermissionsRequest], + typing.Union[ + iam_policy.TestIamPermissionsResponse, + typing.Awaitable[iam_policy.TestIamPermissionsResponse], + ], + ]: + raise NotImplementedError() + + +__all__ = ("DataCatalogTransport",) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py new file mode 100644 index 00000000..1b96a954 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py @@ -0,0 +1,1054 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Callable, Dict, Optional, Sequence, Tuple + +from google.api_core import grpc_helpers # type: ignore +from google import auth # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport.grpc import SslCredentials # type: ignore + + +import grpc # type: ignore + +from google.cloud.datacatalog_v1beta1.types import datacatalog +from google.cloud.datacatalog_v1beta1.types import tags +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import empty_pb2 as empty # type: ignore + +from .base import DataCatalogTransport + + +class DataCatalogGrpcTransport(DataCatalogTransport): + """gRPC backend transport for DataCatalog. + + Data Catalog API service allows clients to discover, + understand, and manage their data. + + This class defines the same methods as the primary client, so the + primary client can load the underlying transport implementation + and call it. + + It sends protocol buffers over the wire using gRPC (which is built on + top of HTTP/2); the ``grpcio`` package must be installed. + """ + + _stubs: Dict[str, Callable] + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: str = None, + scopes: Sequence[str] = None, + channel: grpc.Channel = None, + api_mtls_endpoint: str = None, + client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, + quota_project_id: Optional[str] = None + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + This argument is ignored if ``channel`` is provided. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional(Sequence[str])): A list of scopes. This argument is + ignored if ``channel`` is provided. + channel (Optional[grpc.Channel]): A ``Channel`` instance through + which to make calls. + api_mtls_endpoint (Optional[str]): The mutual TLS endpoint. If + provided, it overrides the ``host`` argument and tries to create + a mutual TLS channel with client SSL credentials from + ``client_cert_source`` or applicatin default SSL credentials. + client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]): A + callback to provide client SSL certificate bytes and private key + bytes, both in PEM format. It is ignored if ``api_mtls_endpoint`` + is None. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + + Raises: + google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport + creation failed for any reason. + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + if channel: + # Sanity check: Ensure that channel and credentials are not both + # provided. + credentials = False + + # If a channel was explicitly provided, set it. + self._grpc_channel = channel + elif api_mtls_endpoint: + host = ( + api_mtls_endpoint + if ":" in api_mtls_endpoint + else api_mtls_endpoint + ":443" + ) + + if credentials is None: + credentials, _ = auth.default( + scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id + ) + + # Create SSL credentials with client_cert_source or application + # default SSL credentials. + if client_cert_source: + cert, key = client_cert_source() + ssl_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + else: + ssl_credentials = SslCredentials().ssl_credentials + + # create a new channel. The provided one is ignored. + self._grpc_channel = type(self).create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + ssl_credentials=ssl_credentials, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + self._stubs = {} # type: Dict[str, Callable] + + # Run the base constructor. + super().__init__( + host=host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + @classmethod + def create_channel( + cls, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: str = None, + scopes: Optional[Sequence[str]] = None, + quota_project_id: Optional[str] = None, + **kwargs + ) -> grpc.Channel: + """Create and return a gRPC channel object. + Args: + address (Optionsl[str]): The host for the channel to use. + credentials (Optional[~.Credentials]): The + authorization credentials to attach to requests. These + credentials identify this application to the service. If + none are specified, the client will attempt to ascertain + the credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is mutually exclusive with credentials. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + kwargs (Optional[dict]): Keyword arguments, which are passed to the + channel creation. + Returns: + grpc.Channel: A gRPC channel object. + + Raises: + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + scopes = scopes or cls.AUTH_SCOPES + return grpc_helpers.create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes, + quota_project_id=quota_project_id, + **kwargs + ) + + @property + def grpc_channel(self) -> grpc.Channel: + """Create the channel designed to connect to this service. + + This property caches on the instance; repeated calls return + the same channel. + """ + # Sanity check: Only create a new channel if we do not already + # have one. + if not hasattr(self, "_grpc_channel"): + self._grpc_channel = self.create_channel( + self._host, credentials=self._credentials, + ) + + # Return the channel from cache. + return self._grpc_channel + + @property + def search_catalog( + self, + ) -> Callable[ + [datacatalog.SearchCatalogRequest], datacatalog.SearchCatalogResponse + ]: + r"""Return a callable for the search catalog method over gRPC. + + Searches Data Catalog for multiple resources like entries, tags + that match a query. + + This is a custom method + (https://cloud.google.com/apis/design/custom_methods) and does + not return the complete resource, only the resource identifier + and high level fields. Clients can subsequentally call ``Get`` + methods. + + Note that Data Catalog search queries do not guarantee full + recall. Query results that match your query may not be returned, + even in subsequent result pages. Also note that results returned + (and not returned) can vary across repeated search queries. + + See `Data Catalog Search + Syntax `__ + for more information. + + Returns: + Callable[[~.SearchCatalogRequest], + ~.SearchCatalogResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "search_catalog" not in self._stubs: + self._stubs["search_catalog"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/SearchCatalog", + request_serializer=datacatalog.SearchCatalogRequest.serialize, + response_deserializer=datacatalog.SearchCatalogResponse.deserialize, + ) + return self._stubs["search_catalog"] + + @property + def create_entry_group( + self, + ) -> Callable[[datacatalog.CreateEntryGroupRequest], datacatalog.EntryGroup]: + r"""Return a callable for the create entry group method over gRPC. + + A maximum of 10,000 entry groups may be created per organization + across all locations. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.CreateEntryGroupRequest], + ~.EntryGroup]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_entry_group" not in self._stubs: + self._stubs["create_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateEntryGroup", + request_serializer=datacatalog.CreateEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["create_entry_group"] + + @property + def update_entry_group( + self, + ) -> Callable[[datacatalog.UpdateEntryGroupRequest], datacatalog.EntryGroup]: + r"""Return a callable for the update entry group method over gRPC. + + Updates an EntryGroup. The user should enable the Data Catalog + API in the project identified by the ``entry_group.name`` + parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateEntryGroupRequest], + ~.EntryGroup]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_entry_group" not in self._stubs: + self._stubs["update_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateEntryGroup", + request_serializer=datacatalog.UpdateEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["update_entry_group"] + + @property + def get_entry_group( + self, + ) -> Callable[[datacatalog.GetEntryGroupRequest], datacatalog.EntryGroup]: + r"""Return a callable for the get entry group method over gRPC. + + Gets an EntryGroup. + + Returns: + Callable[[~.GetEntryGroupRequest], + ~.EntryGroup]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_entry_group" not in self._stubs: + self._stubs["get_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/GetEntryGroup", + request_serializer=datacatalog.GetEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["get_entry_group"] + + @property + def delete_entry_group( + self, + ) -> Callable[[datacatalog.DeleteEntryGroupRequest], empty.Empty]: + r"""Return a callable for the delete entry group method over gRPC. + + Deletes an EntryGroup. Only entry groups that do not contain + entries can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteEntryGroupRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_entry_group" not in self._stubs: + self._stubs["delete_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteEntryGroup", + request_serializer=datacatalog.DeleteEntryGroupRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_entry_group"] + + @property + def list_entry_groups( + self, + ) -> Callable[ + [datacatalog.ListEntryGroupsRequest], datacatalog.ListEntryGroupsResponse + ]: + r"""Return a callable for the list entry groups method over gRPC. + + Lists entry groups. + + Returns: + Callable[[~.ListEntryGroupsRequest], + ~.ListEntryGroupsResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_entry_groups" not in self._stubs: + self._stubs["list_entry_groups"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/ListEntryGroups", + request_serializer=datacatalog.ListEntryGroupsRequest.serialize, + response_deserializer=datacatalog.ListEntryGroupsResponse.deserialize, + ) + return self._stubs["list_entry_groups"] + + @property + def create_entry( + self, + ) -> Callable[[datacatalog.CreateEntryRequest], datacatalog.Entry]: + r"""Return a callable for the create entry method over gRPC. + + Creates an entry. Only entries of 'FILESET' type or + user-specified type can be created. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + A maximum of 100,000 entries may be created per entry group. + + Returns: + Callable[[~.CreateEntryRequest], + ~.Entry]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_entry" not in self._stubs: + self._stubs["create_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateEntry", + request_serializer=datacatalog.CreateEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["create_entry"] + + @property + def update_entry( + self, + ) -> Callable[[datacatalog.UpdateEntryRequest], datacatalog.Entry]: + r"""Return a callable for the update entry method over gRPC. + + Updates an existing entry. Users should enable the Data Catalog + API in the project identified by the ``entry.name`` parameter + (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateEntryRequest], + ~.Entry]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_entry" not in self._stubs: + self._stubs["update_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateEntry", + request_serializer=datacatalog.UpdateEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["update_entry"] + + @property + def delete_entry(self) -> Callable[[datacatalog.DeleteEntryRequest], empty.Empty]: + r"""Return a callable for the delete entry method over gRPC. + + Deletes an existing entry. Only entries created through + [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry] + method can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteEntryRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_entry" not in self._stubs: + self._stubs["delete_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteEntry", + request_serializer=datacatalog.DeleteEntryRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_entry"] + + @property + def get_entry(self) -> Callable[[datacatalog.GetEntryRequest], datacatalog.Entry]: + r"""Return a callable for the get entry method over gRPC. + + Gets an entry. + + Returns: + Callable[[~.GetEntryRequest], + ~.Entry]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_entry" not in self._stubs: + self._stubs["get_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/GetEntry", + request_serializer=datacatalog.GetEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["get_entry"] + + @property + def lookup_entry( + self, + ) -> Callable[[datacatalog.LookupEntryRequest], datacatalog.Entry]: + r"""Return a callable for the lookup entry method over gRPC. + + Get an entry by target resource name. This method + allows clients to use the resource name from the source + Google Cloud Platform service to get the Data Catalog + Entry. + + Returns: + Callable[[~.LookupEntryRequest], + ~.Entry]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "lookup_entry" not in self._stubs: + self._stubs["lookup_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/LookupEntry", + request_serializer=datacatalog.LookupEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["lookup_entry"] + + @property + def list_entries( + self, + ) -> Callable[[datacatalog.ListEntriesRequest], datacatalog.ListEntriesResponse]: + r"""Return a callable for the list entries method over gRPC. + + Lists entries. + + Returns: + Callable[[~.ListEntriesRequest], + ~.ListEntriesResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_entries" not in self._stubs: + self._stubs["list_entries"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/ListEntries", + request_serializer=datacatalog.ListEntriesRequest.serialize, + response_deserializer=datacatalog.ListEntriesResponse.deserialize, + ) + return self._stubs["list_entries"] + + @property + def create_tag_template( + self, + ) -> Callable[[datacatalog.CreateTagTemplateRequest], tags.TagTemplate]: + r"""Return a callable for the create tag template method over gRPC. + + Creates a tag template. The user should enable the Data Catalog + API in the project identified by the ``parent`` parameter (see + `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.CreateTagTemplateRequest], + ~.TagTemplate]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag_template" not in self._stubs: + self._stubs["create_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTagTemplate", + request_serializer=datacatalog.CreateTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["create_tag_template"] + + @property + def get_tag_template( + self, + ) -> Callable[[datacatalog.GetTagTemplateRequest], tags.TagTemplate]: + r"""Return a callable for the get tag template method over gRPC. + + Gets a tag template. + + Returns: + Callable[[~.GetTagTemplateRequest], + ~.TagTemplate]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_tag_template" not in self._stubs: + self._stubs["get_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/GetTagTemplate", + request_serializer=datacatalog.GetTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["get_tag_template"] + + @property + def update_tag_template( + self, + ) -> Callable[[datacatalog.UpdateTagTemplateRequest], tags.TagTemplate]: + r"""Return a callable for the update tag template method over gRPC. + + Updates a tag template. This method cannot be used to update the + fields of a template. The tag template fields are represented as + separate resources and should be updated using their own + create/update/delete methods. Users should enable the Data + Catalog API in the project identified by the + ``tag_template.name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateTagTemplateRequest], + ~.TagTemplate]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag_template" not in self._stubs: + self._stubs["update_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTagTemplate", + request_serializer=datacatalog.UpdateTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["update_tag_template"] + + @property + def delete_tag_template( + self, + ) -> Callable[[datacatalog.DeleteTagTemplateRequest], empty.Empty]: + r"""Return a callable for the delete tag template method over gRPC. + + Deletes a tag template and all tags using the template. Users + should enable the Data Catalog API in the project identified by + the ``name`` parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteTagTemplateRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag_template" not in self._stubs: + self._stubs["delete_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTagTemplate", + request_serializer=datacatalog.DeleteTagTemplateRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag_template"] + + @property + def create_tag_template_field( + self, + ) -> Callable[[datacatalog.CreateTagTemplateFieldRequest], tags.TagTemplateField]: + r"""Return a callable for the create tag template field method over gRPC. + + Creates a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``parent`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.CreateTagTemplateFieldRequest], + ~.TagTemplateField]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag_template_field" not in self._stubs: + self._stubs["create_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTagTemplateField", + request_serializer=datacatalog.CreateTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["create_tag_template_field"] + + @property + def update_tag_template_field( + self, + ) -> Callable[[datacatalog.UpdateTagTemplateFieldRequest], tags.TagTemplateField]: + r"""Return a callable for the update tag template field method over gRPC. + + Updates a field in a tag template. This method cannot be used to + update the field type. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateTagTemplateFieldRequest], + ~.TagTemplateField]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag_template_field" not in self._stubs: + self._stubs["update_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTagTemplateField", + request_serializer=datacatalog.UpdateTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["update_tag_template_field"] + + @property + def rename_tag_template_field( + self, + ) -> Callable[[datacatalog.RenameTagTemplateFieldRequest], tags.TagTemplateField]: + r"""Return a callable for the rename tag template field method over gRPC. + + Renames a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``name`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.RenameTagTemplateFieldRequest], + ~.TagTemplateField]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "rename_tag_template_field" not in self._stubs: + self._stubs["rename_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/RenameTagTemplateField", + request_serializer=datacatalog.RenameTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["rename_tag_template_field"] + + @property + def delete_tag_template_field( + self, + ) -> Callable[[datacatalog.DeleteTagTemplateFieldRequest], empty.Empty]: + r"""Return a callable for the delete tag template field method over gRPC. + + Deletes a field in a tag template and all uses of that field. + Users should enable the Data Catalog API in the project + identified by the ``name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteTagTemplateFieldRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag_template_field" not in self._stubs: + self._stubs["delete_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTagTemplateField", + request_serializer=datacatalog.DeleteTagTemplateFieldRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag_template_field"] + + @property + def create_tag(self) -> Callable[[datacatalog.CreateTagRequest], tags.Tag]: + r"""Return a callable for the create tag method over gRPC. + + Creates a tag on an + [Entry][google.cloud.datacatalog.v1beta1.Entry]. Note: The + project identified by the ``parent`` parameter for the + `tag `__ + and the `tag + template `__ + used to create the tag must be from the same organization. + + Returns: + Callable[[~.CreateTagRequest], + ~.Tag]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag" not in self._stubs: + self._stubs["create_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTag", + request_serializer=datacatalog.CreateTagRequest.serialize, + response_deserializer=tags.Tag.deserialize, + ) + return self._stubs["create_tag"] + + @property + def update_tag(self) -> Callable[[datacatalog.UpdateTagRequest], tags.Tag]: + r"""Return a callable for the update tag method over gRPC. + + Updates an existing tag. + + Returns: + Callable[[~.UpdateTagRequest], + ~.Tag]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag" not in self._stubs: + self._stubs["update_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTag", + request_serializer=datacatalog.UpdateTagRequest.serialize, + response_deserializer=tags.Tag.deserialize, + ) + return self._stubs["update_tag"] + + @property + def delete_tag(self) -> Callable[[datacatalog.DeleteTagRequest], empty.Empty]: + r"""Return a callable for the delete tag method over gRPC. + + Deletes a tag. + + Returns: + Callable[[~.DeleteTagRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag" not in self._stubs: + self._stubs["delete_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTag", + request_serializer=datacatalog.DeleteTagRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag"] + + @property + def list_tags( + self, + ) -> Callable[[datacatalog.ListTagsRequest], datacatalog.ListTagsResponse]: + r"""Return a callable for the list tags method over gRPC. + + Lists the tags on an + [Entry][google.cloud.datacatalog.v1beta1.Entry]. + + Returns: + Callable[[~.ListTagsRequest], + ~.ListTagsResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_tags" not in self._stubs: + self._stubs["list_tags"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/ListTags", + request_serializer=datacatalog.ListTagsRequest.serialize, + response_deserializer=datacatalog.ListTagsResponse.deserialize, + ) + return self._stubs["list_tags"] + + @property + def set_iam_policy( + self, + ) -> Callable[[iam_policy.SetIamPolicyRequest], policy.Policy]: + r"""Return a callable for the set iam policy method over gRPC. + + Sets the access control policy for a resource. Replaces any + existing policy. Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on + tag templates. + - ``datacatalog.entries.setIamPolicy`` to set policies on + entries. + - ``datacatalog.entryGroups.setIamPolicy`` to set policies on + entry groups. + + Returns: + Callable[[~.SetIamPolicyRequest], + ~.Policy]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "set_iam_policy" not in self._stubs: + self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/SetIamPolicy", + request_serializer=iam_policy.SetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["set_iam_policy"] + + @property + def get_iam_policy( + self, + ) -> Callable[[iam_policy.GetIamPolicyRequest], policy.Policy]: + r"""Return a callable for the get iam policy method over gRPC. + + Gets the access control policy for a resource. A ``NOT_FOUND`` + error is returned if the resource does not exist. An empty + policy is returned if the resource exists but does not have a + policy set on it. + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on + tag templates. + - ``datacatalog.entries.getIamPolicy`` to get policies on + entries. + - ``datacatalog.entryGroups.getIamPolicy`` to get policies on + entry groups. + + Returns: + Callable[[~.GetIamPolicyRequest], + ~.Policy]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_iam_policy" not in self._stubs: + self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/GetIamPolicy", + request_serializer=iam_policy.GetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["get_iam_policy"] + + @property + def test_iam_permissions( + self, + ) -> Callable[ + [iam_policy.TestIamPermissionsRequest], iam_policy.TestIamPermissionsResponse + ]: + r"""Return a callable for the test iam permissions method over gRPC. + + Returns the caller's permissions on a resource. If the resource + does not exist, an empty set of permissions is returned (We + don't return a ``NOT_FOUND`` error). + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + A caller is not required to have Google IAM permission to make + this request. + + Returns: + Callable[[~.TestIamPermissionsRequest], + ~.TestIamPermissionsResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "test_iam_permissions" not in self._stubs: + self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/TestIamPermissions", + request_serializer=iam_policy.TestIamPermissionsRequest.SerializeToString, + response_deserializer=iam_policy.TestIamPermissionsResponse.FromString, + ) + return self._stubs["test_iam_permissions"] + + +__all__ = ("DataCatalogGrpcTransport",) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py new file mode 100644 index 00000000..1d7f80fd --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py @@ -0,0 +1,1075 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple + +from google.api_core import grpc_helpers_async # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport.grpc import SslCredentials # type: ignore + +import grpc # type: ignore +from grpc.experimental import aio # type: ignore + +from google.cloud.datacatalog_v1beta1.types import datacatalog +from google.cloud.datacatalog_v1beta1.types import tags +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import empty_pb2 as empty # type: ignore + +from .base import DataCatalogTransport +from .grpc import DataCatalogGrpcTransport + + +class DataCatalogGrpcAsyncIOTransport(DataCatalogTransport): + """gRPC AsyncIO backend transport for DataCatalog. + + Data Catalog API service allows clients to discover, + understand, and manage their data. + + This class defines the same methods as the primary client, so the + primary client can load the underlying transport implementation + and call it. + + It sends protocol buffers over the wire using gRPC (which is built on + top of HTTP/2); the ``grpcio`` package must be installed. + """ + + _grpc_channel: aio.Channel + _stubs: Dict[str, Callable] = {} + + @classmethod + def create_channel( + cls, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: Optional[str] = None, + scopes: Optional[Sequence[str]] = None, + quota_project_id: Optional[str] = None, + **kwargs, + ) -> aio.Channel: + """Create and return a gRPC AsyncIO channel object. + Args: + address (Optional[str]): The host for the channel to use. + credentials (Optional[~.Credentials]): The + authorization credentials to attach to requests. These + credentials identify this application to the service. If + none are specified, the client will attempt to ascertain + the credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + kwargs (Optional[dict]): Keyword arguments, which are passed to the + channel creation. + Returns: + aio.Channel: A gRPC AsyncIO channel object. + """ + scopes = scopes or cls.AUTH_SCOPES + return grpc_helpers_async.create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes, + quota_project_id=quota_project_id, + **kwargs, + ) + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: Optional[str] = None, + scopes: Optional[Sequence[str]] = None, + channel: aio.Channel = None, + api_mtls_endpoint: str = None, + client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, + quota_project_id=None, + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + This argument is ignored if ``channel`` is provided. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + channel (Optional[aio.Channel]): A ``Channel`` instance through + which to make calls. + api_mtls_endpoint (Optional[str]): The mutual TLS endpoint. If + provided, it overrides the ``host`` argument and tries to create + a mutual TLS channel with client SSL credentials from + ``client_cert_source`` or applicatin default SSL credentials. + client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]): A + callback to provide client SSL certificate bytes and private key + bytes, both in PEM format. It is ignored if ``api_mtls_endpoint`` + is None. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + + Raises: + google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport + creation failed for any reason. + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + if channel: + # Sanity check: Ensure that channel and credentials are not both + # provided. + credentials = False + + # If a channel was explicitly provided, set it. + self._grpc_channel = channel + elif api_mtls_endpoint: + host = ( + api_mtls_endpoint + if ":" in api_mtls_endpoint + else api_mtls_endpoint + ":443" + ) + + # Create SSL credentials with client_cert_source or application + # default SSL credentials. + if client_cert_source: + cert, key = client_cert_source() + ssl_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + else: + ssl_credentials = SslCredentials().ssl_credentials + + # create a new channel. The provided one is ignored. + self._grpc_channel = type(self).create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + ssl_credentials=ssl_credentials, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + # Run the base constructor. + super().__init__( + host=host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + self._stubs = {} + + @property + def grpc_channel(self) -> aio.Channel: + """Create the channel designed to connect to this service. + + This property caches on the instance; repeated calls return + the same channel. + """ + # Sanity check: Only create a new channel if we do not already + # have one. + if not hasattr(self, "_grpc_channel"): + self._grpc_channel = self.create_channel( + self._host, credentials=self._credentials, + ) + + # Return the channel from cache. + return self._grpc_channel + + @property + def search_catalog( + self, + ) -> Callable[ + [datacatalog.SearchCatalogRequest], Awaitable[datacatalog.SearchCatalogResponse] + ]: + r"""Return a callable for the search catalog method over gRPC. + + Searches Data Catalog for multiple resources like entries, tags + that match a query. + + This is a custom method + (https://cloud.google.com/apis/design/custom_methods) and does + not return the complete resource, only the resource identifier + and high level fields. Clients can subsequentally call ``Get`` + methods. + + Note that Data Catalog search queries do not guarantee full + recall. Query results that match your query may not be returned, + even in subsequent result pages. Also note that results returned + (and not returned) can vary across repeated search queries. + + See `Data Catalog Search + Syntax `__ + for more information. + + Returns: + Callable[[~.SearchCatalogRequest], + Awaitable[~.SearchCatalogResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "search_catalog" not in self._stubs: + self._stubs["search_catalog"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/SearchCatalog", + request_serializer=datacatalog.SearchCatalogRequest.serialize, + response_deserializer=datacatalog.SearchCatalogResponse.deserialize, + ) + return self._stubs["search_catalog"] + + @property + def create_entry_group( + self, + ) -> Callable[ + [datacatalog.CreateEntryGroupRequest], Awaitable[datacatalog.EntryGroup] + ]: + r"""Return a callable for the create entry group method over gRPC. + + A maximum of 10,000 entry groups may be created per organization + across all locations. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.CreateEntryGroupRequest], + Awaitable[~.EntryGroup]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_entry_group" not in self._stubs: + self._stubs["create_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateEntryGroup", + request_serializer=datacatalog.CreateEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["create_entry_group"] + + @property + def update_entry_group( + self, + ) -> Callable[ + [datacatalog.UpdateEntryGroupRequest], Awaitable[datacatalog.EntryGroup] + ]: + r"""Return a callable for the update entry group method over gRPC. + + Updates an EntryGroup. The user should enable the Data Catalog + API in the project identified by the ``entry_group.name`` + parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateEntryGroupRequest], + Awaitable[~.EntryGroup]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_entry_group" not in self._stubs: + self._stubs["update_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateEntryGroup", + request_serializer=datacatalog.UpdateEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["update_entry_group"] + + @property + def get_entry_group( + self, + ) -> Callable[ + [datacatalog.GetEntryGroupRequest], Awaitable[datacatalog.EntryGroup] + ]: + r"""Return a callable for the get entry group method over gRPC. + + Gets an EntryGroup. + + Returns: + Callable[[~.GetEntryGroupRequest], + Awaitable[~.EntryGroup]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_entry_group" not in self._stubs: + self._stubs["get_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/GetEntryGroup", + request_serializer=datacatalog.GetEntryGroupRequest.serialize, + response_deserializer=datacatalog.EntryGroup.deserialize, + ) + return self._stubs["get_entry_group"] + + @property + def delete_entry_group( + self, + ) -> Callable[[datacatalog.DeleteEntryGroupRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete entry group method over gRPC. + + Deletes an EntryGroup. Only entry groups that do not contain + entries can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteEntryGroupRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_entry_group" not in self._stubs: + self._stubs["delete_entry_group"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteEntryGroup", + request_serializer=datacatalog.DeleteEntryGroupRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_entry_group"] + + @property + def list_entry_groups( + self, + ) -> Callable[ + [datacatalog.ListEntryGroupsRequest], + Awaitable[datacatalog.ListEntryGroupsResponse], + ]: + r"""Return a callable for the list entry groups method over gRPC. + + Lists entry groups. + + Returns: + Callable[[~.ListEntryGroupsRequest], + Awaitable[~.ListEntryGroupsResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_entry_groups" not in self._stubs: + self._stubs["list_entry_groups"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/ListEntryGroups", + request_serializer=datacatalog.ListEntryGroupsRequest.serialize, + response_deserializer=datacatalog.ListEntryGroupsResponse.deserialize, + ) + return self._stubs["list_entry_groups"] + + @property + def create_entry( + self, + ) -> Callable[[datacatalog.CreateEntryRequest], Awaitable[datacatalog.Entry]]: + r"""Return a callable for the create entry method over gRPC. + + Creates an entry. Only entries of 'FILESET' type or + user-specified type can be created. + + Users should enable the Data Catalog API in the project + identified by the ``parent`` parameter (see [Data Catalog + Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + A maximum of 100,000 entries may be created per entry group. + + Returns: + Callable[[~.CreateEntryRequest], + Awaitable[~.Entry]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_entry" not in self._stubs: + self._stubs["create_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateEntry", + request_serializer=datacatalog.CreateEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["create_entry"] + + @property + def update_entry( + self, + ) -> Callable[[datacatalog.UpdateEntryRequest], Awaitable[datacatalog.Entry]]: + r"""Return a callable for the update entry method over gRPC. + + Updates an existing entry. Users should enable the Data Catalog + API in the project identified by the ``entry.name`` parameter + (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateEntryRequest], + Awaitable[~.Entry]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_entry" not in self._stubs: + self._stubs["update_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateEntry", + request_serializer=datacatalog.UpdateEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["update_entry"] + + @property + def delete_entry( + self, + ) -> Callable[[datacatalog.DeleteEntryRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete entry method over gRPC. + + Deletes an existing entry. Only entries created through + [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry] + method can be deleted. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteEntryRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_entry" not in self._stubs: + self._stubs["delete_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteEntry", + request_serializer=datacatalog.DeleteEntryRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_entry"] + + @property + def get_entry( + self, + ) -> Callable[[datacatalog.GetEntryRequest], Awaitable[datacatalog.Entry]]: + r"""Return a callable for the get entry method over gRPC. + + Gets an entry. + + Returns: + Callable[[~.GetEntryRequest], + Awaitable[~.Entry]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_entry" not in self._stubs: + self._stubs["get_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/GetEntry", + request_serializer=datacatalog.GetEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["get_entry"] + + @property + def lookup_entry( + self, + ) -> Callable[[datacatalog.LookupEntryRequest], Awaitable[datacatalog.Entry]]: + r"""Return a callable for the lookup entry method over gRPC. + + Get an entry by target resource name. This method + allows clients to use the resource name from the source + Google Cloud Platform service to get the Data Catalog + Entry. + + Returns: + Callable[[~.LookupEntryRequest], + Awaitable[~.Entry]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "lookup_entry" not in self._stubs: + self._stubs["lookup_entry"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/LookupEntry", + request_serializer=datacatalog.LookupEntryRequest.serialize, + response_deserializer=datacatalog.Entry.deserialize, + ) + return self._stubs["lookup_entry"] + + @property + def list_entries( + self, + ) -> Callable[ + [datacatalog.ListEntriesRequest], Awaitable[datacatalog.ListEntriesResponse] + ]: + r"""Return a callable for the list entries method over gRPC. + + Lists entries. + + Returns: + Callable[[~.ListEntriesRequest], + Awaitable[~.ListEntriesResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_entries" not in self._stubs: + self._stubs["list_entries"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/ListEntries", + request_serializer=datacatalog.ListEntriesRequest.serialize, + response_deserializer=datacatalog.ListEntriesResponse.deserialize, + ) + return self._stubs["list_entries"] + + @property + def create_tag_template( + self, + ) -> Callable[[datacatalog.CreateTagTemplateRequest], Awaitable[tags.TagTemplate]]: + r"""Return a callable for the create tag template method over gRPC. + + Creates a tag template. The user should enable the Data Catalog + API in the project identified by the ``parent`` parameter (see + `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.CreateTagTemplateRequest], + Awaitable[~.TagTemplate]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag_template" not in self._stubs: + self._stubs["create_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTagTemplate", + request_serializer=datacatalog.CreateTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["create_tag_template"] + + @property + def get_tag_template( + self, + ) -> Callable[[datacatalog.GetTagTemplateRequest], Awaitable[tags.TagTemplate]]: + r"""Return a callable for the get tag template method over gRPC. + + Gets a tag template. + + Returns: + Callable[[~.GetTagTemplateRequest], + Awaitable[~.TagTemplate]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_tag_template" not in self._stubs: + self._stubs["get_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/GetTagTemplate", + request_serializer=datacatalog.GetTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["get_tag_template"] + + @property + def update_tag_template( + self, + ) -> Callable[[datacatalog.UpdateTagTemplateRequest], Awaitable[tags.TagTemplate]]: + r"""Return a callable for the update tag template method over gRPC. + + Updates a tag template. This method cannot be used to update the + fields of a template. The tag template fields are represented as + separate resources and should be updated using their own + create/update/delete methods. Users should enable the Data + Catalog API in the project identified by the + ``tag_template.name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateTagTemplateRequest], + Awaitable[~.TagTemplate]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag_template" not in self._stubs: + self._stubs["update_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTagTemplate", + request_serializer=datacatalog.UpdateTagTemplateRequest.serialize, + response_deserializer=tags.TagTemplate.deserialize, + ) + return self._stubs["update_tag_template"] + + @property + def delete_tag_template( + self, + ) -> Callable[[datacatalog.DeleteTagTemplateRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete tag template method over gRPC. + + Deletes a tag template and all tags using the template. Users + should enable the Data Catalog API in the project identified by + the ``name`` parameter (see [Data Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteTagTemplateRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag_template" not in self._stubs: + self._stubs["delete_tag_template"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTagTemplate", + request_serializer=datacatalog.DeleteTagTemplateRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag_template"] + + @property + def create_tag_template_field( + self, + ) -> Callable[ + [datacatalog.CreateTagTemplateFieldRequest], Awaitable[tags.TagTemplateField] + ]: + r"""Return a callable for the create tag template field method over gRPC. + + Creates a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``parent`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.CreateTagTemplateFieldRequest], + Awaitable[~.TagTemplateField]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag_template_field" not in self._stubs: + self._stubs["create_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTagTemplateField", + request_serializer=datacatalog.CreateTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["create_tag_template_field"] + + @property + def update_tag_template_field( + self, + ) -> Callable[ + [datacatalog.UpdateTagTemplateFieldRequest], Awaitable[tags.TagTemplateField] + ]: + r"""Return a callable for the update tag template field method over gRPC. + + Updates a field in a tag template. This method cannot be used to + update the field type. Users should enable the Data Catalog API + in the project identified by the ``name`` parameter (see [Data + Catalog Resource Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.UpdateTagTemplateFieldRequest], + Awaitable[~.TagTemplateField]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag_template_field" not in self._stubs: + self._stubs["update_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTagTemplateField", + request_serializer=datacatalog.UpdateTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["update_tag_template_field"] + + @property + def rename_tag_template_field( + self, + ) -> Callable[ + [datacatalog.RenameTagTemplateFieldRequest], Awaitable[tags.TagTemplateField] + ]: + r"""Return a callable for the rename tag template field method over gRPC. + + Renames a field in a tag template. The user should enable the + Data Catalog API in the project identified by the ``name`` + parameter (see `Data Catalog Resource + Project `__ + for more information). + + Returns: + Callable[[~.RenameTagTemplateFieldRequest], + Awaitable[~.TagTemplateField]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "rename_tag_template_field" not in self._stubs: + self._stubs["rename_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/RenameTagTemplateField", + request_serializer=datacatalog.RenameTagTemplateFieldRequest.serialize, + response_deserializer=tags.TagTemplateField.deserialize, + ) + return self._stubs["rename_tag_template_field"] + + @property + def delete_tag_template_field( + self, + ) -> Callable[[datacatalog.DeleteTagTemplateFieldRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete tag template field method over gRPC. + + Deletes a field in a tag template and all uses of that field. + Users should enable the Data Catalog API in the project + identified by the ``name`` parameter (see [Data Catalog Resource + Project] + (https://cloud.google.com/data-catalog/docs/concepts/resource-project) + for more information). + + Returns: + Callable[[~.DeleteTagTemplateFieldRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag_template_field" not in self._stubs: + self._stubs["delete_tag_template_field"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTagTemplateField", + request_serializer=datacatalog.DeleteTagTemplateFieldRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag_template_field"] + + @property + def create_tag( + self, + ) -> Callable[[datacatalog.CreateTagRequest], Awaitable[tags.Tag]]: + r"""Return a callable for the create tag method over gRPC. + + Creates a tag on an + [Entry][google.cloud.datacatalog.v1beta1.Entry]. Note: The + project identified by the ``parent`` parameter for the + `tag `__ + and the `tag + template `__ + used to create the tag must be from the same organization. + + Returns: + Callable[[~.CreateTagRequest], + Awaitable[~.Tag]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_tag" not in self._stubs: + self._stubs["create_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/CreateTag", + request_serializer=datacatalog.CreateTagRequest.serialize, + response_deserializer=tags.Tag.deserialize, + ) + return self._stubs["create_tag"] + + @property + def update_tag( + self, + ) -> Callable[[datacatalog.UpdateTagRequest], Awaitable[tags.Tag]]: + r"""Return a callable for the update tag method over gRPC. + + Updates an existing tag. + + Returns: + Callable[[~.UpdateTagRequest], + Awaitable[~.Tag]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_tag" not in self._stubs: + self._stubs["update_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/UpdateTag", + request_serializer=datacatalog.UpdateTagRequest.serialize, + response_deserializer=tags.Tag.deserialize, + ) + return self._stubs["update_tag"] + + @property + def delete_tag( + self, + ) -> Callable[[datacatalog.DeleteTagRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete tag method over gRPC. + + Deletes a tag. + + Returns: + Callable[[~.DeleteTagRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_tag" not in self._stubs: + self._stubs["delete_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/DeleteTag", + request_serializer=datacatalog.DeleteTagRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_tag"] + + @property + def list_tags( + self, + ) -> Callable[ + [datacatalog.ListTagsRequest], Awaitable[datacatalog.ListTagsResponse] + ]: + r"""Return a callable for the list tags method over gRPC. + + Lists the tags on an + [Entry][google.cloud.datacatalog.v1beta1.Entry]. + + Returns: + Callable[[~.ListTagsRequest], + Awaitable[~.ListTagsResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_tags" not in self._stubs: + self._stubs["list_tags"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/ListTags", + request_serializer=datacatalog.ListTagsRequest.serialize, + response_deserializer=datacatalog.ListTagsResponse.deserialize, + ) + return self._stubs["list_tags"] + + @property + def set_iam_policy( + self, + ) -> Callable[[iam_policy.SetIamPolicyRequest], Awaitable[policy.Policy]]: + r"""Return a callable for the set iam policy method over gRPC. + + Sets the access control policy for a resource. Replaces any + existing policy. Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.setIamPolicy`` to set policies on + tag templates. + - ``datacatalog.entries.setIamPolicy`` to set policies on + entries. + - ``datacatalog.entryGroups.setIamPolicy`` to set policies on + entry groups. + + Returns: + Callable[[~.SetIamPolicyRequest], + Awaitable[~.Policy]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "set_iam_policy" not in self._stubs: + self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/SetIamPolicy", + request_serializer=iam_policy.SetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["set_iam_policy"] + + @property + def get_iam_policy( + self, + ) -> Callable[[iam_policy.GetIamPolicyRequest], Awaitable[policy.Policy]]: + r"""Return a callable for the get iam policy method over gRPC. + + Gets the access control policy for a resource. A ``NOT_FOUND`` + error is returned if the resource does not exist. An empty + policy is returned if the resource exists but does not have a + policy set on it. + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + Callers must have following Google IAM permission + + - ``datacatalog.tagTemplates.getIamPolicy`` to get policies on + tag templates. + - ``datacatalog.entries.getIamPolicy`` to get policies on + entries. + - ``datacatalog.entryGroups.getIamPolicy`` to get policies on + entry groups. + + Returns: + Callable[[~.GetIamPolicyRequest], + Awaitable[~.Policy]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_iam_policy" not in self._stubs: + self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/GetIamPolicy", + request_serializer=iam_policy.GetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["get_iam_policy"] + + @property + def test_iam_permissions( + self, + ) -> Callable[ + [iam_policy.TestIamPermissionsRequest], + Awaitable[iam_policy.TestIamPermissionsResponse], + ]: + r"""Return a callable for the test iam permissions method over gRPC. + + Returns the caller's permissions on a resource. If the resource + does not exist, an empty set of permissions is returned (We + don't return a ``NOT_FOUND`` error). + + Supported resources are: + + - Tag templates. + - Entries. + - Entry groups. Note, this method cannot be used to manage + policies for BigQuery, Pub/Sub and any external Google Cloud + Platform resources synced to Data Catalog. + + A caller is not required to have Google IAM permission to make + this request. + + Returns: + Callable[[~.TestIamPermissionsRequest], + Awaitable[~.TestIamPermissionsResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "test_iam_permissions" not in self._stubs: + self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.DataCatalog/TestIamPermissions", + request_serializer=iam_policy.TestIamPermissionsRequest.SerializeToString, + response_deserializer=iam_policy.TestIamPermissionsResponse.FromString, + ) + return self._stubs["test_iam_permissions"] + + +__all__ = ("DataCatalogGrpcAsyncIOTransport",) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/__init__.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/__init__.py new file mode 100644 index 00000000..8abc6009 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/__init__.py @@ -0,0 +1,24 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from .client import PolicyTagManagerClient +from .async_client import PolicyTagManagerAsyncClient + +__all__ = ( + "PolicyTagManagerClient", + "PolicyTagManagerAsyncClient", +) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py new file mode 100644 index 00000000..de2eaeea --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py @@ -0,0 +1,1177 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +import functools +import re +from typing import Dict, Sequence, Tuple, Type, Union +import pkg_resources + +import google.api_core.client_options as ClientOptions # type: ignore +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore +from google.oauth2 import service_account # type: ignore + +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager import pagers +from google.cloud.datacatalog_v1beta1.types import policytagmanager +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore + +from .transports.base import PolicyTagManagerTransport +from .transports.grpc_asyncio import PolicyTagManagerGrpcAsyncIOTransport +from .client import PolicyTagManagerClient + + +class PolicyTagManagerAsyncClient: + """The policy tag manager API service allows clients to manage + their taxonomies and policy tags. + """ + + _client: PolicyTagManagerClient + + DEFAULT_ENDPOINT = PolicyTagManagerClient.DEFAULT_ENDPOINT + DEFAULT_MTLS_ENDPOINT = PolicyTagManagerClient.DEFAULT_MTLS_ENDPOINT + + taxonomy_path = staticmethod(PolicyTagManagerClient.taxonomy_path) + + policy_tag_path = staticmethod(PolicyTagManagerClient.policy_tag_path) + + from_service_account_file = PolicyTagManagerClient.from_service_account_file + from_service_account_json = from_service_account_file + + get_transport_class = functools.partial( + type(PolicyTagManagerClient).get_transport_class, type(PolicyTagManagerClient) + ) + + def __init__( + self, + *, + credentials: credentials.Credentials = None, + transport: Union[str, PolicyTagManagerTransport] = "grpc_asyncio", + client_options: ClientOptions = None, + ) -> None: + """Instantiate the policy tag manager client. + + Args: + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + transport (Union[str, ~.PolicyTagManagerTransport]): The + transport to use. If set to None, a transport is chosen + automatically. + client_options (ClientOptions): Custom options for the client. It + won't take effect if a ``transport`` instance is provided. + (1) The ``api_endpoint`` property can be used to override the + default endpoint provided by the client. GOOGLE_API_USE_MTLS + environment variable can also be used to override the endpoint: + "always" (always use the default mTLS endpoint), "never" (always + use the default regular endpoint, this is the default value for + the environment variable) and "auto" (auto switch to the default + mTLS endpoint if client SSL credentials is present). However, + the ``api_endpoint`` property takes precedence if provided. + (2) The ``client_cert_source`` property is used to provide client + SSL credentials for mutual TLS transport. If not provided, the + default SSL credentials will be used if present. + + Raises: + google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport + creation failed for any reason. + """ + + self._client = PolicyTagManagerClient( + credentials=credentials, transport=transport, client_options=client_options, + ) + + async def create_taxonomy( + self, + request: policytagmanager.CreateTaxonomyRequest = None, + *, + parent: str = None, + taxonomy: policytagmanager.Taxonomy = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.Taxonomy: + r"""Creates a taxonomy in the specified project. + + Args: + request (:class:`~.policytagmanager.CreateTaxonomyRequest`): + The request object. Request message for + [CreateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreateTaxonomy]. + parent (:class:`str`): + Required. Resource name of the + project that the taxonomy will belong + to. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + taxonomy (:class:`~.policytagmanager.Taxonomy`): + The taxonomy to be created. + This corresponds to the ``taxonomy`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.Taxonomy: + A taxonomy is a collection of policy tags that classify + data along a common axis. For instance a data + *sensitivity* taxonomy could contain policy tags + denoting PII such as age, zipcode, and SSN. A data + *origin* taxonomy could contain policy tags to + distinguish user data, employee data, partner data, + public data. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, taxonomy]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.CreateTaxonomyRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if taxonomy is not None: + request.taxonomy = taxonomy + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_taxonomy, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_taxonomy( + self, + request: policytagmanager.DeleteTaxonomyRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a taxonomy. This operation will also delete + all policy tags in this taxonomy along with their + associated policies. + + Args: + request (:class:`~.policytagmanager.DeleteTaxonomyRequest`): + The request object. Request message for + [DeleteTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeleteTaxonomy]. + name (:class:`str`): + Required. Resource name of the + taxonomy to be deleted. All policy tags + in this taxonomy will also be deleted. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.DeleteTaxonomyRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_taxonomy, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def update_taxonomy( + self, + request: policytagmanager.UpdateTaxonomyRequest = None, + *, + taxonomy: policytagmanager.Taxonomy = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.Taxonomy: + r"""Updates a taxonomy. + + Args: + request (:class:`~.policytagmanager.UpdateTaxonomyRequest`): + The request object. Request message for + [UpdateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdateTaxonomy]. + taxonomy (:class:`~.policytagmanager.Taxonomy`): + The taxonomy to update. Only description, display_name, + and activated policy types can be updated. + This corresponds to the ``taxonomy`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.Taxonomy: + A taxonomy is a collection of policy tags that classify + data along a common axis. For instance a data + *sensitivity* taxonomy could contain policy tags + denoting PII such as age, zipcode, and SSN. A data + *origin* taxonomy could contain policy tags to + distinguish user data, employee data, partner data, + public data. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([taxonomy]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.UpdateTaxonomyRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if taxonomy is not None: + request.taxonomy = taxonomy + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_taxonomy, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("taxonomy.name", request.taxonomy.name),) + ), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def list_taxonomies( + self, + request: policytagmanager.ListTaxonomiesRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListTaxonomiesAsyncPager: + r"""Lists all taxonomies in a project in a particular + location that the caller has permission to view. + + Args: + request (:class:`~.policytagmanager.ListTaxonomiesRequest`): + The request object. Request message for + [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. + parent (:class:`str`): + Required. Resource name of the + project to list the taxonomies of. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListTaxonomiesAsyncPager: + Response message for + [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.ListTaxonomiesRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.list_taxonomies, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.ListTaxonomiesAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def get_taxonomy( + self, + request: policytagmanager.GetTaxonomyRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.Taxonomy: + r"""Gets a taxonomy. + + Args: + request (:class:`~.policytagmanager.GetTaxonomyRequest`): + The request object. Request message for + [GetTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetTaxonomy]. + name (:class:`str`): + Required. Resource name of the + requested taxonomy. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.Taxonomy: + A taxonomy is a collection of policy tags that classify + data along a common axis. For instance a data + *sensitivity* taxonomy could contain policy tags + denoting PII such as age, zipcode, and SSN. A data + *origin* taxonomy could contain policy tags to + distinguish user data, employee data, partner data, + public data. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.GetTaxonomyRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_taxonomy, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def create_policy_tag( + self, + request: policytagmanager.CreatePolicyTagRequest = None, + *, + parent: str = None, + policy_tag: policytagmanager.PolicyTag = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.PolicyTag: + r"""Creates a policy tag in the specified taxonomy. + + Args: + request (:class:`~.policytagmanager.CreatePolicyTagRequest`): + The request object. Request message for + [CreatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreatePolicyTag]. + parent (:class:`str`): + Required. Resource name of the + taxonomy that the policy tag will belong + to. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + policy_tag (:class:`~.policytagmanager.PolicyTag`): + The policy tag to be created. + This corresponds to the ``policy_tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.PolicyTag: + Denotes one policy tag in a taxonomy + (e.g. ssn). Policy Tags can be defined + in a hierarchy. For example, consider + the following hierarchy: Geolocation + -> (LatLong, City, ZipCode). + PolicyTag "Geolocation" contains three + child policy tags: "LatLong", "City", + and "ZipCode". + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent, policy_tag]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.CreatePolicyTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if policy_tag is not None: + request.policy_tag = policy_tag + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.create_policy_tag, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def delete_policy_tag( + self, + request: policytagmanager.DeletePolicyTagRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a policy tag. Also deletes all of its + descendant policy tags. + + Args: + request (:class:`~.policytagmanager.DeletePolicyTagRequest`): + The request object. Request message for + [DeletePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeletePolicyTag]. + name (:class:`str`): + Required. Resource name of the policy + tag to be deleted. All of its descendant + policy tags will also be deleted. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.DeletePolicyTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.delete_policy_tag, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + await rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + async def update_policy_tag( + self, + request: policytagmanager.UpdatePolicyTagRequest = None, + *, + policy_tag: policytagmanager.PolicyTag = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.PolicyTag: + r"""Updates a policy tag. + + Args: + request (:class:`~.policytagmanager.UpdatePolicyTagRequest`): + The request object. Request message for + [UpdatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdatePolicyTag]. + policy_tag (:class:`~.policytagmanager.PolicyTag`): + The policy tag to update. Only the description, + display_name, and parent_policy_tag fields can be + updated. + This corresponds to the ``policy_tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.PolicyTag: + Denotes one policy tag in a taxonomy + (e.g. ssn). Policy Tags can be defined + in a hierarchy. For example, consider + the following hierarchy: Geolocation + -> (LatLong, City, ZipCode). + PolicyTag "Geolocation" contains three + child policy tags: "LatLong", "City", + and "ZipCode". + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([policy_tag]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.UpdatePolicyTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if policy_tag is not None: + request.policy_tag = policy_tag + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.update_policy_tag, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("policy_tag.name", request.policy_tag.name),) + ), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def list_policy_tags( + self, + request: policytagmanager.ListPolicyTagsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListPolicyTagsAsyncPager: + r"""Lists all policy tags in a taxonomy. + + Args: + request (:class:`~.policytagmanager.ListPolicyTagsRequest`): + The request object. Request message for + [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. + parent (:class:`str`): + Required. Resource name of the + taxonomy to list the policy tags of. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListPolicyTagsAsyncPager: + Response message for + [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([parent]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.ListPolicyTagsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.list_policy_tags, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__aiter__` convenience method. + response = pagers.ListPolicyTagsAsyncPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + async def get_policy_tag( + self, + request: policytagmanager.GetPolicyTagRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.PolicyTag: + r"""Gets a policy tag. + + Args: + request (:class:`~.policytagmanager.GetPolicyTagRequest`): + The request object. Request message for + [GetPolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetPolicyTag]. + name (:class:`str`): + Required. Resource name of the + requested policy tag. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.PolicyTag: + Denotes one policy tag in a taxonomy + (e.g. ssn). Policy Tags can be defined + in a hierarchy. For example, consider + the following hierarchy: Geolocation + -> (LatLong, City, ZipCode). + PolicyTag "Geolocation" contains three + child policy tags: "LatLong", "City", + and "ZipCode". + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + if request is not None and any([name]): + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + request = policytagmanager.GetPolicyTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_policy_tag, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def get_iam_policy( + self, + request: iam_policy.GetIamPolicyRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Gets the IAM policy for a taxonomy or a policy tag. + + Args: + request (:class:`~.iam_policy.GetIamPolicyRequest`): + The request object. Request message for `GetIamPolicy` + method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.GetIamPolicyRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.get_iam_policy, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def set_iam_policy( + self, + request: iam_policy.SetIamPolicyRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Sets the IAM policy for a taxonomy or a policy tag. + + Args: + request (:class:`~.iam_policy.SetIamPolicyRequest`): + The request object. Request message for `SetIamPolicy` + method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.SetIamPolicyRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.set_iam_policy, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def test_iam_permissions( + self, + request: iam_policy.TestIamPermissionsRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> iam_policy.TestIamPermissionsResponse: + r"""Returns the permissions that a caller has on the + specified taxonomy or policy tag. + + Args: + request (:class:`~.iam_policy.TestIamPermissionsRequest`): + The request object. Request message for + `TestIamPermissions` method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.iam_policy.TestIamPermissionsResponse: + Response message for ``TestIamPermissions`` method. + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.TestIamPermissionsRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.test_iam_permissions, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +__all__ = ("PolicyTagManagerAsyncClient",) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py new file mode 100644 index 00000000..ac3eec4d --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py @@ -0,0 +1,1344 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +import os +import re +from typing import Callable, Dict, Sequence, Tuple, Type, Union +import pkg_resources + +import google.api_core.client_options as ClientOptions # type: ignore +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport import mtls # type: ignore +from google.auth.exceptions import MutualTLSChannelError # type: ignore +from google.oauth2 import service_account # type: ignore + +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager import pagers +from google.cloud.datacatalog_v1beta1.types import policytagmanager +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore + +from .transports.base import PolicyTagManagerTransport +from .transports.grpc import PolicyTagManagerGrpcTransport +from .transports.grpc_asyncio import PolicyTagManagerGrpcAsyncIOTransport + + +class PolicyTagManagerClientMeta(type): + """Metaclass for the PolicyTagManager client. + + This provides class-level methods for building and retrieving + support objects (e.g. transport) without polluting the client instance + objects. + """ + + _transport_registry = ( + OrderedDict() + ) # type: Dict[str, Type[PolicyTagManagerTransport]] + _transport_registry["grpc"] = PolicyTagManagerGrpcTransport + _transport_registry["grpc_asyncio"] = PolicyTagManagerGrpcAsyncIOTransport + + def get_transport_class(cls, label: str = None,) -> Type[PolicyTagManagerTransport]: + """Return an appropriate transport class. + + Args: + label: The name of the desired transport. If none is + provided, then the first transport in the registry is used. + + Returns: + The transport class to use. + """ + # If a specific transport is requested, return that one. + if label: + return cls._transport_registry[label] + + # No transport is requested; return the default (that is, the first one + # in the dictionary). + return next(iter(cls._transport_registry.values())) + + +class PolicyTagManagerClient(metaclass=PolicyTagManagerClientMeta): + """The policy tag manager API service allows clients to manage + their taxonomies and policy tags. + """ + + @staticmethod + def _get_default_mtls_endpoint(api_endpoint): + """Convert api endpoint to mTLS endpoint. + Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to + "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively. + Args: + api_endpoint (Optional[str]): the api endpoint to convert. + Returns: + str: converted mTLS api endpoint. + """ + if not api_endpoint: + return api_endpoint + + mtls_endpoint_re = re.compile( + r"(?P[^.]+)(?P\.mtls)?(?P\.sandbox)?(?P\.googleapis\.com)?" + ) + + m = mtls_endpoint_re.match(api_endpoint) + name, mtls, sandbox, googledomain = m.groups() + if mtls or not googledomain: + return api_endpoint + + if sandbox: + return api_endpoint.replace( + "sandbox.googleapis.com", "mtls.sandbox.googleapis.com" + ) + + return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com") + + DEFAULT_ENDPOINT = "datacatalog.googleapis.com" + DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore + DEFAULT_ENDPOINT + ) + + @classmethod + def from_service_account_file(cls, filename: str, *args, **kwargs): + """Creates an instance of this client using the provided credentials + file. + + Args: + filename (str): The path to the service account private key json + file. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + {@api.name}: The constructed client. + """ + credentials = service_account.Credentials.from_service_account_file(filename) + kwargs["credentials"] = credentials + return cls(*args, **kwargs) + + from_service_account_json = from_service_account_file + + @staticmethod + def policy_tag_path( + project: str, location: str, taxonomy: str, policy_tag: str, + ) -> str: + """Return a fully-qualified policy_tag string.""" + return "projects/{project}/locations/{location}/taxonomies/{taxonomy}/policyTags/{policy_tag}".format( + project=project, + location=location, + taxonomy=taxonomy, + policy_tag=policy_tag, + ) + + @staticmethod + def parse_policy_tag_path(path: str) -> Dict[str, str]: + """Parse a policy_tag path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/taxonomies/(?P.+?)/policyTags/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def taxonomy_path(project: str, location: str, taxonomy: str,) -> str: + """Return a fully-qualified taxonomy string.""" + return "projects/{project}/locations/{location}/taxonomies/{taxonomy}".format( + project=project, location=location, taxonomy=taxonomy, + ) + + @staticmethod + def parse_taxonomy_path(path: str) -> Dict[str, str]: + """Parse a taxonomy path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/locations/(?P.+?)/taxonomies/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + def __init__( + self, + *, + credentials: credentials.Credentials = None, + transport: Union[str, PolicyTagManagerTransport] = None, + client_options: ClientOptions = None, + ) -> None: + """Instantiate the policy tag manager client. + + Args: + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + transport (Union[str, ~.PolicyTagManagerTransport]): The + transport to use. If set to None, a transport is chosen + automatically. + client_options (ClientOptions): Custom options for the client. It + won't take effect if a ``transport`` instance is provided. + (1) The ``api_endpoint`` property can be used to override the + default endpoint provided by the client. GOOGLE_API_USE_MTLS + environment variable can also be used to override the endpoint: + "always" (always use the default mTLS endpoint), "never" (always + use the default regular endpoint, this is the default value for + the environment variable) and "auto" (auto switch to the default + mTLS endpoint if client SSL credentials is present). However, + the ``api_endpoint`` property takes precedence if provided. + (2) The ``client_cert_source`` property is used to provide client + SSL credentials for mutual TLS transport. If not provided, the + default SSL credentials will be used if present. + + Raises: + google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport + creation failed for any reason. + """ + if isinstance(client_options, dict): + client_options = ClientOptions.from_dict(client_options) + if client_options is None: + client_options = ClientOptions.ClientOptions() + + if client_options.api_endpoint is None: + use_mtls_env = os.getenv("GOOGLE_API_USE_MTLS", "never") + if use_mtls_env == "never": + client_options.api_endpoint = self.DEFAULT_ENDPOINT + elif use_mtls_env == "always": + client_options.api_endpoint = self.DEFAULT_MTLS_ENDPOINT + elif use_mtls_env == "auto": + has_client_cert_source = ( + client_options.client_cert_source is not None + or mtls.has_default_client_cert_source() + ) + client_options.api_endpoint = ( + self.DEFAULT_MTLS_ENDPOINT + if has_client_cert_source + else self.DEFAULT_ENDPOINT + ) + else: + raise MutualTLSChannelError( + "Unsupported GOOGLE_API_USE_MTLS value. Accepted values: never, auto, always" + ) + + # Save or instantiate the transport. + # Ordinarily, we provide the transport, but allowing a custom transport + # instance provides an extensibility point for unusual situations. + if isinstance(transport, PolicyTagManagerTransport): + # transport is a PolicyTagManagerTransport instance. + if credentials or client_options.credentials_file: + raise ValueError( + "When providing a transport instance, " + "provide its credentials directly." + ) + if client_options.scopes: + raise ValueError( + "When providing a transport instance, " + "provide its scopes directly." + ) + self._transport = transport + else: + Transport = type(self).get_transport_class(transport) + self._transport = Transport( + credentials=credentials, + credentials_file=client_options.credentials_file, + host=client_options.api_endpoint, + scopes=client_options.scopes, + api_mtls_endpoint=client_options.api_endpoint, + client_cert_source=client_options.client_cert_source, + quota_project_id=client_options.quota_project_id, + ) + + def create_taxonomy( + self, + request: policytagmanager.CreateTaxonomyRequest = None, + *, + parent: str = None, + taxonomy: policytagmanager.Taxonomy = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.Taxonomy: + r"""Creates a taxonomy in the specified project. + + Args: + request (:class:`~.policytagmanager.CreateTaxonomyRequest`): + The request object. Request message for + [CreateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreateTaxonomy]. + parent (:class:`str`): + Required. Resource name of the + project that the taxonomy will belong + to. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + taxonomy (:class:`~.policytagmanager.Taxonomy`): + The taxonomy to be created. + This corresponds to the ``taxonomy`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.Taxonomy: + A taxonomy is a collection of policy tags that classify + data along a common axis. For instance a data + *sensitivity* taxonomy could contain policy tags + denoting PII such as age, zipcode, and SSN. A data + *origin* taxonomy could contain policy tags to + distinguish user data, employee data, partner data, + public data. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, taxonomy]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.CreateTaxonomyRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.CreateTaxonomyRequest): + request = policytagmanager.CreateTaxonomyRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if taxonomy is not None: + request.taxonomy = taxonomy + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_taxonomy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_taxonomy( + self, + request: policytagmanager.DeleteTaxonomyRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a taxonomy. This operation will also delete + all policy tags in this taxonomy along with their + associated policies. + + Args: + request (:class:`~.policytagmanager.DeleteTaxonomyRequest`): + The request object. Request message for + [DeleteTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeleteTaxonomy]. + name (:class:`str`): + Required. Resource name of the + taxonomy to be deleted. All policy tags + in this taxonomy will also be deleted. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.DeleteTaxonomyRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.DeleteTaxonomyRequest): + request = policytagmanager.DeleteTaxonomyRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_taxonomy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def update_taxonomy( + self, + request: policytagmanager.UpdateTaxonomyRequest = None, + *, + taxonomy: policytagmanager.Taxonomy = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.Taxonomy: + r"""Updates a taxonomy. + + Args: + request (:class:`~.policytagmanager.UpdateTaxonomyRequest`): + The request object. Request message for + [UpdateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdateTaxonomy]. + taxonomy (:class:`~.policytagmanager.Taxonomy`): + The taxonomy to update. Only description, display_name, + and activated policy types can be updated. + This corresponds to the ``taxonomy`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.Taxonomy: + A taxonomy is a collection of policy tags that classify + data along a common axis. For instance a data + *sensitivity* taxonomy could contain policy tags + denoting PII such as age, zipcode, and SSN. A data + *origin* taxonomy could contain policy tags to + distinguish user data, employee data, partner data, + public data. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([taxonomy]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.UpdateTaxonomyRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.UpdateTaxonomyRequest): + request = policytagmanager.UpdateTaxonomyRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if taxonomy is not None: + request.taxonomy = taxonomy + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_taxonomy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("taxonomy.name", request.taxonomy.name),) + ), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def list_taxonomies( + self, + request: policytagmanager.ListTaxonomiesRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListTaxonomiesPager: + r"""Lists all taxonomies in a project in a particular + location that the caller has permission to view. + + Args: + request (:class:`~.policytagmanager.ListTaxonomiesRequest`): + The request object. Request message for + [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. + parent (:class:`str`): + Required. Resource name of the + project to list the taxonomies of. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListTaxonomiesPager: + Response message for + [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.ListTaxonomiesRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.ListTaxonomiesRequest): + request = policytagmanager.ListTaxonomiesRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.list_taxonomies] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.ListTaxonomiesPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def get_taxonomy( + self, + request: policytagmanager.GetTaxonomyRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.Taxonomy: + r"""Gets a taxonomy. + + Args: + request (:class:`~.policytagmanager.GetTaxonomyRequest`): + The request object. Request message for + [GetTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetTaxonomy]. + name (:class:`str`): + Required. Resource name of the + requested taxonomy. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.Taxonomy: + A taxonomy is a collection of policy tags that classify + data along a common axis. For instance a data + *sensitivity* taxonomy could contain policy tags + denoting PII such as age, zipcode, and SSN. A data + *origin* taxonomy could contain policy tags to + distinguish user data, employee data, partner data, + public data. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.GetTaxonomyRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.GetTaxonomyRequest): + request = policytagmanager.GetTaxonomyRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_taxonomy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def create_policy_tag( + self, + request: policytagmanager.CreatePolicyTagRequest = None, + *, + parent: str = None, + policy_tag: policytagmanager.PolicyTag = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.PolicyTag: + r"""Creates a policy tag in the specified taxonomy. + + Args: + request (:class:`~.policytagmanager.CreatePolicyTagRequest`): + The request object. Request message for + [CreatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreatePolicyTag]. + parent (:class:`str`): + Required. Resource name of the + taxonomy that the policy tag will belong + to. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + policy_tag (:class:`~.policytagmanager.PolicyTag`): + The policy tag to be created. + This corresponds to the ``policy_tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.PolicyTag: + Denotes one policy tag in a taxonomy + (e.g. ssn). Policy Tags can be defined + in a hierarchy. For example, consider + the following hierarchy: Geolocation + -> (LatLong, City, ZipCode). + PolicyTag "Geolocation" contains three + child policy tags: "LatLong", "City", + and "ZipCode". + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent, policy_tag]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.CreatePolicyTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.CreatePolicyTagRequest): + request = policytagmanager.CreatePolicyTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + if policy_tag is not None: + request.policy_tag = policy_tag + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.create_policy_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def delete_policy_tag( + self, + request: policytagmanager.DeletePolicyTagRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> None: + r"""Deletes a policy tag. Also deletes all of its + descendant policy tags. + + Args: + request (:class:`~.policytagmanager.DeletePolicyTagRequest`): + The request object. Request message for + [DeletePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeletePolicyTag]. + name (:class:`str`): + Required. Resource name of the policy + tag to be deleted. All of its descendant + policy tags will also be deleted. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.DeletePolicyTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.DeletePolicyTagRequest): + request = policytagmanager.DeletePolicyTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.delete_policy_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + rpc( + request, retry=retry, timeout=timeout, metadata=metadata, + ) + + def update_policy_tag( + self, + request: policytagmanager.UpdatePolicyTagRequest = None, + *, + policy_tag: policytagmanager.PolicyTag = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.PolicyTag: + r"""Updates a policy tag. + + Args: + request (:class:`~.policytagmanager.UpdatePolicyTagRequest`): + The request object. Request message for + [UpdatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdatePolicyTag]. + policy_tag (:class:`~.policytagmanager.PolicyTag`): + The policy tag to update. Only the description, + display_name, and parent_policy_tag fields can be + updated. + This corresponds to the ``policy_tag`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.PolicyTag: + Denotes one policy tag in a taxonomy + (e.g. ssn). Policy Tags can be defined + in a hierarchy. For example, consider + the following hierarchy: Geolocation + -> (LatLong, City, ZipCode). + PolicyTag "Geolocation" contains three + child policy tags: "LatLong", "City", + and "ZipCode". + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([policy_tag]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.UpdatePolicyTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.UpdatePolicyTagRequest): + request = policytagmanager.UpdatePolicyTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if policy_tag is not None: + request.policy_tag = policy_tag + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.update_policy_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata( + (("policy_tag.name", request.policy_tag.name),) + ), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def list_policy_tags( + self, + request: policytagmanager.ListPolicyTagsRequest = None, + *, + parent: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> pagers.ListPolicyTagsPager: + r"""Lists all policy tags in a taxonomy. + + Args: + request (:class:`~.policytagmanager.ListPolicyTagsRequest`): + The request object. Request message for + [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. + parent (:class:`str`): + Required. Resource name of the + taxonomy to list the policy tags of. + This corresponds to the ``parent`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.pagers.ListPolicyTagsPager: + Response message for + [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. + + Iterating over this object will yield results and + resolve additional pages automatically. + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([parent]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.ListPolicyTagsRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.ListPolicyTagsRequest): + request = policytagmanager.ListPolicyTagsRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if parent is not None: + request.parent = parent + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.list_policy_tags] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # This method is paged; wrap the response in a pager, which provides + # an `__iter__` convenience method. + response = pagers.ListPolicyTagsPager( + method=rpc, request=request, response=response, metadata=metadata, + ) + + # Done; return the response. + return response + + def get_policy_tag( + self, + request: policytagmanager.GetPolicyTagRequest = None, + *, + name: str = None, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanager.PolicyTag: + r"""Gets a policy tag. + + Args: + request (:class:`~.policytagmanager.GetPolicyTagRequest`): + The request object. Request message for + [GetPolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetPolicyTag]. + name (:class:`str`): + Required. Resource name of the + requested policy tag. + This corresponds to the ``name`` field + on the ``request`` instance; if ``request`` is provided, this + should not be set. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanager.PolicyTag: + Denotes one policy tag in a taxonomy + (e.g. ssn). Policy Tags can be defined + in a hierarchy. For example, consider + the following hierarchy: Geolocation + -> (LatLong, City, ZipCode). + PolicyTag "Geolocation" contains three + child policy tags: "LatLong", "City", + and "ZipCode". + + """ + # Create or coerce a protobuf request object. + # Sanity check: If we got a request object, we should *not* have + # gotten any keyword arguments that map to the request. + has_flattened_params = any([name]) + if request is not None and has_flattened_params: + raise ValueError( + "If the `request` argument is set, then none of " + "the individual field arguments should be set." + ) + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanager.GetPolicyTagRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance(request, policytagmanager.GetPolicyTagRequest): + request = policytagmanager.GetPolicyTagRequest(request) + + # If we have keyword arguments corresponding to fields on the + # request, apply these. + + if name is not None: + request.name = name + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_policy_tag] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def get_iam_policy( + self, + request: iam_policy.GetIamPolicyRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Gets the IAM policy for a taxonomy or a policy tag. + + Args: + request (:class:`~.iam_policy.GetIamPolicyRequest`): + The request object. Request message for `GetIamPolicy` + method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.GetIamPolicyRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.get_iam_policy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def set_iam_policy( + self, + request: iam_policy.SetIamPolicyRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policy.Policy: + r"""Sets the IAM policy for a taxonomy or a policy tag. + + Args: + request (:class:`~.iam_policy.SetIamPolicyRequest`): + The request object. Request message for `SetIamPolicy` + method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policy.Policy: + Defines an Identity and Access Management (IAM) policy. + It is used to specify access control policies for Cloud + Platform resources. + + A ``Policy`` is a collection of ``bindings``. A + ``binding`` binds one or more ``members`` to a single + ``role``. Members can be user accounts, service + accounts, Google groups, and domains (such as G Suite). + A ``role`` is a named list of permissions (defined by + IAM or configured by users). A ``binding`` can + optionally specify a ``condition``, which is a logic + expression that further constrains the role binding + based on attributes about the request and/or target + resource. + + **JSON Example** + + :: + + { + "bindings": [ + { + "role": "roles/resourcemanager.organizationAdmin", + "members": [ + "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + }, + { + "role": "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { + "title": "expirable access", + "description": "Does not grant access after Sep 2020", + "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", + } + } + ] + } + + **YAML Example** + + :: + + bindings: + - members: + - user:mike@example.com + - group:admins@example.com + - domain:google.com + - serviceAccount:my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin + - members: + - user:eve@example.com + role: roles/resourcemanager.organizationViewer + condition: + title: expirable access + description: Does not grant access after Sep 2020 + expression: request.time < timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the `IAM + developer's + guide `__. + + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.SetIamPolicyRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.set_iam_policy] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def test_iam_permissions( + self, + request: iam_policy.TestIamPermissionsRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> iam_policy.TestIamPermissionsResponse: + r"""Returns the permissions that a caller has on the + specified taxonomy or policy tag. + + Args: + request (:class:`~.iam_policy.TestIamPermissionsRequest`): + The request object. Request message for + `TestIamPermissions` method. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.iam_policy.TestIamPermissionsResponse: + Response message for ``TestIamPermissions`` method. + """ + # Create or coerce a protobuf request object. + + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. + if isinstance(request, dict): + request = iam_policy.TestIamPermissionsRequest(**request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.test_iam_permissions] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +__all__ = ("PolicyTagManagerClient",) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py new file mode 100644 index 00000000..4dd9013d --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py @@ -0,0 +1,276 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Any, AsyncIterable, Awaitable, Callable, Iterable, Sequence, Tuple + +from google.cloud.datacatalog_v1beta1.types import policytagmanager + + +class ListTaxonomiesPager: + """A pager for iterating through ``list_taxonomies`` requests. + + This class thinly wraps an initial + :class:`~.policytagmanager.ListTaxonomiesResponse` object, and + provides an ``__iter__`` method to iterate through its + ``taxonomies`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``ListTaxonomies`` requests and continue to iterate + through the ``taxonomies`` field on the + corresponding responses. + + All the usual :class:`~.policytagmanager.ListTaxonomiesResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., policytagmanager.ListTaxonomiesResponse], + request: policytagmanager.ListTaxonomiesRequest, + response: policytagmanager.ListTaxonomiesResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.policytagmanager.ListTaxonomiesRequest`): + The initial request object. + response (:class:`~.policytagmanager.ListTaxonomiesResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = policytagmanager.ListTaxonomiesRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[policytagmanager.ListTaxonomiesResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[policytagmanager.Taxonomy]: + for page in self.pages: + yield from page.taxonomies + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListTaxonomiesAsyncPager: + """A pager for iterating through ``list_taxonomies`` requests. + + This class thinly wraps an initial + :class:`~.policytagmanager.ListTaxonomiesResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``taxonomies`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``ListTaxonomies`` requests and continue to iterate + through the ``taxonomies`` field on the + corresponding responses. + + All the usual :class:`~.policytagmanager.ListTaxonomiesResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[policytagmanager.ListTaxonomiesResponse]], + request: policytagmanager.ListTaxonomiesRequest, + response: policytagmanager.ListTaxonomiesResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.policytagmanager.ListTaxonomiesRequest`): + The initial request object. + response (:class:`~.policytagmanager.ListTaxonomiesResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = policytagmanager.ListTaxonomiesRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[policytagmanager.ListTaxonomiesResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[policytagmanager.Taxonomy]: + async def async_generator(): + async for page in self.pages: + for response in page.taxonomies: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListPolicyTagsPager: + """A pager for iterating through ``list_policy_tags`` requests. + + This class thinly wraps an initial + :class:`~.policytagmanager.ListPolicyTagsResponse` object, and + provides an ``__iter__`` method to iterate through its + ``policy_tags`` field. + + If there are more pages, the ``__iter__`` method will make additional + ``ListPolicyTags`` requests and continue to iterate + through the ``policy_tags`` field on the + corresponding responses. + + All the usual :class:`~.policytagmanager.ListPolicyTagsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., policytagmanager.ListPolicyTagsResponse], + request: policytagmanager.ListPolicyTagsRequest, + response: policytagmanager.ListPolicyTagsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.policytagmanager.ListPolicyTagsRequest`): + The initial request object. + response (:class:`~.policytagmanager.ListPolicyTagsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = policytagmanager.ListPolicyTagsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + def pages(self) -> Iterable[policytagmanager.ListPolicyTagsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = self._method(self._request, metadata=self._metadata) + yield self._response + + def __iter__(self) -> Iterable[policytagmanager.PolicyTag]: + for page in self.pages: + yield from page.policy_tags + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) + + +class ListPolicyTagsAsyncPager: + """A pager for iterating through ``list_policy_tags`` requests. + + This class thinly wraps an initial + :class:`~.policytagmanager.ListPolicyTagsResponse` object, and + provides an ``__aiter__`` method to iterate through its + ``policy_tags`` field. + + If there are more pages, the ``__aiter__`` method will make additional + ``ListPolicyTags`` requests and continue to iterate + through the ``policy_tags`` field on the + corresponding responses. + + All the usual :class:`~.policytagmanager.ListPolicyTagsResponse` + attributes are available on the pager. If multiple requests are made, only + the most recent response is retained, and thus used for attribute lookup. + """ + + def __init__( + self, + method: Callable[..., Awaitable[policytagmanager.ListPolicyTagsResponse]], + request: policytagmanager.ListPolicyTagsRequest, + response: policytagmanager.ListPolicyTagsResponse, + *, + metadata: Sequence[Tuple[str, str]] = () + ): + """Instantiate the pager. + + Args: + method (Callable): The method that was originally called, and + which instantiated this pager. + request (:class:`~.policytagmanager.ListPolicyTagsRequest`): + The initial request object. + response (:class:`~.policytagmanager.ListPolicyTagsResponse`): + The initial response object. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + """ + self._method = method + self._request = policytagmanager.ListPolicyTagsRequest(request) + self._response = response + self._metadata = metadata + + def __getattr__(self, name: str) -> Any: + return getattr(self._response, name) + + @property + async def pages(self) -> AsyncIterable[policytagmanager.ListPolicyTagsResponse]: + yield self._response + while self._response.next_page_token: + self._request.page_token = self._response.next_page_token + self._response = await self._method(self._request, metadata=self._metadata) + yield self._response + + def __aiter__(self) -> AsyncIterable[policytagmanager.PolicyTag]: + async def async_generator(): + async for page in self.pages: + for response in page.policy_tags: + yield response + + return async_generator() + + def __repr__(self) -> str: + return "{0}<{1!r}>".format(self.__class__.__name__, self._response) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/__init__.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/__init__.py new file mode 100644 index 00000000..1a518753 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/__init__.py @@ -0,0 +1,36 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +from typing import Dict, Type + +from .base import PolicyTagManagerTransport +from .grpc import PolicyTagManagerGrpcTransport +from .grpc_asyncio import PolicyTagManagerGrpcAsyncIOTransport + + +# Compile a registry of transports. +_transport_registry = OrderedDict() # type: Dict[str, Type[PolicyTagManagerTransport]] +_transport_registry["grpc"] = PolicyTagManagerGrpcTransport +_transport_registry["grpc_asyncio"] = PolicyTagManagerGrpcAsyncIOTransport + + +__all__ = ( + "PolicyTagManagerTransport", + "PolicyTagManagerGrpcTransport", + "PolicyTagManagerGrpcAsyncIOTransport", +) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/base.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/base.py new file mode 100644 index 00000000..abca4532 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/base.py @@ -0,0 +1,288 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import abc +import typing +import pkg_resources + +from google import auth +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore + +from google.cloud.datacatalog_v1beta1.types import policytagmanager +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import empty_pb2 as empty # type: ignore + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +class PolicyTagManagerTransport(abc.ABC): + """Abstract transport class for PolicyTagManager.""" + + AUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",) + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: typing.Optional[str] = None, + scopes: typing.Optional[typing.Sequence[str]] = AUTH_SCOPES, + quota_project_id: typing.Optional[str] = None, + **kwargs, + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is mutually exclusive with credentials. + scope (Optional[Sequence[str]]): A list of scopes. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + """ + # Save the hostname. Default to port 443 (HTTPS) if none is specified. + if ":" not in host: + host += ":443" + self._host = host + + # If no credentials are provided, then determine the appropriate + # defaults. + if credentials and credentials_file: + raise exceptions.DuplicateCredentialArgs( + "'credentials_file' and 'credentials' are mutually exclusive" + ) + + if credentials_file is not None: + credentials, _ = auth.load_credentials_from_file( + credentials_file, scopes=scopes, quota_project_id=quota_project_id + ) + + elif credentials is None: + credentials, _ = auth.default( + scopes=scopes, quota_project_id=quota_project_id + ) + + # Save the credentials. + self._credentials = credentials + + # Lifted into its own function so it can be stubbed out during tests. + self._prep_wrapped_messages() + + def _prep_wrapped_messages(self): + # Precompute the wrapped methods. + self._wrapped_methods = { + self.create_taxonomy: gapic_v1.method.wrap_method( + self.create_taxonomy, default_timeout=None, client_info=_client_info, + ), + self.delete_taxonomy: gapic_v1.method.wrap_method( + self.delete_taxonomy, default_timeout=None, client_info=_client_info, + ), + self.update_taxonomy: gapic_v1.method.wrap_method( + self.update_taxonomy, default_timeout=None, client_info=_client_info, + ), + self.list_taxonomies: gapic_v1.method.wrap_method( + self.list_taxonomies, default_timeout=None, client_info=_client_info, + ), + self.get_taxonomy: gapic_v1.method.wrap_method( + self.get_taxonomy, default_timeout=None, client_info=_client_info, + ), + self.create_policy_tag: gapic_v1.method.wrap_method( + self.create_policy_tag, default_timeout=None, client_info=_client_info, + ), + self.delete_policy_tag: gapic_v1.method.wrap_method( + self.delete_policy_tag, default_timeout=None, client_info=_client_info, + ), + self.update_policy_tag: gapic_v1.method.wrap_method( + self.update_policy_tag, default_timeout=None, client_info=_client_info, + ), + self.list_policy_tags: gapic_v1.method.wrap_method( + self.list_policy_tags, default_timeout=None, client_info=_client_info, + ), + self.get_policy_tag: gapic_v1.method.wrap_method( + self.get_policy_tag, default_timeout=None, client_info=_client_info, + ), + self.get_iam_policy: gapic_v1.method.wrap_method( + self.get_iam_policy, default_timeout=None, client_info=_client_info, + ), + self.set_iam_policy: gapic_v1.method.wrap_method( + self.set_iam_policy, default_timeout=None, client_info=_client_info, + ), + self.test_iam_permissions: gapic_v1.method.wrap_method( + self.test_iam_permissions, + default_timeout=None, + client_info=_client_info, + ), + } + + @property + def create_taxonomy( + self, + ) -> typing.Callable[ + [policytagmanager.CreateTaxonomyRequest], + typing.Union[ + policytagmanager.Taxonomy, typing.Awaitable[policytagmanager.Taxonomy] + ], + ]: + raise NotImplementedError() + + @property + def delete_taxonomy( + self, + ) -> typing.Callable[ + [policytagmanager.DeleteTaxonomyRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def update_taxonomy( + self, + ) -> typing.Callable[ + [policytagmanager.UpdateTaxonomyRequest], + typing.Union[ + policytagmanager.Taxonomy, typing.Awaitable[policytagmanager.Taxonomy] + ], + ]: + raise NotImplementedError() + + @property + def list_taxonomies( + self, + ) -> typing.Callable[ + [policytagmanager.ListTaxonomiesRequest], + typing.Union[ + policytagmanager.ListTaxonomiesResponse, + typing.Awaitable[policytagmanager.ListTaxonomiesResponse], + ], + ]: + raise NotImplementedError() + + @property + def get_taxonomy( + self, + ) -> typing.Callable[ + [policytagmanager.GetTaxonomyRequest], + typing.Union[ + policytagmanager.Taxonomy, typing.Awaitable[policytagmanager.Taxonomy] + ], + ]: + raise NotImplementedError() + + @property + def create_policy_tag( + self, + ) -> typing.Callable[ + [policytagmanager.CreatePolicyTagRequest], + typing.Union[ + policytagmanager.PolicyTag, typing.Awaitable[policytagmanager.PolicyTag] + ], + ]: + raise NotImplementedError() + + @property + def delete_policy_tag( + self, + ) -> typing.Callable[ + [policytagmanager.DeletePolicyTagRequest], + typing.Union[empty.Empty, typing.Awaitable[empty.Empty]], + ]: + raise NotImplementedError() + + @property + def update_policy_tag( + self, + ) -> typing.Callable[ + [policytagmanager.UpdatePolicyTagRequest], + typing.Union[ + policytagmanager.PolicyTag, typing.Awaitable[policytagmanager.PolicyTag] + ], + ]: + raise NotImplementedError() + + @property + def list_policy_tags( + self, + ) -> typing.Callable[ + [policytagmanager.ListPolicyTagsRequest], + typing.Union[ + policytagmanager.ListPolicyTagsResponse, + typing.Awaitable[policytagmanager.ListPolicyTagsResponse], + ], + ]: + raise NotImplementedError() + + @property + def get_policy_tag( + self, + ) -> typing.Callable[ + [policytagmanager.GetPolicyTagRequest], + typing.Union[ + policytagmanager.PolicyTag, typing.Awaitable[policytagmanager.PolicyTag] + ], + ]: + raise NotImplementedError() + + @property + def get_iam_policy( + self, + ) -> typing.Callable[ + [iam_policy.GetIamPolicyRequest], + typing.Union[policy.Policy, typing.Awaitable[policy.Policy]], + ]: + raise NotImplementedError() + + @property + def set_iam_policy( + self, + ) -> typing.Callable[ + [iam_policy.SetIamPolicyRequest], + typing.Union[policy.Policy, typing.Awaitable[policy.Policy]], + ]: + raise NotImplementedError() + + @property + def test_iam_permissions( + self, + ) -> typing.Callable[ + [iam_policy.TestIamPermissionsRequest], + typing.Union[ + iam_policy.TestIamPermissionsResponse, + typing.Awaitable[iam_policy.TestIamPermissionsResponse], + ], + ]: + raise NotImplementedError() + + +__all__ = ("PolicyTagManagerTransport",) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py new file mode 100644 index 00000000..d7fc35f0 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py @@ -0,0 +1,566 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Callable, Dict, Optional, Sequence, Tuple + +from google.api_core import grpc_helpers # type: ignore +from google import auth # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport.grpc import SslCredentials # type: ignore + + +import grpc # type: ignore + +from google.cloud.datacatalog_v1beta1.types import policytagmanager +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import empty_pb2 as empty # type: ignore + +from .base import PolicyTagManagerTransport + + +class PolicyTagManagerGrpcTransport(PolicyTagManagerTransport): + """gRPC backend transport for PolicyTagManager. + + The policy tag manager API service allows clients to manage + their taxonomies and policy tags. + + This class defines the same methods as the primary client, so the + primary client can load the underlying transport implementation + and call it. + + It sends protocol buffers over the wire using gRPC (which is built on + top of HTTP/2); the ``grpcio`` package must be installed. + """ + + _stubs: Dict[str, Callable] + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: str = None, + scopes: Sequence[str] = None, + channel: grpc.Channel = None, + api_mtls_endpoint: str = None, + client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, + quota_project_id: Optional[str] = None + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + This argument is ignored if ``channel`` is provided. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional(Sequence[str])): A list of scopes. This argument is + ignored if ``channel`` is provided. + channel (Optional[grpc.Channel]): A ``Channel`` instance through + which to make calls. + api_mtls_endpoint (Optional[str]): The mutual TLS endpoint. If + provided, it overrides the ``host`` argument and tries to create + a mutual TLS channel with client SSL credentials from + ``client_cert_source`` or applicatin default SSL credentials. + client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]): A + callback to provide client SSL certificate bytes and private key + bytes, both in PEM format. It is ignored if ``api_mtls_endpoint`` + is None. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + + Raises: + google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport + creation failed for any reason. + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + if channel: + # Sanity check: Ensure that channel and credentials are not both + # provided. + credentials = False + + # If a channel was explicitly provided, set it. + self._grpc_channel = channel + elif api_mtls_endpoint: + host = ( + api_mtls_endpoint + if ":" in api_mtls_endpoint + else api_mtls_endpoint + ":443" + ) + + if credentials is None: + credentials, _ = auth.default( + scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id + ) + + # Create SSL credentials with client_cert_source or application + # default SSL credentials. + if client_cert_source: + cert, key = client_cert_source() + ssl_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + else: + ssl_credentials = SslCredentials().ssl_credentials + + # create a new channel. The provided one is ignored. + self._grpc_channel = type(self).create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + ssl_credentials=ssl_credentials, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + self._stubs = {} # type: Dict[str, Callable] + + # Run the base constructor. + super().__init__( + host=host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + @classmethod + def create_channel( + cls, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: str = None, + scopes: Optional[Sequence[str]] = None, + quota_project_id: Optional[str] = None, + **kwargs + ) -> grpc.Channel: + """Create and return a gRPC channel object. + Args: + address (Optionsl[str]): The host for the channel to use. + credentials (Optional[~.Credentials]): The + authorization credentials to attach to requests. These + credentials identify this application to the service. If + none are specified, the client will attempt to ascertain + the credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is mutually exclusive with credentials. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + kwargs (Optional[dict]): Keyword arguments, which are passed to the + channel creation. + Returns: + grpc.Channel: A gRPC channel object. + + Raises: + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + scopes = scopes or cls.AUTH_SCOPES + return grpc_helpers.create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes, + quota_project_id=quota_project_id, + **kwargs + ) + + @property + def grpc_channel(self) -> grpc.Channel: + """Create the channel designed to connect to this service. + + This property caches on the instance; repeated calls return + the same channel. + """ + # Sanity check: Only create a new channel if we do not already + # have one. + if not hasattr(self, "_grpc_channel"): + self._grpc_channel = self.create_channel( + self._host, credentials=self._credentials, + ) + + # Return the channel from cache. + return self._grpc_channel + + @property + def create_taxonomy( + self, + ) -> Callable[[policytagmanager.CreateTaxonomyRequest], policytagmanager.Taxonomy]: + r"""Return a callable for the create taxonomy method over gRPC. + + Creates a taxonomy in the specified project. + + Returns: + Callable[[~.CreateTaxonomyRequest], + ~.Taxonomy]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_taxonomy" not in self._stubs: + self._stubs["create_taxonomy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/CreateTaxonomy", + request_serializer=policytagmanager.CreateTaxonomyRequest.serialize, + response_deserializer=policytagmanager.Taxonomy.deserialize, + ) + return self._stubs["create_taxonomy"] + + @property + def delete_taxonomy( + self, + ) -> Callable[[policytagmanager.DeleteTaxonomyRequest], empty.Empty]: + r"""Return a callable for the delete taxonomy method over gRPC. + + Deletes a taxonomy. This operation will also delete + all policy tags in this taxonomy along with their + associated policies. + + Returns: + Callable[[~.DeleteTaxonomyRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_taxonomy" not in self._stubs: + self._stubs["delete_taxonomy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/DeleteTaxonomy", + request_serializer=policytagmanager.DeleteTaxonomyRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_taxonomy"] + + @property + def update_taxonomy( + self, + ) -> Callable[[policytagmanager.UpdateTaxonomyRequest], policytagmanager.Taxonomy]: + r"""Return a callable for the update taxonomy method over gRPC. + + Updates a taxonomy. + + Returns: + Callable[[~.UpdateTaxonomyRequest], + ~.Taxonomy]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_taxonomy" not in self._stubs: + self._stubs["update_taxonomy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/UpdateTaxonomy", + request_serializer=policytagmanager.UpdateTaxonomyRequest.serialize, + response_deserializer=policytagmanager.Taxonomy.deserialize, + ) + return self._stubs["update_taxonomy"] + + @property + def list_taxonomies( + self, + ) -> Callable[ + [policytagmanager.ListTaxonomiesRequest], + policytagmanager.ListTaxonomiesResponse, + ]: + r"""Return a callable for the list taxonomies method over gRPC. + + Lists all taxonomies in a project in a particular + location that the caller has permission to view. + + Returns: + Callable[[~.ListTaxonomiesRequest], + ~.ListTaxonomiesResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_taxonomies" not in self._stubs: + self._stubs["list_taxonomies"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/ListTaxonomies", + request_serializer=policytagmanager.ListTaxonomiesRequest.serialize, + response_deserializer=policytagmanager.ListTaxonomiesResponse.deserialize, + ) + return self._stubs["list_taxonomies"] + + @property + def get_taxonomy( + self, + ) -> Callable[[policytagmanager.GetTaxonomyRequest], policytagmanager.Taxonomy]: + r"""Return a callable for the get taxonomy method over gRPC. + + Gets a taxonomy. + + Returns: + Callable[[~.GetTaxonomyRequest], + ~.Taxonomy]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_taxonomy" not in self._stubs: + self._stubs["get_taxonomy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/GetTaxonomy", + request_serializer=policytagmanager.GetTaxonomyRequest.serialize, + response_deserializer=policytagmanager.Taxonomy.deserialize, + ) + return self._stubs["get_taxonomy"] + + @property + def create_policy_tag( + self, + ) -> Callable[ + [policytagmanager.CreatePolicyTagRequest], policytagmanager.PolicyTag + ]: + r"""Return a callable for the create policy tag method over gRPC. + + Creates a policy tag in the specified taxonomy. + + Returns: + Callable[[~.CreatePolicyTagRequest], + ~.PolicyTag]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_policy_tag" not in self._stubs: + self._stubs["create_policy_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/CreatePolicyTag", + request_serializer=policytagmanager.CreatePolicyTagRequest.serialize, + response_deserializer=policytagmanager.PolicyTag.deserialize, + ) + return self._stubs["create_policy_tag"] + + @property + def delete_policy_tag( + self, + ) -> Callable[[policytagmanager.DeletePolicyTagRequest], empty.Empty]: + r"""Return a callable for the delete policy tag method over gRPC. + + Deletes a policy tag. Also deletes all of its + descendant policy tags. + + Returns: + Callable[[~.DeletePolicyTagRequest], + ~.Empty]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_policy_tag" not in self._stubs: + self._stubs["delete_policy_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/DeletePolicyTag", + request_serializer=policytagmanager.DeletePolicyTagRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_policy_tag"] + + @property + def update_policy_tag( + self, + ) -> Callable[ + [policytagmanager.UpdatePolicyTagRequest], policytagmanager.PolicyTag + ]: + r"""Return a callable for the update policy tag method over gRPC. + + Updates a policy tag. + + Returns: + Callable[[~.UpdatePolicyTagRequest], + ~.PolicyTag]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_policy_tag" not in self._stubs: + self._stubs["update_policy_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/UpdatePolicyTag", + request_serializer=policytagmanager.UpdatePolicyTagRequest.serialize, + response_deserializer=policytagmanager.PolicyTag.deserialize, + ) + return self._stubs["update_policy_tag"] + + @property + def list_policy_tags( + self, + ) -> Callable[ + [policytagmanager.ListPolicyTagsRequest], + policytagmanager.ListPolicyTagsResponse, + ]: + r"""Return a callable for the list policy tags method over gRPC. + + Lists all policy tags in a taxonomy. + + Returns: + Callable[[~.ListPolicyTagsRequest], + ~.ListPolicyTagsResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_policy_tags" not in self._stubs: + self._stubs["list_policy_tags"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/ListPolicyTags", + request_serializer=policytagmanager.ListPolicyTagsRequest.serialize, + response_deserializer=policytagmanager.ListPolicyTagsResponse.deserialize, + ) + return self._stubs["list_policy_tags"] + + @property + def get_policy_tag( + self, + ) -> Callable[[policytagmanager.GetPolicyTagRequest], policytagmanager.PolicyTag]: + r"""Return a callable for the get policy tag method over gRPC. + + Gets a policy tag. + + Returns: + Callable[[~.GetPolicyTagRequest], + ~.PolicyTag]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_policy_tag" not in self._stubs: + self._stubs["get_policy_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/GetPolicyTag", + request_serializer=policytagmanager.GetPolicyTagRequest.serialize, + response_deserializer=policytagmanager.PolicyTag.deserialize, + ) + return self._stubs["get_policy_tag"] + + @property + def get_iam_policy( + self, + ) -> Callable[[iam_policy.GetIamPolicyRequest], policy.Policy]: + r"""Return a callable for the get iam policy method over gRPC. + + Gets the IAM policy for a taxonomy or a policy tag. + + Returns: + Callable[[~.GetIamPolicyRequest], + ~.Policy]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_iam_policy" not in self._stubs: + self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/GetIamPolicy", + request_serializer=iam_policy.GetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["get_iam_policy"] + + @property + def set_iam_policy( + self, + ) -> Callable[[iam_policy.SetIamPolicyRequest], policy.Policy]: + r"""Return a callable for the set iam policy method over gRPC. + + Sets the IAM policy for a taxonomy or a policy tag. + + Returns: + Callable[[~.SetIamPolicyRequest], + ~.Policy]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "set_iam_policy" not in self._stubs: + self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/SetIamPolicy", + request_serializer=iam_policy.SetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["set_iam_policy"] + + @property + def test_iam_permissions( + self, + ) -> Callable[ + [iam_policy.TestIamPermissionsRequest], iam_policy.TestIamPermissionsResponse + ]: + r"""Return a callable for the test iam permissions method over gRPC. + + Returns the permissions that a caller has on the + specified taxonomy or policy tag. + + Returns: + Callable[[~.TestIamPermissionsRequest], + ~.TestIamPermissionsResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "test_iam_permissions" not in self._stubs: + self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/TestIamPermissions", + request_serializer=iam_policy.TestIamPermissionsRequest.SerializeToString, + response_deserializer=iam_policy.TestIamPermissionsResponse.FromString, + ) + return self._stubs["test_iam_permissions"] + + +__all__ = ("PolicyTagManagerGrpcTransport",) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py new file mode 100644 index 00000000..217f0a87 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py @@ -0,0 +1,568 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple + +from google.api_core import grpc_helpers_async # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport.grpc import SslCredentials # type: ignore + +import grpc # type: ignore +from grpc.experimental import aio # type: ignore + +from google.cloud.datacatalog_v1beta1.types import policytagmanager +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.protobuf import empty_pb2 as empty # type: ignore + +from .base import PolicyTagManagerTransport +from .grpc import PolicyTagManagerGrpcTransport + + +class PolicyTagManagerGrpcAsyncIOTransport(PolicyTagManagerTransport): + """gRPC AsyncIO backend transport for PolicyTagManager. + + The policy tag manager API service allows clients to manage + their taxonomies and policy tags. + + This class defines the same methods as the primary client, so the + primary client can load the underlying transport implementation + and call it. + + It sends protocol buffers over the wire using gRPC (which is built on + top of HTTP/2); the ``grpcio`` package must be installed. + """ + + _grpc_channel: aio.Channel + _stubs: Dict[str, Callable] = {} + + @classmethod + def create_channel( + cls, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: Optional[str] = None, + scopes: Optional[Sequence[str]] = None, + quota_project_id: Optional[str] = None, + **kwargs, + ) -> aio.Channel: + """Create and return a gRPC AsyncIO channel object. + Args: + address (Optional[str]): The host for the channel to use. + credentials (Optional[~.Credentials]): The + authorization credentials to attach to requests. These + credentials identify this application to the service. If + none are specified, the client will attempt to ascertain + the credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + kwargs (Optional[dict]): Keyword arguments, which are passed to the + channel creation. + Returns: + aio.Channel: A gRPC AsyncIO channel object. + """ + scopes = scopes or cls.AUTH_SCOPES + return grpc_helpers_async.create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes, + quota_project_id=quota_project_id, + **kwargs, + ) + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: Optional[str] = None, + scopes: Optional[Sequence[str]] = None, + channel: aio.Channel = None, + api_mtls_endpoint: str = None, + client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, + quota_project_id=None, + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + This argument is ignored if ``channel`` is provided. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + channel (Optional[aio.Channel]): A ``Channel`` instance through + which to make calls. + api_mtls_endpoint (Optional[str]): The mutual TLS endpoint. If + provided, it overrides the ``host`` argument and tries to create + a mutual TLS channel with client SSL credentials from + ``client_cert_source`` or applicatin default SSL credentials. + client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]): A + callback to provide client SSL certificate bytes and private key + bytes, both in PEM format. It is ignored if ``api_mtls_endpoint`` + is None. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + + Raises: + google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport + creation failed for any reason. + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + if channel: + # Sanity check: Ensure that channel and credentials are not both + # provided. + credentials = False + + # If a channel was explicitly provided, set it. + self._grpc_channel = channel + elif api_mtls_endpoint: + host = ( + api_mtls_endpoint + if ":" in api_mtls_endpoint + else api_mtls_endpoint + ":443" + ) + + # Create SSL credentials with client_cert_source or application + # default SSL credentials. + if client_cert_source: + cert, key = client_cert_source() + ssl_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + else: + ssl_credentials = SslCredentials().ssl_credentials + + # create a new channel. The provided one is ignored. + self._grpc_channel = type(self).create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + ssl_credentials=ssl_credentials, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + # Run the base constructor. + super().__init__( + host=host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + self._stubs = {} + + @property + def grpc_channel(self) -> aio.Channel: + """Create the channel designed to connect to this service. + + This property caches on the instance; repeated calls return + the same channel. + """ + # Sanity check: Only create a new channel if we do not already + # have one. + if not hasattr(self, "_grpc_channel"): + self._grpc_channel = self.create_channel( + self._host, credentials=self._credentials, + ) + + # Return the channel from cache. + return self._grpc_channel + + @property + def create_taxonomy( + self, + ) -> Callable[ + [policytagmanager.CreateTaxonomyRequest], Awaitable[policytagmanager.Taxonomy] + ]: + r"""Return a callable for the create taxonomy method over gRPC. + + Creates a taxonomy in the specified project. + + Returns: + Callable[[~.CreateTaxonomyRequest], + Awaitable[~.Taxonomy]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_taxonomy" not in self._stubs: + self._stubs["create_taxonomy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/CreateTaxonomy", + request_serializer=policytagmanager.CreateTaxonomyRequest.serialize, + response_deserializer=policytagmanager.Taxonomy.deserialize, + ) + return self._stubs["create_taxonomy"] + + @property + def delete_taxonomy( + self, + ) -> Callable[[policytagmanager.DeleteTaxonomyRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete taxonomy method over gRPC. + + Deletes a taxonomy. This operation will also delete + all policy tags in this taxonomy along with their + associated policies. + + Returns: + Callable[[~.DeleteTaxonomyRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_taxonomy" not in self._stubs: + self._stubs["delete_taxonomy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/DeleteTaxonomy", + request_serializer=policytagmanager.DeleteTaxonomyRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_taxonomy"] + + @property + def update_taxonomy( + self, + ) -> Callable[ + [policytagmanager.UpdateTaxonomyRequest], Awaitable[policytagmanager.Taxonomy] + ]: + r"""Return a callable for the update taxonomy method over gRPC. + + Updates a taxonomy. + + Returns: + Callable[[~.UpdateTaxonomyRequest], + Awaitable[~.Taxonomy]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_taxonomy" not in self._stubs: + self._stubs["update_taxonomy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/UpdateTaxonomy", + request_serializer=policytagmanager.UpdateTaxonomyRequest.serialize, + response_deserializer=policytagmanager.Taxonomy.deserialize, + ) + return self._stubs["update_taxonomy"] + + @property + def list_taxonomies( + self, + ) -> Callable[ + [policytagmanager.ListTaxonomiesRequest], + Awaitable[policytagmanager.ListTaxonomiesResponse], + ]: + r"""Return a callable for the list taxonomies method over gRPC. + + Lists all taxonomies in a project in a particular + location that the caller has permission to view. + + Returns: + Callable[[~.ListTaxonomiesRequest], + Awaitable[~.ListTaxonomiesResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_taxonomies" not in self._stubs: + self._stubs["list_taxonomies"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/ListTaxonomies", + request_serializer=policytagmanager.ListTaxonomiesRequest.serialize, + response_deserializer=policytagmanager.ListTaxonomiesResponse.deserialize, + ) + return self._stubs["list_taxonomies"] + + @property + def get_taxonomy( + self, + ) -> Callable[ + [policytagmanager.GetTaxonomyRequest], Awaitable[policytagmanager.Taxonomy] + ]: + r"""Return a callable for the get taxonomy method over gRPC. + + Gets a taxonomy. + + Returns: + Callable[[~.GetTaxonomyRequest], + Awaitable[~.Taxonomy]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_taxonomy" not in self._stubs: + self._stubs["get_taxonomy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/GetTaxonomy", + request_serializer=policytagmanager.GetTaxonomyRequest.serialize, + response_deserializer=policytagmanager.Taxonomy.deserialize, + ) + return self._stubs["get_taxonomy"] + + @property + def create_policy_tag( + self, + ) -> Callable[ + [policytagmanager.CreatePolicyTagRequest], Awaitable[policytagmanager.PolicyTag] + ]: + r"""Return a callable for the create policy tag method over gRPC. + + Creates a policy tag in the specified taxonomy. + + Returns: + Callable[[~.CreatePolicyTagRequest], + Awaitable[~.PolicyTag]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "create_policy_tag" not in self._stubs: + self._stubs["create_policy_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/CreatePolicyTag", + request_serializer=policytagmanager.CreatePolicyTagRequest.serialize, + response_deserializer=policytagmanager.PolicyTag.deserialize, + ) + return self._stubs["create_policy_tag"] + + @property + def delete_policy_tag( + self, + ) -> Callable[[policytagmanager.DeletePolicyTagRequest], Awaitable[empty.Empty]]: + r"""Return a callable for the delete policy tag method over gRPC. + + Deletes a policy tag. Also deletes all of its + descendant policy tags. + + Returns: + Callable[[~.DeletePolicyTagRequest], + Awaitable[~.Empty]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "delete_policy_tag" not in self._stubs: + self._stubs["delete_policy_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/DeletePolicyTag", + request_serializer=policytagmanager.DeletePolicyTagRequest.serialize, + response_deserializer=empty.Empty.FromString, + ) + return self._stubs["delete_policy_tag"] + + @property + def update_policy_tag( + self, + ) -> Callable[ + [policytagmanager.UpdatePolicyTagRequest], Awaitable[policytagmanager.PolicyTag] + ]: + r"""Return a callable for the update policy tag method over gRPC. + + Updates a policy tag. + + Returns: + Callable[[~.UpdatePolicyTagRequest], + Awaitable[~.PolicyTag]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "update_policy_tag" not in self._stubs: + self._stubs["update_policy_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/UpdatePolicyTag", + request_serializer=policytagmanager.UpdatePolicyTagRequest.serialize, + response_deserializer=policytagmanager.PolicyTag.deserialize, + ) + return self._stubs["update_policy_tag"] + + @property + def list_policy_tags( + self, + ) -> Callable[ + [policytagmanager.ListPolicyTagsRequest], + Awaitable[policytagmanager.ListPolicyTagsResponse], + ]: + r"""Return a callable for the list policy tags method over gRPC. + + Lists all policy tags in a taxonomy. + + Returns: + Callable[[~.ListPolicyTagsRequest], + Awaitable[~.ListPolicyTagsResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "list_policy_tags" not in self._stubs: + self._stubs["list_policy_tags"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/ListPolicyTags", + request_serializer=policytagmanager.ListPolicyTagsRequest.serialize, + response_deserializer=policytagmanager.ListPolicyTagsResponse.deserialize, + ) + return self._stubs["list_policy_tags"] + + @property + def get_policy_tag( + self, + ) -> Callable[ + [policytagmanager.GetPolicyTagRequest], Awaitable[policytagmanager.PolicyTag] + ]: + r"""Return a callable for the get policy tag method over gRPC. + + Gets a policy tag. + + Returns: + Callable[[~.GetPolicyTagRequest], + Awaitable[~.PolicyTag]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_policy_tag" not in self._stubs: + self._stubs["get_policy_tag"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/GetPolicyTag", + request_serializer=policytagmanager.GetPolicyTagRequest.serialize, + response_deserializer=policytagmanager.PolicyTag.deserialize, + ) + return self._stubs["get_policy_tag"] + + @property + def get_iam_policy( + self, + ) -> Callable[[iam_policy.GetIamPolicyRequest], Awaitable[policy.Policy]]: + r"""Return a callable for the get iam policy method over gRPC. + + Gets the IAM policy for a taxonomy or a policy tag. + + Returns: + Callable[[~.GetIamPolicyRequest], + Awaitable[~.Policy]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "get_iam_policy" not in self._stubs: + self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/GetIamPolicy", + request_serializer=iam_policy.GetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["get_iam_policy"] + + @property + def set_iam_policy( + self, + ) -> Callable[[iam_policy.SetIamPolicyRequest], Awaitable[policy.Policy]]: + r"""Return a callable for the set iam policy method over gRPC. + + Sets the IAM policy for a taxonomy or a policy tag. + + Returns: + Callable[[~.SetIamPolicyRequest], + Awaitable[~.Policy]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "set_iam_policy" not in self._stubs: + self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/SetIamPolicy", + request_serializer=iam_policy.SetIamPolicyRequest.SerializeToString, + response_deserializer=policy.Policy.FromString, + ) + return self._stubs["set_iam_policy"] + + @property + def test_iam_permissions( + self, + ) -> Callable[ + [iam_policy.TestIamPermissionsRequest], + Awaitable[iam_policy.TestIamPermissionsResponse], + ]: + r"""Return a callable for the test iam permissions method over gRPC. + + Returns the permissions that a caller has on the + specified taxonomy or policy tag. + + Returns: + Callable[[~.TestIamPermissionsRequest], + Awaitable[~.TestIamPermissionsResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "test_iam_permissions" not in self._stubs: + self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManager/TestIamPermissions", + request_serializer=iam_policy.TestIamPermissionsRequest.SerializeToString, + response_deserializer=iam_policy.TestIamPermissionsResponse.FromString, + ) + return self._stubs["test_iam_permissions"] + + +__all__ = ("PolicyTagManagerGrpcAsyncIOTransport",) diff --git a/google/cloud/datacatalog.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/__init__.py similarity index 55% rename from google/cloud/datacatalog.py rename to google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/__init__.py index 4c14f13a..16fecda2 100644 --- a/google/cloud/datacatalog.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/__init__.py @@ -1,33 +1,24 @@ # -*- coding: utf-8 -*- -# + # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # -# https://www.apache.org/licenses/LICENSE-2.0 +# http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. +# - -from __future__ import absolute_import - -from google.cloud.datacatalog_v1beta1 import DataCatalogClient -from google.cloud.datacatalog_v1beta1 import PolicyTagManagerClient -from google.cloud.datacatalog_v1beta1 import PolicyTagManagerSerializationClient -from google.cloud.datacatalog_v1beta1 import enums -from google.cloud.datacatalog_v1beta1 import types - +from .client import PolicyTagManagerSerializationClient +from .async_client import PolicyTagManagerSerializationAsyncClient __all__ = ( - "enums", - "types", - "DataCatalogClient", - "PolicyTagManagerClient", "PolicyTagManagerSerializationClient", + "PolicyTagManagerSerializationAsyncClient", ) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py new file mode 100644 index 00000000..474cc182 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py @@ -0,0 +1,222 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +import functools +import re +from typing import Dict, Sequence, Tuple, Type, Union +import pkg_resources + +import google.api_core.client_options as ClientOptions # type: ignore +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore +from google.oauth2 import service_account # type: ignore + +from google.cloud.datacatalog_v1beta1.types import policytagmanager +from google.cloud.datacatalog_v1beta1.types import policytagmanagerserialization + +from .transports.base import PolicyTagManagerSerializationTransport +from .transports.grpc_asyncio import PolicyTagManagerSerializationGrpcAsyncIOTransport +from .client import PolicyTagManagerSerializationClient + + +class PolicyTagManagerSerializationAsyncClient: + """Policy tag manager serialization API service allows clients + to manipulate their taxonomies and policy tags data with + serialized format. + """ + + _client: PolicyTagManagerSerializationClient + + DEFAULT_ENDPOINT = PolicyTagManagerSerializationClient.DEFAULT_ENDPOINT + DEFAULT_MTLS_ENDPOINT = PolicyTagManagerSerializationClient.DEFAULT_MTLS_ENDPOINT + + from_service_account_file = ( + PolicyTagManagerSerializationClient.from_service_account_file + ) + from_service_account_json = from_service_account_file + + get_transport_class = functools.partial( + type(PolicyTagManagerSerializationClient).get_transport_class, + type(PolicyTagManagerSerializationClient), + ) + + def __init__( + self, + *, + credentials: credentials.Credentials = None, + transport: Union[str, PolicyTagManagerSerializationTransport] = "grpc_asyncio", + client_options: ClientOptions = None, + ) -> None: + """Instantiate the policy tag manager serialization client. + + Args: + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + transport (Union[str, ~.PolicyTagManagerSerializationTransport]): The + transport to use. If set to None, a transport is chosen + automatically. + client_options (ClientOptions): Custom options for the client. It + won't take effect if a ``transport`` instance is provided. + (1) The ``api_endpoint`` property can be used to override the + default endpoint provided by the client. GOOGLE_API_USE_MTLS + environment variable can also be used to override the endpoint: + "always" (always use the default mTLS endpoint), "never" (always + use the default regular endpoint, this is the default value for + the environment variable) and "auto" (auto switch to the default + mTLS endpoint if client SSL credentials is present). However, + the ``api_endpoint`` property takes precedence if provided. + (2) The ``client_cert_source`` property is used to provide client + SSL credentials for mutual TLS transport. If not provided, the + default SSL credentials will be used if present. + + Raises: + google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport + creation failed for any reason. + """ + + self._client = PolicyTagManagerSerializationClient( + credentials=credentials, transport=transport, client_options=client_options, + ) + + async def import_taxonomies( + self, + request: policytagmanagerserialization.ImportTaxonomiesRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanagerserialization.ImportTaxonomiesResponse: + r"""Imports all taxonomies and their policy tags to a + project as new taxonomies. + + This method provides a bulk taxonomy / policy tag + creation using nested proto structure. + + Args: + request (:class:`~.policytagmanagerserialization.ImportTaxonomiesRequest`): + The request object. Request message for + [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanagerserialization.ImportTaxonomiesResponse: + Response message for + [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. + + """ + # Create or coerce a protobuf request object. + + request = policytagmanagerserialization.ImportTaxonomiesRequest(request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.import_taxonomies, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + async def export_taxonomies( + self, + request: policytagmanagerserialization.ExportTaxonomiesRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanagerserialization.ExportTaxonomiesResponse: + r"""Exports all taxonomies and their policy tags in a + project. + This method generates SerializedTaxonomy protos with + nested policy tags that can be used as an input for + future ImportTaxonomies calls. + + Args: + request (:class:`~.policytagmanagerserialization.ExportTaxonomiesRequest`): + The request object. Request message for + [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanagerserialization.ExportTaxonomiesResponse: + Response message for + [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. + + """ + # Create or coerce a protobuf request object. + + request = policytagmanagerserialization.ExportTaxonomiesRequest(request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = gapic_v1.method_async.wrap_method( + self._client._transport.export_taxonomies, + default_timeout=None, + client_info=_client_info, + ) + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = await rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +__all__ = ("PolicyTagManagerSerializationAsyncClient",) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py new file mode 100644 index 00000000..445d151d --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py @@ -0,0 +1,357 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +import os +import re +from typing import Callable, Dict, Sequence, Tuple, Type, Union +import pkg_resources + +import google.api_core.client_options as ClientOptions # type: ignore +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport import mtls # type: ignore +from google.auth.exceptions import MutualTLSChannelError # type: ignore +from google.oauth2 import service_account # type: ignore + +from google.cloud.datacatalog_v1beta1.types import policytagmanager +from google.cloud.datacatalog_v1beta1.types import policytagmanagerserialization + +from .transports.base import PolicyTagManagerSerializationTransport +from .transports.grpc import PolicyTagManagerSerializationGrpcTransport +from .transports.grpc_asyncio import PolicyTagManagerSerializationGrpcAsyncIOTransport + + +class PolicyTagManagerSerializationClientMeta(type): + """Metaclass for the PolicyTagManagerSerialization client. + + This provides class-level methods for building and retrieving + support objects (e.g. transport) without polluting the client instance + objects. + """ + + _transport_registry = ( + OrderedDict() + ) # type: Dict[str, Type[PolicyTagManagerSerializationTransport]] + _transport_registry["grpc"] = PolicyTagManagerSerializationGrpcTransport + _transport_registry[ + "grpc_asyncio" + ] = PolicyTagManagerSerializationGrpcAsyncIOTransport + + def get_transport_class( + cls, label: str = None, + ) -> Type[PolicyTagManagerSerializationTransport]: + """Return an appropriate transport class. + + Args: + label: The name of the desired transport. If none is + provided, then the first transport in the registry is used. + + Returns: + The transport class to use. + """ + # If a specific transport is requested, return that one. + if label: + return cls._transport_registry[label] + + # No transport is requested; return the default (that is, the first one + # in the dictionary). + return next(iter(cls._transport_registry.values())) + + +class PolicyTagManagerSerializationClient( + metaclass=PolicyTagManagerSerializationClientMeta +): + """Policy tag manager serialization API service allows clients + to manipulate their taxonomies and policy tags data with + serialized format. + """ + + @staticmethod + def _get_default_mtls_endpoint(api_endpoint): + """Convert api endpoint to mTLS endpoint. + Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to + "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively. + Args: + api_endpoint (Optional[str]): the api endpoint to convert. + Returns: + str: converted mTLS api endpoint. + """ + if not api_endpoint: + return api_endpoint + + mtls_endpoint_re = re.compile( + r"(?P[^.]+)(?P\.mtls)?(?P\.sandbox)?(?P\.googleapis\.com)?" + ) + + m = mtls_endpoint_re.match(api_endpoint) + name, mtls, sandbox, googledomain = m.groups() + if mtls or not googledomain: + return api_endpoint + + if sandbox: + return api_endpoint.replace( + "sandbox.googleapis.com", "mtls.sandbox.googleapis.com" + ) + + return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com") + + DEFAULT_ENDPOINT = "datacatalog.googleapis.com" + DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore + DEFAULT_ENDPOINT + ) + + @classmethod + def from_service_account_file(cls, filename: str, *args, **kwargs): + """Creates an instance of this client using the provided credentials + file. + + Args: + filename (str): The path to the service account private key json + file. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + {@api.name}: The constructed client. + """ + credentials = service_account.Credentials.from_service_account_file(filename) + kwargs["credentials"] = credentials + return cls(*args, **kwargs) + + from_service_account_json = from_service_account_file + + def __init__( + self, + *, + credentials: credentials.Credentials = None, + transport: Union[str, PolicyTagManagerSerializationTransport] = None, + client_options: ClientOptions = None, + ) -> None: + """Instantiate the policy tag manager serialization client. + + Args: + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + transport (Union[str, ~.PolicyTagManagerSerializationTransport]): The + transport to use. If set to None, a transport is chosen + automatically. + client_options (ClientOptions): Custom options for the client. It + won't take effect if a ``transport`` instance is provided. + (1) The ``api_endpoint`` property can be used to override the + default endpoint provided by the client. GOOGLE_API_USE_MTLS + environment variable can also be used to override the endpoint: + "always" (always use the default mTLS endpoint), "never" (always + use the default regular endpoint, this is the default value for + the environment variable) and "auto" (auto switch to the default + mTLS endpoint if client SSL credentials is present). However, + the ``api_endpoint`` property takes precedence if provided. + (2) The ``client_cert_source`` property is used to provide client + SSL credentials for mutual TLS transport. If not provided, the + default SSL credentials will be used if present. + + Raises: + google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport + creation failed for any reason. + """ + if isinstance(client_options, dict): + client_options = ClientOptions.from_dict(client_options) + if client_options is None: + client_options = ClientOptions.ClientOptions() + + if client_options.api_endpoint is None: + use_mtls_env = os.getenv("GOOGLE_API_USE_MTLS", "never") + if use_mtls_env == "never": + client_options.api_endpoint = self.DEFAULT_ENDPOINT + elif use_mtls_env == "always": + client_options.api_endpoint = self.DEFAULT_MTLS_ENDPOINT + elif use_mtls_env == "auto": + has_client_cert_source = ( + client_options.client_cert_source is not None + or mtls.has_default_client_cert_source() + ) + client_options.api_endpoint = ( + self.DEFAULT_MTLS_ENDPOINT + if has_client_cert_source + else self.DEFAULT_ENDPOINT + ) + else: + raise MutualTLSChannelError( + "Unsupported GOOGLE_API_USE_MTLS value. Accepted values: never, auto, always" + ) + + # Save or instantiate the transport. + # Ordinarily, we provide the transport, but allowing a custom transport + # instance provides an extensibility point for unusual situations. + if isinstance(transport, PolicyTagManagerSerializationTransport): + # transport is a PolicyTagManagerSerializationTransport instance. + if credentials or client_options.credentials_file: + raise ValueError( + "When providing a transport instance, " + "provide its credentials directly." + ) + if client_options.scopes: + raise ValueError( + "When providing a transport instance, " + "provide its scopes directly." + ) + self._transport = transport + else: + Transport = type(self).get_transport_class(transport) + self._transport = Transport( + credentials=credentials, + credentials_file=client_options.credentials_file, + host=client_options.api_endpoint, + scopes=client_options.scopes, + api_mtls_endpoint=client_options.api_endpoint, + client_cert_source=client_options.client_cert_source, + quota_project_id=client_options.quota_project_id, + ) + + def import_taxonomies( + self, + request: policytagmanagerserialization.ImportTaxonomiesRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanagerserialization.ImportTaxonomiesResponse: + r"""Imports all taxonomies and their policy tags to a + project as new taxonomies. + + This method provides a bulk taxonomy / policy tag + creation using nested proto structure. + + Args: + request (:class:`~.policytagmanagerserialization.ImportTaxonomiesRequest`): + The request object. Request message for + [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanagerserialization.ImportTaxonomiesResponse: + Response message for + [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. + + """ + # Create or coerce a protobuf request object. + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanagerserialization.ImportTaxonomiesRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance( + request, policytagmanagerserialization.ImportTaxonomiesRequest + ): + request = policytagmanagerserialization.ImportTaxonomiesRequest(request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.import_taxonomies] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + def export_taxonomies( + self, + request: policytagmanagerserialization.ExportTaxonomiesRequest = None, + *, + retry: retries.Retry = gapic_v1.method.DEFAULT, + timeout: float = None, + metadata: Sequence[Tuple[str, str]] = (), + ) -> policytagmanagerserialization.ExportTaxonomiesResponse: + r"""Exports all taxonomies and their policy tags in a + project. + This method generates SerializedTaxonomy protos with + nested policy tags that can be used as an input for + future ImportTaxonomies calls. + + Args: + request (:class:`~.policytagmanagerserialization.ExportTaxonomiesRequest`): + The request object. Request message for + [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. + + retry (google.api_core.retry.Retry): Designation of what errors, if any, + should be retried. + timeout (float): The timeout for this request. + metadata (Sequence[Tuple[str, str]]): Strings which should be + sent along with the request as metadata. + + Returns: + ~.policytagmanagerserialization.ExportTaxonomiesResponse: + Response message for + [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. + + """ + # Create or coerce a protobuf request object. + + # Minor optimization to avoid making a copy if the user passes + # in a policytagmanagerserialization.ExportTaxonomiesRequest. + # There's no risk of modifying the input as we've already verified + # there are no flattened fields. + if not isinstance( + request, policytagmanagerserialization.ExportTaxonomiesRequest + ): + request = policytagmanagerserialization.ExportTaxonomiesRequest(request) + + # Wrap the RPC method; this adds retry and timeout information, + # and friendly error handling. + rpc = self._transport._wrapped_methods[self._transport.export_taxonomies] + + # Certain fields should be provided within the metadata header; + # add these here. + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)), + ) + + # Send the request. + response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) + + # Done; return the response. + return response + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +__all__ = ("PolicyTagManagerSerializationClient",) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/__init__.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/__init__.py new file mode 100644 index 00000000..9e8babd0 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/__init__.py @@ -0,0 +1,38 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from collections import OrderedDict +from typing import Dict, Type + +from .base import PolicyTagManagerSerializationTransport +from .grpc import PolicyTagManagerSerializationGrpcTransport +from .grpc_asyncio import PolicyTagManagerSerializationGrpcAsyncIOTransport + + +# Compile a registry of transports. +_transport_registry = ( + OrderedDict() +) # type: Dict[str, Type[PolicyTagManagerSerializationTransport]] +_transport_registry["grpc"] = PolicyTagManagerSerializationGrpcTransport +_transport_registry["grpc_asyncio"] = PolicyTagManagerSerializationGrpcAsyncIOTransport + + +__all__ = ( + "PolicyTagManagerSerializationTransport", + "PolicyTagManagerSerializationGrpcTransport", + "PolicyTagManagerSerializationGrpcAsyncIOTransport", +) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/base.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/base.py new file mode 100644 index 00000000..26360d93 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/base.py @@ -0,0 +1,136 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import abc +import typing +import pkg_resources + +from google import auth +from google.api_core import exceptions # type: ignore +from google.api_core import gapic_v1 # type: ignore +from google.api_core import retry as retries # type: ignore +from google.auth import credentials # type: ignore + +from google.cloud.datacatalog_v1beta1.types import policytagmanagerserialization + + +try: + _client_info = gapic_v1.client_info.ClientInfo( + gapic_version=pkg_resources.get_distribution( + "google-cloud-datacatalog", + ).version, + ) +except pkg_resources.DistributionNotFound: + _client_info = gapic_v1.client_info.ClientInfo() + + +class PolicyTagManagerSerializationTransport(abc.ABC): + """Abstract transport class for PolicyTagManagerSerialization.""" + + AUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",) + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: typing.Optional[str] = None, + scopes: typing.Optional[typing.Sequence[str]] = AUTH_SCOPES, + quota_project_id: typing.Optional[str] = None, + **kwargs, + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is mutually exclusive with credentials. + scope (Optional[Sequence[str]]): A list of scopes. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + """ + # Save the hostname. Default to port 443 (HTTPS) if none is specified. + if ":" not in host: + host += ":443" + self._host = host + + # If no credentials are provided, then determine the appropriate + # defaults. + if credentials and credentials_file: + raise exceptions.DuplicateCredentialArgs( + "'credentials_file' and 'credentials' are mutually exclusive" + ) + + if credentials_file is not None: + credentials, _ = auth.load_credentials_from_file( + credentials_file, scopes=scopes, quota_project_id=quota_project_id + ) + + elif credentials is None: + credentials, _ = auth.default( + scopes=scopes, quota_project_id=quota_project_id + ) + + # Save the credentials. + self._credentials = credentials + + # Lifted into its own function so it can be stubbed out during tests. + self._prep_wrapped_messages() + + def _prep_wrapped_messages(self): + # Precompute the wrapped methods. + self._wrapped_methods = { + self.import_taxonomies: gapic_v1.method.wrap_method( + self.import_taxonomies, default_timeout=None, client_info=_client_info, + ), + self.export_taxonomies: gapic_v1.method.wrap_method( + self.export_taxonomies, default_timeout=None, client_info=_client_info, + ), + } + + @property + def import_taxonomies( + self, + ) -> typing.Callable[ + [policytagmanagerserialization.ImportTaxonomiesRequest], + typing.Union[ + policytagmanagerserialization.ImportTaxonomiesResponse, + typing.Awaitable[policytagmanagerserialization.ImportTaxonomiesResponse], + ], + ]: + raise NotImplementedError() + + @property + def export_taxonomies( + self, + ) -> typing.Callable[ + [policytagmanagerserialization.ExportTaxonomiesRequest], + typing.Union[ + policytagmanagerserialization.ExportTaxonomiesResponse, + typing.Awaitable[policytagmanagerserialization.ExportTaxonomiesResponse], + ], + ]: + raise NotImplementedError() + + +__all__ = ("PolicyTagManagerSerializationTransport",) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py new file mode 100644 index 00000000..d2d74539 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py @@ -0,0 +1,277 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Callable, Dict, Optional, Sequence, Tuple + +from google.api_core import grpc_helpers # type: ignore +from google import auth # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport.grpc import SslCredentials # type: ignore + + +import grpc # type: ignore + +from google.cloud.datacatalog_v1beta1.types import policytagmanagerserialization + +from .base import PolicyTagManagerSerializationTransport + + +class PolicyTagManagerSerializationGrpcTransport( + PolicyTagManagerSerializationTransport +): + """gRPC backend transport for PolicyTagManagerSerialization. + + Policy tag manager serialization API service allows clients + to manipulate their taxonomies and policy tags data with + serialized format. + + This class defines the same methods as the primary client, so the + primary client can load the underlying transport implementation + and call it. + + It sends protocol buffers over the wire using gRPC (which is built on + top of HTTP/2); the ``grpcio`` package must be installed. + """ + + _stubs: Dict[str, Callable] + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: str = None, + scopes: Sequence[str] = None, + channel: grpc.Channel = None, + api_mtls_endpoint: str = None, + client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, + quota_project_id: Optional[str] = None + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + This argument is ignored if ``channel`` is provided. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional(Sequence[str])): A list of scopes. This argument is + ignored if ``channel`` is provided. + channel (Optional[grpc.Channel]): A ``Channel`` instance through + which to make calls. + api_mtls_endpoint (Optional[str]): The mutual TLS endpoint. If + provided, it overrides the ``host`` argument and tries to create + a mutual TLS channel with client SSL credentials from + ``client_cert_source`` or applicatin default SSL credentials. + client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]): A + callback to provide client SSL certificate bytes and private key + bytes, both in PEM format. It is ignored if ``api_mtls_endpoint`` + is None. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + + Raises: + google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport + creation failed for any reason. + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + if channel: + # Sanity check: Ensure that channel and credentials are not both + # provided. + credentials = False + + # If a channel was explicitly provided, set it. + self._grpc_channel = channel + elif api_mtls_endpoint: + host = ( + api_mtls_endpoint + if ":" in api_mtls_endpoint + else api_mtls_endpoint + ":443" + ) + + if credentials is None: + credentials, _ = auth.default( + scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id + ) + + # Create SSL credentials with client_cert_source or application + # default SSL credentials. + if client_cert_source: + cert, key = client_cert_source() + ssl_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + else: + ssl_credentials = SslCredentials().ssl_credentials + + # create a new channel. The provided one is ignored. + self._grpc_channel = type(self).create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + ssl_credentials=ssl_credentials, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + self._stubs = {} # type: Dict[str, Callable] + + # Run the base constructor. + super().__init__( + host=host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + @classmethod + def create_channel( + cls, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: str = None, + scopes: Optional[Sequence[str]] = None, + quota_project_id: Optional[str] = None, + **kwargs + ) -> grpc.Channel: + """Create and return a gRPC channel object. + Args: + address (Optionsl[str]): The host for the channel to use. + credentials (Optional[~.Credentials]): The + authorization credentials to attach to requests. These + credentials identify this application to the service. If + none are specified, the client will attempt to ascertain + the credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is mutually exclusive with credentials. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + kwargs (Optional[dict]): Keyword arguments, which are passed to the + channel creation. + Returns: + grpc.Channel: A gRPC channel object. + + Raises: + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + scopes = scopes or cls.AUTH_SCOPES + return grpc_helpers.create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes, + quota_project_id=quota_project_id, + **kwargs + ) + + @property + def grpc_channel(self) -> grpc.Channel: + """Create the channel designed to connect to this service. + + This property caches on the instance; repeated calls return + the same channel. + """ + # Sanity check: Only create a new channel if we do not already + # have one. + if not hasattr(self, "_grpc_channel"): + self._grpc_channel = self.create_channel( + self._host, credentials=self._credentials, + ) + + # Return the channel from cache. + return self._grpc_channel + + @property + def import_taxonomies( + self, + ) -> Callable[ + [policytagmanagerserialization.ImportTaxonomiesRequest], + policytagmanagerserialization.ImportTaxonomiesResponse, + ]: + r"""Return a callable for the import taxonomies method over gRPC. + + Imports all taxonomies and their policy tags to a + project as new taxonomies. + + This method provides a bulk taxonomy / policy tag + creation using nested proto structure. + + Returns: + Callable[[~.ImportTaxonomiesRequest], + ~.ImportTaxonomiesResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "import_taxonomies" not in self._stubs: + self._stubs["import_taxonomies"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization/ImportTaxonomies", + request_serializer=policytagmanagerserialization.ImportTaxonomiesRequest.serialize, + response_deserializer=policytagmanagerserialization.ImportTaxonomiesResponse.deserialize, + ) + return self._stubs["import_taxonomies"] + + @property + def export_taxonomies( + self, + ) -> Callable[ + [policytagmanagerserialization.ExportTaxonomiesRequest], + policytagmanagerserialization.ExportTaxonomiesResponse, + ]: + r"""Return a callable for the export taxonomies method over gRPC. + + Exports all taxonomies and their policy tags in a + project. + This method generates SerializedTaxonomy protos with + nested policy tags that can be used as an input for + future ImportTaxonomies calls. + + Returns: + Callable[[~.ExportTaxonomiesRequest], + ~.ExportTaxonomiesResponse]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "export_taxonomies" not in self._stubs: + self._stubs["export_taxonomies"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization/ExportTaxonomies", + request_serializer=policytagmanagerserialization.ExportTaxonomiesRequest.serialize, + response_deserializer=policytagmanagerserialization.ExportTaxonomiesResponse.deserialize, + ) + return self._stubs["export_taxonomies"] + + +__all__ = ("PolicyTagManagerSerializationGrpcTransport",) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py new file mode 100644 index 00000000..8e47b76f --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py @@ -0,0 +1,270 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple + +from google.api_core import grpc_helpers_async # type: ignore +from google.auth import credentials # type: ignore +from google.auth.transport.grpc import SslCredentials # type: ignore + +import grpc # type: ignore +from grpc.experimental import aio # type: ignore + +from google.cloud.datacatalog_v1beta1.types import policytagmanagerserialization + +from .base import PolicyTagManagerSerializationTransport +from .grpc import PolicyTagManagerSerializationGrpcTransport + + +class PolicyTagManagerSerializationGrpcAsyncIOTransport( + PolicyTagManagerSerializationTransport +): + """gRPC AsyncIO backend transport for PolicyTagManagerSerialization. + + Policy tag manager serialization API service allows clients + to manipulate their taxonomies and policy tags data with + serialized format. + + This class defines the same methods as the primary client, so the + primary client can load the underlying transport implementation + and call it. + + It sends protocol buffers over the wire using gRPC (which is built on + top of HTTP/2); the ``grpcio`` package must be installed. + """ + + _grpc_channel: aio.Channel + _stubs: Dict[str, Callable] = {} + + @classmethod + def create_channel( + cls, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: Optional[str] = None, + scopes: Optional[Sequence[str]] = None, + quota_project_id: Optional[str] = None, + **kwargs, + ) -> aio.Channel: + """Create and return a gRPC AsyncIO channel object. + Args: + address (Optional[str]): The host for the channel to use. + credentials (Optional[~.Credentials]): The + authorization credentials to attach to requests. These + credentials identify this application to the service. If + none are specified, the client will attempt to ascertain + the credentials from the environment. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + kwargs (Optional[dict]): Keyword arguments, which are passed to the + channel creation. + Returns: + aio.Channel: A gRPC AsyncIO channel object. + """ + scopes = scopes or cls.AUTH_SCOPES + return grpc_helpers_async.create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes, + quota_project_id=quota_project_id, + **kwargs, + ) + + def __init__( + self, + *, + host: str = "datacatalog.googleapis.com", + credentials: credentials.Credentials = None, + credentials_file: Optional[str] = None, + scopes: Optional[Sequence[str]] = None, + channel: aio.Channel = None, + api_mtls_endpoint: str = None, + client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, + quota_project_id=None, + ) -> None: + """Instantiate the transport. + + Args: + host (Optional[str]): The hostname to connect to. + credentials (Optional[google.auth.credentials.Credentials]): The + authorization credentials to attach to requests. These + credentials identify the application to the service; if none + are specified, the client will attempt to ascertain the + credentials from the environment. + This argument is ignored if ``channel`` is provided. + credentials_file (Optional[str]): A file with credentials that can + be loaded with :func:`google.auth.load_credentials_from_file`. + This argument is ignored if ``channel`` is provided. + scopes (Optional[Sequence[str]]): A optional list of scopes needed for this + service. These are only used when credentials are not specified and + are passed to :func:`google.auth.default`. + channel (Optional[aio.Channel]): A ``Channel`` instance through + which to make calls. + api_mtls_endpoint (Optional[str]): The mutual TLS endpoint. If + provided, it overrides the ``host`` argument and tries to create + a mutual TLS channel with client SSL credentials from + ``client_cert_source`` or applicatin default SSL credentials. + client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]): A + callback to provide client SSL certificate bytes and private key + bytes, both in PEM format. It is ignored if ``api_mtls_endpoint`` + is None. + quota_project_id (Optional[str]): An optional project to use for billing + and quota. + + Raises: + google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport + creation failed for any reason. + google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` + and ``credentials_file`` are passed. + """ + if channel: + # Sanity check: Ensure that channel and credentials are not both + # provided. + credentials = False + + # If a channel was explicitly provided, set it. + self._grpc_channel = channel + elif api_mtls_endpoint: + host = ( + api_mtls_endpoint + if ":" in api_mtls_endpoint + else api_mtls_endpoint + ":443" + ) + + # Create SSL credentials with client_cert_source or application + # default SSL credentials. + if client_cert_source: + cert, key = client_cert_source() + ssl_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + else: + ssl_credentials = SslCredentials().ssl_credentials + + # create a new channel. The provided one is ignored. + self._grpc_channel = type(self).create_channel( + host, + credentials=credentials, + credentials_file=credentials_file, + ssl_credentials=ssl_credentials, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + # Run the base constructor. + super().__init__( + host=host, + credentials=credentials, + credentials_file=credentials_file, + scopes=scopes or self.AUTH_SCOPES, + quota_project_id=quota_project_id, + ) + + self._stubs = {} + + @property + def grpc_channel(self) -> aio.Channel: + """Create the channel designed to connect to this service. + + This property caches on the instance; repeated calls return + the same channel. + """ + # Sanity check: Only create a new channel if we do not already + # have one. + if not hasattr(self, "_grpc_channel"): + self._grpc_channel = self.create_channel( + self._host, credentials=self._credentials, + ) + + # Return the channel from cache. + return self._grpc_channel + + @property + def import_taxonomies( + self, + ) -> Callable[ + [policytagmanagerserialization.ImportTaxonomiesRequest], + Awaitable[policytagmanagerserialization.ImportTaxonomiesResponse], + ]: + r"""Return a callable for the import taxonomies method over gRPC. + + Imports all taxonomies and their policy tags to a + project as new taxonomies. + + This method provides a bulk taxonomy / policy tag + creation using nested proto structure. + + Returns: + Callable[[~.ImportTaxonomiesRequest], + Awaitable[~.ImportTaxonomiesResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "import_taxonomies" not in self._stubs: + self._stubs["import_taxonomies"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization/ImportTaxonomies", + request_serializer=policytagmanagerserialization.ImportTaxonomiesRequest.serialize, + response_deserializer=policytagmanagerserialization.ImportTaxonomiesResponse.deserialize, + ) + return self._stubs["import_taxonomies"] + + @property + def export_taxonomies( + self, + ) -> Callable[ + [policytagmanagerserialization.ExportTaxonomiesRequest], + Awaitable[policytagmanagerserialization.ExportTaxonomiesResponse], + ]: + r"""Return a callable for the export taxonomies method over gRPC. + + Exports all taxonomies and their policy tags in a + project. + This method generates SerializedTaxonomy protos with + nested policy tags that can be used as an input for + future ImportTaxonomies calls. + + Returns: + Callable[[~.ExportTaxonomiesRequest], + Awaitable[~.ExportTaxonomiesResponse]]: + A function that, when called, will call the underlying RPC + on the server. + """ + # Generate a "stub function" on-the-fly which will actually make + # the request. + # gRPC handles serialization and deserialization, so we just need + # to pass in the functions for each. + if "export_taxonomies" not in self._stubs: + self._stubs["export_taxonomies"] = self.grpc_channel.unary_unary( + "/google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization/ExportTaxonomies", + request_serializer=policytagmanagerserialization.ExportTaxonomiesRequest.serialize, + response_deserializer=policytagmanagerserialization.ExportTaxonomiesResponse.deserialize, + ) + return self._stubs["export_taxonomies"] + + +__all__ = ("PolicyTagManagerSerializationGrpcAsyncIOTransport",) diff --git a/google/cloud/datacatalog_v1beta1/types.py b/google/cloud/datacatalog_v1beta1/types.py deleted file mode 100644 index b074c792..00000000 --- a/google/cloud/datacatalog_v1beta1/types.py +++ /dev/null @@ -1,76 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - - -from __future__ import absolute_import -import sys - -from google.api_core.protobuf_helpers import get_messages - -from google.cloud.datacatalog_v1beta1.proto import datacatalog_pb2 -from google.cloud.datacatalog_v1beta1.proto import gcs_fileset_spec_pb2 -from google.cloud.datacatalog_v1beta1.proto import policytagmanager_pb2 -from google.cloud.datacatalog_v1beta1.proto import policytagmanagerserialization_pb2 -from google.cloud.datacatalog_v1beta1.proto import schema_pb2 -from google.cloud.datacatalog_v1beta1.proto import search_pb2 -from google.cloud.datacatalog_v1beta1.proto import table_spec_pb2 -from google.cloud.datacatalog_v1beta1.proto import tags_pb2 -from google.cloud.datacatalog_v1beta1.proto import timestamps_pb2 -from google.iam.v1 import iam_policy_pb2 -from google.iam.v1 import options_pb2 -from google.iam.v1 import policy_pb2 -from google.protobuf import empty_pb2 -from google.protobuf import field_mask_pb2 -from google.protobuf import timestamp_pb2 -from google.type import expr_pb2 - - -_shared_modules = [ - iam_policy_pb2, - options_pb2, - policy_pb2, - empty_pb2, - field_mask_pb2, - timestamp_pb2, - expr_pb2, -] - -_local_modules = [ - datacatalog_pb2, - gcs_fileset_spec_pb2, - policytagmanager_pb2, - policytagmanagerserialization_pb2, - schema_pb2, - search_pb2, - table_spec_pb2, - tags_pb2, - timestamps_pb2, -] - -names = [] - -for module in _shared_modules: # pragma: NO COVER - for name, message in get_messages(module).items(): - setattr(sys.modules[__name__], name, message) - names.append(name) -for module in _local_modules: - for name, message in get_messages(module).items(): - message.__module__ = "google.cloud.datacatalog_v1beta1.types" - setattr(sys.modules[__name__], name, message) - names.append(name) - - -__all__ = tuple(sorted(names)) diff --git a/google/cloud/datacatalog_v1beta1/types/__init__.py b/google/cloud/datacatalog_v1beta1/types/__init__.py new file mode 100644 index 00000000..8a5c9ee7 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/__init__.py @@ -0,0 +1,167 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +from .timestamps import SystemTimestamps +from .gcs_fileset_spec import ( + GcsFilesetSpec, + GcsFileSpec, +) +from .schema import ( + Schema, + ColumnSchema, +) +from .search import SearchCatalogResult +from .table_spec import ( + BigQueryTableSpec, + ViewSpec, + TableSpec, + BigQueryDateShardedSpec, +) +from .tags import ( + Tag, + TagField, + TagTemplate, + TagTemplateField, + FieldType, +) +from .datacatalog import ( + SearchCatalogRequest, + SearchCatalogResponse, + CreateEntryGroupRequest, + UpdateEntryGroupRequest, + GetEntryGroupRequest, + DeleteEntryGroupRequest, + ListEntryGroupsRequest, + ListEntryGroupsResponse, + CreateEntryRequest, + UpdateEntryRequest, + DeleteEntryRequest, + GetEntryRequest, + LookupEntryRequest, + Entry, + EntryGroup, + CreateTagTemplateRequest, + GetTagTemplateRequest, + UpdateTagTemplateRequest, + DeleteTagTemplateRequest, + CreateTagRequest, + UpdateTagRequest, + DeleteTagRequest, + CreateTagTemplateFieldRequest, + UpdateTagTemplateFieldRequest, + RenameTagTemplateFieldRequest, + DeleteTagTemplateFieldRequest, + ListTagsRequest, + ListTagsResponse, + ListEntriesRequest, + ListEntriesResponse, +) +from .policytagmanager import ( + Taxonomy, + PolicyTag, + CreateTaxonomyRequest, + DeleteTaxonomyRequest, + UpdateTaxonomyRequest, + ListTaxonomiesRequest, + ListTaxonomiesResponse, + GetTaxonomyRequest, + CreatePolicyTagRequest, + DeletePolicyTagRequest, + UpdatePolicyTagRequest, + ListPolicyTagsRequest, + ListPolicyTagsResponse, + GetPolicyTagRequest, +) +from .policytagmanagerserialization import ( + SerializedTaxonomy, + SerializedPolicyTag, + ImportTaxonomiesRequest, + InlineSource, + ImportTaxonomiesResponse, + ExportTaxonomiesRequest, + ExportTaxonomiesResponse, +) + + +__all__ = ( + "SystemTimestamps", + "GcsFilesetSpec", + "GcsFileSpec", + "Schema", + "ColumnSchema", + "SearchCatalogResult", + "BigQueryTableSpec", + "ViewSpec", + "TableSpec", + "BigQueryDateShardedSpec", + "Tag", + "TagField", + "TagTemplate", + "TagTemplateField", + "FieldType", + "SearchCatalogRequest", + "SearchCatalogResponse", + "CreateEntryGroupRequest", + "UpdateEntryGroupRequest", + "GetEntryGroupRequest", + "DeleteEntryGroupRequest", + "ListEntryGroupsRequest", + "ListEntryGroupsResponse", + "CreateEntryRequest", + "UpdateEntryRequest", + "DeleteEntryRequest", + "GetEntryRequest", + "LookupEntryRequest", + "Entry", + "EntryGroup", + "CreateTagTemplateRequest", + "GetTagTemplateRequest", + "UpdateTagTemplateRequest", + "DeleteTagTemplateRequest", + "CreateTagRequest", + "UpdateTagRequest", + "DeleteTagRequest", + "CreateTagTemplateFieldRequest", + "UpdateTagTemplateFieldRequest", + "RenameTagTemplateFieldRequest", + "DeleteTagTemplateFieldRequest", + "ListTagsRequest", + "ListTagsResponse", + "ListEntriesRequest", + "ListEntriesResponse", + "Taxonomy", + "PolicyTag", + "CreateTaxonomyRequest", + "DeleteTaxonomyRequest", + "UpdateTaxonomyRequest", + "ListTaxonomiesRequest", + "ListTaxonomiesResponse", + "GetTaxonomyRequest", + "CreatePolicyTagRequest", + "DeletePolicyTagRequest", + "UpdatePolicyTagRequest", + "ListPolicyTagsRequest", + "ListPolicyTagsResponse", + "GetPolicyTagRequest", + "SerializedTaxonomy", + "SerializedPolicyTag", + "ImportTaxonomiesRequest", + "InlineSource", + "ImportTaxonomiesResponse", + "ExportTaxonomiesRequest", + "ExportTaxonomiesResponse", +) diff --git a/google/cloud/datacatalog_v1beta1/types/common.py b/google/cloud/datacatalog_v1beta1/types/common.py new file mode 100644 index 00000000..73167f1e --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/common.py @@ -0,0 +1,35 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", manifest={"IntegratedSystem",}, +) + + +class IntegratedSystem(proto.Enum): + r"""This enum describes all the possible systems that Data + Catalog integrates with. + """ + INTEGRATED_SYSTEM_UNSPECIFIED = 0 + BIGQUERY = 1 + CLOUD_PUBSUB = 2 + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/types/datacatalog.py b/google/cloud/datacatalog_v1beta1/types/datacatalog.py new file mode 100644 index 00000000..7bbbae2f --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/datacatalog.py @@ -0,0 +1,996 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.cloud.datacatalog_v1beta1.types import common +from google.cloud.datacatalog_v1beta1.types import ( + gcs_fileset_spec as gcd_gcs_fileset_spec, +) +from google.cloud.datacatalog_v1beta1.types import schema as gcd_schema +from google.cloud.datacatalog_v1beta1.types import search +from google.cloud.datacatalog_v1beta1.types import table_spec +from google.cloud.datacatalog_v1beta1.types import tags as gcd_tags +from google.cloud.datacatalog_v1beta1.types import timestamps +from google.protobuf import field_mask_pb2 as field_mask # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", + manifest={ + "EntryType", + "SearchCatalogRequest", + "SearchCatalogResponse", + "CreateEntryGroupRequest", + "UpdateEntryGroupRequest", + "GetEntryGroupRequest", + "DeleteEntryGroupRequest", + "ListEntryGroupsRequest", + "ListEntryGroupsResponse", + "CreateEntryRequest", + "UpdateEntryRequest", + "DeleteEntryRequest", + "GetEntryRequest", + "LookupEntryRequest", + "Entry", + "EntryGroup", + "CreateTagTemplateRequest", + "GetTagTemplateRequest", + "UpdateTagTemplateRequest", + "DeleteTagTemplateRequest", + "CreateTagRequest", + "UpdateTagRequest", + "DeleteTagRequest", + "CreateTagTemplateFieldRequest", + "UpdateTagTemplateFieldRequest", + "RenameTagTemplateFieldRequest", + "DeleteTagTemplateFieldRequest", + "ListTagsRequest", + "ListTagsResponse", + "ListEntriesRequest", + "ListEntriesResponse", + }, +) + + +class EntryType(proto.Enum): + r"""Entry resources in Data Catalog can be of different types e.g. a + BigQuery Table entry is of type ``TABLE``. This enum describes all + the possible types Data Catalog contains. + """ + ENTRY_TYPE_UNSPECIFIED = 0 + TABLE = 2 + MODEL = 5 + DATA_STREAM = 3 + FILESET = 4 + + +class SearchCatalogRequest(proto.Message): + r"""Request message for + [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. + + Attributes: + scope (~.datacatalog.SearchCatalogRequest.Scope): + Required. The scope of this search request. A ``scope`` that + has empty ``include_org_ids``, ``include_project_ids`` AND + false ``include_gcp_public_datasets`` is considered invalid. + Data Catalog will return an error in such a case. + query (str): + Required. The query string in search query syntax. The query + must be non-empty. + + Query strings can be simple as "x" or more qualified as: + + - name:x + - column:x + - description:y + + Note: Query tokens need to have a minimum of 3 characters + for substring matching to work correctly. See `Data Catalog + Search + Syntax `__ + for more information. + page_size (int): + Number of results in the search page. If <=0 then defaults + to 10. Max limit for page_size is 1000. Throws an invalid + argument for page_size > 1000. + page_token (str): + Optional. Pagination token returned in an earlier + [SearchCatalogResponse.next_page_token][google.cloud.datacatalog.v1beta1.SearchCatalogResponse.next_page_token], + which indicates that this is a continuation of a prior + [SearchCatalogRequest][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog] + call, and that the system should return the next page of + data. If empty, the first page is returned. + order_by (str): + Specifies the ordering of results, currently supported + case-sensitive choices are: + + - ``relevance``, only supports descending + - ``last_modified_timestamp [asc|desc]``, defaults to + descending if not specified + + If not specified, defaults to ``relevance`` descending. + """ + + class Scope(proto.Message): + r"""The criteria that select the subspace used for query + matching. + + Attributes: + include_org_ids (Sequence[str]): + The list of organization IDs to search + within. To find your organization ID, follow + instructions in + https://cloud.google.com/resource- + manager/docs/creating-managing-organization. + include_project_ids (Sequence[str]): + The list of project IDs to search within. To + learn more about the distinction between project + names/IDs/numbers, go to + https://cloud.google.com/docs/overview/#projects. + include_gcp_public_datasets (bool): + If ``true``, include Google Cloud Platform (GCP) public + datasets in the search results. Info on GCP public datasets + is available at https://cloud.google.com/public-datasets/. + By default, GCP public datasets are excluded. + """ + + include_org_ids = proto.RepeatedField(proto.STRING, number=2) + + include_project_ids = proto.RepeatedField(proto.STRING, number=3) + + include_gcp_public_datasets = proto.Field(proto.BOOL, number=7) + + scope = proto.Field(proto.MESSAGE, number=6, message=Scope,) + + query = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + order_by = proto.Field(proto.STRING, number=5) + + +class SearchCatalogResponse(proto.Message): + r"""Response message for + [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. + + Attributes: + results (Sequence[~.search.SearchCatalogResult]): + Search results. + next_page_token (str): + The token that can be used to retrieve the + next page of results. + """ + + @property + def raw_page(self): + return self + + results = proto.RepeatedField( + proto.MESSAGE, number=1, message=search.SearchCatalogResult, + ) + + next_page_token = proto.Field(proto.STRING, number=3) + + +class CreateEntryGroupRequest(proto.Message): + r"""Request message for + [CreateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntryGroup]. + + Attributes: + parent (str): + Required. The name of the project this entry group is in. + Example: + + - projects/{project_id}/locations/{location} + + Note that this EntryGroup and its child resources may not + actually be stored in the location in this name. + entry_group_id (str): + Required. The id of the entry group to + create. The id must begin with a letter or + underscore, contain only English letters, + numbers and underscores, and be at most 64 + characters. + entry_group (~.datacatalog.EntryGroup): + The entry group to create. Defaults to an + empty entry group. + """ + + parent = proto.Field(proto.STRING, number=1) + + entry_group_id = proto.Field(proto.STRING, number=3) + + entry_group = proto.Field(proto.MESSAGE, number=2, message="EntryGroup",) + + +class UpdateEntryGroupRequest(proto.Message): + r"""Request message for + [UpdateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntryGroup]. + + Attributes: + entry_group (~.datacatalog.EntryGroup): + Required. The updated entry group. "name" + field must be set. + update_mask (~.field_mask.FieldMask): + The fields to update on the entry group. If + absent or empty, all modifiable fields are + updated. + """ + + entry_group = proto.Field(proto.MESSAGE, number=1, message="EntryGroup",) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class GetEntryGroupRequest(proto.Message): + r"""Request message for + [GetEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntryGroup]. + + Attributes: + name (str): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + read_mask (~.field_mask.FieldMask): + The fields to return. If not set or empty, + all fields are returned. + """ + + name = proto.Field(proto.STRING, number=1) + + read_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class DeleteEntryGroupRequest(proto.Message): + r"""Request message for + [DeleteEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntryGroup]. + + Attributes: + name (str): + Required. The name of the entry group. For example, + ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + force (bool): + Optional. If true, deletes all entries in the + entry group. + """ + + name = proto.Field(proto.STRING, number=1) + + force = proto.Field(proto.BOOL, number=2) + + +class ListEntryGroupsRequest(proto.Message): + r"""Request message for + [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. + + Attributes: + parent (str): + Required. The name of the location that contains the entry + groups, which can be provided in URL format. Example: + + - projects/{project_id}/locations/{location} + page_size (int): + Optional. The maximum number of items to return. Default is + 10. Max limit is 1000. Throws an invalid argument for + ``page_size > 1000``. + page_token (str): + Optional. Token that specifies which page is + requested. If empty, the first page is returned. + """ + + parent = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + +class ListEntryGroupsResponse(proto.Message): + r"""Response message for + [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. + + Attributes: + entry_groups (Sequence[~.datacatalog.EntryGroup]): + EntryGroup details. + next_page_token (str): + Token to retrieve the next page of results. + It is set to empty if no items remain in + results. + """ + + @property + def raw_page(self): + return self + + entry_groups = proto.RepeatedField(proto.MESSAGE, number=1, message="EntryGroup",) + + next_page_token = proto.Field(proto.STRING, number=2) + + +class CreateEntryRequest(proto.Message): + r"""Request message for + [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry]. + + Attributes: + parent (str): + Required. The name of the entry group this entry is in. + Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + + Note that this Entry and its child resources may not + actually be stored in the location in this name. + entry_id (str): + Required. The id of the entry to create. + entry (~.datacatalog.Entry): + Required. The entry to create. + """ + + parent = proto.Field(proto.STRING, number=1) + + entry_id = proto.Field(proto.STRING, number=3) + + entry = proto.Field(proto.MESSAGE, number=2, message="Entry",) + + +class UpdateEntryRequest(proto.Message): + r"""Request message for + [UpdateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntry]. + + Attributes: + entry (~.datacatalog.Entry): + Required. The updated entry. The "name" field + must be set. + update_mask (~.field_mask.FieldMask): + The fields to update on the entry. If absent or empty, all + modifiable fields are updated. + + The following fields are modifiable: + + - For entries with type ``DATA_STREAM``: + + - ``schema`` + + - For entries with type ``FILESET`` + + - ``schema`` + - ``display_name`` + - ``description`` + - ``gcs_fileset_spec`` + - ``gcs_fileset_spec.file_patterns`` + + - For entries with ``user_specified_type`` + + - ``schema`` + - ``display_name`` + - ``description`` + - user_specified_type + - user_specified_system + - linked_resource + - source_system_timestamps + """ + + entry = proto.Field(proto.MESSAGE, number=1, message="Entry",) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class DeleteEntryRequest(proto.Message): + r"""Request message for + [DeleteEntry][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntry]. + + Attributes: + name (str): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + """ + + name = proto.Field(proto.STRING, number=1) + + +class GetEntryRequest(proto.Message): + r"""Request message for + [GetEntry][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntry]. + + Attributes: + name (str): + Required. The name of the entry. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + """ + + name = proto.Field(proto.STRING, number=1) + + +class LookupEntryRequest(proto.Message): + r"""Request message for + [LookupEntry][google.cloud.datacatalog.v1beta1.DataCatalog.LookupEntry]. + + Attributes: + linked_resource (str): + The full name of the Google Cloud Platform resource the Data + Catalog entry represents. See: + https://cloud.google.com/apis/design/resource_names#full_resource_name. + Full names are case-sensitive. + + Examples: + + - //bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId + - //pubsub.googleapis.com/projects/projectId/topics/topicId + sql_resource (str): + The SQL name of the entry. SQL names are case-sensitive. + + Examples: + + - ``pubsub.project_id.topic_id`` + - :literal:`pubsub.project_id.`topic.id.with.dots\`` + - ``bigquery.table.project_id.dataset_id.table_id`` + - ``bigquery.dataset.project_id.dataset_id`` + - ``datacatalog.entry.project_id.location_id.entry_group_id.entry_id`` + + ``*_id``\ s shoud satisfy the standard SQL rules for + identifiers. + https://cloud.google.com/bigquery/docs/reference/standard-sql/lexical. + """ + + linked_resource = proto.Field(proto.STRING, number=1, oneof="target_name") + + sql_resource = proto.Field(proto.STRING, number=3, oneof="target_name") + + +class Entry(proto.Message): + r"""Entry Metadata. A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery dataset or a + Pub/Sub topic), or outside of Google Cloud Platform. Clients can use + the ``linked_resource`` field in the Entry resource to refer to the + original resource ID of the source system. + + An Entry resource contains resource details, such as its schema. An + Entry can also be used to attach flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + Attributes: + name (str): + The Data Catalog resource name of the entry in URL format. + Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + + Note that this Entry and its child resources may not + actually be stored in the location in this name. + linked_resource (str): + The resource this metadata entry refers to. + + For Google Cloud Platform resources, ``linked_resource`` is + the `full name of the + resource `__. + For example, the ``linked_resource`` for a table resource + from BigQuery is: + + - //bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId + + Output only when Entry is of type in the EntryType enum. For + entries with user_specified_type, this field is optional and + defaults to an empty string. + type (~.datacatalog.EntryType): + The type of the entry. + Only used for Entries with types in the + EntryType enum. + user_specified_type (str): + Entry type if it does not fit any of the input-allowed + values listed in ``EntryType`` enum above. When creating an + entry, users should check the enum values first, if nothing + matches the entry to be created, then provide a custom + value, for example "my_special_type". + ``user_specified_type`` strings must begin with a letter or + underscore and can only contain letters, numbers, and + underscores; are case insensitive; must be at least 1 + character and at most 64 characters long. + + Currently, only FILESET enum value is allowed. All other + entries created through Data Catalog must use + ``user_specified_type``. + integrated_system (~.common.IntegratedSystem): + Output only. This field indicates the entry's + source system that Data Catalog integrates with, + such as BigQuery or Pub/Sub. + user_specified_system (str): + This field indicates the entry's source system that Data + Catalog does not integrate with. ``user_specified_system`` + strings must begin with a letter or underscore and can only + contain letters, numbers, and underscores; are case + insensitive; must be at least 1 character and at most 64 + characters long. + gcs_fileset_spec (~.gcd_gcs_fileset_spec.GcsFilesetSpec): + Specification that applies to a Cloud Storage + fileset. This is only valid on entries of type + FILESET. + bigquery_table_spec (~.table_spec.BigQueryTableSpec): + Specification that applies to a BigQuery table. This is only + valid on entries of type ``TABLE``. + bigquery_date_sharded_spec (~.table_spec.BigQueryDateShardedSpec): + Specification for a group of BigQuery tables with name + pattern ``[prefix]YYYYMMDD``. Context: + https://cloud.google.com/bigquery/docs/partitioned-tables#partitioning_versus_sharding. + display_name (str): + Display information such as title and + description. A short name to identify the entry, + for example, "Analytics Data - Jan 2011". + Default value is an empty string. + description (str): + Entry description, which can consist of + several sentences or paragraphs that describe + entry contents. Default value is an empty + string. + schema (~.gcd_schema.Schema): + Schema of the entry. An entry might not have + any schema attached to it. + source_system_timestamps (~.timestamps.SystemTimestamps): + Output only. Timestamps about the underlying resource, not + about this Data Catalog entry. Output only when Entry is of + type in the EntryType enum. For entries with + user_specified_type, this field is optional and defaults to + an empty timestamp. + """ + + name = proto.Field(proto.STRING, number=1) + + linked_resource = proto.Field(proto.STRING, number=9) + + type = proto.Field(proto.ENUM, number=2, oneof="entry_type", enum="EntryType",) + + user_specified_type = proto.Field(proto.STRING, number=16, oneof="entry_type") + + integrated_system = proto.Field( + proto.ENUM, number=17, oneof="system", enum=common.IntegratedSystem, + ) + + user_specified_system = proto.Field(proto.STRING, number=18, oneof="system") + + gcs_fileset_spec = proto.Field( + proto.MESSAGE, + number=6, + oneof="type_spec", + message=gcd_gcs_fileset_spec.GcsFilesetSpec, + ) + + bigquery_table_spec = proto.Field( + proto.MESSAGE, + number=12, + oneof="type_spec", + message=table_spec.BigQueryTableSpec, + ) + + bigquery_date_sharded_spec = proto.Field( + proto.MESSAGE, + number=15, + oneof="type_spec", + message=table_spec.BigQueryDateShardedSpec, + ) + + display_name = proto.Field(proto.STRING, number=3) + + description = proto.Field(proto.STRING, number=4) + + schema = proto.Field(proto.MESSAGE, number=5, message=gcd_schema.Schema,) + + source_system_timestamps = proto.Field( + proto.MESSAGE, number=7, message=timestamps.SystemTimestamps, + ) + + +class EntryGroup(proto.Message): + r"""EntryGroup Metadata. An EntryGroup resource represents a logical + grouping of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] resources. + + Attributes: + name (str): + The resource name of the entry group in URL format. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + + Note that this EntryGroup and its child resources may not + actually be stored in the location in this name. + display_name (str): + A short name to identify the entry group, for + example, "analytics data - jan 2011". Default + value is an empty string. + description (str): + Entry group description, which can consist of + several sentences or paragraphs that describe + entry group contents. Default value is an empty + string. + data_catalog_timestamps (~.timestamps.SystemTimestamps): + Output only. Timestamps about this + EntryGroup. Default value is empty timestamps. + """ + + name = proto.Field(proto.STRING, number=1) + + display_name = proto.Field(proto.STRING, number=2) + + description = proto.Field(proto.STRING, number=3) + + data_catalog_timestamps = proto.Field( + proto.MESSAGE, number=4, message=timestamps.SystemTimestamps, + ) + + +class CreateTagTemplateRequest(proto.Message): + r"""Request message for + [CreateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplate]. + + Attributes: + parent (str): + Required. The name of the project and the template location + [region](https://cloud.google.com/data-catalog/docs/concepts/regions. + + Example: + + - projects/{project_id}/locations/us-central1 + tag_template_id (str): + Required. The id of the tag template to + create. + tag_template (~.gcd_tags.TagTemplate): + Required. The tag template to create. + """ + + parent = proto.Field(proto.STRING, number=1) + + tag_template_id = proto.Field(proto.STRING, number=3) + + tag_template = proto.Field(proto.MESSAGE, number=2, message=gcd_tags.TagTemplate,) + + +class GetTagTemplateRequest(proto.Message): + r"""Request message for + [GetTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.GetTagTemplate]. + + Attributes: + name (str): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + """ + + name = proto.Field(proto.STRING, number=1) + + +class UpdateTagTemplateRequest(proto.Message): + r"""Request message for + [UpdateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplate]. + + Attributes: + tag_template (~.gcd_tags.TagTemplate): + Required. The template to update. The "name" + field must be set. + update_mask (~.field_mask.FieldMask): + The field mask specifies the parts of the template to + overwrite. + + Allowed fields: + + - ``display_name`` + + If absent or empty, all of the allowed fields above will be + updated. + """ + + tag_template = proto.Field(proto.MESSAGE, number=1, message=gcd_tags.TagTemplate,) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class DeleteTagTemplateRequest(proto.Message): + r"""Request message for + [DeleteTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplate]. + + Attributes: + name (str): + Required. The name of the tag template to delete. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + force (bool): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of any possible tags + using this template. ``force = false`` will be supported in + the future. + """ + + name = proto.Field(proto.STRING, number=1) + + force = proto.Field(proto.BOOL, number=2) + + +class CreateTagRequest(proto.Message): + r"""Request message for + [CreateTag][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTag]. + + Attributes: + parent (str): + Required. The name of the resource to attach this tag to. + Tags can be attached to Entries. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + + Note that this Tag and its child resources may not actually + be stored in the location in this name. + tag (~.gcd_tags.Tag): + Required. The tag to create. + """ + + parent = proto.Field(proto.STRING, number=1) + + tag = proto.Field(proto.MESSAGE, number=2, message=gcd_tags.Tag,) + + +class UpdateTagRequest(proto.Message): + r"""Request message for + [UpdateTag][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTag]. + + Attributes: + tag (~.gcd_tags.Tag): + Required. The updated tag. The "name" field + must be set. + update_mask (~.field_mask.FieldMask): + The fields to update on the Tag. If absent or empty, all + modifiable fields are updated. Currently the only modifiable + field is the field ``fields``. + """ + + tag = proto.Field(proto.MESSAGE, number=1, message=gcd_tags.Tag,) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class DeleteTagRequest(proto.Message): + r"""Request message for + [DeleteTag][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTag]. + + Attributes: + name (str): + Required. The name of the tag to delete. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + """ + + name = proto.Field(proto.STRING, number=1) + + +class CreateTagTemplateFieldRequest(proto.Message): + r"""Request message for + [CreateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplateField]. + + Attributes: + parent (str): + Required. The name of the project and the template location + `region `__. + + Example: + + - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + tag_template_field_id (str): + Required. The ID of the tag template field to create. Field + ids can contain letters (both uppercase and lowercase), + numbers (0-9), underscores (_) and dashes (-). Field IDs + must be at least 1 character long and at most 128 characters + long. Field IDs must also be unique within their template. + tag_template_field (~.gcd_tags.TagTemplateField): + Required. The tag template field to create. + """ + + parent = proto.Field(proto.STRING, number=1) + + tag_template_field_id = proto.Field(proto.STRING, number=2) + + tag_template_field = proto.Field( + proto.MESSAGE, number=3, message=gcd_tags.TagTemplateField, + ) + + +class UpdateTagTemplateFieldRequest(proto.Message): + r"""Request message for + [UpdateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplateField]. + + Attributes: + name (str): + Required. The name of the tag template field. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + tag_template_field (~.gcd_tags.TagTemplateField): + Required. The template to update. + update_mask (~.field_mask.FieldMask): + Optional. The field mask specifies the parts of the template + to be updated. Allowed fields: + + - ``display_name`` + - ``type.enum_type`` + - ``is_required`` + + If ``update_mask`` is not set or empty, all of the allowed + fields above will be updated. + + When updating an enum type, the provided values will be + merged with the existing values. Therefore, enum values can + only be added, existing enum values cannot be deleted nor + renamed. Updating a template field from optional to required + is NOT allowed. + """ + + name = proto.Field(proto.STRING, number=1) + + tag_template_field = proto.Field( + proto.MESSAGE, number=2, message=gcd_tags.TagTemplateField, + ) + + update_mask = proto.Field(proto.MESSAGE, number=3, message=field_mask.FieldMask,) + + +class RenameTagTemplateFieldRequest(proto.Message): + r"""Request message for + [RenameTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.RenameTagTemplateField]. + + Attributes: + name (str): + Required. The name of the tag template. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + new_tag_template_field_id (str): + Required. The new ID of this tag template field. For + example, ``my_new_field``. + """ + + name = proto.Field(proto.STRING, number=1) + + new_tag_template_field_id = proto.Field(proto.STRING, number=2) + + +class DeleteTagTemplateFieldRequest(proto.Message): + r"""Request message for + [DeleteTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplateField]. + + Attributes: + name (str): + Required. The name of the tag template field to delete. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + force (bool): + Required. Currently, this field must always be set to + ``true``. This confirms the deletion of this field from any + tags using this field. ``force = false`` will be supported + in the future. + """ + + name = proto.Field(proto.STRING, number=1) + + force = proto.Field(proto.BOOL, number=2) + + +class ListTagsRequest(proto.Message): + r"""Request message for + [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. + + Attributes: + parent (str): + Required. The name of the Data Catalog resource to list the + tags of. The resource could be an + [Entry][google.cloud.datacatalog.v1beta1.Entry] or an + [EntryGroup][google.cloud.datacatalog.v1beta1.EntryGroup]. + + Examples: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + page_size (int): + The maximum number of tags to return. Default + is 10. Max limit is 1000. + page_token (str): + Token that specifies which page is requested. + If empty, the first page is returned. + """ + + parent = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + +class ListTagsResponse(proto.Message): + r"""Response message for + [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. + + Attributes: + tags (Sequence[~.gcd_tags.Tag]): + [Tag][google.cloud.datacatalog.v1beta1.Tag] details. + next_page_token (str): + Token to retrieve the next page of results. + It is set to empty if no items remain in + results. + """ + + @property + def raw_page(self): + return self + + tags = proto.RepeatedField(proto.MESSAGE, number=1, message=gcd_tags.Tag,) + + next_page_token = proto.Field(proto.STRING, number=2) + + +class ListEntriesRequest(proto.Message): + r"""Request message for + [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. + + Attributes: + parent (str): + Required. The name of the entry group that contains the + entries, which can be provided in URL format. Example: + + - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + page_size (int): + The maximum number of items to return. Default is 10. Max + limit is 1000. Throws an invalid argument for + ``page_size > 1000``. + page_token (str): + Token that specifies which page is requested. + If empty, the first page is returned. + read_mask (~.field_mask.FieldMask): + The fields to return for each Entry. If not set or empty, + all fields are returned. For example, setting read_mask to + contain only one path "name" will cause ListEntries to + return a list of Entries with only "name" field. + """ + + parent = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + read_mask = proto.Field(proto.MESSAGE, number=4, message=field_mask.FieldMask,) + + +class ListEntriesResponse(proto.Message): + r"""Response message for + [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. + + Attributes: + entries (Sequence[~.datacatalog.Entry]): + Entry details. + next_page_token (str): + Token to retrieve the next page of results. + It is set to empty if no items remain in + results. + """ + + @property + def raw_page(self): + return self + + entries = proto.RepeatedField(proto.MESSAGE, number=1, message=Entry,) + + next_page_token = proto.Field(proto.STRING, number=2) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/types/gcs_fileset_spec.py b/google/cloud/datacatalog_v1beta1/types/gcs_fileset_spec.py new file mode 100644 index 00000000..cc52615b --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/gcs_fileset_spec.py @@ -0,0 +1,103 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.cloud.datacatalog_v1beta1.types import timestamps + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", + manifest={"GcsFilesetSpec", "GcsFileSpec",}, +) + + +class GcsFilesetSpec(proto.Message): + r"""Describes a Cloud Storage fileset entry. + + Attributes: + file_patterns (Sequence[str]): + Required. Patterns to identify a set of files in Google + Cloud Storage. See `Cloud Storage + documentation `__ + for more information. Note that bucket wildcards are + currently not supported. + + Examples of valid file_patterns: + + - ``gs://bucket_name/dir/*``: matches all files within + ``bucket_name/dir`` directory. + - ``gs://bucket_name/dir/**``: matches all files in + ``bucket_name/dir`` spanning all subdirectories. + - ``gs://bucket_name/file*``: matches files prefixed by + ``file`` in ``bucket_name`` + - ``gs://bucket_name/??.txt``: matches files with two + characters followed by ``.txt`` in ``bucket_name`` + - ``gs://bucket_name/[aeiou].txt``: matches files that + contain a single vowel character followed by ``.txt`` in + ``bucket_name`` + - ``gs://bucket_name/[a-m].txt``: matches files that + contain ``a``, ``b``, ... or ``m`` followed by ``.txt`` + in ``bucket_name`` + - ``gs://bucket_name/a/*/b``: matches all files in + ``bucket_name`` that match ``a/*/b`` pattern, such as + ``a/c/b``, ``a/d/b`` + - ``gs://another_bucket/a.txt``: matches + ``gs://another_bucket/a.txt`` + + You can combine wildcards to provide more powerful matches, + for example: + + - ``gs://bucket_name/[a-m]??.j*g`` + sample_gcs_file_specs (Sequence[~.gcs_fileset_spec.GcsFileSpec]): + Output only. Sample files contained in this + fileset, not all files contained in this fileset + are represented here. + """ + + file_patterns = proto.RepeatedField(proto.STRING, number=1) + + sample_gcs_file_specs = proto.RepeatedField( + proto.MESSAGE, number=2, message="GcsFileSpec", + ) + + +class GcsFileSpec(proto.Message): + r"""Specifications of a single file in Cloud Storage. + + Attributes: + file_path (str): + Required. The full file path. Example: + ``gs://bucket_name/a/b.txt``. + gcs_timestamps (~.timestamps.SystemTimestamps): + Output only. Timestamps about the Cloud + Storage file. + size_bytes (int): + Output only. The size of the file, in bytes. + """ + + file_path = proto.Field(proto.STRING, number=1) + + gcs_timestamps = proto.Field( + proto.MESSAGE, number=2, message=timestamps.SystemTimestamps, + ) + + size_bytes = proto.Field(proto.INT64, number=4) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/types/policytagmanager.py b/google/cloud/datacatalog_v1beta1/types/policytagmanager.py new file mode 100644 index 00000000..259be1b3 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/policytagmanager.py @@ -0,0 +1,368 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.protobuf import field_mask_pb2 as field_mask # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", + manifest={ + "Taxonomy", + "PolicyTag", + "CreateTaxonomyRequest", + "DeleteTaxonomyRequest", + "UpdateTaxonomyRequest", + "ListTaxonomiesRequest", + "ListTaxonomiesResponse", + "GetTaxonomyRequest", + "CreatePolicyTagRequest", + "DeletePolicyTagRequest", + "UpdatePolicyTagRequest", + "ListPolicyTagsRequest", + "ListPolicyTagsResponse", + "GetPolicyTagRequest", + }, +) + + +class Taxonomy(proto.Message): + r"""A taxonomy is a collection of policy tags that classify data along a + common axis. For instance a data *sensitivity* taxonomy could + contain policy tags denoting PII such as age, zipcode, and SSN. A + data *origin* taxonomy could contain policy tags to distinguish user + data, employee data, partner data, public data. + + Attributes: + name (str): + Output only. Resource name of this taxonomy, whose format + is: + "projects/{project_number}/locations/{location_id}/taxonomies/{id}". + display_name (str): + Required. User defined name of this taxonomy. + It must: contain only unicode letters, numbers, + underscores, dashes and spaces; not start or end + with spaces; and be at most 200 bytes long when + encoded in UTF-8. + description (str): + Optional. Description of this taxonomy. It + must: contain only unicode characters, tabs, + newlines, carriage returns and page breaks; and + be at most 2000 bytes long when encoded in + UTF-8. If not set, defaults to an empty + description. + activated_policy_types (Sequence[~.policytagmanager.Taxonomy.PolicyType]): + Optional. A list of policy types that are + activated for this taxonomy. If not set, + defaults to an empty list. + """ + + class PolicyType(proto.Enum): + r"""Defines policy types where policy tag can be used for.""" + POLICY_TYPE_UNSPECIFIED = 0 + FINE_GRAINED_ACCESS_CONTROL = 1 + + name = proto.Field(proto.STRING, number=1) + + display_name = proto.Field(proto.STRING, number=2) + + description = proto.Field(proto.STRING, number=3) + + activated_policy_types = proto.RepeatedField(proto.ENUM, number=6, enum=PolicyType,) + + +class PolicyTag(proto.Message): + r"""Denotes one policy tag in a taxonomy (e.g. ssn). Policy Tags + can be defined in a hierarchy. For example, consider the + following hierarchy: Geolocation -> (LatLong, City, ZipCode). + PolicyTag "Geolocation" contains three child policy tags: + "LatLong", "City", and "ZipCode". + + Attributes: + name (str): + Output only. Resource name of this policy tag, whose format + is: + "projects/{project_number}/locations/{location_id}/taxonomies/{taxonomy_id}/policyTags/{id}". + display_name (str): + Required. User defined name of this policy + tag. It must: be unique within the parent + taxonomy; contain only unicode letters, numbers, + underscores, dashes and spaces; not start or end + with spaces; and be at most 200 bytes long when + encoded in UTF-8. + description (str): + Description of this policy tag. It must: + contain only unicode characters, tabs, newlines, + carriage returns and page breaks; and be at most + 2000 bytes long when encoded in UTF-8. If not + set, defaults to an empty description. If not + set, defaults to an empty description. + parent_policy_tag (str): + Resource name of this policy tag's parent + policy tag (e.g. for the "LatLong" policy tag in + the example above, this field contains the + resource name of the "Geolocation" policy tag). + If empty, it means this policy tag is a top + level policy tag (e.g. this field is empty for + the "Geolocation" policy tag in the example + above). If not set, defaults to an empty string. + child_policy_tags (Sequence[str]): + Output only. Resource names of child policy + tags of this policy tag. + """ + + name = proto.Field(proto.STRING, number=1) + + display_name = proto.Field(proto.STRING, number=2) + + description = proto.Field(proto.STRING, number=3) + + parent_policy_tag = proto.Field(proto.STRING, number=4) + + child_policy_tags = proto.RepeatedField(proto.STRING, number=5) + + +class CreateTaxonomyRequest(proto.Message): + r"""Request message for + [CreateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreateTaxonomy]. + + Attributes: + parent (str): + Required. Resource name of the project that + the taxonomy will belong to. + taxonomy (~.policytagmanager.Taxonomy): + The taxonomy to be created. + """ + + parent = proto.Field(proto.STRING, number=1) + + taxonomy = proto.Field(proto.MESSAGE, number=2, message=Taxonomy,) + + +class DeleteTaxonomyRequest(proto.Message): + r"""Request message for + [DeleteTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeleteTaxonomy]. + + Attributes: + name (str): + Required. Resource name of the taxonomy to be + deleted. All policy tags in this taxonomy will + also be deleted. + """ + + name = proto.Field(proto.STRING, number=1) + + +class UpdateTaxonomyRequest(proto.Message): + r"""Request message for + [UpdateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdateTaxonomy]. + + Attributes: + taxonomy (~.policytagmanager.Taxonomy): + The taxonomy to update. Only description, display_name, and + activated policy types can be updated. + update_mask (~.field_mask.FieldMask): + The update mask applies to the resource. For the + ``FieldMask`` definition, see + https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask + If not set, defaults to all of the fields that are allowed + to update. + """ + + taxonomy = proto.Field(proto.MESSAGE, number=1, message=Taxonomy,) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class ListTaxonomiesRequest(proto.Message): + r"""Request message for + [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. + + Attributes: + parent (str): + Required. Resource name of the project to + list the taxonomies of. + page_size (int): + The maximum number of items to return. Must + be a value between 1 and 1000. If not set, + defaults to 50. + page_token (str): + The next_page_token value returned from a previous list + request, if any. If not set, defaults to an empty string. + """ + + parent = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + +class ListTaxonomiesResponse(proto.Message): + r"""Response message for + [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. + + Attributes: + taxonomies (Sequence[~.policytagmanager.Taxonomy]): + Taxonomies that the project contains. + next_page_token (str): + Token used to retrieve the next page of + results, or empty if there are no more results + in the list. + """ + + @property + def raw_page(self): + return self + + taxonomies = proto.RepeatedField(proto.MESSAGE, number=1, message=Taxonomy,) + + next_page_token = proto.Field(proto.STRING, number=2) + + +class GetTaxonomyRequest(proto.Message): + r"""Request message for + [GetTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetTaxonomy]. + + Attributes: + name (str): + Required. Resource name of the requested + taxonomy. + """ + + name = proto.Field(proto.STRING, number=1) + + +class CreatePolicyTagRequest(proto.Message): + r"""Request message for + [CreatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreatePolicyTag]. + + Attributes: + parent (str): + Required. Resource name of the taxonomy that + the policy tag will belong to. + policy_tag (~.policytagmanager.PolicyTag): + The policy tag to be created. + """ + + parent = proto.Field(proto.STRING, number=1) + + policy_tag = proto.Field(proto.MESSAGE, number=2, message=PolicyTag,) + + +class DeletePolicyTagRequest(proto.Message): + r"""Request message for + [DeletePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeletePolicyTag]. + + Attributes: + name (str): + Required. Resource name of the policy tag to + be deleted. All of its descendant policy tags + will also be deleted. + """ + + name = proto.Field(proto.STRING, number=1) + + +class UpdatePolicyTagRequest(proto.Message): + r"""Request message for + [UpdatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdatePolicyTag]. + + Attributes: + policy_tag (~.policytagmanager.PolicyTag): + The policy tag to update. Only the description, + display_name, and parent_policy_tag fields can be updated. + update_mask (~.field_mask.FieldMask): + The update mask applies to the resource. Only display_name, + description and parent_policy_tag can be updated and thus + can be listed in the mask. If update_mask is not provided, + all allowed fields (i.e. display_name, description and + parent) will be updated. For more information including the + ``FieldMask`` definition, see + https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask + If not set, defaults to all of the fields that are allowed + to update. + """ + + policy_tag = proto.Field(proto.MESSAGE, number=1, message=PolicyTag,) + + update_mask = proto.Field(proto.MESSAGE, number=2, message=field_mask.FieldMask,) + + +class ListPolicyTagsRequest(proto.Message): + r"""Request message for + [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. + + Attributes: + parent (str): + Required. Resource name of the taxonomy to + list the policy tags of. + page_size (int): + The maximum number of items to return. Must + be a value between 1 and 1000. If not set, + defaults to 50. + page_token (str): + The next_page_token value returned from a previous List + request, if any. If not set, defaults to an empty string. + """ + + parent = proto.Field(proto.STRING, number=1) + + page_size = proto.Field(proto.INT32, number=2) + + page_token = proto.Field(proto.STRING, number=3) + + +class ListPolicyTagsResponse(proto.Message): + r"""Response message for + [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. + + Attributes: + policy_tags (Sequence[~.policytagmanager.PolicyTag]): + The policy tags that are in the requested + taxonomy. + next_page_token (str): + Token used to retrieve the next page of + results, or empty if there are no more results + in the list. + """ + + @property + def raw_page(self): + return self + + policy_tags = proto.RepeatedField(proto.MESSAGE, number=1, message=PolicyTag,) + + next_page_token = proto.Field(proto.STRING, number=2) + + +class GetPolicyTagRequest(proto.Message): + r"""Request message for + [GetPolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetPolicyTag]. + + Attributes: + name (str): + Required. Resource name of the requested + policy tag. + """ + + name = proto.Field(proto.STRING, number=1) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/types/policytagmanagerserialization.py b/google/cloud/datacatalog_v1beta1/types/policytagmanagerserialization.py new file mode 100644 index 00000000..dd14cd86 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/policytagmanagerserialization.py @@ -0,0 +1,174 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.cloud.datacatalog_v1beta1.types import policytagmanager + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", + manifest={ + "SerializedTaxonomy", + "SerializedPolicyTag", + "ImportTaxonomiesRequest", + "InlineSource", + "ImportTaxonomiesResponse", + "ExportTaxonomiesRequest", + "ExportTaxonomiesResponse", + }, +) + + +class SerializedTaxonomy(proto.Message): + r"""Message capturing a taxonomy and its policy tag hierarchy as + a nested proto. Used for taxonomy import/export and mutation. + + Attributes: + display_name (str): + Required. Display name of the taxonomy. Max + 200 bytes when encoded in UTF-8. + description (str): + Description of the serialized taxonomy. The + length of the description is limited to 2000 + bytes when encoded in UTF-8. If not set, + defaults to an empty description. + policy_tags (Sequence[~.policytagmanagerserialization.SerializedPolicyTag]): + Top level policy tags associated with the + taxonomy if any. + """ + + display_name = proto.Field(proto.STRING, number=1) + + description = proto.Field(proto.STRING, number=2) + + policy_tags = proto.RepeatedField( + proto.MESSAGE, number=3, message="SerializedPolicyTag", + ) + + +class SerializedPolicyTag(proto.Message): + r"""Message representing one policy tag when exported as a nested + proto. + + Attributes: + display_name (str): + Required. Display name of the policy tag. Max + 200 bytes when encoded in UTF-8. + description (str): + Description of the serialized policy tag. The + length of the description is limited to 2000 + bytes when encoded in UTF-8. If not set, + defaults to an empty description. + child_policy_tags (Sequence[~.policytagmanagerserialization.SerializedPolicyTag]): + Children of the policy tag if any. + """ + + display_name = proto.Field(proto.STRING, number=2) + + description = proto.Field(proto.STRING, number=3) + + child_policy_tags = proto.RepeatedField( + proto.MESSAGE, number=4, message="SerializedPolicyTag", + ) + + +class ImportTaxonomiesRequest(proto.Message): + r"""Request message for + [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. + + Attributes: + parent (str): + Required. Resource name of project that the + newly created taxonomies will belong to. + inline_source (~.policytagmanagerserialization.InlineSource): + Inline source used for taxonomies import + """ + + parent = proto.Field(proto.STRING, number=1) + + inline_source = proto.Field( + proto.MESSAGE, number=2, oneof="source", message="InlineSource", + ) + + +class InlineSource(proto.Message): + r"""Inline source used for taxonomies import. + + Attributes: + taxonomies (Sequence[~.policytagmanagerserialization.SerializedTaxonomy]): + Required. Taxonomies to be imported. + """ + + taxonomies = proto.RepeatedField( + proto.MESSAGE, number=1, message=SerializedTaxonomy, + ) + + +class ImportTaxonomiesResponse(proto.Message): + r"""Response message for + [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. + + Attributes: + taxonomies (Sequence[~.policytagmanager.Taxonomy]): + Taxonomies that were imported. + """ + + taxonomies = proto.RepeatedField( + proto.MESSAGE, number=1, message=policytagmanager.Taxonomy, + ) + + +class ExportTaxonomiesRequest(proto.Message): + r"""Request message for + [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. + + Attributes: + parent (str): + Required. Resource name of the project that + taxonomies to be exported will share. + taxonomies (Sequence[str]): + Required. Resource names of the taxonomies to + be exported. + serialized_taxonomies (bool): + Export taxonomies as serialized taxonomies. + """ + + parent = proto.Field(proto.STRING, number=1) + + taxonomies = proto.RepeatedField(proto.STRING, number=2) + + serialized_taxonomies = proto.Field(proto.BOOL, number=3, oneof="destination") + + +class ExportTaxonomiesResponse(proto.Message): + r"""Response message for + [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. + + Attributes: + taxonomies (Sequence[~.policytagmanagerserialization.SerializedTaxonomy]): + List of taxonomies and policy tags in a tree + structure. + """ + + taxonomies = proto.RepeatedField( + proto.MESSAGE, number=1, message=SerializedTaxonomy, + ) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/types/schema.py b/google/cloud/datacatalog_v1beta1/types/schema.py new file mode 100644 index 00000000..55014c32 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/schema.py @@ -0,0 +1,71 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", manifest={"Schema", "ColumnSchema",}, +) + + +class Schema(proto.Message): + r"""Represents a schema (e.g. BigQuery, GoogleSQL, Avro schema). + + Attributes: + columns (Sequence[~.schema.ColumnSchema]): + Required. Schema of columns. A maximum of + 10,000 columns and sub-columns can be specified. + """ + + columns = proto.RepeatedField(proto.MESSAGE, number=2, message="ColumnSchema",) + + +class ColumnSchema(proto.Message): + r"""Representation of a column within a schema. Columns could be + nested inside other columns. + + Attributes: + column (str): + Required. Name of the column. + type (str): + Required. Type of the column. + description (str): + Optional. Description of the column. Default + value is an empty string. + mode (str): + Optional. A column's mode indicates whether the values in + this column are required, nullable, etc. Only ``NULLABLE``, + ``REQUIRED`` and ``REPEATED`` are supported. Default mode is + ``NULLABLE``. + subcolumns (Sequence[~.schema.ColumnSchema]): + Optional. Schema of sub-columns. A column can + have zero or more sub-columns. + """ + + column = proto.Field(proto.STRING, number=6) + + type = proto.Field(proto.STRING, number=1) + + description = proto.Field(proto.STRING, number=2) + + mode = proto.Field(proto.STRING, number=3) + + subcolumns = proto.RepeatedField(proto.MESSAGE, number=7, message="ColumnSchema",) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/types/search.py b/google/cloud/datacatalog_v1beta1/types/search.py new file mode 100644 index 00000000..87f828d2 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/search.py @@ -0,0 +1,77 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", + manifest={"SearchResultType", "SearchCatalogResult",}, +) + + +class SearchResultType(proto.Enum): + r"""The different types of resources that can be returned in + search. + """ + SEARCH_RESULT_TYPE_UNSPECIFIED = 0 + ENTRY = 1 + TAG_TEMPLATE = 2 + ENTRY_GROUP = 3 + + +class SearchCatalogResult(proto.Message): + r"""A result that appears in the response of a search request. + Each result captures details of one entry that matches the + search. + + Attributes: + search_result_type (~.search.SearchResultType): + Type of the search result. This field can be + used to determine which Get method to call to + fetch the full resource. + search_result_subtype (str): + Sub-type of the search result. This is a dot-delimited + description of the resource's full type, and is the same as + the value callers would provide in the "type" search facet. + Examples: ``entry.table``, ``entry.dataStream``, + ``tagTemplate``. + relative_resource_name (str): + The relative resource name of the resource in URL format. + Examples: + + - ``projects/{project_id}/locations/{location_id}/entryGroups/{entry_group_id}/entries/{entry_id}`` + - ``projects/{project_id}/tagTemplates/{tag_template_id}`` + linked_resource (str): + The full name of the cloud resource the entry belongs to. + See: + https://cloud.google.com/apis/design/resource_names#full_resource_name. + Example: + + - ``//bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId`` + """ + + search_result_type = proto.Field(proto.ENUM, number=1, enum="SearchResultType",) + + search_result_subtype = proto.Field(proto.STRING, number=2) + + relative_resource_name = proto.Field(proto.STRING, number=3) + + linked_resource = proto.Field(proto.STRING, number=4) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/types/table_spec.py b/google/cloud/datacatalog_v1beta1/types/table_spec.py new file mode 100644 index 00000000..254afd21 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/table_spec.py @@ -0,0 +1,119 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", + manifest={ + "TableSourceType", + "BigQueryTableSpec", + "ViewSpec", + "TableSpec", + "BigQueryDateShardedSpec", + }, +) + + +class TableSourceType(proto.Enum): + r"""Table source type.""" + TABLE_SOURCE_TYPE_UNSPECIFIED = 0 + BIGQUERY_VIEW = 2 + BIGQUERY_TABLE = 5 + + +class BigQueryTableSpec(proto.Message): + r"""Describes a BigQuery table. + + Attributes: + table_source_type (~.gcd_table_spec.TableSourceType): + Output only. The table source type. + view_spec (~.gcd_table_spec.ViewSpec): + Table view specification. This field should only be + populated if ``table_source_type`` is ``BIGQUERY_VIEW``. + table_spec (~.gcd_table_spec.TableSpec): + Spec of a BigQuery table. This field should only be + populated if ``table_source_type`` is ``BIGQUERY_TABLE``. + """ + + table_source_type = proto.Field(proto.ENUM, number=1, enum="TableSourceType",) + + view_spec = proto.Field( + proto.MESSAGE, number=2, oneof="type_spec", message="ViewSpec", + ) + + table_spec = proto.Field( + proto.MESSAGE, number=3, oneof="type_spec", message="TableSpec", + ) + + +class ViewSpec(proto.Message): + r"""Table view specification. + + Attributes: + view_query (str): + Output only. The query that defines the table + view. + """ + + view_query = proto.Field(proto.STRING, number=1) + + +class TableSpec(proto.Message): + r"""Normal BigQuery table spec. + + Attributes: + grouped_entry (str): + Output only. If the table is a dated shard, i.e., with name + pattern ``[prefix]YYYYMMDD``, ``grouped_entry`` is the Data + Catalog resource name of the date sharded grouped entry, for + example, + ``projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}``. + Otherwise, ``grouped_entry`` is empty. + """ + + grouped_entry = proto.Field(proto.STRING, number=1) + + +class BigQueryDateShardedSpec(proto.Message): + r"""Spec for a group of BigQuery tables with name pattern + ``[prefix]YYYYMMDD``. Context: + https://cloud.google.com/bigquery/docs/partitioned-tables#partitioning_versus_sharding + + Attributes: + dataset (str): + Output only. The Data Catalog resource name of the dataset + entry the current table belongs to, for example, + ``projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}``. + table_prefix (str): + Output only. The table name prefix of the shards. The name + of any given shard is ``[table_prefix]YYYYMMDD``, for + example, for shard ``MyTable20180101``, the ``table_prefix`` + is ``MyTable``. + shard_count (int): + Output only. Total number of shards. + """ + + dataset = proto.Field(proto.STRING, number=1) + + table_prefix = proto.Field(proto.STRING, number=2) + + shard_count = proto.Field(proto.INT64, number=3) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/types/tags.py b/google/cloud/datacatalog_v1beta1/types/tags.py new file mode 100644 index 00000000..ddd5cf1f --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/tags.py @@ -0,0 +1,291 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.protobuf import timestamp_pb2 as timestamp # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", + manifest={"Tag", "TagField", "TagTemplate", "TagTemplateField", "FieldType",}, +) + + +class Tag(proto.Message): + r"""Tags are used to attach custom metadata to Data Catalog resources. + Tags conform to the specifications within their tag template. + + See `Data Catalog + IAM `__ for + information on the permissions needed to create or view tags. + + Attributes: + name (str): + The resource name of the tag in URL format. Example: + + - projects/{project_id}/locations/{location}/entrygroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + + where ``tag_id`` is a system-generated identifier. Note that + this Tag may not actually be stored in the location in this + name. + template (str): + Required. The resource name of the tag template that this + tag uses. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + + This field cannot be modified after creation. + template_display_name (str): + Output only. The display name of the tag + template. + column (str): + Resources like Entry can have schemas associated with them. + This scope allows users to attach tags to an individual + column based on that schema. + + For attaching a tag to a nested column, use ``.`` to + separate the column names. Example: + + - ``outer_column.inner_column`` + fields (Sequence[~.tags.Tag.FieldsEntry]): + Required. This maps the ID of a tag field to + the value of and additional information about + that field. Valid field IDs are defined by the + tag's template. A tag must have at least 1 field + and at most 500 fields. + """ + + name = proto.Field(proto.STRING, number=1) + + template = proto.Field(proto.STRING, number=2) + + template_display_name = proto.Field(proto.STRING, number=5) + + column = proto.Field(proto.STRING, number=4, oneof="scope") + + fields = proto.MapField(proto.STRING, proto.MESSAGE, number=3, message="TagField",) + + +class TagField(proto.Message): + r"""Contains the value and supporting information for a field within a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. + + Attributes: + display_name (str): + Output only. The display name of this field. + double_value (float): + Holds the value for a tag field with double + type. + string_value (str): + Holds the value for a tag field with string + type. + bool_value (bool): + Holds the value for a tag field with boolean + type. + timestamp_value (~.timestamp.Timestamp): + Holds the value for a tag field with + timestamp type. + enum_value (~.tags.TagField.EnumValue): + Holds the value for a tag field with enum + type. This value must be one of the allowed + values in the definition of this enum. + order (int): + Output only. The order of this field with respect to other + fields in this tag. It can be set in + [Tag][google.cloud.datacatalog.v1beta1.TagTemplateField.order]. + For example, a higher value can indicate a more important + field. The value can be negative. Multiple fields can have + the same order, and field orders within a tag do not have to + be sequential. + """ + + class EnumValue(proto.Message): + r"""Holds an enum value. + + Attributes: + display_name (str): + The display name of the enum value. + """ + + display_name = proto.Field(proto.STRING, number=1) + + display_name = proto.Field(proto.STRING, number=1) + + double_value = proto.Field(proto.DOUBLE, number=2, oneof="kind") + + string_value = proto.Field(proto.STRING, number=3, oneof="kind") + + bool_value = proto.Field(proto.BOOL, number=4, oneof="kind") + + timestamp_value = proto.Field( + proto.MESSAGE, number=5, oneof="kind", message=timestamp.Timestamp, + ) + + enum_value = proto.Field(proto.MESSAGE, number=6, oneof="kind", message=EnumValue,) + + order = proto.Field(proto.INT32, number=7) + + +class TagTemplate(proto.Message): + r"""A tag template defines a tag, which can have one or more typed + fields. The template is used to create and attach the tag to GCP + resources. `Tag template + roles `__ + provide permissions to create, edit, and use the template. See, for + example, the `TagTemplate + User `__ + role, which includes permission to use the tag template to tag + resources. + + Attributes: + name (str): + The resource name of the tag template in URL format. + Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + + Note that this TagTemplate and its child resources may not + actually be stored in the location in this name. + display_name (str): + The display name for this template. Defaults + to an empty string. + fields (Sequence[~.tags.TagTemplate.FieldsEntry]): + Required. Map of tag template field IDs to the settings for + the field. This map is an exhaustive list of the allowed + fields. This map must contain at least one field and at most + 500 fields. + + The keys to this map are tag template field IDs. Field IDs + can contain letters (both uppercase and lowercase), numbers + (0-9) and underscores (_). Field IDs must be at least 1 + character long and at most 64 characters long. Field IDs + must start with a letter or underscore. + """ + + name = proto.Field(proto.STRING, number=1) + + display_name = proto.Field(proto.STRING, number=2) + + fields = proto.MapField( + proto.STRING, proto.MESSAGE, number=3, message="TagTemplateField", + ) + + +class TagTemplateField(proto.Message): + r"""The template for an individual field within a tag template. + + Attributes: + name (str): + Output only. The resource name of the tag template field in + URL format. Example: + + - projects/{project_id}/locations/{location}/tagTemplates/{tag_template}/fields/{field} + + Note that this TagTemplateField may not actually be stored + in the location in this name. + display_name (str): + The display name for this field. Defaults to + an empty string. + type (~.tags.FieldType): + Required. The type of value this tag field + can contain. + is_required (bool): + Whether this is a required field. Defaults to + false. + order (int): + The order of this field with respect to other + fields in this tag template. A higher value + indicates a more important field. The value can + be negative. Multiple fields can have the same + order, and field orders within a tag do not have + to be sequential. + """ + + name = proto.Field(proto.STRING, number=6) + + display_name = proto.Field(proto.STRING, number=1) + + type = proto.Field(proto.MESSAGE, number=2, message="FieldType",) + + is_required = proto.Field(proto.BOOL, number=3) + + order = proto.Field(proto.INT32, number=5) + + +class FieldType(proto.Message): + r""" + + Attributes: + primitive_type (~.tags.FieldType.PrimitiveType): + Represents primitive types - string, bool + etc. + enum_type (~.tags.FieldType.EnumType): + Represents an enum type. + """ + + class PrimitiveType(proto.Enum): + r"""""" + PRIMITIVE_TYPE_UNSPECIFIED = 0 + DOUBLE = 1 + STRING = 2 + BOOL = 3 + TIMESTAMP = 4 + + class EnumType(proto.Message): + r""" + + Attributes: + allowed_values (Sequence[~.tags.FieldType.EnumType.EnumValue]): + Required on create; optional on update. The + set of allowed values for this enum. This set + must not be empty, the display names of the + values in this set must not be empty and the + display names of the values must be case- + insensitively unique within this set. Currently, + enum values can only be added to the list of + allowed values. Deletion and renaming of enum + values are not supported. Can have up to 500 + allowed values. + """ + + class EnumValue(proto.Message): + r""" + + Attributes: + display_name (str): + Required. The display name of the enum value. + Must not be an empty string. + """ + + display_name = proto.Field(proto.STRING, number=1) + + allowed_values = proto.RepeatedField( + proto.MESSAGE, number=1, message="FieldType.EnumType.EnumValue", + ) + + primitive_type = proto.Field( + proto.ENUM, number=1, oneof="type_decl", enum=PrimitiveType, + ) + + enum_type = proto.Field( + proto.MESSAGE, number=2, oneof="type_decl", message=EnumType, + ) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/google/cloud/datacatalog_v1beta1/types/timestamps.py b/google/cloud/datacatalog_v1beta1/types/timestamps.py new file mode 100644 index 00000000..82ef8a06 --- /dev/null +++ b/google/cloud/datacatalog_v1beta1/types/timestamps.py @@ -0,0 +1,53 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import proto # type: ignore + + +from google.protobuf import timestamp_pb2 as timestamp # type: ignore + + +__protobuf__ = proto.module( + package="google.cloud.datacatalog.v1beta1", manifest={"SystemTimestamps",}, +) + + +class SystemTimestamps(proto.Message): + r"""Timestamps about this resource according to a particular + system. + + Attributes: + create_time (~.timestamp.Timestamp): + The creation time of the resource within the + given system. + update_time (~.timestamp.Timestamp): + The last-modified time of the resource within + the given system. + expire_time (~.timestamp.Timestamp): + Output only. The expiration time of the + resource within the given system. Currently only + apllicable to BigQuery resources. + """ + + create_time = proto.Field(proto.MESSAGE, number=1, message=timestamp.Timestamp,) + + update_time = proto.Field(proto.MESSAGE, number=2, message=timestamp.Timestamp,) + + expire_time = proto.Field(proto.MESSAGE, number=3, message=timestamp.Timestamp,) + + +__all__ = tuple(sorted(__protobuf__.manifest)) diff --git a/mypy.ini b/mypy.ini new file mode 100644 index 00000000..4505b485 --- /dev/null +++ b/mypy.ini @@ -0,0 +1,3 @@ +[mypy] +python_version = 3.6 +namespace_packages = True diff --git a/noxfile.py b/noxfile.py index ef349873..f2524069 100644 --- a/noxfile.py +++ b/noxfile.py @@ -27,8 +27,8 @@ BLACK_PATHS = ["docs", "google", "tests", "noxfile.py", "setup.py"] DEFAULT_PYTHON_VERSION = "3.8" -SYSTEM_TEST_PYTHON_VERSIONS = ["2.7", "3.8"] -UNIT_TEST_PYTHON_VERSIONS = ["2.7", "3.5", "3.6", "3.7", "3.8"] +SYSTEM_TEST_PYTHON_VERSIONS = ["3.8"] +UNIT_TEST_PYTHON_VERSIONS = ["3.6", "3.7", "3.8"] @nox.session(python=DEFAULT_PYTHON_VERSION) @@ -70,6 +70,8 @@ def lint_setup_py(session): def default(session): # Install all test dependencies, then install this package in-place. + session.install("asyncmock", "pytest-asyncio") + session.install("mock", "pytest", "pytest-cov") session.install("-e", ".") @@ -139,7 +141,7 @@ def cover(session): test runs (not system test runs), and then erases coverage data. """ session.install("coverage", "pytest-cov") - session.run("coverage", "report", "--show-missing", "--fail-under=79") + session.run("coverage", "report", "--show-missing", "--fail-under=99") session.run("coverage", "erase") @@ -154,7 +156,7 @@ def docs(session): shutil.rmtree(os.path.join("docs", "_build"), ignore_errors=True) session.run( "sphinx-build", - # "-W", # warnings as errors + # # "-W", # warnings as errors "-T", # show full traceback on exception "-N", # no colors "-b", diff --git a/samples/quickstart/create_fileset_entry_quickstart.py b/samples/quickstart/create_fileset_entry_quickstart.py index 55b0af59..5e1c99f0 100644 --- a/samples/quickstart/create_fileset_entry_quickstart.py +++ b/samples/quickstart/create_fileset_entry_quickstart.py @@ -40,7 +40,7 @@ def create_fileset_entry_quickstart(client, project_id, entry_group_id, entry_id # Create an Entry Group. # Construct a full Entry Group object to send to the API. - entry_group_obj = datacatalog_v1beta1.types.EntryGroup() + entry_group_obj = datacatalog_v1beta1.EntryGroup() entry_group_obj.display_name = "My Fileset Entry Group" entry_group_obj.description = "This Entry Group consists of ...." @@ -48,26 +48,23 @@ def create_fileset_entry_quickstart(client, project_id, entry_group_id, entry_id # Raises google.api_core.exceptions.AlreadyExists if the Entry Group # already exists within the project. entry_group = client.create_entry_group( - parent=datacatalog_v1beta1.DataCatalogClient.location_path( + request = {'parent': datacatalog_v1beta1.DataCatalogClient.location_path( project_id, location_id - ), - entry_group_id=entry_group_id, - entry_group=entry_group_obj, - ) + ), 'entry_group_id': entry_group_id, 'entry_group': entry_group_obj}) print("Created entry group {}".format(entry_group.name)) # Create a Fileset Entry. # Construct a full Entry object to send to the API. - entry = datacatalog_v1beta1.types.Entry() + entry = datacatalog_v1beta1.Entry() entry.display_name = "My Fileset" entry.description = "This Fileset consists of ..." entry.gcs_fileset_spec.file_patterns.append("gs://cloud-samples-data/*") - entry.type = datacatalog_v1beta1.enums.EntryType.FILESET + entry.type = datacatalog_v1beta1.EntryType.FILESET # Create the Schema, for example when you have a csv file. columns = [] columns.append( - datacatalog_v1beta1.types.ColumnSchema( + datacatalog_v1beta1.ColumnSchema( column="first_name", description="First name", mode="REQUIRED", @@ -76,7 +73,7 @@ def create_fileset_entry_quickstart(client, project_id, entry_group_id, entry_id ) columns.append( - datacatalog_v1beta1.types.ColumnSchema( + datacatalog_v1beta1.ColumnSchema( column="last_name", description="Last name", mode="REQUIRED", type="STRING" ) ) @@ -84,19 +81,19 @@ def create_fileset_entry_quickstart(client, project_id, entry_group_id, entry_id # Create sub columns for the addresses parent column subcolumns = [] subcolumns.append( - datacatalog_v1beta1.types.ColumnSchema( + datacatalog_v1beta1.ColumnSchema( column="city", description="City", mode="NULLABLE", type="STRING" ) ) subcolumns.append( - datacatalog_v1beta1.types.ColumnSchema( + datacatalog_v1beta1.ColumnSchema( column="state", description="State", mode="NULLABLE", type="STRING" ) ) columns.append( - datacatalog_v1beta1.types.ColumnSchema( + datacatalog_v1beta1.ColumnSchema( column="addresses", description="Addresses", mode="REPEATED", @@ -110,6 +107,6 @@ def create_fileset_entry_quickstart(client, project_id, entry_group_id, entry_id # Send the entry to the API for creation. # Raises google.api_core.exceptions.AlreadyExists if the Entry already # exists within the project. - entry = client.create_entry(entry_group.name, entry_id, entry) + entry = client.create_entry(request = {'parent': entry_group.name, 'entry_id': entry_id, 'entry': entry}) print("Created entry {}".format(entry.name)) # [END datacatalog_create_fileset_quickstart_tag] diff --git a/samples/snippets/README.rst b/samples/snippets/README.rst index 3476ccea..343431d9 100644 --- a/samples/snippets/README.rst +++ b/samples/snippets/README.rst @@ -1,4 +1,3 @@ - .. This file is automatically generated. Do not edit this file directly. Google Cloud Data Catalog Python Samples @@ -16,11 +15,13 @@ This directory contains samples for Google Cloud Data Catalog. `Google Cloud Dat .. _Google Cloud Data Catalog: https://cloud.google.com/data-catalog/docs + + + Setup ------------------------------------------------------------------------------- - Authentication ++++++++++++++ @@ -31,9 +32,6 @@ credentials for applications. .. _Authentication Getting Started Guide: https://cloud.google.com/docs/authentication/getting-started - - - Install Dependencies ++++++++++++++++++++ @@ -48,7 +46,7 @@ Install Dependencies .. _Python Development Environment Setup Guide: https://cloud.google.com/python/setup -#. Create a virtualenv. Samples are compatible with Python 3.6+. +#. Create a virtualenv. Samples are compatible with Python 2.7 and 3.4+. .. code-block:: bash @@ -64,15 +62,9 @@ Install Dependencies .. _pip: https://pip.pypa.io/ .. _virtualenv: https://virtualenv.pypa.io/ - - - - - Samples ------------------------------------------------------------------------------- - Lookup entry +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ @@ -88,7 +80,6 @@ To run this sample: $ python lookup_entry.py - usage: lookup_entry.py [-h] project_id {bigquery-dataset,bigquery-table,pubsub-topic} ... @@ -116,10 +107,6 @@ To run this sample: - - - - The client library ------------------------------------------------------------------------------- @@ -135,5 +122,4 @@ to `browse the source`_ and `report issues`_. https://github.com/GoogleCloudPlatform/google-cloud-python/issues - -.. _Google Cloud SDK: https://cloud.google.com/sdk/ +.. _Google Cloud SDK: https://cloud.google.com/sdk/ \ No newline at end of file diff --git a/samples/snippets/lookup_entry.py b/samples/snippets/lookup_entry.py index 4b6e8c58..656cb97e 100644 --- a/samples/snippets/lookup_entry.py +++ b/samples/snippets/lookup_entry.py @@ -34,7 +34,7 @@ def lookup_bigquery_dataset(project_id, dataset_id): resource_name = '//bigquery.googleapis.com/projects/{}/datasets/{}'\ .format(project_id, dataset_id) - return datacatalog.lookup_entry(linked_resource=resource_name) + return datacatalog.lookup_entry(request={'linked_resource': resource_name}) # [END datacatalog_lookup_dataset] @@ -48,7 +48,7 @@ def lookup_bigquery_dataset_sql_resource(project_id, dataset_id): sql_resource = 'bigquery.dataset.`{}`.`{}`'.format(project_id, dataset_id) - return datacatalog.lookup_entry(sql_resource=sql_resource) + return datacatalog.lookup_entry(request={'sql_resource': sql_resource}) def lookup_bigquery_table(project_id, dataset_id, table_id): @@ -61,7 +61,7 @@ def lookup_bigquery_table(project_id, dataset_id, table_id): '/tables/{}'\ .format(project_id, dataset_id, table_id) - return datacatalog.lookup_entry(linked_resource=resource_name) + return datacatalog.lookup_entry(request={'linked_resource': resource_name}) def lookup_bigquery_table_sql_resource(project_id, dataset_id, table_id): @@ -75,7 +75,7 @@ def lookup_bigquery_table_sql_resource(project_id, dataset_id, table_id): sql_resource = 'bigquery.table.`{}`.`{}`.`{}`'.format( project_id, dataset_id, table_id) - return datacatalog.lookup_entry(sql_resource=sql_resource) + return datacatalog.lookup_entry(request={'sql_resource': sql_resource}) def lookup_pubsub_topic(project_id, topic_id): @@ -87,7 +87,7 @@ def lookup_pubsub_topic(project_id, topic_id): resource_name = '//pubsub.googleapis.com/projects/{}/topics/{}'\ .format(project_id, topic_id) - return datacatalog.lookup_entry(linked_resource=resource_name) + return datacatalog.lookup_entry(request={'linked_resource': resource_name}) def lookup_pubsub_topic_sql_resource(project_id, topic_id): @@ -100,7 +100,7 @@ def lookup_pubsub_topic_sql_resource(project_id, topic_id): sql_resource = 'pubsub.topic.`{}`.`{}`'.format(project_id, topic_id) - return datacatalog.lookup_entry(sql_resource=sql_resource) + return datacatalog.lookup_entry(request={'sql_resource': sql_resource}) if __name__ == '__main__': diff --git a/samples/tests/conftest.py b/samples/tests/conftest.py index 75e6753f..6ee1fcb6 100644 --- a/samples/tests/conftest.py +++ b/samples/tests/conftest.py @@ -52,7 +52,7 @@ def random_entry_id(client, project_id, random_entry_group_id): entry_name = datacatalog_v1beta1.DataCatalogClient.entry_path( project_id, "us-central1", random_entry_group_id, random_entry_id ) - client.delete_entry(entry_name) + client.delete_entry(request = {'name': entry_name}) @pytest.fixture @@ -65,7 +65,7 @@ def random_entry_group_id(client, project_id): entry_group_name = datacatalog_v1beta1.DataCatalogClient.entry_group_path( project_id, "us-central1", random_entry_group_id ) - client.delete_entry_group(entry_group_name) + client.delete_entry_group(request = {'name': entry_group_name}) @pytest.fixture @@ -76,7 +76,7 @@ def random_entry_name(client, entry_group_name): ) random_entry_name = "{}/entries/{}".format(entry_group_name, random_entry_id) yield random_entry_name - client.delete_entry(random_entry_name) + client.delete_entry(request = {'name': random_entry_name}) @pytest.fixture @@ -86,9 +86,6 @@ def entry_group_name(client, project_id): now.strftime("%Y%m%d%H%M%S"), uuid.uuid4().hex[:8] ) entry_group = client.create_entry_group( - datacatalog_v1beta1.DataCatalogClient.location_path(project_id, "us-central1"), - entry_group_id, - {}, - ) + request = {'parent': datacatalog_v1beta1.DataCatalogClient.location_path(project_id, "us-central1"), 'entry_group_id': entry_group_id, 'entry_group': {}}) yield entry_group.name - client.delete_entry_group(entry_group.name) + client.delete_entry_group(request = {'name': entry_group.name}) diff --git a/samples/tests/test_create_entry_group.py b/samples/tests/test_create_entry_group.py index 9c8c33b8..443c97f9 100644 --- a/samples/tests/test_create_entry_group.py +++ b/samples/tests/test_create_entry_group.py @@ -18,7 +18,7 @@ def test_create_entry_group(capsys, client, project_id, random_entry_group_id): - create_entry_group.create_entry_group(client, project_id, random_entry_group_id) + create_entry_group.create_entry_group(request = {'parent': client, 'entry_group_id': project_id, 'entry_group': random_entry_group_id}) out, err = capsys.readouterr() assert ( "Created entry group" diff --git a/samples/v1beta1/create_entry_group.py b/samples/v1beta1/create_entry_group.py index 24a856d8..d2056ec6 100644 --- a/samples/v1beta1/create_entry_group.py +++ b/samples/v1beta1/create_entry_group.py @@ -40,7 +40,7 @@ def create_entry_group(client, project_id, entry_group_id): ) # Construct a full EntryGroup object to send to the API. - entry_group = datacatalog_v1beta1.types.EntryGroup() + entry_group = datacatalog_v1beta1.EntryGroup() entry_group.display_name = "My Entry Group" entry_group.description = "This Entry Group consists of ..." @@ -48,7 +48,6 @@ def create_entry_group(client, project_id, entry_group_id): # Raises google.api_core.exceptions.AlreadyExists if the Entry Group # already exists within the project. entry_group = client.create_entry_group( - parent, entry_group_id, entry_group - ) # Make an API request. + request = {'parent': parent, 'entry_group_id': entry_group_id, 'entry_group': entry_group}) # Make an API request. print("Created entry group {}".format(entry_group.name)) # [END datacatalog_create_entry_group_tag] diff --git a/samples/v1beta1/create_fileset_entry.py b/samples/v1beta1/create_fileset_entry.py index 6cc27565..f96255b2 100644 --- a/samples/v1beta1/create_fileset_entry.py +++ b/samples/v1beta1/create_fileset_entry.py @@ -81,6 +81,6 @@ def create_fileset_entry(client, entry_group_name, entry_id): # Send the entry to the API for creation. # Raises google.api_core.exceptions.AlreadyExists if the Entry already # exists within the project. - entry = client.create_entry(entry_group_name, entry_id, entry) + entry = client.create_entry(request = {'parent': entry_group_name, 'entry_id': entry_id, 'entry': entry}) print("Created entry {}".format(entry.name)) # [END datacatalog_create_fileset_tag] diff --git a/samples/v1beta1/datacatalog_get_entry.py b/samples/v1beta1/datacatalog_get_entry.py index fcd8b209..05bc0dd5 100644 --- a/samples/v1beta1/datacatalog_get_entry.py +++ b/samples/v1beta1/datacatalog_get_entry.py @@ -26,8 +26,6 @@ # [START datacatalog_get_entry] from google.cloud import datacatalog_v1beta1 -from google.cloud.datacatalog_v1beta1 import enums - def sample_get_entry(project_id, location_id, entry_group_id, entry_id): """ @@ -48,10 +46,10 @@ def sample_get_entry(project_id, location_id, entry_group_id, entry_id): # entry_id = '[Entry ID]' name = client.entry_path(project_id, location_id, entry_group_id, entry_id) - response = client.get_entry(name) + response = client.get_entry(request = {'name': name}) entry = response print(u"Entry name: {}".format(entry.name)) - print(u"Entry type: {}".format(enums.EntryType(entry.type).name)) + print(u"Entry type: {}".format(datacatalog_v1beta1.EntryType(entry.type).name)) print(u"Linked resource: {}".format(entry.linked_resource)) diff --git a/samples/v1beta1/datacatalog_lookup_entry.py b/samples/v1beta1/datacatalog_lookup_entry.py index 7920df16..176d080d 100644 --- a/samples/v1beta1/datacatalog_lookup_entry.py +++ b/samples/v1beta1/datacatalog_lookup_entry.py @@ -26,7 +26,6 @@ # [START datacatalog_lookup_entry] from google.cloud import datacatalog_v1beta1 -from google.cloud.datacatalog_v1beta1 import enums def sample_lookup_entry(resource_name): @@ -45,10 +44,10 @@ def sample_lookup_entry(resource_name): client = datacatalog_v1beta1.DataCatalogClient() # resource_name = '[Full Resource Name]' - response = client.lookup_entry(linked_resource=resource_name) + response = client.lookup_entry(request = {'linked_resource': resource_name}) entry = response print(u"Entry name: {}".format(entry.name)) - print(u"Entry type: {}".format(enums.EntryType(entry.type).name)) + print(u"Entry type: {}".format(datacatalog_v1beta1.EntryType(entry.type).name)) print(u"Linked resource: {}".format(entry.linked_resource)) diff --git a/samples/v1beta1/datacatalog_lookup_entry_sql_resource.py b/samples/v1beta1/datacatalog_lookup_entry_sql_resource.py index 9656759e..f46af369 100644 --- a/samples/v1beta1/datacatalog_lookup_entry_sql_resource.py +++ b/samples/v1beta1/datacatalog_lookup_entry_sql_resource.py @@ -26,7 +26,6 @@ # [START datacatalog_lookup_entry_sql_resource] from google.cloud import datacatalog_v1beta1 -from google.cloud.datacatalog_v1beta1 import enums def sample_lookup_entry(sql_name): @@ -44,10 +43,10 @@ def sample_lookup_entry(sql_name): client = datacatalog_v1beta1.DataCatalogClient() # sql_name = '[SQL Resource Name]' - response = client.lookup_entry(sql_resource=sql_name) + response = client.lookup_entry(request = {'sql_resource': sql_name}) entry = response print(u"Entry name: {}".format(entry.name)) - print(u"Entry type: {}".format(enums.EntryType(entry.type).name)) + print(u"Entry type: {}".format(datacatalog_v1beta1.EntryType(entry.type).name)) print(u"Linked resource: {}".format(entry.linked_resource)) diff --git a/samples/v1beta1/datacatalog_search.py b/samples/v1beta1/datacatalog_search.py index c4c1798c..ad102766 100644 --- a/samples/v1beta1/datacatalog_search.py +++ b/samples/v1beta1/datacatalog_search.py @@ -54,7 +54,7 @@ def sample_search_catalog(include_project_id, include_gcp_public_datasets, query } # Iterate over all results - for response_item in client.search_catalog(scope, query): + for response_item in client.search_catalog(request = {'scope': scope, 'query': query}): print( u"Result type: {}".format( enums.SearchResultType(response_item.search_result_type).name diff --git a/scripts/fixup_datacatalog_v1_keywords.py b/scripts/fixup_datacatalog_v1_keywords.py new file mode 100644 index 00000000..9ad22462 --- /dev/null +++ b/scripts/fixup_datacatalog_v1_keywords.py @@ -0,0 +1,204 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import argparse +import os +import libcst as cst +import pathlib +import sys +from typing import (Any, Callable, Dict, List, Sequence, Tuple) + + +def partition( + predicate: Callable[[Any], bool], + iterator: Sequence[Any] +) -> Tuple[List[Any], List[Any]]: + """A stable, out-of-place partition.""" + results = ([], []) + + for i in iterator: + results[int(predicate(i))].append(i) + + # Returns trueList, falseList + return results[1], results[0] + + +class datacatalogCallTransformer(cst.CSTTransformer): + CTRL_PARAMS: Tuple[str] = ('retry', 'timeout', 'metadata') + METHOD_TO_PARAMS: Dict[str, Tuple[str]] = { + 'create_entry': ('parent', 'entry_id', 'entry', ), + 'create_entry_group': ('parent', 'entry_group_id', 'entry_group', ), + 'create_tag': ('parent', 'tag', ), + 'create_tag_template': ('parent', 'tag_template_id', 'tag_template', ), + 'create_tag_template_field': ('parent', 'tag_template_field_id', 'tag_template_field', ), + 'delete_entry': ('name', ), + 'delete_entry_group': ('name', 'force', ), + 'delete_tag': ('name', ), + 'delete_tag_template': ('name', 'force', ), + 'delete_tag_template_field': ('name', 'force', ), + 'get_entry': ('name', ), + 'get_entry_group': ('name', 'read_mask', ), + 'get_iam_policy': ('resource', 'options', ), + 'get_tag_template': ('name', ), + 'list_entries': ('parent', 'page_size', 'page_token', 'read_mask', ), + 'list_entry_groups': ('parent', 'page_size', 'page_token', ), + 'list_tags': ('parent', 'page_size', 'page_token', ), + 'lookup_entry': ('linked_resource', 'sql_resource', ), + 'rename_tag_template_field': ('name', 'new_tag_template_field_id', ), + 'search_catalog': ('scope', 'query', 'page_size', 'page_token', 'order_by', ), + 'set_iam_policy': ('resource', 'policy', ), + 'test_iam_permissions': ('resource', 'permissions', ), + 'update_entry': ('entry', 'update_mask', ), + 'update_entry_group': ('entry_group', 'update_mask', ), + 'update_tag': ('tag', 'update_mask', ), + 'update_tag_template': ('tag_template', 'update_mask', ), + 'update_tag_template_field': ('name', 'tag_template_field', 'update_mask', ), + + } + + def leave_Call(self, original: cst.Call, updated: cst.Call) -> cst.CSTNode: + try: + key = original.func.attr.value + kword_params = self.METHOD_TO_PARAMS[key] + except (AttributeError, KeyError): + # Either not a method from the API or too convoluted to be sure. + return updated + + # If the existing code is valid, keyword args come after positional args. + # Therefore, all positional args must map to the first parameters. + args, kwargs = partition(lambda a: not bool(a.keyword), updated.args) + if any(k.keyword.value == "request" for k in kwargs): + # We've already fixed this file, don't fix it again. + return updated + + kwargs, ctrl_kwargs = partition( + lambda a: not a.keyword.value in self.CTRL_PARAMS, + kwargs + ) + + args, ctrl_args = args[:len(kword_params)], args[len(kword_params):] + ctrl_kwargs.extend(cst.Arg(value=a.value, keyword=cst.Name(value=ctrl)) + for a, ctrl in zip(ctrl_args, self.CTRL_PARAMS)) + + request_arg = cst.Arg( + value=cst.Dict([ + cst.DictElement( + cst.SimpleString("'{}'".format(name)), + cst.Element(value=arg.value) + ) + # Note: the args + kwargs looks silly, but keep in mind that + # the control parameters had to be stripped out, and that + # those could have been passed positionally or by keyword. + for name, arg in zip(kword_params, args + kwargs)]), + keyword=cst.Name("request") + ) + + return updated.with_changes( + args=[request_arg] + ctrl_kwargs + ) + + +def fix_files( + in_dir: pathlib.Path, + out_dir: pathlib.Path, + *, + transformer=datacatalogCallTransformer(), +): + """Duplicate the input dir to the output dir, fixing file method calls. + + Preconditions: + * in_dir is a real directory + * out_dir is a real, empty directory + """ + pyfile_gen = ( + pathlib.Path(os.path.join(root, f)) + for root, _, files in os.walk(in_dir) + for f in files if os.path.splitext(f)[1] == ".py" + ) + + for fpath in pyfile_gen: + with open(fpath, 'r') as f: + src = f.read() + + # Parse the code and insert method call fixes. + tree = cst.parse_module(src) + updated = tree.visit(transformer) + + # Create the path and directory structure for the new file. + updated_path = out_dir.joinpath(fpath.relative_to(in_dir)) + updated_path.parent.mkdir(parents=True, exist_ok=True) + + # Generate the updated source file at the corresponding path. + with open(updated_path, 'w') as f: + f.write(updated.code) + + +if __name__ == '__main__': + parser = argparse.ArgumentParser( + description="""Fix up source that uses the datacatalog client library. + +The existing sources are NOT overwritten but are copied to output_dir with changes made. + +Note: This tool operates at a best-effort level at converting positional + parameters in client method calls to keyword based parameters. + Cases where it WILL FAIL include + A) * or ** expansion in a method call. + B) Calls via function or method alias (includes free function calls) + C) Indirect or dispatched calls (e.g. the method is looked up dynamically) + + These all constitute false negatives. The tool will also detect false + positives when an API method shares a name with another method. +""") + parser.add_argument( + '-d', + '--input-directory', + required=True, + dest='input_dir', + help='the input directory to walk for python files to fix up', + ) + parser.add_argument( + '-o', + '--output-directory', + required=True, + dest='output_dir', + help='the directory to output files fixed via un-flattening', + ) + args = parser.parse_args() + input_dir = pathlib.Path(args.input_dir) + output_dir = pathlib.Path(args.output_dir) + if not input_dir.is_dir(): + print( + f"input directory '{input_dir}' does not exist or is not a directory", + file=sys.stderr, + ) + sys.exit(-1) + + if not output_dir.is_dir(): + print( + f"output directory '{output_dir}' does not exist or is not a directory", + file=sys.stderr, + ) + sys.exit(-1) + + if os.listdir(output_dir): + print( + f"output directory '{output_dir}' is not empty", + file=sys.stderr, + ) + sys.exit(-1) + + fix_files(input_dir, output_dir) diff --git a/scripts/fixup_datacatalog_v1beta1_keywords.py b/scripts/fixup_datacatalog_v1beta1_keywords.py new file mode 100644 index 00000000..e48632cc --- /dev/null +++ b/scripts/fixup_datacatalog_v1beta1_keywords.py @@ -0,0 +1,216 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import argparse +import os +import libcst as cst +import pathlib +import sys +from typing import (Any, Callable, Dict, List, Sequence, Tuple) + + +def partition( + predicate: Callable[[Any], bool], + iterator: Sequence[Any] +) -> Tuple[List[Any], List[Any]]: + """A stable, out-of-place partition.""" + results = ([], []) + + for i in iterator: + results[int(predicate(i))].append(i) + + # Returns trueList, falseList + return results[1], results[0] + + +class datacatalogCallTransformer(cst.CSTTransformer): + CTRL_PARAMS: Tuple[str] = ('retry', 'timeout', 'metadata') + METHOD_TO_PARAMS: Dict[str, Tuple[str]] = { + 'create_entry': ('parent', 'entry_id', 'entry', ), + 'create_entry_group': ('parent', 'entry_group_id', 'entry_group', ), + 'create_policy_tag': ('parent', 'policy_tag', ), + 'create_tag': ('parent', 'tag', ), + 'create_tag_template': ('parent', 'tag_template_id', 'tag_template', ), + 'create_tag_template_field': ('parent', 'tag_template_field_id', 'tag_template_field', ), + 'create_taxonomy': ('parent', 'taxonomy', ), + 'delete_entry': ('name', ), + 'delete_entry_group': ('name', 'force', ), + 'delete_policy_tag': ('name', ), + 'delete_tag': ('name', ), + 'delete_tag_template': ('name', 'force', ), + 'delete_tag_template_field': ('name', 'force', ), + 'delete_taxonomy': ('name', ), + 'export_taxonomies': ('parent', 'taxonomies', 'serialized_taxonomies', ), + 'get_entry': ('name', ), + 'get_entry_group': ('name', 'read_mask', ), + 'get_iam_policy': ('resource', 'options', ), + 'get_policy_tag': ('name', ), + 'get_tag_template': ('name', ), + 'get_taxonomy': ('name', ), + 'import_taxonomies': ('parent', 'inline_source', ), + 'list_entries': ('parent', 'page_size', 'page_token', 'read_mask', ), + 'list_entry_groups': ('parent', 'page_size', 'page_token', ), + 'list_policy_tags': ('parent', 'page_size', 'page_token', ), + 'list_tags': ('parent', 'page_size', 'page_token', ), + 'list_taxonomies': ('parent', 'page_size', 'page_token', ), + 'lookup_entry': ('linked_resource', 'sql_resource', ), + 'rename_tag_template_field': ('name', 'new_tag_template_field_id', ), + 'search_catalog': ('scope', 'query', 'page_size', 'page_token', 'order_by', ), + 'set_iam_policy': ('resource', 'policy', ), + 'test_iam_permissions': ('resource', 'permissions', ), + 'update_entry': ('entry', 'update_mask', ), + 'update_entry_group': ('entry_group', 'update_mask', ), + 'update_policy_tag': ('policy_tag', 'update_mask', ), + 'update_tag': ('tag', 'update_mask', ), + 'update_tag_template': ('tag_template', 'update_mask', ), + 'update_tag_template_field': ('name', 'tag_template_field', 'update_mask', ), + 'update_taxonomy': ('taxonomy', 'update_mask', ), + + } + + def leave_Call(self, original: cst.Call, updated: cst.Call) -> cst.CSTNode: + try: + key = original.func.attr.value + kword_params = self.METHOD_TO_PARAMS[key] + except (AttributeError, KeyError): + # Either not a method from the API or too convoluted to be sure. + return updated + + # If the existing code is valid, keyword args come after positional args. + # Therefore, all positional args must map to the first parameters. + args, kwargs = partition(lambda a: not bool(a.keyword), updated.args) + if any(k.keyword.value == "request" for k in kwargs): + # We've already fixed this file, don't fix it again. + return updated + + kwargs, ctrl_kwargs = partition( + lambda a: not a.keyword.value in self.CTRL_PARAMS, + kwargs + ) + + args, ctrl_args = args[:len(kword_params)], args[len(kword_params):] + ctrl_kwargs.extend(cst.Arg(value=a.value, keyword=cst.Name(value=ctrl)) + for a, ctrl in zip(ctrl_args, self.CTRL_PARAMS)) + + request_arg = cst.Arg( + value=cst.Dict([ + cst.DictElement( + cst.SimpleString("'{}'".format(name)), + cst.Element(value=arg.value) + ) + # Note: the args + kwargs looks silly, but keep in mind that + # the control parameters had to be stripped out, and that + # those could have been passed positionally or by keyword. + for name, arg in zip(kword_params, args + kwargs)]), + keyword=cst.Name("request") + ) + + return updated.with_changes( + args=[request_arg] + ctrl_kwargs + ) + + +def fix_files( + in_dir: pathlib.Path, + out_dir: pathlib.Path, + *, + transformer=datacatalogCallTransformer(), +): + """Duplicate the input dir to the output dir, fixing file method calls. + + Preconditions: + * in_dir is a real directory + * out_dir is a real, empty directory + """ + pyfile_gen = ( + pathlib.Path(os.path.join(root, f)) + for root, _, files in os.walk(in_dir) + for f in files if os.path.splitext(f)[1] == ".py" + ) + + for fpath in pyfile_gen: + with open(fpath, 'r') as f: + src = f.read() + + # Parse the code and insert method call fixes. + tree = cst.parse_module(src) + updated = tree.visit(transformer) + + # Create the path and directory structure for the new file. + updated_path = out_dir.joinpath(fpath.relative_to(in_dir)) + updated_path.parent.mkdir(parents=True, exist_ok=True) + + # Generate the updated source file at the corresponding path. + with open(updated_path, 'w') as f: + f.write(updated.code) + + +if __name__ == '__main__': + parser = argparse.ArgumentParser( + description="""Fix up source that uses the datacatalog client library. + +The existing sources are NOT overwritten but are copied to output_dir with changes made. + +Note: This tool operates at a best-effort level at converting positional + parameters in client method calls to keyword based parameters. + Cases where it WILL FAIL include + A) * or ** expansion in a method call. + B) Calls via function or method alias (includes free function calls) + C) Indirect or dispatched calls (e.g. the method is looked up dynamically) + + These all constitute false negatives. The tool will also detect false + positives when an API method shares a name with another method. +""") + parser.add_argument( + '-d', + '--input-directory', + required=True, + dest='input_dir', + help='the input directory to walk for python files to fix up', + ) + parser.add_argument( + '-o', + '--output-directory', + required=True, + dest='output_dir', + help='the directory to output files fixed via un-flattening', + ) + args = parser.parse_args() + input_dir = pathlib.Path(args.input_dir) + output_dir = pathlib.Path(args.output_dir) + if not input_dir.is_dir(): + print( + f"input directory '{input_dir}' does not exist or is not a directory", + file=sys.stderr, + ) + sys.exit(-1) + + if not output_dir.is_dir(): + print( + f"output directory '{output_dir}' does not exist or is not a directory", + file=sys.stderr, + ) + sys.exit(-1) + + if os.listdir(output_dir): + print( + f"output directory '{output_dir}' is not empty", + file=sys.stderr, + ) + sys.exit(-1) + + fix_files(input_dir, output_dir) diff --git a/setup.py b/setup.py index 8590f436..e21dad6c 100644 --- a/setup.py +++ b/setup.py @@ -28,9 +28,10 @@ # 'Development Status :: 5 - Production/Stable' release_status = "Development Status :: 5 - Production/Stable" dependencies = [ - "google-api-core[grpc] >= 1.14.0, < 2.0.0dev", + "google-api-core[grpc] >= 1.22.0, < 2.0.0dev", "grpc-google-iam-v1 >= 0.12.3, < 0.13dev", - 'enum34; python_version < "3.4"', + "libcst >= 0.2.5", + "proto-plus >= 1.4.0", ] package_root = os.path.abspath(os.path.dirname(__file__)) @@ -40,7 +41,9 @@ readme = readme_file.read() packages = [ - package for package in setuptools.find_packages() if package.startswith("google") + package + for package in setuptools.PEP420PackageFinder.find() + if package.startswith("google") ] namespaces = ["google"] @@ -61,17 +64,19 @@ "Intended Audience :: Developers", "License :: OSI Approved :: Apache Software License", "Programming Language :: Python", - "Programming Language :: Python :: 2", - "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", - "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", + "Programming Language :: Python :: 3.8", "Operating System :: OS Independent", "Topic :: Internet", ], platforms="Posix; MacOS X; Windows", packages=packages, + scripts=[ + "scripts/fixup_datacatalog_v1_keywords.py", + "scripts/fixup_datacatalog_v1beta1_keywords.py", + ], namespace_packages=namespaces, install_requires=dependencies, include_package_data=True, diff --git a/synth.metadata b/synth.metadata index 55b4e42b..48e1a9a9 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,29 +4,21 @@ "git": { "name": ".", "remote": "git@github.com:googleapis/python-datacatalog.git", - "sha": "bc14411984259665f14119443f6d5e6901a7e3d4" - } - }, - { - "git": { - "name": "googleapis", - "remote": "https://github.com/googleapis/googleapis.git", - "sha": "874846a1917ee5c3fe271449f3cb9a06e75407be", - "internalRef": "326288259" + "sha": "09d02ebb2738c9663abe060da926c2432d6ffb42" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "5747555f7620113d9a2078a48f4c047a99d31b3e" + "sha": "d3049e66447b44dc10579e461d5e08e0e3838edd" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "5747555f7620113d9a2078a48f4c047a99d31b3e" + "sha": "d3049e66447b44dc10579e461d5e08e0e3838edd" } } ], diff --git a/synth.py b/synth.py index 9e783eeb..7037d24b 100644 --- a/synth.py +++ b/synth.py @@ -31,6 +31,7 @@ service='datacatalog', version=version, bazel_target=f"//google/cloud/datacatalog/{version}:datacatalog-{version}-py", + include_protos=True, ) s.move( @@ -57,10 +58,20 @@ # Add templated files # ---------------------------------------------------------------------------- templated_files = common.py_library( - cov_level=79, samples=True, + microgenerator=True, ) -s.move(templated_files) +s.move(templated_files, excludes=[".coveragerc"]) # microgenerator has a good .coveragerc file + +# ---------------------------------------------------------------------------- +# Samples templates +# ---------------------------------------------------------------------------- + +python.py_samples() + +# Temporarily disable warnings due to +# https://github.com/googleapis/gapic-generator-python/issues/525 +s.replace("noxfile.py", '[\"\']-W[\"\']', '# "-W"') # ---------------------------------------------------------------------------- # Samples templates diff --git a/tests/unit/gapic/datacatalog_v1/__init__.py b/tests/unit/gapic/datacatalog_v1/__init__.py new file mode 100644 index 00000000..8b137891 --- /dev/null +++ b/tests/unit/gapic/datacatalog_v1/__init__.py @@ -0,0 +1 @@ + diff --git a/tests/unit/gapic/datacatalog_v1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py new file mode 100644 index 00000000..79523fa6 --- /dev/null +++ b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py @@ -0,0 +1,6836 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import os +import mock + +import grpc +from grpc.experimental import aio +import math +import pytest +from proto.marshal.rules.dates import DurationRule, TimestampRule + +from google import auth +from google.api_core import client_options +from google.api_core import exceptions +from google.api_core import gapic_v1 +from google.api_core import grpc_helpers +from google.api_core import grpc_helpers_async +from google.auth import credentials +from google.auth.exceptions import MutualTLSChannelError +from google.cloud.datacatalog_v1.services.data_catalog import DataCatalogAsyncClient +from google.cloud.datacatalog_v1.services.data_catalog import DataCatalogClient +from google.cloud.datacatalog_v1.services.data_catalog import pagers +from google.cloud.datacatalog_v1.services.data_catalog import transports +from google.cloud.datacatalog_v1.types import common +from google.cloud.datacatalog_v1.types import datacatalog +from google.cloud.datacatalog_v1.types import gcs_fileset_spec +from google.cloud.datacatalog_v1.types import gcs_fileset_spec as gcd_gcs_fileset_spec +from google.cloud.datacatalog_v1.types import schema +from google.cloud.datacatalog_v1.types import schema as gcd_schema +from google.cloud.datacatalog_v1.types import search +from google.cloud.datacatalog_v1.types import table_spec +from google.cloud.datacatalog_v1.types import table_spec as gcd_table_spec +from google.cloud.datacatalog_v1.types import tags +from google.cloud.datacatalog_v1.types import timestamps +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import options_pb2 as options # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.oauth2 import service_account +from google.protobuf import field_mask_pb2 as field_mask # type: ignore +from google.protobuf import timestamp_pb2 as timestamp # type: ignore +from google.type import expr_pb2 as expr # type: ignore + + +def client_cert_source_callback(): + return b"cert bytes", b"key bytes" + + +# If default endpoint is localhost, then default mtls endpoint will be the same. +# This method modifies the default endpoint so the client can produce a different +# mtls endpoint for endpoint testing purposes. +def modify_default_endpoint(client): + return ( + "foo.googleapis.com" + if ("localhost" in client.DEFAULT_ENDPOINT) + else client.DEFAULT_ENDPOINT + ) + + +def test__get_default_mtls_endpoint(): + api_endpoint = "example.googleapis.com" + api_mtls_endpoint = "example.mtls.googleapis.com" + sandbox_endpoint = "example.sandbox.googleapis.com" + sandbox_mtls_endpoint = "example.mtls.sandbox.googleapis.com" + non_googleapi = "api.example.com" + + assert DataCatalogClient._get_default_mtls_endpoint(None) is None + assert ( + DataCatalogClient._get_default_mtls_endpoint(api_endpoint) == api_mtls_endpoint + ) + assert ( + DataCatalogClient._get_default_mtls_endpoint(api_mtls_endpoint) + == api_mtls_endpoint + ) + assert ( + DataCatalogClient._get_default_mtls_endpoint(sandbox_endpoint) + == sandbox_mtls_endpoint + ) + assert ( + DataCatalogClient._get_default_mtls_endpoint(sandbox_mtls_endpoint) + == sandbox_mtls_endpoint + ) + assert DataCatalogClient._get_default_mtls_endpoint(non_googleapi) == non_googleapi + + +@pytest.mark.parametrize("client_class", [DataCatalogClient, DataCatalogAsyncClient]) +def test_data_catalog_client_from_service_account_file(client_class): + creds = credentials.AnonymousCredentials() + with mock.patch.object( + service_account.Credentials, "from_service_account_file" + ) as factory: + factory.return_value = creds + client = client_class.from_service_account_file("dummy/file/path.json") + assert client._transport._credentials == creds + + client = client_class.from_service_account_json("dummy/file/path.json") + assert client._transport._credentials == creds + + assert client._transport._host == "datacatalog.googleapis.com:443" + + +def test_data_catalog_client_get_transport_class(): + transport = DataCatalogClient.get_transport_class() + assert transport == transports.DataCatalogGrpcTransport + + transport = DataCatalogClient.get_transport_class("grpc") + assert transport == transports.DataCatalogGrpcTransport + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + (DataCatalogClient, transports.DataCatalogGrpcTransport, "grpc"), + ( + DataCatalogAsyncClient, + transports.DataCatalogGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +@mock.patch.object( + DataCatalogClient, "DEFAULT_ENDPOINT", modify_default_endpoint(DataCatalogClient) +) +@mock.patch.object( + DataCatalogAsyncClient, + "DEFAULT_ENDPOINT", + modify_default_endpoint(DataCatalogAsyncClient), +) +def test_data_catalog_client_client_options( + client_class, transport_class, transport_name +): + # Check that if channel is provided we won't create a new one. + with mock.patch.object(DataCatalogClient, "get_transport_class") as gtc: + transport = transport_class(credentials=credentials.AnonymousCredentials()) + client = client_class(transport=transport) + gtc.assert_not_called() + + # Check that if channel is provided via str we will create a new one. + with mock.patch.object(DataCatalogClient, "get_transport_class") as gtc: + client = client_class(transport=transport_name) + gtc.assert_called() + + # Check the case api_endpoint is provided. + options = client_options.ClientOptions(api_endpoint="squid.clam.whelk") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host="squid.clam.whelk", + scopes=None, + api_mtls_endpoint="squid.clam.whelk", + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS is + # "never". + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "never"}): + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS is + # "always". + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "always"}): + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", and client_cert_source is provided. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + options = client_options.ClientOptions( + client_cert_source=client_cert_source_callback + ) + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=client_cert_source_callback, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", and default_client_cert_source is provided. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=True, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", but client_cert_source and default_client_cert_source are None. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=False, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS has + # unsupported value. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "Unsupported"}): + with pytest.raises(MutualTLSChannelError): + client = client_class() + + # Check the case quota_project_id is provided + options = client_options.ClientOptions(quota_project_id="octopus") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id="octopus", + ) + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + (DataCatalogClient, transports.DataCatalogGrpcTransport, "grpc"), + ( + DataCatalogAsyncClient, + transports.DataCatalogGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +def test_data_catalog_client_client_options_scopes( + client_class, transport_class, transport_name +): + # Check the case scopes are provided. + options = client_options.ClientOptions(scopes=["1", "2"],) + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=["1", "2"], + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + (DataCatalogClient, transports.DataCatalogGrpcTransport, "grpc"), + ( + DataCatalogAsyncClient, + transports.DataCatalogGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +def test_data_catalog_client_client_options_credentials_file( + client_class, transport_class, transport_name +): + # Check the case credentials file is provided. + options = client_options.ClientOptions(credentials_file="credentials.json") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file="credentials.json", + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + +def test_data_catalog_client_client_options_from_dict(): + with mock.patch( + "google.cloud.datacatalog_v1.services.data_catalog.transports.DataCatalogGrpcTransport.__init__" + ) as grpc_transport: + grpc_transport.return_value = None + client = DataCatalogClient(client_options={"api_endpoint": "squid.clam.whelk"}) + grpc_transport.assert_called_once_with( + credentials=None, + credentials_file=None, + host="squid.clam.whelk", + scopes=None, + api_mtls_endpoint="squid.clam.whelk", + client_cert_source=None, + quota_project_id=None, + ) + + +def test_search_catalog( + transport: str = "grpc", request_type=datacatalog.SearchCatalogRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.search_catalog), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.SearchCatalogResponse( + next_page_token="next_page_token_value", unreachable=["unreachable_value"], + ) + + response = client.search_catalog(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.SearchCatalogRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.SearchCatalogPager) + + assert response.next_page_token == "next_page_token_value" + + assert response.unreachable == ["unreachable_value"] + + +def test_search_catalog_from_dict(): + test_search_catalog(request_type=dict) + + +@pytest.mark.asyncio +async def test_search_catalog_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.SearchCatalogRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.search_catalog), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.SearchCatalogResponse( + next_page_token="next_page_token_value", + unreachable=["unreachable_value"], + ) + ) + + response = await client.search_catalog(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.SearchCatalogAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + assert response.unreachable == ["unreachable_value"] + + +def test_search_catalog_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.search_catalog), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.SearchCatalogResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.search_catalog( + scope=datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ), + query="query_value", + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].scope == datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ) + + assert args[0].query == "query_value" + + +def test_search_catalog_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.search_catalog( + datacatalog.SearchCatalogRequest(), + scope=datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ), + query="query_value", + ) + + +@pytest.mark.asyncio +async def test_search_catalog_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.search_catalog), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.SearchCatalogResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.SearchCatalogResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.search_catalog( + scope=datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ), + query="query_value", + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].scope == datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ) + + assert args[0].query == "query_value" + + +@pytest.mark.asyncio +async def test_search_catalog_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.search_catalog( + datacatalog.SearchCatalogRequest(), + scope=datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ), + query="query_value", + ) + + +def test_search_catalog_pager(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.search_catalog), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.SearchCatalogResponse( + results=[ + search.SearchCatalogResult(), + search.SearchCatalogResult(), + search.SearchCatalogResult(), + ], + next_page_token="abc", + ), + datacatalog.SearchCatalogResponse(results=[], next_page_token="def",), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(),], next_page_token="ghi", + ), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(), search.SearchCatalogResult(),], + ), + RuntimeError, + ) + + metadata = () + pager = client.search_catalog(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, search.SearchCatalogResult) for i in results) + + +def test_search_catalog_pages(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.search_catalog), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.SearchCatalogResponse( + results=[ + search.SearchCatalogResult(), + search.SearchCatalogResult(), + search.SearchCatalogResult(), + ], + next_page_token="abc", + ), + datacatalog.SearchCatalogResponse(results=[], next_page_token="def",), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(),], next_page_token="ghi", + ), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(), search.SearchCatalogResult(),], + ), + RuntimeError, + ) + pages = list(client.search_catalog(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_search_catalog_async_pager(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.search_catalog), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.SearchCatalogResponse( + results=[ + search.SearchCatalogResult(), + search.SearchCatalogResult(), + search.SearchCatalogResult(), + ], + next_page_token="abc", + ), + datacatalog.SearchCatalogResponse(results=[], next_page_token="def",), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(),], next_page_token="ghi", + ), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(), search.SearchCatalogResult(),], + ), + RuntimeError, + ) + async_pager = await client.search_catalog(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, search.SearchCatalogResult) for i in responses) + + +@pytest.mark.asyncio +async def test_search_catalog_async_pages(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.search_catalog), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.SearchCatalogResponse( + results=[ + search.SearchCatalogResult(), + search.SearchCatalogResult(), + search.SearchCatalogResult(), + ], + next_page_token="abc", + ), + datacatalog.SearchCatalogResponse(results=[], next_page_token="def",), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(),], next_page_token="ghi", + ), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(), search.SearchCatalogResult(),], + ), + RuntimeError, + ) + pages = [] + async for page in (await client.search_catalog(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_create_entry_group( + transport: str = "grpc", request_type=datacatalog.CreateEntryGroupRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + + response = client.create_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateEntryGroupRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_create_entry_group_from_dict(): + test_create_entry_group(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_entry_group_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateEntryGroupRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.create_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_create_entry_group_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateEntryGroupRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_entry_group), "__call__" + ) as call: + call.return_value = datacatalog.EntryGroup() + + client.create_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_entry_group_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateEntryGroupRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry_group), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + + await client.create_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_entry_group_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_entry_group( + parent="parent_value", + entry_group_id="entry_group_id_value", + entry_group=datacatalog.EntryGroup(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].entry_group_id == "entry_group_id_value" + + assert args[0].entry_group == datacatalog.EntryGroup(name="name_value") + + +def test_create_entry_group_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_entry_group( + datacatalog.CreateEntryGroupRequest(), + parent="parent_value", + entry_group_id="entry_group_id_value", + entry_group=datacatalog.EntryGroup(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_entry_group_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_entry_group( + parent="parent_value", + entry_group_id="entry_group_id_value", + entry_group=datacatalog.EntryGroup(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].entry_group_id == "entry_group_id_value" + + assert args[0].entry_group == datacatalog.EntryGroup(name="name_value") + + +@pytest.mark.asyncio +async def test_create_entry_group_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_entry_group( + datacatalog.CreateEntryGroupRequest(), + parent="parent_value", + entry_group_id="entry_group_id_value", + entry_group=datacatalog.EntryGroup(name="name_value"), + ) + + +def test_get_entry_group( + transport: str = "grpc", request_type=datacatalog.GetEntryGroupRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry_group), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + + response = client.get_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetEntryGroupRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_get_entry_group_from_dict(): + test_get_entry_group(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_entry_group_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.GetEntryGroupRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.get_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_get_entry_group_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetEntryGroupRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry_group), "__call__") as call: + call.return_value = datacatalog.EntryGroup() + + client.get_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_entry_group_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetEntryGroupRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry_group), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + + await client.get_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_get_entry_group_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry_group), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_entry_group( + name="name_value", read_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].read_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_get_entry_group_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_entry_group( + datacatalog.GetEntryGroupRequest(), + name="name_value", + read_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_get_entry_group_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_entry_group( + name="name_value", read_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].read_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_get_entry_group_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_entry_group( + datacatalog.GetEntryGroupRequest(), + name="name_value", + read_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_update_entry_group( + transport: str = "grpc", request_type=datacatalog.UpdateEntryGroupRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + + response = client.update_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateEntryGroupRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_update_entry_group_from_dict(): + test_update_entry_group(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_entry_group_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateEntryGroupRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.update_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_update_entry_group_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateEntryGroupRequest() + request.entry_group.name = "entry_group.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_entry_group), "__call__" + ) as call: + call.return_value = datacatalog.EntryGroup() + + client.update_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "entry_group.name=entry_group.name/value",) in kw[ + "metadata" + ] + + +@pytest.mark.asyncio +async def test_update_entry_group_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateEntryGroupRequest() + request.entry_group.name = "entry_group.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry_group), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + + await client.update_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "entry_group.name=entry_group.name/value",) in kw[ + "metadata" + ] + + +def test_update_entry_group_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_entry_group( + entry_group=datacatalog.EntryGroup(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].entry_group == datacatalog.EntryGroup(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_entry_group_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_entry_group( + datacatalog.UpdateEntryGroupRequest(), + entry_group=datacatalog.EntryGroup(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_entry_group_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_entry_group( + entry_group=datacatalog.EntryGroup(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].entry_group == datacatalog.EntryGroup(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_entry_group_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_entry_group( + datacatalog.UpdateEntryGroupRequest(), + entry_group=datacatalog.EntryGroup(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_delete_entry_group( + transport: str = "grpc", request_type=datacatalog.DeleteEntryGroupRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteEntryGroupRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_entry_group_from_dict(): + test_delete_entry_group(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_entry_group_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteEntryGroupRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_entry_group_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteEntryGroupRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_entry_group), "__call__" + ) as call: + call.return_value = None + + client.delete_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_entry_group_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteEntryGroupRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry_group), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_entry_group_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_entry_group(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_delete_entry_group_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_entry_group( + datacatalog.DeleteEntryGroupRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_delete_entry_group_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_entry_group(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_delete_entry_group_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_entry_group( + datacatalog.DeleteEntryGroupRequest(), name="name_value", + ) + + +def test_list_entry_groups( + transport: str = "grpc", request_type=datacatalog.ListEntryGroupsRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntryGroupsResponse( + next_page_token="next_page_token_value", + ) + + response = client.list_entry_groups(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListEntryGroupsRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListEntryGroupsPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_entry_groups_from_dict(): + test_list_entry_groups(request_type=dict) + + +@pytest.mark.asyncio +async def test_list_entry_groups_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.ListEntryGroupsRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntryGroupsResponse( + next_page_token="next_page_token_value", + ) + ) + + response = await client.list_entry_groups(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListEntryGroupsAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_entry_groups_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListEntryGroupsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + call.return_value = datacatalog.ListEntryGroupsResponse() + + client.list_entry_groups(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_list_entry_groups_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListEntryGroupsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntryGroupsResponse() + ) + + await client.list_entry_groups(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_list_entry_groups_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntryGroupsResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.list_entry_groups(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +def test_list_entry_groups_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.list_entry_groups( + datacatalog.ListEntryGroupsRequest(), parent="parent_value", + ) + + +@pytest.mark.asyncio +async def test_list_entry_groups_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntryGroupsResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntryGroupsResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.list_entry_groups(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +@pytest.mark.asyncio +async def test_list_entry_groups_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.list_entry_groups( + datacatalog.ListEntryGroupsRequest(), parent="parent_value", + ) + + +def test_list_entry_groups_pager(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntryGroupsResponse( + entry_groups=[ + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + ], + next_page_token="abc", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[], next_page_token="def", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(),], next_page_token="ghi", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(), datacatalog.EntryGroup(),], + ), + RuntimeError, + ) + + metadata = () + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)), + ) + pager = client.list_entry_groups(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, datacatalog.EntryGroup) for i in results) + + +def test_list_entry_groups_pages(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntryGroupsResponse( + entry_groups=[ + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + ], + next_page_token="abc", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[], next_page_token="def", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(),], next_page_token="ghi", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(), datacatalog.EntryGroup(),], + ), + RuntimeError, + ) + pages = list(client.list_entry_groups(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_list_entry_groups_async_pager(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntryGroupsResponse( + entry_groups=[ + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + ], + next_page_token="abc", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[], next_page_token="def", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(),], next_page_token="ghi", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(), datacatalog.EntryGroup(),], + ), + RuntimeError, + ) + async_pager = await client.list_entry_groups(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, datacatalog.EntryGroup) for i in responses) + + +@pytest.mark.asyncio +async def test_list_entry_groups_async_pages(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntryGroupsResponse( + entry_groups=[ + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + ], + next_page_token="abc", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[], next_page_token="def", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(),], next_page_token="ghi", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(), datacatalog.EntryGroup(),], + ), + RuntimeError, + ) + pages = [] + async for page in (await client.list_entry_groups(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_create_entry( + transport: str = "grpc", request_type=datacatalog.CreateEntryRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + type=datacatalog.EntryType.TABLE, + integrated_system=common.IntegratedSystem.BIGQUERY, + gcs_fileset_spec=gcs_fileset_spec.GcsFilesetSpec( + file_patterns=["file_patterns_value"] + ), + ) + + response = client.create_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateEntryRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_create_entry_from_dict(): + test_create_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.create_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_create_entry_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateEntryRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_entry), "__call__") as call: + call.return_value = datacatalog.Entry() + + client.create_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_entry_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateEntryRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + + await client.create_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_entry_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_entry( + parent="parent_value", + entry_id="entry_id_value", + entry=datacatalog.Entry(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].entry_id == "entry_id_value" + + assert args[0].entry == datacatalog.Entry(name="name_value") + + +def test_create_entry_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_entry( + datacatalog.CreateEntryRequest(), + parent="parent_value", + entry_id="entry_id_value", + entry=datacatalog.Entry(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_entry_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_entry( + parent="parent_value", + entry_id="entry_id_value", + entry=datacatalog.Entry(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].entry_id == "entry_id_value" + + assert args[0].entry == datacatalog.Entry(name="name_value") + + +@pytest.mark.asyncio +async def test_create_entry_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_entry( + datacatalog.CreateEntryRequest(), + parent="parent_value", + entry_id="entry_id_value", + entry=datacatalog.Entry(name="name_value"), + ) + + +def test_update_entry( + transport: str = "grpc", request_type=datacatalog.UpdateEntryRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + type=datacatalog.EntryType.TABLE, + integrated_system=common.IntegratedSystem.BIGQUERY, + gcs_fileset_spec=gcs_fileset_spec.GcsFilesetSpec( + file_patterns=["file_patterns_value"] + ), + ) + + response = client.update_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateEntryRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_update_entry_from_dict(): + test_update_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.update_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_update_entry_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateEntryRequest() + request.entry.name = "entry.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_entry), "__call__") as call: + call.return_value = datacatalog.Entry() + + client.update_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "entry.name=entry.name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_update_entry_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateEntryRequest() + request.entry.name = "entry.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + + await client.update_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "entry.name=entry.name/value",) in kw["metadata"] + + +def test_update_entry_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_entry( + entry=datacatalog.Entry(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].entry == datacatalog.Entry(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_entry_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_entry( + datacatalog.UpdateEntryRequest(), + entry=datacatalog.Entry(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_entry_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_entry( + entry=datacatalog.Entry(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].entry == datacatalog.Entry(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_entry_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_entry( + datacatalog.UpdateEntryRequest(), + entry=datacatalog.Entry(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_delete_entry( + transport: str = "grpc", request_type=datacatalog.DeleteEntryRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteEntryRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_entry_from_dict(): + test_delete_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_entry_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteEntryRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_entry), "__call__") as call: + call.return_value = None + + client.delete_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_entry_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteEntryRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_entry_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_entry(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_delete_entry_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_entry( + datacatalog.DeleteEntryRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_delete_entry_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_entry(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_delete_entry_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_entry( + datacatalog.DeleteEntryRequest(), name="name_value", + ) + + +def test_get_entry(transport: str = "grpc", request_type=datacatalog.GetEntryRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + type=datacatalog.EntryType.TABLE, + integrated_system=common.IntegratedSystem.BIGQUERY, + gcs_fileset_spec=gcs_fileset_spec.GcsFilesetSpec( + file_patterns=["file_patterns_value"] + ), + ) + + response = client.get_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetEntryRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_get_entry_from_dict(): + test_get_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.GetEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.get_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_get_entry_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetEntryRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry), "__call__") as call: + call.return_value = datacatalog.Entry() + + client.get_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_entry_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetEntryRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + + await client.get_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_get_entry_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_entry(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_get_entry_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_entry( + datacatalog.GetEntryRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_get_entry_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_entry(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_get_entry_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_entry( + datacatalog.GetEntryRequest(), name="name_value", + ) + + +def test_lookup_entry( + transport: str = "grpc", request_type=datacatalog.LookupEntryRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.lookup_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + type=datacatalog.EntryType.TABLE, + integrated_system=common.IntegratedSystem.BIGQUERY, + gcs_fileset_spec=gcs_fileset_spec.GcsFilesetSpec( + file_patterns=["file_patterns_value"] + ), + ) + + response = client.lookup_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.LookupEntryRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_lookup_entry_from_dict(): + test_lookup_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_lookup_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.LookupEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.lookup_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.lookup_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_list_entries( + transport: str = "grpc", request_type=datacatalog.ListEntriesRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntriesResponse( + next_page_token="next_page_token_value", + ) + + response = client.list_entries(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListEntriesRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListEntriesPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_entries_from_dict(): + test_list_entries(request_type=dict) + + +@pytest.mark.asyncio +async def test_list_entries_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.ListEntriesRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntriesResponse(next_page_token="next_page_token_value",) + ) + + response = await client.list_entries(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListEntriesAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_entries_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListEntriesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + call.return_value = datacatalog.ListEntriesResponse() + + client.list_entries(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_list_entries_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListEntriesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntriesResponse() + ) + + await client.list_entries(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_list_entries_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntriesResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.list_entries(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +def test_list_entries_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.list_entries( + datacatalog.ListEntriesRequest(), parent="parent_value", + ) + + +@pytest.mark.asyncio +async def test_list_entries_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntriesResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntriesResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.list_entries(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +@pytest.mark.asyncio +async def test_list_entries_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.list_entries( + datacatalog.ListEntriesRequest(), parent="parent_value", + ) + + +def test_list_entries_pager(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntriesResponse( + entries=[ + datacatalog.Entry(), + datacatalog.Entry(), + datacatalog.Entry(), + ], + next_page_token="abc", + ), + datacatalog.ListEntriesResponse(entries=[], next_page_token="def",), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(),], next_page_token="ghi", + ), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(), datacatalog.Entry(),], + ), + RuntimeError, + ) + + metadata = () + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)), + ) + pager = client.list_entries(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, datacatalog.Entry) for i in results) + + +def test_list_entries_pages(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntriesResponse( + entries=[ + datacatalog.Entry(), + datacatalog.Entry(), + datacatalog.Entry(), + ], + next_page_token="abc", + ), + datacatalog.ListEntriesResponse(entries=[], next_page_token="def",), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(),], next_page_token="ghi", + ), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(), datacatalog.Entry(),], + ), + RuntimeError, + ) + pages = list(client.list_entries(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_list_entries_async_pager(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntriesResponse( + entries=[ + datacatalog.Entry(), + datacatalog.Entry(), + datacatalog.Entry(), + ], + next_page_token="abc", + ), + datacatalog.ListEntriesResponse(entries=[], next_page_token="def",), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(),], next_page_token="ghi", + ), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(), datacatalog.Entry(),], + ), + RuntimeError, + ) + async_pager = await client.list_entries(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, datacatalog.Entry) for i in responses) + + +@pytest.mark.asyncio +async def test_list_entries_async_pages(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntriesResponse( + entries=[ + datacatalog.Entry(), + datacatalog.Entry(), + datacatalog.Entry(), + ], + next_page_token="abc", + ), + datacatalog.ListEntriesResponse(entries=[], next_page_token="def",), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(),], next_page_token="ghi", + ), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(), datacatalog.Entry(),], + ), + RuntimeError, + ) + pages = [] + async for page in (await client.list_entries(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_create_tag_template( + transport: str = "grpc", request_type=datacatalog.CreateTagTemplateRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate( + name="name_value", display_name="display_name_value", + ) + + response = client.create_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagTemplateRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_create_tag_template_from_dict(): + test_create_tag_template(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_tag_template_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateTagTemplateRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplate(name="name_value", display_name="display_name_value",) + ) + + response = await client.create_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_create_tag_template_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagTemplateRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template), "__call__" + ) as call: + call.return_value = tags.TagTemplate() + + client.create_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_tag_template_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagTemplateRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + + await client.create_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_tag_template_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_tag_template( + parent="parent_value", + tag_template_id="tag_template_id_value", + tag_template=tags.TagTemplate(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag_template_id == "tag_template_id_value" + + assert args[0].tag_template == tags.TagTemplate(name="name_value") + + +def test_create_tag_template_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_tag_template( + datacatalog.CreateTagTemplateRequest(), + parent="parent_value", + tag_template_id="tag_template_id_value", + tag_template=tags.TagTemplate(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_tag_template_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_tag_template( + parent="parent_value", + tag_template_id="tag_template_id_value", + tag_template=tags.TagTemplate(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag_template_id == "tag_template_id_value" + + assert args[0].tag_template == tags.TagTemplate(name="name_value") + + +@pytest.mark.asyncio +async def test_create_tag_template_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_tag_template( + datacatalog.CreateTagTemplateRequest(), + parent="parent_value", + tag_template_id="tag_template_id_value", + tag_template=tags.TagTemplate(name="name_value"), + ) + + +def test_get_tag_template( + transport: str = "grpc", request_type=datacatalog.GetTagTemplateRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.get_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate( + name="name_value", display_name="display_name_value", + ) + + response = client.get_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetTagTemplateRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_get_tag_template_from_dict(): + test_get_tag_template(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_tag_template_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.GetTagTemplateRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplate(name="name_value", display_name="display_name_value",) + ) + + response = await client.get_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_get_tag_template_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetTagTemplateRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.get_tag_template), "__call__" + ) as call: + call.return_value = tags.TagTemplate() + + client.get_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_tag_template_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetTagTemplateRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_tag_template), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + + await client.get_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_get_tag_template_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.get_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_tag_template(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_get_tag_template_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_tag_template( + datacatalog.GetTagTemplateRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_get_tag_template_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_tag_template(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_get_tag_template_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_tag_template( + datacatalog.GetTagTemplateRequest(), name="name_value", + ) + + +def test_update_tag_template( + transport: str = "grpc", request_type=datacatalog.UpdateTagTemplateRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate( + name="name_value", display_name="display_name_value", + ) + + response = client.update_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagTemplateRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_update_tag_template_from_dict(): + test_update_tag_template(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_tag_template_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateTagTemplateRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplate(name="name_value", display_name="display_name_value",) + ) + + response = await client.update_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_update_tag_template_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagTemplateRequest() + request.tag_template.name = "tag_template.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template), "__call__" + ) as call: + call.return_value = tags.TagTemplate() + + client.update_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ( + "x-goog-request-params", + "tag_template.name=tag_template.name/value", + ) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_update_tag_template_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagTemplateRequest() + request.tag_template.name = "tag_template.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + + await client.update_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ( + "x-goog-request-params", + "tag_template.name=tag_template.name/value", + ) in kw["metadata"] + + +def test_update_tag_template_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_tag_template( + tag_template=tags.TagTemplate(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].tag_template == tags.TagTemplate(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_tag_template_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_tag_template( + datacatalog.UpdateTagTemplateRequest(), + tag_template=tags.TagTemplate(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_tag_template_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_tag_template( + tag_template=tags.TagTemplate(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].tag_template == tags.TagTemplate(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_tag_template_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_tag_template( + datacatalog.UpdateTagTemplateRequest(), + tag_template=tags.TagTemplate(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_delete_tag_template( + transport: str = "grpc", request_type=datacatalog.DeleteTagTemplateRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagTemplateRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_template_from_dict(): + test_delete_tag_template(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_tag_template_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteTagTemplateRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_template_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagTemplateRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template), "__call__" + ) as call: + call.return_value = None + + client.delete_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagTemplateRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_tag_template_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_tag_template( + name="name_value", force=True, + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].force == True + + +def test_delete_tag_template_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_tag_template( + datacatalog.DeleteTagTemplateRequest(), name="name_value", force=True, + ) + + +@pytest.mark.asyncio +async def test_delete_tag_template_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_tag_template(name="name_value", force=True,) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].force == True + + +@pytest.mark.asyncio +async def test_delete_tag_template_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_tag_template( + datacatalog.DeleteTagTemplateRequest(), name="name_value", force=True, + ) + + +def test_create_tag_template_field( + transport: str = "grpc", request_type=datacatalog.CreateTagTemplateFieldRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + + response = client.create_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagTemplateFieldRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_create_tag_template_field_from_dict(): + test_create_tag_template_field(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_tag_template_field_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateTagTemplateFieldRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + ) + + response = await client.create_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_create_tag_template_field_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagTemplateFieldRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template_field), "__call__" + ) as call: + call.return_value = tags.TagTemplateField() + + client.create_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_tag_template_field_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagTemplateFieldRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template_field), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + + await client.create_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_tag_template_field_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_tag_template_field( + parent="parent_value", + tag_template_field_id="tag_template_field_id_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag_template_field_id == "tag_template_field_id_value" + + assert args[0].tag_template_field == tags.TagTemplateField(name="name_value") + + +def test_create_tag_template_field_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_tag_template_field( + datacatalog.CreateTagTemplateFieldRequest(), + parent="parent_value", + tag_template_field_id="tag_template_field_id_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_tag_template_field_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_tag_template_field( + parent="parent_value", + tag_template_field_id="tag_template_field_id_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag_template_field_id == "tag_template_field_id_value" + + assert args[0].tag_template_field == tags.TagTemplateField(name="name_value") + + +@pytest.mark.asyncio +async def test_create_tag_template_field_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_tag_template_field( + datacatalog.CreateTagTemplateFieldRequest(), + parent="parent_value", + tag_template_field_id="tag_template_field_id_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + ) + + +def test_update_tag_template_field( + transport: str = "grpc", request_type=datacatalog.UpdateTagTemplateFieldRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + + response = client.update_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagTemplateFieldRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_update_tag_template_field_from_dict(): + test_update_tag_template_field(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_tag_template_field_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateTagTemplateFieldRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + ) + + response = await client.update_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_update_tag_template_field_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template_field), "__call__" + ) as call: + call.return_value = tags.TagTemplateField() + + client.update_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_update_tag_template_field_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template_field), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + + await client.update_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_update_tag_template_field_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_tag_template_field( + name="name_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].tag_template_field == tags.TagTemplateField(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_tag_template_field_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_tag_template_field( + datacatalog.UpdateTagTemplateFieldRequest(), + name="name_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_tag_template_field_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_tag_template_field( + name="name_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].tag_template_field == tags.TagTemplateField(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_tag_template_field_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_tag_template_field( + datacatalog.UpdateTagTemplateFieldRequest(), + name="name_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_rename_tag_template_field( + transport: str = "grpc", request_type=datacatalog.RenameTagTemplateFieldRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.rename_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + + response = client.rename_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.RenameTagTemplateFieldRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_rename_tag_template_field_from_dict(): + test_rename_tag_template_field(request_type=dict) + + +@pytest.mark.asyncio +async def test_rename_tag_template_field_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.RenameTagTemplateFieldRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.rename_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + ) + + response = await client.rename_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_rename_tag_template_field_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.RenameTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.rename_tag_template_field), "__call__" + ) as call: + call.return_value = tags.TagTemplateField() + + client.rename_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_rename_tag_template_field_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.RenameTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.rename_tag_template_field), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + + await client.rename_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_rename_tag_template_field_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.rename_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.rename_tag_template_field( + name="name_value", + new_tag_template_field_id="new_tag_template_field_id_value", + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].new_tag_template_field_id == "new_tag_template_field_id_value" + + +def test_rename_tag_template_field_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.rename_tag_template_field( + datacatalog.RenameTagTemplateFieldRequest(), + name="name_value", + new_tag_template_field_id="new_tag_template_field_id_value", + ) + + +@pytest.mark.asyncio +async def test_rename_tag_template_field_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.rename_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.rename_tag_template_field( + name="name_value", + new_tag_template_field_id="new_tag_template_field_id_value", + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].new_tag_template_field_id == "new_tag_template_field_id_value" + + +@pytest.mark.asyncio +async def test_rename_tag_template_field_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.rename_tag_template_field( + datacatalog.RenameTagTemplateFieldRequest(), + name="name_value", + new_tag_template_field_id="new_tag_template_field_id_value", + ) + + +def test_delete_tag_template_field( + transport: str = "grpc", request_type=datacatalog.DeleteTagTemplateFieldRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagTemplateFieldRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_template_field_from_dict(): + test_delete_tag_template_field(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteTagTemplateFieldRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_template_field_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template_field), "__call__" + ) as call: + call.return_value = None + + client.delete_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template_field), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_tag_template_field_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_tag_template_field( + name="name_value", force=True, + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].force == True + + +def test_delete_tag_template_field_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_tag_template_field( + datacatalog.DeleteTagTemplateFieldRequest(), name="name_value", force=True, + ) + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_tag_template_field( + name="name_value", force=True, + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].force == True + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_tag_template_field( + datacatalog.DeleteTagTemplateFieldRequest(), name="name_value", force=True, + ) + + +def test_create_tag(transport: str = "grpc", request_type=datacatalog.CreateTagRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag( + name="name_value", + template="template_value", + template_display_name="template_display_name_value", + column="column_value", + ) + + response = client.create_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.Tag) + + assert response.name == "name_value" + + assert response.template == "template_value" + + assert response.template_display_name == "template_display_name_value" + + +def test_create_tag_from_dict(): + test_create_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_tag_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.Tag( + name="name_value", + template="template_value", + template_display_name="template_display_name_value", + ) + ) + + response = await client.create_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.Tag) + + assert response.name == "name_value" + + assert response.template == "template_value" + + assert response.template_display_name == "template_display_name_value" + + +def test_create_tag_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_tag), "__call__") as call: + call.return_value = tags.Tag() + + client.create_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_tag_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.Tag()) + + await client.create_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_tag_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_tag( + parent="parent_value", tag=tags.Tag(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag == tags.Tag(name="name_value") + + +def test_create_tag_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_tag( + datacatalog.CreateTagRequest(), + parent="parent_value", + tag=tags.Tag(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_tag_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.Tag()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_tag( + parent="parent_value", tag=tags.Tag(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag == tags.Tag(name="name_value") + + +@pytest.mark.asyncio +async def test_create_tag_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_tag( + datacatalog.CreateTagRequest(), + parent="parent_value", + tag=tags.Tag(name="name_value"), + ) + + +def test_update_tag(transport: str = "grpc", request_type=datacatalog.UpdateTagRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag( + name="name_value", + template="template_value", + template_display_name="template_display_name_value", + column="column_value", + ) + + response = client.update_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.Tag) + + assert response.name == "name_value" + + assert response.template == "template_value" + + assert response.template_display_name == "template_display_name_value" + + +def test_update_tag_from_dict(): + test_update_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_tag_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.Tag( + name="name_value", + template="template_value", + template_display_name="template_display_name_value", + ) + ) + + response = await client.update_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.Tag) + + assert response.name == "name_value" + + assert response.template == "template_value" + + assert response.template_display_name == "template_display_name_value" + + +def test_update_tag_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagRequest() + request.tag.name = "tag.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_tag), "__call__") as call: + call.return_value = tags.Tag() + + client.update_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "tag.name=tag.name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_update_tag_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagRequest() + request.tag.name = "tag.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.Tag()) + + await client.update_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "tag.name=tag.name/value",) in kw["metadata"] + + +def test_update_tag_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_tag( + tag=tags.Tag(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].tag == tags.Tag(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_tag_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_tag( + datacatalog.UpdateTagRequest(), + tag=tags.Tag(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_tag_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.Tag()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_tag( + tag=tags.Tag(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].tag == tags.Tag(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_tag_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_tag( + datacatalog.UpdateTagRequest(), + tag=tags.Tag(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_delete_tag(transport: str = "grpc", request_type=datacatalog.DeleteTagRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_from_dict(): + test_delete_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_tag_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_tag), "__call__") as call: + call.return_value = None + + client.delete_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_tag_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_tag_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_tag(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_delete_tag_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_tag( + datacatalog.DeleteTagRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_delete_tag_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_tag(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_delete_tag_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_tag( + datacatalog.DeleteTagRequest(), name="name_value", + ) + + +def test_list_tags(transport: str = "grpc", request_type=datacatalog.ListTagsRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListTagsResponse( + next_page_token="next_page_token_value", + ) + + response = client.list_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListTagsRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListTagsPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_tags_from_dict(): + test_list_tags(request_type=dict) + + +@pytest.mark.asyncio +async def test_list_tags_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.ListTagsRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListTagsResponse(next_page_token="next_page_token_value",) + ) + + response = await client.list_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListTagsAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_tags_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListTagsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + call.return_value = datacatalog.ListTagsResponse() + + client.list_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_list_tags_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListTagsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListTagsResponse() + ) + + await client.list_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_list_tags_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListTagsResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.list_tags(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +def test_list_tags_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.list_tags( + datacatalog.ListTagsRequest(), parent="parent_value", + ) + + +@pytest.mark.asyncio +async def test_list_tags_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListTagsResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListTagsResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.list_tags(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +@pytest.mark.asyncio +async def test_list_tags_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.list_tags( + datacatalog.ListTagsRequest(), parent="parent_value", + ) + + +def test_list_tags_pager(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListTagsResponse( + tags=[tags.Tag(), tags.Tag(), tags.Tag(),], next_page_token="abc", + ), + datacatalog.ListTagsResponse(tags=[], next_page_token="def",), + datacatalog.ListTagsResponse(tags=[tags.Tag(),], next_page_token="ghi",), + datacatalog.ListTagsResponse(tags=[tags.Tag(), tags.Tag(),],), + RuntimeError, + ) + + metadata = () + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)), + ) + pager = client.list_tags(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, tags.Tag) for i in results) + + +def test_list_tags_pages(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListTagsResponse( + tags=[tags.Tag(), tags.Tag(), tags.Tag(),], next_page_token="abc", + ), + datacatalog.ListTagsResponse(tags=[], next_page_token="def",), + datacatalog.ListTagsResponse(tags=[tags.Tag(),], next_page_token="ghi",), + datacatalog.ListTagsResponse(tags=[tags.Tag(), tags.Tag(),],), + RuntimeError, + ) + pages = list(client.list_tags(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_list_tags_async_pager(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListTagsResponse( + tags=[tags.Tag(), tags.Tag(), tags.Tag(),], next_page_token="abc", + ), + datacatalog.ListTagsResponse(tags=[], next_page_token="def",), + datacatalog.ListTagsResponse(tags=[tags.Tag(),], next_page_token="ghi",), + datacatalog.ListTagsResponse(tags=[tags.Tag(), tags.Tag(),],), + RuntimeError, + ) + async_pager = await client.list_tags(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, tags.Tag) for i in responses) + + +@pytest.mark.asyncio +async def test_list_tags_async_pages(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListTagsResponse( + tags=[tags.Tag(), tags.Tag(), tags.Tag(),], next_page_token="abc", + ), + datacatalog.ListTagsResponse(tags=[], next_page_token="def",), + datacatalog.ListTagsResponse(tags=[tags.Tag(),], next_page_token="ghi",), + datacatalog.ListTagsResponse(tags=[tags.Tag(), tags.Tag(),],), + RuntimeError, + ) + pages = [] + async for page in (await client.list_tags(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_set_iam_policy( + transport: str = "grpc", request_type=iam_policy.SetIamPolicyRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy(version=774, etag=b"etag_blob",) + + response = client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.SetIamPolicyRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_set_iam_policy_from_dict(): + test_set_iam_policy(request_type=dict) + + +@pytest.mark.asyncio +async def test_set_iam_policy_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = iam_policy.SetIamPolicyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.set_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policy.Policy(version=774, etag=b"etag_blob",) + ) + + response = await client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_set_iam_policy_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.SetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + call.return_value = policy.Policy() + + client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_set_iam_policy_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.SetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.set_iam_policy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + + await client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +def test_set_iam_policy_from_dict(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + response = client.set_iam_policy( + request={ + "resource": "resource_value", + "policy": policy.Policy(version=774), + } + ) + call.assert_called() + + +def test_set_iam_policy_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.set_iam_policy(resource="resource_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].resource == "resource_value" + + +def test_set_iam_policy_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.set_iam_policy( + iam_policy.SetIamPolicyRequest(), resource="resource_value", + ) + + +@pytest.mark.asyncio +async def test_set_iam_policy_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.set_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.set_iam_policy(resource="resource_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].resource == "resource_value" + + +@pytest.mark.asyncio +async def test_set_iam_policy_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.set_iam_policy( + iam_policy.SetIamPolicyRequest(), resource="resource_value", + ) + + +def test_get_iam_policy( + transport: str = "grpc", request_type=iam_policy.GetIamPolicyRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy(version=774, etag=b"etag_blob",) + + response = client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.GetIamPolicyRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_get_iam_policy_from_dict(): + test_get_iam_policy(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_iam_policy_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = iam_policy.GetIamPolicyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policy.Policy(version=774, etag=b"etag_blob",) + ) + + response = await client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_get_iam_policy_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.GetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + call.return_value = policy.Policy() + + client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_iam_policy_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.GetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_iam_policy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + + await client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +def test_get_iam_policy_from_dict(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + response = client.get_iam_policy( + request={ + "resource": "resource_value", + "options": options.GetPolicyOptions(requested_policy_version=2598), + } + ) + call.assert_called() + + +def test_get_iam_policy_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_iam_policy(resource="resource_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].resource == "resource_value" + + +def test_get_iam_policy_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_iam_policy( + iam_policy.GetIamPolicyRequest(), resource="resource_value", + ) + + +@pytest.mark.asyncio +async def test_get_iam_policy_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_iam_policy(resource="resource_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].resource == "resource_value" + + +@pytest.mark.asyncio +async def test_get_iam_policy_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_iam_policy( + iam_policy.GetIamPolicyRequest(), resource="resource_value", + ) + + +def test_test_iam_permissions( + transport: str = "grpc", request_type=iam_policy.TestIamPermissionsRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.test_iam_permissions), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = iam_policy.TestIamPermissionsResponse( + permissions=["permissions_value"], + ) + + response = client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.TestIamPermissionsRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, iam_policy.TestIamPermissionsResponse) + + assert response.permissions == ["permissions_value"] + + +def test_test_iam_permissions_from_dict(): + test_test_iam_permissions(request_type=dict) + + +@pytest.mark.asyncio +async def test_test_iam_permissions_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = iam_policy.TestIamPermissionsRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.test_iam_permissions), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + iam_policy.TestIamPermissionsResponse(permissions=["permissions_value"],) + ) + + response = await client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, iam_policy.TestIamPermissionsResponse) + + assert response.permissions == ["permissions_value"] + + +def test_test_iam_permissions_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.TestIamPermissionsRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.test_iam_permissions), "__call__" + ) as call: + call.return_value = iam_policy.TestIamPermissionsResponse() + + client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_test_iam_permissions_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.TestIamPermissionsRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.test_iam_permissions), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + iam_policy.TestIamPermissionsResponse() + ) + + await client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +def test_test_iam_permissions_from_dict(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.test_iam_permissions), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = iam_policy.TestIamPermissionsResponse() + + response = client.test_iam_permissions( + request={ + "resource": "resource_value", + "permissions": ["permissions_value"], + } + ) + call.assert_called() + + +def test_credentials_transport_error(): + # It is an error to provide credentials and a transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # It is an error to provide a credentials file and a transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = DataCatalogClient( + client_options={"credentials_file": "credentials.json"}, + transport=transport, + ) + + # It is an error to provide scopes and a transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = DataCatalogClient( + client_options={"scopes": ["1", "2"]}, transport=transport, + ) + + +def test_transport_instance(): + # A client may be instantiated with a custom transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + client = DataCatalogClient(transport=transport) + assert client._transport is transport + + +def test_transport_get_channel(): + # A client may be instantiated with a custom transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + channel = transport.grpc_channel + assert channel + + transport = transports.DataCatalogGrpcAsyncIOTransport( + credentials=credentials.AnonymousCredentials(), + ) + channel = transport.grpc_channel + assert channel + + +def test_transport_grpc_default(): + # A client should use the gRPC transport by default. + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + assert isinstance(client._transport, transports.DataCatalogGrpcTransport,) + + +def test_data_catalog_base_transport_error(): + # Passing both a credentials object and credentials_file should raise an error + with pytest.raises(exceptions.DuplicateCredentialArgs): + transport = transports.DataCatalogTransport( + credentials=credentials.AnonymousCredentials(), + credentials_file="credentials.json", + ) + + +def test_data_catalog_base_transport(): + # Instantiate the base transport. + with mock.patch( + "google.cloud.datacatalog_v1.services.data_catalog.transports.DataCatalogTransport.__init__" + ) as Transport: + Transport.return_value = None + transport = transports.DataCatalogTransport( + credentials=credentials.AnonymousCredentials(), + ) + + # Every method on the transport should just blindly + # raise NotImplementedError. + methods = ( + "search_catalog", + "create_entry_group", + "get_entry_group", + "update_entry_group", + "delete_entry_group", + "list_entry_groups", + "create_entry", + "update_entry", + "delete_entry", + "get_entry", + "lookup_entry", + "list_entries", + "create_tag_template", + "get_tag_template", + "update_tag_template", + "delete_tag_template", + "create_tag_template_field", + "update_tag_template_field", + "rename_tag_template_field", + "delete_tag_template_field", + "create_tag", + "update_tag", + "delete_tag", + "list_tags", + "set_iam_policy", + "get_iam_policy", + "test_iam_permissions", + ) + for method in methods: + with pytest.raises(NotImplementedError): + getattr(transport, method)(request=object()) + + +def test_data_catalog_base_transport_with_credentials_file(): + # Instantiate the base transport with a credentials file + with mock.patch.object( + auth, "load_credentials_from_file" + ) as load_creds, mock.patch( + "google.cloud.datacatalog_v1.services.data_catalog.transports.DataCatalogTransport._prep_wrapped_messages" + ) as Transport: + Transport.return_value = None + load_creds.return_value = (credentials.AnonymousCredentials(), None) + transport = transports.DataCatalogTransport( + credentials_file="credentials.json", quota_project_id="octopus", + ) + load_creds.assert_called_once_with( + "credentials.json", + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id="octopus", + ) + + +def test_data_catalog_auth_adc(): + # If no credentials are provided, we should use ADC credentials. + with mock.patch.object(auth, "default") as adc: + adc.return_value = (credentials.AnonymousCredentials(), None) + DataCatalogClient() + adc.assert_called_once_with( + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id=None, + ) + + +def test_data_catalog_transport_auth_adc(): + # If credentials and host are not provided, the transport class should use + # ADC credentials. + with mock.patch.object(auth, "default") as adc: + adc.return_value = (credentials.AnonymousCredentials(), None) + transports.DataCatalogGrpcTransport( + host="squid.clam.whelk", quota_project_id="octopus" + ) + adc.assert_called_once_with( + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id="octopus", + ) + + +def test_data_catalog_host_no_port(): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), + client_options=client_options.ClientOptions( + api_endpoint="datacatalog.googleapis.com" + ), + ) + assert client._transport._host == "datacatalog.googleapis.com:443" + + +def test_data_catalog_host_with_port(): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), + client_options=client_options.ClientOptions( + api_endpoint="datacatalog.googleapis.com:8000" + ), + ) + assert client._transport._host == "datacatalog.googleapis.com:8000" + + +def test_data_catalog_grpc_transport_channel(): + channel = grpc.insecure_channel("http://localhost/") + + # Check that if channel is provided, mtls endpoint and client_cert_source + # won't be used. + callback = mock.MagicMock() + transport = transports.DataCatalogGrpcTransport( + host="squid.clam.whelk", + channel=channel, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=callback, + ) + assert transport.grpc_channel == channel + assert transport._host == "squid.clam.whelk:443" + assert not callback.called + + +def test_data_catalog_grpc_asyncio_transport_channel(): + channel = aio.insecure_channel("http://localhost/") + + # Check that if channel is provided, mtls endpoint and client_cert_source + # won't be used. + callback = mock.MagicMock() + transport = transports.DataCatalogGrpcAsyncIOTransport( + host="squid.clam.whelk", + channel=channel, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=callback, + ) + assert transport.grpc_channel == channel + assert transport._host == "squid.clam.whelk:443" + assert not callback.called + + +@mock.patch("grpc.ssl_channel_credentials", autospec=True) +@mock.patch("google.api_core.grpc_helpers.create_channel", autospec=True) +def test_data_catalog_grpc_transport_channel_mtls_with_client_cert_source( + grpc_create_channel, grpc_ssl_channel_cred +): + # Check that if channel is None, but api_mtls_endpoint and client_cert_source + # are provided, then a mTLS channel will be created. + mock_cred = mock.Mock() + + mock_ssl_cred = mock.Mock() + grpc_ssl_channel_cred.return_value = mock_ssl_cred + + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + transport = transports.DataCatalogGrpcTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=client_cert_source_callback, + ) + grpc_ssl_channel_cred.assert_called_once_with( + certificate_chain=b"cert bytes", private_key=b"key bytes" + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@mock.patch("grpc.ssl_channel_credentials", autospec=True) +@mock.patch("google.api_core.grpc_helpers_async.create_channel", autospec=True) +def test_data_catalog_grpc_asyncio_transport_channel_mtls_with_client_cert_source( + grpc_create_channel, grpc_ssl_channel_cred +): + # Check that if channel is None, but api_mtls_endpoint and client_cert_source + # are provided, then a mTLS channel will be created. + mock_cred = mock.Mock() + + mock_ssl_cred = mock.Mock() + grpc_ssl_channel_cred.return_value = mock_ssl_cred + + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + transport = transports.DataCatalogGrpcAsyncIOTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=client_cert_source_callback, + ) + grpc_ssl_channel_cred.assert_called_once_with( + certificate_chain=b"cert bytes", private_key=b"key bytes" + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@pytest.mark.parametrize( + "api_mtls_endpoint", ["mtls.squid.clam.whelk", "mtls.squid.clam.whelk:443"] +) +@mock.patch("google.api_core.grpc_helpers.create_channel", autospec=True) +def test_data_catalog_grpc_transport_channel_mtls_with_adc( + grpc_create_channel, api_mtls_endpoint +): + # Check that if channel and client_cert_source are None, but api_mtls_endpoint + # is provided, then a mTLS channel will be created with SSL ADC. + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + # Mock google.auth.transport.grpc.SslCredentials class. + mock_ssl_cred = mock.Mock() + with mock.patch.multiple( + "google.auth.transport.grpc.SslCredentials", + __init__=mock.Mock(return_value=None), + ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), + ): + mock_cred = mock.Mock() + transport = transports.DataCatalogGrpcTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint=api_mtls_endpoint, + client_cert_source=None, + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@pytest.mark.parametrize( + "api_mtls_endpoint", ["mtls.squid.clam.whelk", "mtls.squid.clam.whelk:443"] +) +@mock.patch("google.api_core.grpc_helpers_async.create_channel", autospec=True) +def test_data_catalog_grpc_asyncio_transport_channel_mtls_with_adc( + grpc_create_channel, api_mtls_endpoint +): + # Check that if channel and client_cert_source are None, but api_mtls_endpoint + # is provided, then a mTLS channel will be created with SSL ADC. + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + # Mock google.auth.transport.grpc.SslCredentials class. + mock_ssl_cred = mock.Mock() + with mock.patch.multiple( + "google.auth.transport.grpc.SslCredentials", + __init__=mock.Mock(return_value=None), + ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), + ): + mock_cred = mock.Mock() + transport = transports.DataCatalogGrpcAsyncIOTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint=api_mtls_endpoint, + client_cert_source=None, + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +def test_tag_template_field_path(): + project = "squid" + location = "clam" + tag_template = "whelk" + field = "octopus" + + expected = "projects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}".format( + project=project, location=location, tag_template=tag_template, field=field, + ) + actual = DataCatalogClient.tag_template_field_path( + project, location, tag_template, field + ) + assert expected == actual + + +def test_parse_tag_template_field_path(): + expected = { + "project": "oyster", + "location": "nudibranch", + "tag_template": "cuttlefish", + "field": "mussel", + } + path = DataCatalogClient.tag_template_field_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_tag_template_field_path(path) + assert expected == actual + + +def test_tag_template_path(): + project = "squid" + location = "clam" + tag_template = "whelk" + + expected = "projects/{project}/locations/{location}/tagTemplates/{tag_template}".format( + project=project, location=location, tag_template=tag_template, + ) + actual = DataCatalogClient.tag_template_path(project, location, tag_template) + assert expected == actual + + +def test_parse_tag_template_path(): + expected = { + "project": "octopus", + "location": "oyster", + "tag_template": "nudibranch", + } + path = DataCatalogClient.tag_template_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_tag_template_path(path) + assert expected == actual + + +def test_entry_path(): + project = "squid" + location = "clam" + entry_group = "whelk" + entry = "octopus" + + expected = "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}".format( + project=project, location=location, entry_group=entry_group, entry=entry, + ) + actual = DataCatalogClient.entry_path(project, location, entry_group, entry) + assert expected == actual + + +def test_parse_entry_path(): + expected = { + "project": "oyster", + "location": "nudibranch", + "entry_group": "cuttlefish", + "entry": "mussel", + } + path = DataCatalogClient.entry_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_entry_path(path) + assert expected == actual + + +def test_tag_path(): + project = "squid" + location = "clam" + entry_group = "whelk" + entry = "octopus" + tag = "oyster" + + expected = "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}".format( + project=project, + location=location, + entry_group=entry_group, + entry=entry, + tag=tag, + ) + actual = DataCatalogClient.tag_path(project, location, entry_group, entry, tag) + assert expected == actual + + +def test_parse_tag_path(): + expected = { + "project": "nudibranch", + "location": "cuttlefish", + "entry_group": "mussel", + "entry": "winkle", + "tag": "nautilus", + } + path = DataCatalogClient.tag_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_tag_path(path) + assert expected == actual + + +def test_entry_group_path(): + project = "squid" + location = "clam" + entry_group = "whelk" + + expected = "projects/{project}/locations/{location}/entryGroups/{entry_group}".format( + project=project, location=location, entry_group=entry_group, + ) + actual = DataCatalogClient.entry_group_path(project, location, entry_group) + assert expected == actual + + +def test_parse_entry_group_path(): + expected = { + "project": "octopus", + "location": "oyster", + "entry_group": "nudibranch", + } + path = DataCatalogClient.entry_group_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_entry_group_path(path) + assert expected == actual diff --git a/tests/unit/gapic/datacatalog_v1beta1/__init__.py b/tests/unit/gapic/datacatalog_v1beta1/__init__.py new file mode 100644 index 00000000..8b137891 --- /dev/null +++ b/tests/unit/gapic/datacatalog_v1beta1/__init__.py @@ -0,0 +1 @@ + diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py new file mode 100644 index 00000000..427f8b6b --- /dev/null +++ b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py @@ -0,0 +1,6833 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import os +import mock + +import grpc +from grpc.experimental import aio +import math +import pytest +from proto.marshal.rules.dates import DurationRule, TimestampRule + +from google import auth +from google.api_core import client_options +from google.api_core import exceptions +from google.api_core import gapic_v1 +from google.api_core import grpc_helpers +from google.api_core import grpc_helpers_async +from google.auth import credentials +from google.auth.exceptions import MutualTLSChannelError +from google.cloud.datacatalog_v1beta1.services.data_catalog import ( + DataCatalogAsyncClient, +) +from google.cloud.datacatalog_v1beta1.services.data_catalog import DataCatalogClient +from google.cloud.datacatalog_v1beta1.services.data_catalog import pagers +from google.cloud.datacatalog_v1beta1.services.data_catalog import transports +from google.cloud.datacatalog_v1beta1.types import common +from google.cloud.datacatalog_v1beta1.types import datacatalog +from google.cloud.datacatalog_v1beta1.types import gcs_fileset_spec +from google.cloud.datacatalog_v1beta1.types import ( + gcs_fileset_spec as gcd_gcs_fileset_spec, +) +from google.cloud.datacatalog_v1beta1.types import schema +from google.cloud.datacatalog_v1beta1.types import schema as gcd_schema +from google.cloud.datacatalog_v1beta1.types import search +from google.cloud.datacatalog_v1beta1.types import table_spec +from google.cloud.datacatalog_v1beta1.types import table_spec as gcd_table_spec +from google.cloud.datacatalog_v1beta1.types import tags +from google.cloud.datacatalog_v1beta1.types import timestamps +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import options_pb2 as options # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.oauth2 import service_account +from google.protobuf import field_mask_pb2 as field_mask # type: ignore +from google.protobuf import timestamp_pb2 as timestamp # type: ignore +from google.type import expr_pb2 as expr # type: ignore + + +def client_cert_source_callback(): + return b"cert bytes", b"key bytes" + + +# If default endpoint is localhost, then default mtls endpoint will be the same. +# This method modifies the default endpoint so the client can produce a different +# mtls endpoint for endpoint testing purposes. +def modify_default_endpoint(client): + return ( + "foo.googleapis.com" + if ("localhost" in client.DEFAULT_ENDPOINT) + else client.DEFAULT_ENDPOINT + ) + + +def test__get_default_mtls_endpoint(): + api_endpoint = "example.googleapis.com" + api_mtls_endpoint = "example.mtls.googleapis.com" + sandbox_endpoint = "example.sandbox.googleapis.com" + sandbox_mtls_endpoint = "example.mtls.sandbox.googleapis.com" + non_googleapi = "api.example.com" + + assert DataCatalogClient._get_default_mtls_endpoint(None) is None + assert ( + DataCatalogClient._get_default_mtls_endpoint(api_endpoint) == api_mtls_endpoint + ) + assert ( + DataCatalogClient._get_default_mtls_endpoint(api_mtls_endpoint) + == api_mtls_endpoint + ) + assert ( + DataCatalogClient._get_default_mtls_endpoint(sandbox_endpoint) + == sandbox_mtls_endpoint + ) + assert ( + DataCatalogClient._get_default_mtls_endpoint(sandbox_mtls_endpoint) + == sandbox_mtls_endpoint + ) + assert DataCatalogClient._get_default_mtls_endpoint(non_googleapi) == non_googleapi + + +@pytest.mark.parametrize("client_class", [DataCatalogClient, DataCatalogAsyncClient]) +def test_data_catalog_client_from_service_account_file(client_class): + creds = credentials.AnonymousCredentials() + with mock.patch.object( + service_account.Credentials, "from_service_account_file" + ) as factory: + factory.return_value = creds + client = client_class.from_service_account_file("dummy/file/path.json") + assert client._transport._credentials == creds + + client = client_class.from_service_account_json("dummy/file/path.json") + assert client._transport._credentials == creds + + assert client._transport._host == "datacatalog.googleapis.com:443" + + +def test_data_catalog_client_get_transport_class(): + transport = DataCatalogClient.get_transport_class() + assert transport == transports.DataCatalogGrpcTransport + + transport = DataCatalogClient.get_transport_class("grpc") + assert transport == transports.DataCatalogGrpcTransport + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + (DataCatalogClient, transports.DataCatalogGrpcTransport, "grpc"), + ( + DataCatalogAsyncClient, + transports.DataCatalogGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +@mock.patch.object( + DataCatalogClient, "DEFAULT_ENDPOINT", modify_default_endpoint(DataCatalogClient) +) +@mock.patch.object( + DataCatalogAsyncClient, + "DEFAULT_ENDPOINT", + modify_default_endpoint(DataCatalogAsyncClient), +) +def test_data_catalog_client_client_options( + client_class, transport_class, transport_name +): + # Check that if channel is provided we won't create a new one. + with mock.patch.object(DataCatalogClient, "get_transport_class") as gtc: + transport = transport_class(credentials=credentials.AnonymousCredentials()) + client = client_class(transport=transport) + gtc.assert_not_called() + + # Check that if channel is provided via str we will create a new one. + with mock.patch.object(DataCatalogClient, "get_transport_class") as gtc: + client = client_class(transport=transport_name) + gtc.assert_called() + + # Check the case api_endpoint is provided. + options = client_options.ClientOptions(api_endpoint="squid.clam.whelk") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host="squid.clam.whelk", + scopes=None, + api_mtls_endpoint="squid.clam.whelk", + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS is + # "never". + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "never"}): + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS is + # "always". + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "always"}): + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", and client_cert_source is provided. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + options = client_options.ClientOptions( + client_cert_source=client_cert_source_callback + ) + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=client_cert_source_callback, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", and default_client_cert_source is provided. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=True, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", but client_cert_source and default_client_cert_source are None. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=False, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS has + # unsupported value. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "Unsupported"}): + with pytest.raises(MutualTLSChannelError): + client = client_class() + + # Check the case quota_project_id is provided + options = client_options.ClientOptions(quota_project_id="octopus") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id="octopus", + ) + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + (DataCatalogClient, transports.DataCatalogGrpcTransport, "grpc"), + ( + DataCatalogAsyncClient, + transports.DataCatalogGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +def test_data_catalog_client_client_options_scopes( + client_class, transport_class, transport_name +): + # Check the case scopes are provided. + options = client_options.ClientOptions(scopes=["1", "2"],) + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=["1", "2"], + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + (DataCatalogClient, transports.DataCatalogGrpcTransport, "grpc"), + ( + DataCatalogAsyncClient, + transports.DataCatalogGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +def test_data_catalog_client_client_options_credentials_file( + client_class, transport_class, transport_name +): + # Check the case credentials file is provided. + options = client_options.ClientOptions(credentials_file="credentials.json") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file="credentials.json", + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + +def test_data_catalog_client_client_options_from_dict(): + with mock.patch( + "google.cloud.datacatalog_v1beta1.services.data_catalog.transports.DataCatalogGrpcTransport.__init__" + ) as grpc_transport: + grpc_transport.return_value = None + client = DataCatalogClient(client_options={"api_endpoint": "squid.clam.whelk"}) + grpc_transport.assert_called_once_with( + credentials=None, + credentials_file=None, + host="squid.clam.whelk", + scopes=None, + api_mtls_endpoint="squid.clam.whelk", + client_cert_source=None, + quota_project_id=None, + ) + + +def test_search_catalog( + transport: str = "grpc", request_type=datacatalog.SearchCatalogRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.search_catalog), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.SearchCatalogResponse( + next_page_token="next_page_token_value", + ) + + response = client.search_catalog(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.SearchCatalogRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.SearchCatalogPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_search_catalog_from_dict(): + test_search_catalog(request_type=dict) + + +@pytest.mark.asyncio +async def test_search_catalog_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.SearchCatalogRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.search_catalog), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.SearchCatalogResponse(next_page_token="next_page_token_value",) + ) + + response = await client.search_catalog(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.SearchCatalogAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_search_catalog_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.search_catalog), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.SearchCatalogResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.search_catalog( + scope=datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ), + query="query_value", + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].scope == datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ) + + assert args[0].query == "query_value" + + +def test_search_catalog_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.search_catalog( + datacatalog.SearchCatalogRequest(), + scope=datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ), + query="query_value", + ) + + +@pytest.mark.asyncio +async def test_search_catalog_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.search_catalog), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.SearchCatalogResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.SearchCatalogResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.search_catalog( + scope=datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ), + query="query_value", + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].scope == datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ) + + assert args[0].query == "query_value" + + +@pytest.mark.asyncio +async def test_search_catalog_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.search_catalog( + datacatalog.SearchCatalogRequest(), + scope=datacatalog.SearchCatalogRequest.Scope( + include_org_ids=["include_org_ids_value"] + ), + query="query_value", + ) + + +def test_search_catalog_pager(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.search_catalog), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.SearchCatalogResponse( + results=[ + search.SearchCatalogResult(), + search.SearchCatalogResult(), + search.SearchCatalogResult(), + ], + next_page_token="abc", + ), + datacatalog.SearchCatalogResponse(results=[], next_page_token="def",), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(),], next_page_token="ghi", + ), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(), search.SearchCatalogResult(),], + ), + RuntimeError, + ) + + metadata = () + pager = client.search_catalog(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, search.SearchCatalogResult) for i in results) + + +def test_search_catalog_pages(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.search_catalog), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.SearchCatalogResponse( + results=[ + search.SearchCatalogResult(), + search.SearchCatalogResult(), + search.SearchCatalogResult(), + ], + next_page_token="abc", + ), + datacatalog.SearchCatalogResponse(results=[], next_page_token="def",), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(),], next_page_token="ghi", + ), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(), search.SearchCatalogResult(),], + ), + RuntimeError, + ) + pages = list(client.search_catalog(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_search_catalog_async_pager(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.search_catalog), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.SearchCatalogResponse( + results=[ + search.SearchCatalogResult(), + search.SearchCatalogResult(), + search.SearchCatalogResult(), + ], + next_page_token="abc", + ), + datacatalog.SearchCatalogResponse(results=[], next_page_token="def",), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(),], next_page_token="ghi", + ), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(), search.SearchCatalogResult(),], + ), + RuntimeError, + ) + async_pager = await client.search_catalog(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, search.SearchCatalogResult) for i in responses) + + +@pytest.mark.asyncio +async def test_search_catalog_async_pages(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.search_catalog), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.SearchCatalogResponse( + results=[ + search.SearchCatalogResult(), + search.SearchCatalogResult(), + search.SearchCatalogResult(), + ], + next_page_token="abc", + ), + datacatalog.SearchCatalogResponse(results=[], next_page_token="def",), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(),], next_page_token="ghi", + ), + datacatalog.SearchCatalogResponse( + results=[search.SearchCatalogResult(), search.SearchCatalogResult(),], + ), + RuntimeError, + ) + pages = [] + async for page in (await client.search_catalog(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_create_entry_group( + transport: str = "grpc", request_type=datacatalog.CreateEntryGroupRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + + response = client.create_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateEntryGroupRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_create_entry_group_from_dict(): + test_create_entry_group(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_entry_group_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateEntryGroupRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.create_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_create_entry_group_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateEntryGroupRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_entry_group), "__call__" + ) as call: + call.return_value = datacatalog.EntryGroup() + + client.create_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_entry_group_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateEntryGroupRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry_group), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + + await client.create_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_entry_group_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_entry_group( + parent="parent_value", + entry_group_id="entry_group_id_value", + entry_group=datacatalog.EntryGroup(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].entry_group_id == "entry_group_id_value" + + assert args[0].entry_group == datacatalog.EntryGroup(name="name_value") + + +def test_create_entry_group_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_entry_group( + datacatalog.CreateEntryGroupRequest(), + parent="parent_value", + entry_group_id="entry_group_id_value", + entry_group=datacatalog.EntryGroup(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_entry_group_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_entry_group( + parent="parent_value", + entry_group_id="entry_group_id_value", + entry_group=datacatalog.EntryGroup(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].entry_group_id == "entry_group_id_value" + + assert args[0].entry_group == datacatalog.EntryGroup(name="name_value") + + +@pytest.mark.asyncio +async def test_create_entry_group_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_entry_group( + datacatalog.CreateEntryGroupRequest(), + parent="parent_value", + entry_group_id="entry_group_id_value", + entry_group=datacatalog.EntryGroup(name="name_value"), + ) + + +def test_update_entry_group( + transport: str = "grpc", request_type=datacatalog.UpdateEntryGroupRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + + response = client.update_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateEntryGroupRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_update_entry_group_from_dict(): + test_update_entry_group(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_entry_group_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateEntryGroupRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.update_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_update_entry_group_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateEntryGroupRequest() + request.entry_group.name = "entry_group.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_entry_group), "__call__" + ) as call: + call.return_value = datacatalog.EntryGroup() + + client.update_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "entry_group.name=entry_group.name/value",) in kw[ + "metadata" + ] + + +@pytest.mark.asyncio +async def test_update_entry_group_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateEntryGroupRequest() + request.entry_group.name = "entry_group.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry_group), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + + await client.update_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "entry_group.name=entry_group.name/value",) in kw[ + "metadata" + ] + + +def test_update_entry_group_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_entry_group( + entry_group=datacatalog.EntryGroup(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].entry_group == datacatalog.EntryGroup(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_entry_group_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_entry_group( + datacatalog.UpdateEntryGroupRequest(), + entry_group=datacatalog.EntryGroup(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_entry_group_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_entry_group( + entry_group=datacatalog.EntryGroup(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].entry_group == datacatalog.EntryGroup(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_entry_group_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_entry_group( + datacatalog.UpdateEntryGroupRequest(), + entry_group=datacatalog.EntryGroup(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_get_entry_group( + transport: str = "grpc", request_type=datacatalog.GetEntryGroupRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry_group), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + + response = client.get_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetEntryGroupRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_get_entry_group_from_dict(): + test_get_entry_group(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_entry_group_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.GetEntryGroupRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup( + name="name_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.get_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.EntryGroup) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_get_entry_group_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetEntryGroupRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry_group), "__call__") as call: + call.return_value = datacatalog.EntryGroup() + + client.get_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_entry_group_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetEntryGroupRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry_group), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + + await client.get_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_get_entry_group_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry_group), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_entry_group( + name="name_value", read_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].read_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_get_entry_group_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_entry_group( + datacatalog.GetEntryGroupRequest(), + name="name_value", + read_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_get_entry_group_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.EntryGroup() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.EntryGroup() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_entry_group( + name="name_value", read_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].read_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_get_entry_group_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_entry_group( + datacatalog.GetEntryGroupRequest(), + name="name_value", + read_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_delete_entry_group( + transport: str = "grpc", request_type=datacatalog.DeleteEntryGroupRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteEntryGroupRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_entry_group_from_dict(): + test_delete_entry_group(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_entry_group_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteEntryGroupRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_entry_group_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteEntryGroupRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_entry_group), "__call__" + ) as call: + call.return_value = None + + client.delete_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_entry_group_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteEntryGroupRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry_group), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_entry_group(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_entry_group_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_entry_group(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_delete_entry_group_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_entry_group( + datacatalog.DeleteEntryGroupRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_delete_entry_group_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry_group), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_entry_group(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_delete_entry_group_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_entry_group( + datacatalog.DeleteEntryGroupRequest(), name="name_value", + ) + + +def test_list_entry_groups( + transport: str = "grpc", request_type=datacatalog.ListEntryGroupsRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntryGroupsResponse( + next_page_token="next_page_token_value", + ) + + response = client.list_entry_groups(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListEntryGroupsRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListEntryGroupsPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_entry_groups_from_dict(): + test_list_entry_groups(request_type=dict) + + +@pytest.mark.asyncio +async def test_list_entry_groups_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.ListEntryGroupsRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntryGroupsResponse( + next_page_token="next_page_token_value", + ) + ) + + response = await client.list_entry_groups(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListEntryGroupsAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_entry_groups_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListEntryGroupsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + call.return_value = datacatalog.ListEntryGroupsResponse() + + client.list_entry_groups(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_list_entry_groups_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListEntryGroupsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntryGroupsResponse() + ) + + await client.list_entry_groups(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_list_entry_groups_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntryGroupsResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.list_entry_groups(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +def test_list_entry_groups_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.list_entry_groups( + datacatalog.ListEntryGroupsRequest(), parent="parent_value", + ) + + +@pytest.mark.asyncio +async def test_list_entry_groups_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntryGroupsResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntryGroupsResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.list_entry_groups(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +@pytest.mark.asyncio +async def test_list_entry_groups_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.list_entry_groups( + datacatalog.ListEntryGroupsRequest(), parent="parent_value", + ) + + +def test_list_entry_groups_pager(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntryGroupsResponse( + entry_groups=[ + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + ], + next_page_token="abc", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[], next_page_token="def", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(),], next_page_token="ghi", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(), datacatalog.EntryGroup(),], + ), + RuntimeError, + ) + + metadata = () + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)), + ) + pager = client.list_entry_groups(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, datacatalog.EntryGroup) for i in results) + + +def test_list_entry_groups_pages(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_entry_groups), "__call__" + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntryGroupsResponse( + entry_groups=[ + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + ], + next_page_token="abc", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[], next_page_token="def", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(),], next_page_token="ghi", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(), datacatalog.EntryGroup(),], + ), + RuntimeError, + ) + pages = list(client.list_entry_groups(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_list_entry_groups_async_pager(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntryGroupsResponse( + entry_groups=[ + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + ], + next_page_token="abc", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[], next_page_token="def", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(),], next_page_token="ghi", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(), datacatalog.EntryGroup(),], + ), + RuntimeError, + ) + async_pager = await client.list_entry_groups(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, datacatalog.EntryGroup) for i in responses) + + +@pytest.mark.asyncio +async def test_list_entry_groups_async_pages(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entry_groups), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntryGroupsResponse( + entry_groups=[ + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + datacatalog.EntryGroup(), + ], + next_page_token="abc", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[], next_page_token="def", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(),], next_page_token="ghi", + ), + datacatalog.ListEntryGroupsResponse( + entry_groups=[datacatalog.EntryGroup(), datacatalog.EntryGroup(),], + ), + RuntimeError, + ) + pages = [] + async for page in (await client.list_entry_groups(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_create_entry( + transport: str = "grpc", request_type=datacatalog.CreateEntryRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + type=datacatalog.EntryType.TABLE, + integrated_system=common.IntegratedSystem.BIGQUERY, + gcs_fileset_spec=gcs_fileset_spec.GcsFilesetSpec( + file_patterns=["file_patterns_value"] + ), + ) + + response = client.create_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateEntryRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_create_entry_from_dict(): + test_create_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.create_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_create_entry_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateEntryRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_entry), "__call__") as call: + call.return_value = datacatalog.Entry() + + client.create_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_entry_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateEntryRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + + await client.create_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_entry_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_entry( + parent="parent_value", + entry_id="entry_id_value", + entry=datacatalog.Entry(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].entry_id == "entry_id_value" + + assert args[0].entry == datacatalog.Entry(name="name_value") + + +def test_create_entry_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_entry( + datacatalog.CreateEntryRequest(), + parent="parent_value", + entry_id="entry_id_value", + entry=datacatalog.Entry(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_entry_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_entry( + parent="parent_value", + entry_id="entry_id_value", + entry=datacatalog.Entry(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].entry_id == "entry_id_value" + + assert args[0].entry == datacatalog.Entry(name="name_value") + + +@pytest.mark.asyncio +async def test_create_entry_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_entry( + datacatalog.CreateEntryRequest(), + parent="parent_value", + entry_id="entry_id_value", + entry=datacatalog.Entry(name="name_value"), + ) + + +def test_update_entry( + transport: str = "grpc", request_type=datacatalog.UpdateEntryRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + type=datacatalog.EntryType.TABLE, + integrated_system=common.IntegratedSystem.BIGQUERY, + gcs_fileset_spec=gcs_fileset_spec.GcsFilesetSpec( + file_patterns=["file_patterns_value"] + ), + ) + + response = client.update_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateEntryRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_update_entry_from_dict(): + test_update_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.update_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_update_entry_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateEntryRequest() + request.entry.name = "entry.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_entry), "__call__") as call: + call.return_value = datacatalog.Entry() + + client.update_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "entry.name=entry.name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_update_entry_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateEntryRequest() + request.entry.name = "entry.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + + await client.update_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "entry.name=entry.name/value",) in kw["metadata"] + + +def test_update_entry_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_entry( + entry=datacatalog.Entry(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].entry == datacatalog.Entry(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_entry_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_entry( + datacatalog.UpdateEntryRequest(), + entry=datacatalog.Entry(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_entry_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_entry( + entry=datacatalog.Entry(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].entry == datacatalog.Entry(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_entry_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_entry( + datacatalog.UpdateEntryRequest(), + entry=datacatalog.Entry(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_delete_entry( + transport: str = "grpc", request_type=datacatalog.DeleteEntryRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteEntryRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_entry_from_dict(): + test_delete_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_entry_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteEntryRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_entry), "__call__") as call: + call.return_value = None + + client.delete_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_entry_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteEntryRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_entry_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_entry(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_delete_entry_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_entry( + datacatalog.DeleteEntryRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_delete_entry_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_entry(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_delete_entry_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_entry( + datacatalog.DeleteEntryRequest(), name="name_value", + ) + + +def test_get_entry(transport: str = "grpc", request_type=datacatalog.GetEntryRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + type=datacatalog.EntryType.TABLE, + integrated_system=common.IntegratedSystem.BIGQUERY, + gcs_fileset_spec=gcs_fileset_spec.GcsFilesetSpec( + file_patterns=["file_patterns_value"] + ), + ) + + response = client.get_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetEntryRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_get_entry_from_dict(): + test_get_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.GetEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.get_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_get_entry_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetEntryRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry), "__call__") as call: + call.return_value = datacatalog.Entry() + + client.get_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_entry_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetEntryRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + + await client.get_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_get_entry_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_entry(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_get_entry_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_entry( + datacatalog.GetEntryRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_get_entry_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(datacatalog.Entry()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_entry(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_get_entry_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_entry( + datacatalog.GetEntryRequest(), name="name_value", + ) + + +def test_lookup_entry( + transport: str = "grpc", request_type=datacatalog.LookupEntryRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.lookup_entry), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + type=datacatalog.EntryType.TABLE, + integrated_system=common.IntegratedSystem.BIGQUERY, + gcs_fileset_spec=gcs_fileset_spec.GcsFilesetSpec( + file_patterns=["file_patterns_value"] + ), + ) + + response = client.lookup_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.LookupEntryRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_lookup_entry_from_dict(): + test_lookup_entry(request_type=dict) + + +@pytest.mark.asyncio +async def test_lookup_entry_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.LookupEntryRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.lookup_entry), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.Entry( + name="name_value", + linked_resource="linked_resource_value", + display_name="display_name_value", + description="description_value", + ) + ) + + response = await client.lookup_entry(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, datacatalog.Entry) + + assert response.name == "name_value" + + assert response.linked_resource == "linked_resource_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + +def test_list_entries( + transport: str = "grpc", request_type=datacatalog.ListEntriesRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntriesResponse( + next_page_token="next_page_token_value", + ) + + response = client.list_entries(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListEntriesRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListEntriesPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_entries_from_dict(): + test_list_entries(request_type=dict) + + +@pytest.mark.asyncio +async def test_list_entries_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.ListEntriesRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntriesResponse(next_page_token="next_page_token_value",) + ) + + response = await client.list_entries(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListEntriesAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_entries_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListEntriesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + call.return_value = datacatalog.ListEntriesResponse() + + client.list_entries(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_list_entries_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListEntriesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntriesResponse() + ) + + await client.list_entries(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_list_entries_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntriesResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.list_entries(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +def test_list_entries_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.list_entries( + datacatalog.ListEntriesRequest(), parent="parent_value", + ) + + +@pytest.mark.asyncio +async def test_list_entries_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListEntriesResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListEntriesResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.list_entries(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +@pytest.mark.asyncio +async def test_list_entries_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.list_entries( + datacatalog.ListEntriesRequest(), parent="parent_value", + ) + + +def test_list_entries_pager(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntriesResponse( + entries=[ + datacatalog.Entry(), + datacatalog.Entry(), + datacatalog.Entry(), + ], + next_page_token="abc", + ), + datacatalog.ListEntriesResponse(entries=[], next_page_token="def",), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(),], next_page_token="ghi", + ), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(), datacatalog.Entry(),], + ), + RuntimeError, + ) + + metadata = () + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)), + ) + pager = client.list_entries(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, datacatalog.Entry) for i in results) + + +def test_list_entries_pages(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_entries), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntriesResponse( + entries=[ + datacatalog.Entry(), + datacatalog.Entry(), + datacatalog.Entry(), + ], + next_page_token="abc", + ), + datacatalog.ListEntriesResponse(entries=[], next_page_token="def",), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(),], next_page_token="ghi", + ), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(), datacatalog.Entry(),], + ), + RuntimeError, + ) + pages = list(client.list_entries(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_list_entries_async_pager(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntriesResponse( + entries=[ + datacatalog.Entry(), + datacatalog.Entry(), + datacatalog.Entry(), + ], + next_page_token="abc", + ), + datacatalog.ListEntriesResponse(entries=[], next_page_token="def",), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(),], next_page_token="ghi", + ), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(), datacatalog.Entry(),], + ), + RuntimeError, + ) + async_pager = await client.list_entries(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, datacatalog.Entry) for i in responses) + + +@pytest.mark.asyncio +async def test_list_entries_async_pages(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_entries), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListEntriesResponse( + entries=[ + datacatalog.Entry(), + datacatalog.Entry(), + datacatalog.Entry(), + ], + next_page_token="abc", + ), + datacatalog.ListEntriesResponse(entries=[], next_page_token="def",), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(),], next_page_token="ghi", + ), + datacatalog.ListEntriesResponse( + entries=[datacatalog.Entry(), datacatalog.Entry(),], + ), + RuntimeError, + ) + pages = [] + async for page in (await client.list_entries(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_create_tag_template( + transport: str = "grpc", request_type=datacatalog.CreateTagTemplateRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate( + name="name_value", display_name="display_name_value", + ) + + response = client.create_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagTemplateRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_create_tag_template_from_dict(): + test_create_tag_template(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_tag_template_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateTagTemplateRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplate(name="name_value", display_name="display_name_value",) + ) + + response = await client.create_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_create_tag_template_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagTemplateRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template), "__call__" + ) as call: + call.return_value = tags.TagTemplate() + + client.create_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_tag_template_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagTemplateRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + + await client.create_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_tag_template_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_tag_template( + parent="parent_value", + tag_template_id="tag_template_id_value", + tag_template=tags.TagTemplate(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag_template_id == "tag_template_id_value" + + assert args[0].tag_template == tags.TagTemplate(name="name_value") + + +def test_create_tag_template_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_tag_template( + datacatalog.CreateTagTemplateRequest(), + parent="parent_value", + tag_template_id="tag_template_id_value", + tag_template=tags.TagTemplate(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_tag_template_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_tag_template( + parent="parent_value", + tag_template_id="tag_template_id_value", + tag_template=tags.TagTemplate(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag_template_id == "tag_template_id_value" + + assert args[0].tag_template == tags.TagTemplate(name="name_value") + + +@pytest.mark.asyncio +async def test_create_tag_template_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_tag_template( + datacatalog.CreateTagTemplateRequest(), + parent="parent_value", + tag_template_id="tag_template_id_value", + tag_template=tags.TagTemplate(name="name_value"), + ) + + +def test_get_tag_template( + transport: str = "grpc", request_type=datacatalog.GetTagTemplateRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.get_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate( + name="name_value", display_name="display_name_value", + ) + + response = client.get_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetTagTemplateRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_get_tag_template_from_dict(): + test_get_tag_template(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_tag_template_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.GetTagTemplateRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplate(name="name_value", display_name="display_name_value",) + ) + + response = await client.get_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_get_tag_template_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetTagTemplateRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.get_tag_template), "__call__" + ) as call: + call.return_value = tags.TagTemplate() + + client.get_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_tag_template_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.GetTagTemplateRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_tag_template), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + + await client.get_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_get_tag_template_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.get_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_tag_template(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_get_tag_template_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_tag_template( + datacatalog.GetTagTemplateRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_get_tag_template_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_tag_template(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_get_tag_template_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_tag_template( + datacatalog.GetTagTemplateRequest(), name="name_value", + ) + + +def test_update_tag_template( + transport: str = "grpc", request_type=datacatalog.UpdateTagTemplateRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate( + name="name_value", display_name="display_name_value", + ) + + response = client.update_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagTemplateRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_update_tag_template_from_dict(): + test_update_tag_template(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_tag_template_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateTagTemplateRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplate(name="name_value", display_name="display_name_value",) + ) + + response = await client.update_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplate) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + +def test_update_tag_template_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagTemplateRequest() + request.tag_template.name = "tag_template.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template), "__call__" + ) as call: + call.return_value = tags.TagTemplate() + + client.update_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ( + "x-goog-request-params", + "tag_template.name=tag_template.name/value", + ) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_update_tag_template_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagTemplateRequest() + request.tag_template.name = "tag_template.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + + await client.update_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ( + "x-goog-request-params", + "tag_template.name=tag_template.name/value", + ) in kw["metadata"] + + +def test_update_tag_template_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_tag_template( + tag_template=tags.TagTemplate(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].tag_template == tags.TagTemplate(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_tag_template_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_tag_template( + datacatalog.UpdateTagTemplateRequest(), + tag_template=tags.TagTemplate(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_tag_template_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplate() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.TagTemplate()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_tag_template( + tag_template=tags.TagTemplate(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].tag_template == tags.TagTemplate(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_tag_template_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_tag_template( + datacatalog.UpdateTagTemplateRequest(), + tag_template=tags.TagTemplate(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_delete_tag_template( + transport: str = "grpc", request_type=datacatalog.DeleteTagTemplateRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagTemplateRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_template_from_dict(): + test_delete_tag_template(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_tag_template_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteTagTemplateRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_template_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagTemplateRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template), "__call__" + ) as call: + call.return_value = None + + client.delete_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagTemplateRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_tag_template(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_tag_template_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_tag_template( + name="name_value", force=True, + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].force == True + + +def test_delete_tag_template_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_tag_template( + datacatalog.DeleteTagTemplateRequest(), name="name_value", force=True, + ) + + +@pytest.mark.asyncio +async def test_delete_tag_template_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_tag_template(name="name_value", force=True,) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].force == True + + +@pytest.mark.asyncio +async def test_delete_tag_template_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_tag_template( + datacatalog.DeleteTagTemplateRequest(), name="name_value", force=True, + ) + + +def test_create_tag_template_field( + transport: str = "grpc", request_type=datacatalog.CreateTagTemplateFieldRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + + response = client.create_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagTemplateFieldRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_create_tag_template_field_from_dict(): + test_create_tag_template_field(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_tag_template_field_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateTagTemplateFieldRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + ) + + response = await client.create_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_create_tag_template_field_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagTemplateFieldRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template_field), "__call__" + ) as call: + call.return_value = tags.TagTemplateField() + + client.create_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_tag_template_field_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagTemplateFieldRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template_field), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + + await client.create_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_tag_template_field_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_tag_template_field( + parent="parent_value", + tag_template_field_id="tag_template_field_id_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag_template_field_id == "tag_template_field_id_value" + + assert args[0].tag_template_field == tags.TagTemplateField(name="name_value") + + +def test_create_tag_template_field_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_tag_template_field( + datacatalog.CreateTagTemplateFieldRequest(), + parent="parent_value", + tag_template_field_id="tag_template_field_id_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_tag_template_field_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_tag_template_field( + parent="parent_value", + tag_template_field_id="tag_template_field_id_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag_template_field_id == "tag_template_field_id_value" + + assert args[0].tag_template_field == tags.TagTemplateField(name="name_value") + + +@pytest.mark.asyncio +async def test_create_tag_template_field_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_tag_template_field( + datacatalog.CreateTagTemplateFieldRequest(), + parent="parent_value", + tag_template_field_id="tag_template_field_id_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + ) + + +def test_update_tag_template_field( + transport: str = "grpc", request_type=datacatalog.UpdateTagTemplateFieldRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + + response = client.update_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagTemplateFieldRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_update_tag_template_field_from_dict(): + test_update_tag_template_field(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_tag_template_field_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateTagTemplateFieldRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + ) + + response = await client.update_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_update_tag_template_field_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template_field), "__call__" + ) as call: + call.return_value = tags.TagTemplateField() + + client.update_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_update_tag_template_field_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template_field), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + + await client.update_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_update_tag_template_field_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_tag_template_field( + name="name_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].tag_template_field == tags.TagTemplateField(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_tag_template_field_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_tag_template_field( + datacatalog.UpdateTagTemplateFieldRequest(), + name="name_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_tag_template_field_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_tag_template_field( + name="name_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].tag_template_field == tags.TagTemplateField(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_tag_template_field_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_tag_template_field( + datacatalog.UpdateTagTemplateFieldRequest(), + name="name_value", + tag_template_field=tags.TagTemplateField(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_rename_tag_template_field( + transport: str = "grpc", request_type=datacatalog.RenameTagTemplateFieldRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.rename_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + + response = client.rename_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.RenameTagTemplateFieldRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_rename_tag_template_field_from_dict(): + test_rename_tag_template_field(request_type=dict) + + +@pytest.mark.asyncio +async def test_rename_tag_template_field_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.RenameTagTemplateFieldRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.rename_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField( + name="name_value", + display_name="display_name_value", + is_required=True, + order=540, + ) + ) + + response = await client.rename_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.TagTemplateField) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.is_required is True + + assert response.order == 540 + + +def test_rename_tag_template_field_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.RenameTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.rename_tag_template_field), "__call__" + ) as call: + call.return_value = tags.TagTemplateField() + + client.rename_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_rename_tag_template_field_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.RenameTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.rename_tag_template_field), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + + await client.rename_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_rename_tag_template_field_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.rename_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.rename_tag_template_field( + name="name_value", + new_tag_template_field_id="new_tag_template_field_id_value", + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].new_tag_template_field_id == "new_tag_template_field_id_value" + + +def test_rename_tag_template_field_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.rename_tag_template_field( + datacatalog.RenameTagTemplateFieldRequest(), + name="name_value", + new_tag_template_field_id="new_tag_template_field_id_value", + ) + + +@pytest.mark.asyncio +async def test_rename_tag_template_field_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.rename_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.TagTemplateField() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.TagTemplateField() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.rename_tag_template_field( + name="name_value", + new_tag_template_field_id="new_tag_template_field_id_value", + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].new_tag_template_field_id == "new_tag_template_field_id_value" + + +@pytest.mark.asyncio +async def test_rename_tag_template_field_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.rename_tag_template_field( + datacatalog.RenameTagTemplateFieldRequest(), + name="name_value", + new_tag_template_field_id="new_tag_template_field_id_value", + ) + + +def test_delete_tag_template_field( + transport: str = "grpc", request_type=datacatalog.DeleteTagTemplateFieldRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagTemplateFieldRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_template_field_from_dict(): + test_delete_tag_template_field(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteTagTemplateFieldRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_template_field_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template_field), "__call__" + ) as call: + call.return_value = None + + client.delete_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagTemplateFieldRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template_field), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_tag_template_field(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_tag_template_field_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_tag_template_field( + name="name_value", force=True, + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].force == True + + +def test_delete_tag_template_field_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_tag_template_field( + datacatalog.DeleteTagTemplateFieldRequest(), name="name_value", force=True, + ) + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag_template_field), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_tag_template_field( + name="name_value", force=True, + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + assert args[0].force == True + + +@pytest.mark.asyncio +async def test_delete_tag_template_field_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_tag_template_field( + datacatalog.DeleteTagTemplateFieldRequest(), name="name_value", force=True, + ) + + +def test_create_tag(transport: str = "grpc", request_type=datacatalog.CreateTagRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag( + name="name_value", + template="template_value", + template_display_name="template_display_name_value", + column="column_value", + ) + + response = client.create_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.Tag) + + assert response.name == "name_value" + + assert response.template == "template_value" + + assert response.template_display_name == "template_display_name_value" + + +def test_create_tag_from_dict(): + test_create_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_tag_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.CreateTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.Tag( + name="name_value", + template="template_value", + template_display_name="template_display_name_value", + ) + ) + + response = await client.create_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.Tag) + + assert response.name == "name_value" + + assert response.template == "template_value" + + assert response.template_display_name == "template_display_name_value" + + +def test_create_tag_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_tag), "__call__") as call: + call.return_value = tags.Tag() + + client.create_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_tag_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.CreateTagRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.Tag()) + + await client.create_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_tag_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_tag( + parent="parent_value", tag=tags.Tag(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag == tags.Tag(name="name_value") + + +def test_create_tag_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_tag( + datacatalog.CreateTagRequest(), + parent="parent_value", + tag=tags.Tag(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_tag_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.Tag()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_tag( + parent="parent_value", tag=tags.Tag(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].tag == tags.Tag(name="name_value") + + +@pytest.mark.asyncio +async def test_create_tag_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_tag( + datacatalog.CreateTagRequest(), + parent="parent_value", + tag=tags.Tag(name="name_value"), + ) + + +def test_update_tag(transport: str = "grpc", request_type=datacatalog.UpdateTagRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag( + name="name_value", + template="template_value", + template_display_name="template_display_name_value", + column="column_value", + ) + + response = client.update_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.Tag) + + assert response.name == "name_value" + + assert response.template == "template_value" + + assert response.template_display_name == "template_display_name_value" + + +def test_update_tag_from_dict(): + test_update_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_tag_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.UpdateTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + tags.Tag( + name="name_value", + template="template_value", + template_display_name="template_display_name_value", + ) + ) + + response = await client.update_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, tags.Tag) + + assert response.name == "name_value" + + assert response.template == "template_value" + + assert response.template_display_name == "template_display_name_value" + + +def test_update_tag_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagRequest() + request.tag.name = "tag.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_tag), "__call__") as call: + call.return_value = tags.Tag() + + client.update_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "tag.name=tag.name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_update_tag_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.UpdateTagRequest() + request.tag.name = "tag.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.Tag()) + + await client.update_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "tag.name=tag.name/value",) in kw["metadata"] + + +def test_update_tag_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_tag( + tag=tags.Tag(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].tag == tags.Tag(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +def test_update_tag_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_tag( + datacatalog.UpdateTagRequest(), + tag=tags.Tag(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +@pytest.mark.asyncio +async def test_update_tag_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = tags.Tag() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(tags.Tag()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_tag( + tag=tags.Tag(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].tag == tags.Tag(name="name_value") + + assert args[0].update_mask == field_mask.FieldMask(paths=["paths_value"]) + + +@pytest.mark.asyncio +async def test_update_tag_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_tag( + datacatalog.UpdateTagRequest(), + tag=tags.Tag(name="name_value"), + update_mask=field_mask.FieldMask(paths=["paths_value"]), + ) + + +def test_delete_tag(transport: str = "grpc", request_type=datacatalog.DeleteTagRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_from_dict(): + test_delete_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_tag_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.DeleteTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_tag_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_tag), "__call__") as call: + call.return_value = None + + client.delete_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_tag_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.DeleteTagRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_tag_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_tag(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_delete_tag_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_tag( + datacatalog.DeleteTagRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_delete_tag_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_tag(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_delete_tag_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_tag( + datacatalog.DeleteTagRequest(), name="name_value", + ) + + +def test_list_tags(transport: str = "grpc", request_type=datacatalog.ListTagsRequest): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListTagsResponse( + next_page_token="next_page_token_value", + ) + + response = client.list_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListTagsRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListTagsPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_tags_from_dict(): + test_list_tags(request_type=dict) + + +@pytest.mark.asyncio +async def test_list_tags_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = datacatalog.ListTagsRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListTagsResponse(next_page_token="next_page_token_value",) + ) + + response = await client.list_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListTagsAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_tags_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListTagsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + call.return_value = datacatalog.ListTagsResponse() + + client.list_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_list_tags_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = datacatalog.ListTagsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListTagsResponse() + ) + + await client.list_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_list_tags_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListTagsResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.list_tags(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +def test_list_tags_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.list_tags( + datacatalog.ListTagsRequest(), parent="parent_value", + ) + + +@pytest.mark.asyncio +async def test_list_tags_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = datacatalog.ListTagsResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + datacatalog.ListTagsResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.list_tags(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +@pytest.mark.asyncio +async def test_list_tags_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.list_tags( + datacatalog.ListTagsRequest(), parent="parent_value", + ) + + +def test_list_tags_pager(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListTagsResponse( + tags=[tags.Tag(), tags.Tag(), tags.Tag(),], next_page_token="abc", + ), + datacatalog.ListTagsResponse(tags=[], next_page_token="def",), + datacatalog.ListTagsResponse(tags=[tags.Tag(),], next_page_token="ghi",), + datacatalog.ListTagsResponse(tags=[tags.Tag(), tags.Tag(),],), + RuntimeError, + ) + + metadata = () + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)), + ) + pager = client.list_tags(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, tags.Tag) for i in results) + + +def test_list_tags_pages(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_tags), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListTagsResponse( + tags=[tags.Tag(), tags.Tag(), tags.Tag(),], next_page_token="abc", + ), + datacatalog.ListTagsResponse(tags=[], next_page_token="def",), + datacatalog.ListTagsResponse(tags=[tags.Tag(),], next_page_token="ghi",), + datacatalog.ListTagsResponse(tags=[tags.Tag(), tags.Tag(),],), + RuntimeError, + ) + pages = list(client.list_tags(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_list_tags_async_pager(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListTagsResponse( + tags=[tags.Tag(), tags.Tag(), tags.Tag(),], next_page_token="abc", + ), + datacatalog.ListTagsResponse(tags=[], next_page_token="def",), + datacatalog.ListTagsResponse(tags=[tags.Tag(),], next_page_token="ghi",), + datacatalog.ListTagsResponse(tags=[tags.Tag(), tags.Tag(),],), + RuntimeError, + ) + async_pager = await client.list_tags(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, tags.Tag) for i in responses) + + +@pytest.mark.asyncio +async def test_list_tags_async_pages(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_tags), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + datacatalog.ListTagsResponse( + tags=[tags.Tag(), tags.Tag(), tags.Tag(),], next_page_token="abc", + ), + datacatalog.ListTagsResponse(tags=[], next_page_token="def",), + datacatalog.ListTagsResponse(tags=[tags.Tag(),], next_page_token="ghi",), + datacatalog.ListTagsResponse(tags=[tags.Tag(), tags.Tag(),],), + RuntimeError, + ) + pages = [] + async for page in (await client.list_tags(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_set_iam_policy( + transport: str = "grpc", request_type=iam_policy.SetIamPolicyRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy(version=774, etag=b"etag_blob",) + + response = client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.SetIamPolicyRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_set_iam_policy_from_dict(): + test_set_iam_policy(request_type=dict) + + +@pytest.mark.asyncio +async def test_set_iam_policy_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = iam_policy.SetIamPolicyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.set_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policy.Policy(version=774, etag=b"etag_blob",) + ) + + response = await client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_set_iam_policy_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.SetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + call.return_value = policy.Policy() + + client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_set_iam_policy_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.SetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.set_iam_policy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + + await client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +def test_set_iam_policy_from_dict(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + response = client.set_iam_policy( + request={ + "resource": "resource_value", + "policy": policy.Policy(version=774), + } + ) + call.assert_called() + + +def test_set_iam_policy_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.set_iam_policy(resource="resource_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].resource == "resource_value" + + +def test_set_iam_policy_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.set_iam_policy( + iam_policy.SetIamPolicyRequest(), resource="resource_value", + ) + + +@pytest.mark.asyncio +async def test_set_iam_policy_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.set_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.set_iam_policy(resource="resource_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].resource == "resource_value" + + +@pytest.mark.asyncio +async def test_set_iam_policy_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.set_iam_policy( + iam_policy.SetIamPolicyRequest(), resource="resource_value", + ) + + +def test_get_iam_policy( + transport: str = "grpc", request_type=iam_policy.GetIamPolicyRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy(version=774, etag=b"etag_blob",) + + response = client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.GetIamPolicyRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_get_iam_policy_from_dict(): + test_get_iam_policy(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_iam_policy_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = iam_policy.GetIamPolicyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policy.Policy(version=774, etag=b"etag_blob",) + ) + + response = await client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_get_iam_policy_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.GetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + call.return_value = policy.Policy() + + client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_iam_policy_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.GetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_iam_policy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + + await client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +def test_get_iam_policy_from_dict(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + response = client.get_iam_policy( + request={ + "resource": "resource_value", + "options": options.GetPolicyOptions(requested_policy_version=2598), + } + ) + call.assert_called() + + +def test_get_iam_policy_flattened(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_iam_policy(resource="resource_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].resource == "resource_value" + + +def test_get_iam_policy_flattened_error(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_iam_policy( + iam_policy.GetIamPolicyRequest(), resource="resource_value", + ) + + +@pytest.mark.asyncio +async def test_get_iam_policy_flattened_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_iam_policy(resource="resource_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].resource == "resource_value" + + +@pytest.mark.asyncio +async def test_get_iam_policy_flattened_error_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_iam_policy( + iam_policy.GetIamPolicyRequest(), resource="resource_value", + ) + + +def test_test_iam_permissions( + transport: str = "grpc", request_type=iam_policy.TestIamPermissionsRequest +): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.test_iam_permissions), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = iam_policy.TestIamPermissionsResponse( + permissions=["permissions_value"], + ) + + response = client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.TestIamPermissionsRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, iam_policy.TestIamPermissionsResponse) + + assert response.permissions == ["permissions_value"] + + +def test_test_iam_permissions_from_dict(): + test_test_iam_permissions(request_type=dict) + + +@pytest.mark.asyncio +async def test_test_iam_permissions_async(transport: str = "grpc_asyncio"): + client = DataCatalogAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = iam_policy.TestIamPermissionsRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.test_iam_permissions), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + iam_policy.TestIamPermissionsResponse(permissions=["permissions_value"],) + ) + + response = await client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, iam_policy.TestIamPermissionsResponse) + + assert response.permissions == ["permissions_value"] + + +def test_test_iam_permissions_field_headers(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.TestIamPermissionsRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.test_iam_permissions), "__call__" + ) as call: + call.return_value = iam_policy.TestIamPermissionsResponse() + + client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_test_iam_permissions_field_headers_async(): + client = DataCatalogAsyncClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.TestIamPermissionsRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.test_iam_permissions), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + iam_policy.TestIamPermissionsResponse() + ) + + await client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +def test_test_iam_permissions_from_dict(): + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.test_iam_permissions), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = iam_policy.TestIamPermissionsResponse() + + response = client.test_iam_permissions( + request={ + "resource": "resource_value", + "permissions": ["permissions_value"], + } + ) + call.assert_called() + + +def test_credentials_transport_error(): + # It is an error to provide credentials and a transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # It is an error to provide a credentials file and a transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = DataCatalogClient( + client_options={"credentials_file": "credentials.json"}, + transport=transport, + ) + + # It is an error to provide scopes and a transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = DataCatalogClient( + client_options={"scopes": ["1", "2"]}, transport=transport, + ) + + +def test_transport_instance(): + # A client may be instantiated with a custom transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + client = DataCatalogClient(transport=transport) + assert client._transport is transport + + +def test_transport_get_channel(): + # A client may be instantiated with a custom transport instance. + transport = transports.DataCatalogGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + channel = transport.grpc_channel + assert channel + + transport = transports.DataCatalogGrpcAsyncIOTransport( + credentials=credentials.AnonymousCredentials(), + ) + channel = transport.grpc_channel + assert channel + + +def test_transport_grpc_default(): + # A client should use the gRPC transport by default. + client = DataCatalogClient(credentials=credentials.AnonymousCredentials(),) + assert isinstance(client._transport, transports.DataCatalogGrpcTransport,) + + +def test_data_catalog_base_transport_error(): + # Passing both a credentials object and credentials_file should raise an error + with pytest.raises(exceptions.DuplicateCredentialArgs): + transport = transports.DataCatalogTransport( + credentials=credentials.AnonymousCredentials(), + credentials_file="credentials.json", + ) + + +def test_data_catalog_base_transport(): + # Instantiate the base transport. + with mock.patch( + "google.cloud.datacatalog_v1beta1.services.data_catalog.transports.DataCatalogTransport.__init__" + ) as Transport: + Transport.return_value = None + transport = transports.DataCatalogTransport( + credentials=credentials.AnonymousCredentials(), + ) + + # Every method on the transport should just blindly + # raise NotImplementedError. + methods = ( + "search_catalog", + "create_entry_group", + "update_entry_group", + "get_entry_group", + "delete_entry_group", + "list_entry_groups", + "create_entry", + "update_entry", + "delete_entry", + "get_entry", + "lookup_entry", + "list_entries", + "create_tag_template", + "get_tag_template", + "update_tag_template", + "delete_tag_template", + "create_tag_template_field", + "update_tag_template_field", + "rename_tag_template_field", + "delete_tag_template_field", + "create_tag", + "update_tag", + "delete_tag", + "list_tags", + "set_iam_policy", + "get_iam_policy", + "test_iam_permissions", + ) + for method in methods: + with pytest.raises(NotImplementedError): + getattr(transport, method)(request=object()) + + +def test_data_catalog_base_transport_with_credentials_file(): + # Instantiate the base transport with a credentials file + with mock.patch.object( + auth, "load_credentials_from_file" + ) as load_creds, mock.patch( + "google.cloud.datacatalog_v1beta1.services.data_catalog.transports.DataCatalogTransport._prep_wrapped_messages" + ) as Transport: + Transport.return_value = None + load_creds.return_value = (credentials.AnonymousCredentials(), None) + transport = transports.DataCatalogTransport( + credentials_file="credentials.json", quota_project_id="octopus", + ) + load_creds.assert_called_once_with( + "credentials.json", + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id="octopus", + ) + + +def test_data_catalog_auth_adc(): + # If no credentials are provided, we should use ADC credentials. + with mock.patch.object(auth, "default") as adc: + adc.return_value = (credentials.AnonymousCredentials(), None) + DataCatalogClient() + adc.assert_called_once_with( + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id=None, + ) + + +def test_data_catalog_transport_auth_adc(): + # If credentials and host are not provided, the transport class should use + # ADC credentials. + with mock.patch.object(auth, "default") as adc: + adc.return_value = (credentials.AnonymousCredentials(), None) + transports.DataCatalogGrpcTransport( + host="squid.clam.whelk", quota_project_id="octopus" + ) + adc.assert_called_once_with( + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id="octopus", + ) + + +def test_data_catalog_host_no_port(): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), + client_options=client_options.ClientOptions( + api_endpoint="datacatalog.googleapis.com" + ), + ) + assert client._transport._host == "datacatalog.googleapis.com:443" + + +def test_data_catalog_host_with_port(): + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), + client_options=client_options.ClientOptions( + api_endpoint="datacatalog.googleapis.com:8000" + ), + ) + assert client._transport._host == "datacatalog.googleapis.com:8000" + + +def test_data_catalog_grpc_transport_channel(): + channel = grpc.insecure_channel("http://localhost/") + + # Check that if channel is provided, mtls endpoint and client_cert_source + # won't be used. + callback = mock.MagicMock() + transport = transports.DataCatalogGrpcTransport( + host="squid.clam.whelk", + channel=channel, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=callback, + ) + assert transport.grpc_channel == channel + assert transport._host == "squid.clam.whelk:443" + assert not callback.called + + +def test_data_catalog_grpc_asyncio_transport_channel(): + channel = aio.insecure_channel("http://localhost/") + + # Check that if channel is provided, mtls endpoint and client_cert_source + # won't be used. + callback = mock.MagicMock() + transport = transports.DataCatalogGrpcAsyncIOTransport( + host="squid.clam.whelk", + channel=channel, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=callback, + ) + assert transport.grpc_channel == channel + assert transport._host == "squid.clam.whelk:443" + assert not callback.called + + +@mock.patch("grpc.ssl_channel_credentials", autospec=True) +@mock.patch("google.api_core.grpc_helpers.create_channel", autospec=True) +def test_data_catalog_grpc_transport_channel_mtls_with_client_cert_source( + grpc_create_channel, grpc_ssl_channel_cred +): + # Check that if channel is None, but api_mtls_endpoint and client_cert_source + # are provided, then a mTLS channel will be created. + mock_cred = mock.Mock() + + mock_ssl_cred = mock.Mock() + grpc_ssl_channel_cred.return_value = mock_ssl_cred + + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + transport = transports.DataCatalogGrpcTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=client_cert_source_callback, + ) + grpc_ssl_channel_cred.assert_called_once_with( + certificate_chain=b"cert bytes", private_key=b"key bytes" + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@mock.patch("grpc.ssl_channel_credentials", autospec=True) +@mock.patch("google.api_core.grpc_helpers_async.create_channel", autospec=True) +def test_data_catalog_grpc_asyncio_transport_channel_mtls_with_client_cert_source( + grpc_create_channel, grpc_ssl_channel_cred +): + # Check that if channel is None, but api_mtls_endpoint and client_cert_source + # are provided, then a mTLS channel will be created. + mock_cred = mock.Mock() + + mock_ssl_cred = mock.Mock() + grpc_ssl_channel_cred.return_value = mock_ssl_cred + + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + transport = transports.DataCatalogGrpcAsyncIOTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=client_cert_source_callback, + ) + grpc_ssl_channel_cred.assert_called_once_with( + certificate_chain=b"cert bytes", private_key=b"key bytes" + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@pytest.mark.parametrize( + "api_mtls_endpoint", ["mtls.squid.clam.whelk", "mtls.squid.clam.whelk:443"] +) +@mock.patch("google.api_core.grpc_helpers.create_channel", autospec=True) +def test_data_catalog_grpc_transport_channel_mtls_with_adc( + grpc_create_channel, api_mtls_endpoint +): + # Check that if channel and client_cert_source are None, but api_mtls_endpoint + # is provided, then a mTLS channel will be created with SSL ADC. + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + # Mock google.auth.transport.grpc.SslCredentials class. + mock_ssl_cred = mock.Mock() + with mock.patch.multiple( + "google.auth.transport.grpc.SslCredentials", + __init__=mock.Mock(return_value=None), + ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), + ): + mock_cred = mock.Mock() + transport = transports.DataCatalogGrpcTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint=api_mtls_endpoint, + client_cert_source=None, + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@pytest.mark.parametrize( + "api_mtls_endpoint", ["mtls.squid.clam.whelk", "mtls.squid.clam.whelk:443"] +) +@mock.patch("google.api_core.grpc_helpers_async.create_channel", autospec=True) +def test_data_catalog_grpc_asyncio_transport_channel_mtls_with_adc( + grpc_create_channel, api_mtls_endpoint +): + # Check that if channel and client_cert_source are None, but api_mtls_endpoint + # is provided, then a mTLS channel will be created with SSL ADC. + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + # Mock google.auth.transport.grpc.SslCredentials class. + mock_ssl_cred = mock.Mock() + with mock.patch.multiple( + "google.auth.transport.grpc.SslCredentials", + __init__=mock.Mock(return_value=None), + ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), + ): + mock_cred = mock.Mock() + transport = transports.DataCatalogGrpcAsyncIOTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint=api_mtls_endpoint, + client_cert_source=None, + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +def test_tag_template_path(): + project = "squid" + location = "clam" + tag_template = "whelk" + + expected = "projects/{project}/locations/{location}/tagTemplates/{tag_template}".format( + project=project, location=location, tag_template=tag_template, + ) + actual = DataCatalogClient.tag_template_path(project, location, tag_template) + assert expected == actual + + +def test_parse_tag_template_path(): + expected = { + "project": "octopus", + "location": "oyster", + "tag_template": "nudibranch", + } + path = DataCatalogClient.tag_template_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_tag_template_path(path) + assert expected == actual + + +def test_entry_path(): + project = "squid" + location = "clam" + entry_group = "whelk" + entry = "octopus" + + expected = "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}".format( + project=project, location=location, entry_group=entry_group, entry=entry, + ) + actual = DataCatalogClient.entry_path(project, location, entry_group, entry) + assert expected == actual + + +def test_parse_entry_path(): + expected = { + "project": "oyster", + "location": "nudibranch", + "entry_group": "cuttlefish", + "entry": "mussel", + } + path = DataCatalogClient.entry_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_entry_path(path) + assert expected == actual + + +def test_entry_group_path(): + project = "squid" + location = "clam" + entry_group = "whelk" + + expected = "projects/{project}/locations/{location}/entryGroups/{entry_group}".format( + project=project, location=location, entry_group=entry_group, + ) + actual = DataCatalogClient.entry_group_path(project, location, entry_group) + assert expected == actual + + +def test_parse_entry_group_path(): + expected = { + "project": "octopus", + "location": "oyster", + "entry_group": "nudibranch", + } + path = DataCatalogClient.entry_group_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_entry_group_path(path) + assert expected == actual + + +def test_tag_template_field_path(): + project = "squid" + location = "clam" + tag_template = "whelk" + field = "octopus" + + expected = "projects/{project}/locations/{location}/tagTemplates/{tag_template}/fields/{field}".format( + project=project, location=location, tag_template=tag_template, field=field, + ) + actual = DataCatalogClient.tag_template_field_path( + project, location, tag_template, field + ) + assert expected == actual + + +def test_parse_tag_template_field_path(): + expected = { + "project": "oyster", + "location": "nudibranch", + "tag_template": "cuttlefish", + "field": "mussel", + } + path = DataCatalogClient.tag_template_field_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_tag_template_field_path(path) + assert expected == actual + + +def test_tag_path(): + project = "squid" + location = "clam" + entry_group = "whelk" + entry = "octopus" + tag = "oyster" + + expected = "projects/{project}/locations/{location}/entryGroups/{entry_group}/entries/{entry}/tags/{tag}".format( + project=project, + location=location, + entry_group=entry_group, + entry=entry, + tag=tag, + ) + actual = DataCatalogClient.tag_path(project, location, entry_group, entry, tag) + assert expected == actual + + +def test_parse_tag_path(): + expected = { + "project": "nudibranch", + "location": "cuttlefish", + "entry_group": "mussel", + "entry": "winkle", + "tag": "nautilus", + } + path = DataCatalogClient.tag_path(**expected) + + # Check that the path construction is reversible. + actual = DataCatalogClient.parse_tag_path(path) + assert expected == actual diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py new file mode 100644 index 00000000..de5f0342 --- /dev/null +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py @@ -0,0 +1,3683 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import os +import mock + +import grpc +from grpc.experimental import aio +import math +import pytest +from proto.marshal.rules.dates import DurationRule, TimestampRule + +from google import auth +from google.api_core import client_options +from google.api_core import exceptions +from google.api_core import gapic_v1 +from google.api_core import grpc_helpers +from google.api_core import grpc_helpers_async +from google.auth import credentials +from google.auth.exceptions import MutualTLSChannelError +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager import ( + PolicyTagManagerAsyncClient, +) +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager import ( + PolicyTagManagerClient, +) +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager import pagers +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager import transports +from google.cloud.datacatalog_v1beta1.types import policytagmanager +from google.iam.v1 import iam_policy_pb2 as iam_policy # type: ignore +from google.iam.v1 import options_pb2 as options # type: ignore +from google.iam.v1 import policy_pb2 as policy # type: ignore +from google.oauth2 import service_account +from google.protobuf import field_mask_pb2 as field_mask # type: ignore +from google.type import expr_pb2 as expr # type: ignore + + +def client_cert_source_callback(): + return b"cert bytes", b"key bytes" + + +# If default endpoint is localhost, then default mtls endpoint will be the same. +# This method modifies the default endpoint so the client can produce a different +# mtls endpoint for endpoint testing purposes. +def modify_default_endpoint(client): + return ( + "foo.googleapis.com" + if ("localhost" in client.DEFAULT_ENDPOINT) + else client.DEFAULT_ENDPOINT + ) + + +def test__get_default_mtls_endpoint(): + api_endpoint = "example.googleapis.com" + api_mtls_endpoint = "example.mtls.googleapis.com" + sandbox_endpoint = "example.sandbox.googleapis.com" + sandbox_mtls_endpoint = "example.mtls.sandbox.googleapis.com" + non_googleapi = "api.example.com" + + assert PolicyTagManagerClient._get_default_mtls_endpoint(None) is None + assert ( + PolicyTagManagerClient._get_default_mtls_endpoint(api_endpoint) + == api_mtls_endpoint + ) + assert ( + PolicyTagManagerClient._get_default_mtls_endpoint(api_mtls_endpoint) + == api_mtls_endpoint + ) + assert ( + PolicyTagManagerClient._get_default_mtls_endpoint(sandbox_endpoint) + == sandbox_mtls_endpoint + ) + assert ( + PolicyTagManagerClient._get_default_mtls_endpoint(sandbox_mtls_endpoint) + == sandbox_mtls_endpoint + ) + assert ( + PolicyTagManagerClient._get_default_mtls_endpoint(non_googleapi) + == non_googleapi + ) + + +@pytest.mark.parametrize( + "client_class", [PolicyTagManagerClient, PolicyTagManagerAsyncClient] +) +def test_policy_tag_manager_client_from_service_account_file(client_class): + creds = credentials.AnonymousCredentials() + with mock.patch.object( + service_account.Credentials, "from_service_account_file" + ) as factory: + factory.return_value = creds + client = client_class.from_service_account_file("dummy/file/path.json") + assert client._transport._credentials == creds + + client = client_class.from_service_account_json("dummy/file/path.json") + assert client._transport._credentials == creds + + assert client._transport._host == "datacatalog.googleapis.com:443" + + +def test_policy_tag_manager_client_get_transport_class(): + transport = PolicyTagManagerClient.get_transport_class() + assert transport == transports.PolicyTagManagerGrpcTransport + + transport = PolicyTagManagerClient.get_transport_class("grpc") + assert transport == transports.PolicyTagManagerGrpcTransport + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + (PolicyTagManagerClient, transports.PolicyTagManagerGrpcTransport, "grpc"), + ( + PolicyTagManagerAsyncClient, + transports.PolicyTagManagerGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +@mock.patch.object( + PolicyTagManagerClient, + "DEFAULT_ENDPOINT", + modify_default_endpoint(PolicyTagManagerClient), +) +@mock.patch.object( + PolicyTagManagerAsyncClient, + "DEFAULT_ENDPOINT", + modify_default_endpoint(PolicyTagManagerAsyncClient), +) +def test_policy_tag_manager_client_client_options( + client_class, transport_class, transport_name +): + # Check that if channel is provided we won't create a new one. + with mock.patch.object(PolicyTagManagerClient, "get_transport_class") as gtc: + transport = transport_class(credentials=credentials.AnonymousCredentials()) + client = client_class(transport=transport) + gtc.assert_not_called() + + # Check that if channel is provided via str we will create a new one. + with mock.patch.object(PolicyTagManagerClient, "get_transport_class") as gtc: + client = client_class(transport=transport_name) + gtc.assert_called() + + # Check the case api_endpoint is provided. + options = client_options.ClientOptions(api_endpoint="squid.clam.whelk") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host="squid.clam.whelk", + scopes=None, + api_mtls_endpoint="squid.clam.whelk", + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS is + # "never". + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "never"}): + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS is + # "always". + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "always"}): + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", and client_cert_source is provided. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + options = client_options.ClientOptions( + client_cert_source=client_cert_source_callback + ) + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=client_cert_source_callback, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", and default_client_cert_source is provided. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=True, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", but client_cert_source and default_client_cert_source are None. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=False, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS has + # unsupported value. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "Unsupported"}): + with pytest.raises(MutualTLSChannelError): + client = client_class() + + # Check the case quota_project_id is provided + options = client_options.ClientOptions(quota_project_id="octopus") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id="octopus", + ) + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + (PolicyTagManagerClient, transports.PolicyTagManagerGrpcTransport, "grpc"), + ( + PolicyTagManagerAsyncClient, + transports.PolicyTagManagerGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +def test_policy_tag_manager_client_client_options_scopes( + client_class, transport_class, transport_name +): + # Check the case scopes are provided. + options = client_options.ClientOptions(scopes=["1", "2"],) + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=["1", "2"], + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + (PolicyTagManagerClient, transports.PolicyTagManagerGrpcTransport, "grpc"), + ( + PolicyTagManagerAsyncClient, + transports.PolicyTagManagerGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +def test_policy_tag_manager_client_client_options_credentials_file( + client_class, transport_class, transport_name +): + # Check the case credentials file is provided. + options = client_options.ClientOptions(credentials_file="credentials.json") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file="credentials.json", + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + +def test_policy_tag_manager_client_client_options_from_dict(): + with mock.patch( + "google.cloud.datacatalog_v1beta1.services.policy_tag_manager.transports.PolicyTagManagerGrpcTransport.__init__" + ) as grpc_transport: + grpc_transport.return_value = None + client = PolicyTagManagerClient( + client_options={"api_endpoint": "squid.clam.whelk"} + ) + grpc_transport.assert_called_once_with( + credentials=None, + credentials_file=None, + host="squid.clam.whelk", + scopes=None, + api_mtls_endpoint="squid.clam.whelk", + client_cert_source=None, + quota_project_id=None, + ) + + +def test_create_taxonomy( + transport: str = "grpc", request_type=policytagmanager.CreateTaxonomyRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_taxonomy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.Taxonomy( + name="name_value", + display_name="display_name_value", + description="description_value", + activated_policy_types=[ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ], + ) + + response = client.create_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.CreateTaxonomyRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.Taxonomy) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.activated_policy_types == [ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ] + + +def test_create_taxonomy_from_dict(): + test_create_taxonomy(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_taxonomy_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.CreateTaxonomyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_taxonomy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.Taxonomy( + name="name_value", + display_name="display_name_value", + description="description_value", + activated_policy_types=[ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ], + ) + ) + + response = await client.create_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.Taxonomy) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.activated_policy_types == [ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ] + + +def test_create_taxonomy_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.CreateTaxonomyRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_taxonomy), "__call__") as call: + call.return_value = policytagmanager.Taxonomy() + + client.create_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_taxonomy_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.CreateTaxonomyRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_taxonomy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.Taxonomy() + ) + + await client.create_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_taxonomy_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.create_taxonomy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.Taxonomy() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_taxonomy( + parent="parent_value", + taxonomy=policytagmanager.Taxonomy(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].taxonomy == policytagmanager.Taxonomy(name="name_value") + + +def test_create_taxonomy_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_taxonomy( + policytagmanager.CreateTaxonomyRequest(), + parent="parent_value", + taxonomy=policytagmanager.Taxonomy(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_taxonomy_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_taxonomy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.Taxonomy() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.Taxonomy() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_taxonomy( + parent="parent_value", + taxonomy=policytagmanager.Taxonomy(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].taxonomy == policytagmanager.Taxonomy(name="name_value") + + +@pytest.mark.asyncio +async def test_create_taxonomy_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_taxonomy( + policytagmanager.CreateTaxonomyRequest(), + parent="parent_value", + taxonomy=policytagmanager.Taxonomy(name="name_value"), + ) + + +def test_delete_taxonomy( + transport: str = "grpc", request_type=policytagmanager.DeleteTaxonomyRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_taxonomy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.DeleteTaxonomyRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_taxonomy_from_dict(): + test_delete_taxonomy(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_taxonomy_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.DeleteTaxonomyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_taxonomy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_taxonomy_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.DeleteTaxonomyRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_taxonomy), "__call__") as call: + call.return_value = None + + client.delete_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_taxonomy_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.DeleteTaxonomyRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_taxonomy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_taxonomy_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.delete_taxonomy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_taxonomy(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_delete_taxonomy_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_taxonomy( + policytagmanager.DeleteTaxonomyRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_delete_taxonomy_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_taxonomy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_taxonomy(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_delete_taxonomy_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_taxonomy( + policytagmanager.DeleteTaxonomyRequest(), name="name_value", + ) + + +def test_update_taxonomy( + transport: str = "grpc", request_type=policytagmanager.UpdateTaxonomyRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_taxonomy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.Taxonomy( + name="name_value", + display_name="display_name_value", + description="description_value", + activated_policy_types=[ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ], + ) + + response = client.update_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.UpdateTaxonomyRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.Taxonomy) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.activated_policy_types == [ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ] + + +def test_update_taxonomy_from_dict(): + test_update_taxonomy(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_taxonomy_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.UpdateTaxonomyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_taxonomy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.Taxonomy( + name="name_value", + display_name="display_name_value", + description="description_value", + activated_policy_types=[ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ], + ) + ) + + response = await client.update_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.Taxonomy) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.activated_policy_types == [ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ] + + +def test_update_taxonomy_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.UpdateTaxonomyRequest() + request.taxonomy.name = "taxonomy.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_taxonomy), "__call__") as call: + call.return_value = policytagmanager.Taxonomy() + + client.update_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "taxonomy.name=taxonomy.name/value",) in kw[ + "metadata" + ] + + +@pytest.mark.asyncio +async def test_update_taxonomy_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.UpdateTaxonomyRequest() + request.taxonomy.name = "taxonomy.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_taxonomy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.Taxonomy() + ) + + await client.update_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "taxonomy.name=taxonomy.name/value",) in kw[ + "metadata" + ] + + +def test_update_taxonomy_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.update_taxonomy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.Taxonomy() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_taxonomy(taxonomy=policytagmanager.Taxonomy(name="name_value"),) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].taxonomy == policytagmanager.Taxonomy(name="name_value") + + +def test_update_taxonomy_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_taxonomy( + policytagmanager.UpdateTaxonomyRequest(), + taxonomy=policytagmanager.Taxonomy(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_update_taxonomy_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_taxonomy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.Taxonomy() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.Taxonomy() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_taxonomy( + taxonomy=policytagmanager.Taxonomy(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].taxonomy == policytagmanager.Taxonomy(name="name_value") + + +@pytest.mark.asyncio +async def test_update_taxonomy_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_taxonomy( + policytagmanager.UpdateTaxonomyRequest(), + taxonomy=policytagmanager.Taxonomy(name="name_value"), + ) + + +def test_list_taxonomies( + transport: str = "grpc", request_type=policytagmanager.ListTaxonomiesRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_taxonomies), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.ListTaxonomiesResponse( + next_page_token="next_page_token_value", + ) + + response = client.list_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.ListTaxonomiesRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListTaxonomiesPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_taxonomies_from_dict(): + test_list_taxonomies(request_type=dict) + + +@pytest.mark.asyncio +async def test_list_taxonomies_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.ListTaxonomiesRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_taxonomies), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.ListTaxonomiesResponse( + next_page_token="next_page_token_value", + ) + ) + + response = await client.list_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListTaxonomiesAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_taxonomies_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.ListTaxonomiesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_taxonomies), "__call__") as call: + call.return_value = policytagmanager.ListTaxonomiesResponse() + + client.list_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_list_taxonomies_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.ListTaxonomiesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_taxonomies), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.ListTaxonomiesResponse() + ) + + await client.list_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_list_taxonomies_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_taxonomies), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.ListTaxonomiesResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.list_taxonomies(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +def test_list_taxonomies_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.list_taxonomies( + policytagmanager.ListTaxonomiesRequest(), parent="parent_value", + ) + + +@pytest.mark.asyncio +async def test_list_taxonomies_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_taxonomies), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.ListTaxonomiesResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.ListTaxonomiesResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.list_taxonomies(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +@pytest.mark.asyncio +async def test_list_taxonomies_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.list_taxonomies( + policytagmanager.ListTaxonomiesRequest(), parent="parent_value", + ) + + +def test_list_taxonomies_pager(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_taxonomies), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + policytagmanager.ListTaxonomiesResponse( + taxonomies=[ + policytagmanager.Taxonomy(), + policytagmanager.Taxonomy(), + policytagmanager.Taxonomy(), + ], + next_page_token="abc", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[], next_page_token="def", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[policytagmanager.Taxonomy(),], next_page_token="ghi", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[policytagmanager.Taxonomy(), policytagmanager.Taxonomy(),], + ), + RuntimeError, + ) + + metadata = () + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)), + ) + pager = client.list_taxonomies(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, policytagmanager.Taxonomy) for i in results) + + +def test_list_taxonomies_pages(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.list_taxonomies), "__call__") as call: + # Set the response to a series of pages. + call.side_effect = ( + policytagmanager.ListTaxonomiesResponse( + taxonomies=[ + policytagmanager.Taxonomy(), + policytagmanager.Taxonomy(), + policytagmanager.Taxonomy(), + ], + next_page_token="abc", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[], next_page_token="def", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[policytagmanager.Taxonomy(),], next_page_token="ghi", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[policytagmanager.Taxonomy(), policytagmanager.Taxonomy(),], + ), + RuntimeError, + ) + pages = list(client.list_taxonomies(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_list_taxonomies_async_pager(): + client = PolicyTagManagerAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_taxonomies), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + policytagmanager.ListTaxonomiesResponse( + taxonomies=[ + policytagmanager.Taxonomy(), + policytagmanager.Taxonomy(), + policytagmanager.Taxonomy(), + ], + next_page_token="abc", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[], next_page_token="def", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[policytagmanager.Taxonomy(),], next_page_token="ghi", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[policytagmanager.Taxonomy(), policytagmanager.Taxonomy(),], + ), + RuntimeError, + ) + async_pager = await client.list_taxonomies(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, policytagmanager.Taxonomy) for i in responses) + + +@pytest.mark.asyncio +async def test_list_taxonomies_async_pages(): + client = PolicyTagManagerAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_taxonomies), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + policytagmanager.ListTaxonomiesResponse( + taxonomies=[ + policytagmanager.Taxonomy(), + policytagmanager.Taxonomy(), + policytagmanager.Taxonomy(), + ], + next_page_token="abc", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[], next_page_token="def", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[policytagmanager.Taxonomy(),], next_page_token="ghi", + ), + policytagmanager.ListTaxonomiesResponse( + taxonomies=[policytagmanager.Taxonomy(), policytagmanager.Taxonomy(),], + ), + RuntimeError, + ) + pages = [] + async for page in (await client.list_taxonomies(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_get_taxonomy( + transport: str = "grpc", request_type=policytagmanager.GetTaxonomyRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_taxonomy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.Taxonomy( + name="name_value", + display_name="display_name_value", + description="description_value", + activated_policy_types=[ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ], + ) + + response = client.get_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.GetTaxonomyRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.Taxonomy) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.activated_policy_types == [ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ] + + +def test_get_taxonomy_from_dict(): + test_get_taxonomy(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_taxonomy_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.GetTaxonomyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_taxonomy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.Taxonomy( + name="name_value", + display_name="display_name_value", + description="description_value", + activated_policy_types=[ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ], + ) + ) + + response = await client.get_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.Taxonomy) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.activated_policy_types == [ + policytagmanager.Taxonomy.PolicyType.FINE_GRAINED_ACCESS_CONTROL + ] + + +def test_get_taxonomy_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.GetTaxonomyRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_taxonomy), "__call__") as call: + call.return_value = policytagmanager.Taxonomy() + + client.get_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_taxonomy_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.GetTaxonomyRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_taxonomy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.Taxonomy() + ) + + await client.get_taxonomy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_get_taxonomy_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_taxonomy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.Taxonomy() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_taxonomy(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_get_taxonomy_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_taxonomy( + policytagmanager.GetTaxonomyRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_get_taxonomy_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_taxonomy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.Taxonomy() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.Taxonomy() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_taxonomy(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_get_taxonomy_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_taxonomy( + policytagmanager.GetTaxonomyRequest(), name="name_value", + ) + + +def test_create_policy_tag( + transport: str = "grpc", request_type=policytagmanager.CreatePolicyTagRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.PolicyTag( + name="name_value", + display_name="display_name_value", + description="description_value", + parent_policy_tag="parent_policy_tag_value", + child_policy_tags=["child_policy_tags_value"], + ) + + response = client.create_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.CreatePolicyTagRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.PolicyTag) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.parent_policy_tag == "parent_policy_tag_value" + + assert response.child_policy_tags == ["child_policy_tags_value"] + + +def test_create_policy_tag_from_dict(): + test_create_policy_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_create_policy_tag_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.CreatePolicyTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.PolicyTag( + name="name_value", + display_name="display_name_value", + description="description_value", + parent_policy_tag="parent_policy_tag_value", + child_policy_tags=["child_policy_tags_value"], + ) + ) + + response = await client.create_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.PolicyTag) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.parent_policy_tag == "parent_policy_tag_value" + + assert response.child_policy_tags == ["child_policy_tags_value"] + + +def test_create_policy_tag_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.CreatePolicyTagRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_policy_tag), "__call__" + ) as call: + call.return_value = policytagmanager.PolicyTag() + + client.create_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_create_policy_tag_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.CreatePolicyTagRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_policy_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.PolicyTag() + ) + + await client.create_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_create_policy_tag_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.create_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.PolicyTag() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.create_policy_tag( + parent="parent_value", + policy_tag=policytagmanager.PolicyTag(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].policy_tag == policytagmanager.PolicyTag(name="name_value") + + +def test_create_policy_tag_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.create_policy_tag( + policytagmanager.CreatePolicyTagRequest(), + parent="parent_value", + policy_tag=policytagmanager.PolicyTag(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_create_policy_tag_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.create_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.PolicyTag() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.PolicyTag() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.create_policy_tag( + parent="parent_value", + policy_tag=policytagmanager.PolicyTag(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + assert args[0].policy_tag == policytagmanager.PolicyTag(name="name_value") + + +@pytest.mark.asyncio +async def test_create_policy_tag_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.create_policy_tag( + policytagmanager.CreatePolicyTagRequest(), + parent="parent_value", + policy_tag=policytagmanager.PolicyTag(name="name_value"), + ) + + +def test_delete_policy_tag( + transport: str = "grpc", request_type=policytagmanager.DeletePolicyTagRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + response = client.delete_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.DeletePolicyTagRequest() + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_policy_tag_from_dict(): + test_delete_policy_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_delete_policy_tag_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.DeletePolicyTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + response = await client.delete_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert response is None + + +def test_delete_policy_tag_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.DeletePolicyTagRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_policy_tag), "__call__" + ) as call: + call.return_value = None + + client.delete_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_delete_policy_tag_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.DeletePolicyTagRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_policy_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + + await client.delete_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_delete_policy_tag_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.delete_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.delete_policy_tag(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_delete_policy_tag_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.delete_policy_tag( + policytagmanager.DeletePolicyTagRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_delete_policy_tag_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.delete_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = None + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.delete_policy_tag(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_delete_policy_tag_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.delete_policy_tag( + policytagmanager.DeletePolicyTagRequest(), name="name_value", + ) + + +def test_update_policy_tag( + transport: str = "grpc", request_type=policytagmanager.UpdatePolicyTagRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.PolicyTag( + name="name_value", + display_name="display_name_value", + description="description_value", + parent_policy_tag="parent_policy_tag_value", + child_policy_tags=["child_policy_tags_value"], + ) + + response = client.update_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.UpdatePolicyTagRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.PolicyTag) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.parent_policy_tag == "parent_policy_tag_value" + + assert response.child_policy_tags == ["child_policy_tags_value"] + + +def test_update_policy_tag_from_dict(): + test_update_policy_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_update_policy_tag_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.UpdatePolicyTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.PolicyTag( + name="name_value", + display_name="display_name_value", + description="description_value", + parent_policy_tag="parent_policy_tag_value", + child_policy_tags=["child_policy_tags_value"], + ) + ) + + response = await client.update_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.PolicyTag) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.parent_policy_tag == "parent_policy_tag_value" + + assert response.child_policy_tags == ["child_policy_tags_value"] + + +def test_update_policy_tag_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.UpdatePolicyTagRequest() + request.policy_tag.name = "policy_tag.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_policy_tag), "__call__" + ) as call: + call.return_value = policytagmanager.PolicyTag() + + client.update_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "policy_tag.name=policy_tag.name/value",) in kw[ + "metadata" + ] + + +@pytest.mark.asyncio +async def test_update_policy_tag_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.UpdatePolicyTagRequest() + request.policy_tag.name = "policy_tag.name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_policy_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.PolicyTag() + ) + + await client.update_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "policy_tag.name=policy_tag.name/value",) in kw[ + "metadata" + ] + + +def test_update_policy_tag_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.update_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.PolicyTag() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.update_policy_tag( + policy_tag=policytagmanager.PolicyTag(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].policy_tag == policytagmanager.PolicyTag(name="name_value") + + +def test_update_policy_tag_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.update_policy_tag( + policytagmanager.UpdatePolicyTagRequest(), + policy_tag=policytagmanager.PolicyTag(name="name_value"), + ) + + +@pytest.mark.asyncio +async def test_update_policy_tag_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.update_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.PolicyTag() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.PolicyTag() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.update_policy_tag( + policy_tag=policytagmanager.PolicyTag(name="name_value"), + ) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].policy_tag == policytagmanager.PolicyTag(name="name_value") + + +@pytest.mark.asyncio +async def test_update_policy_tag_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.update_policy_tag( + policytagmanager.UpdatePolicyTagRequest(), + policy_tag=policytagmanager.PolicyTag(name="name_value"), + ) + + +def test_list_policy_tags( + transport: str = "grpc", request_type=policytagmanager.ListPolicyTagsRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_policy_tags), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.ListPolicyTagsResponse( + next_page_token="next_page_token_value", + ) + + response = client.list_policy_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.ListPolicyTagsRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListPolicyTagsPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_policy_tags_from_dict(): + test_list_policy_tags(request_type=dict) + + +@pytest.mark.asyncio +async def test_list_policy_tags_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.ListPolicyTagsRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_policy_tags), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.ListPolicyTagsResponse( + next_page_token="next_page_token_value", + ) + ) + + response = await client.list_policy_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, pagers.ListPolicyTagsAsyncPager) + + assert response.next_page_token == "next_page_token_value" + + +def test_list_policy_tags_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.ListPolicyTagsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_policy_tags), "__call__" + ) as call: + call.return_value = policytagmanager.ListPolicyTagsResponse() + + client.list_policy_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_list_policy_tags_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.ListPolicyTagsRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_policy_tags), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.ListPolicyTagsResponse() + ) + + await client.list_policy_tags(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_list_policy_tags_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_policy_tags), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.ListPolicyTagsResponse() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.list_policy_tags(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +def test_list_policy_tags_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.list_policy_tags( + policytagmanager.ListPolicyTagsRequest(), parent="parent_value", + ) + + +@pytest.mark.asyncio +async def test_list_policy_tags_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_policy_tags), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.ListPolicyTagsResponse() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.ListPolicyTagsResponse() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.list_policy_tags(parent="parent_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].parent == "parent_value" + + +@pytest.mark.asyncio +async def test_list_policy_tags_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.list_policy_tags( + policytagmanager.ListPolicyTagsRequest(), parent="parent_value", + ) + + +def test_list_policy_tags_pager(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_policy_tags), "__call__" + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + policytagmanager.ListPolicyTagsResponse( + policy_tags=[ + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + ], + next_page_token="abc", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[], next_page_token="def", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[policytagmanager.PolicyTag(),], next_page_token="ghi", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[ + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + ], + ), + RuntimeError, + ) + + metadata = () + metadata = tuple(metadata) + ( + gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)), + ) + pager = client.list_policy_tags(request={}) + + assert pager._metadata == metadata + + results = [i for i in pager] + assert len(results) == 6 + assert all(isinstance(i, policytagmanager.PolicyTag) for i in results) + + +def test_list_policy_tags_pages(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.list_policy_tags), "__call__" + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + policytagmanager.ListPolicyTagsResponse( + policy_tags=[ + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + ], + next_page_token="abc", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[], next_page_token="def", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[policytagmanager.PolicyTag(),], next_page_token="ghi", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[ + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + ], + ), + RuntimeError, + ) + pages = list(client.list_policy_tags(request={}).pages) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +@pytest.mark.asyncio +async def test_list_policy_tags_async_pager(): + client = PolicyTagManagerAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_policy_tags), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + policytagmanager.ListPolicyTagsResponse( + policy_tags=[ + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + ], + next_page_token="abc", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[], next_page_token="def", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[policytagmanager.PolicyTag(),], next_page_token="ghi", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[ + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + ], + ), + RuntimeError, + ) + async_pager = await client.list_policy_tags(request={},) + assert async_pager.next_page_token == "abc" + responses = [] + async for response in async_pager: + responses.append(response) + + assert len(responses) == 6 + assert all(isinstance(i, policytagmanager.PolicyTag) for i in responses) + + +@pytest.mark.asyncio +async def test_list_policy_tags_async_pages(): + client = PolicyTagManagerAsyncClient(credentials=credentials.AnonymousCredentials,) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.list_policy_tags), + "__call__", + new_callable=mock.AsyncMock, + ) as call: + # Set the response to a series of pages. + call.side_effect = ( + policytagmanager.ListPolicyTagsResponse( + policy_tags=[ + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + ], + next_page_token="abc", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[], next_page_token="def", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[policytagmanager.PolicyTag(),], next_page_token="ghi", + ), + policytagmanager.ListPolicyTagsResponse( + policy_tags=[ + policytagmanager.PolicyTag(), + policytagmanager.PolicyTag(), + ], + ), + RuntimeError, + ) + pages = [] + async for page in (await client.list_policy_tags(request={})).pages: + pages.append(page) + for page, token in zip(pages, ["abc", "def", "ghi", ""]): + assert page.raw_page.next_page_token == token + + +def test_get_policy_tag( + transport: str = "grpc", request_type=policytagmanager.GetPolicyTagRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_policy_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.PolicyTag( + name="name_value", + display_name="display_name_value", + description="description_value", + parent_policy_tag="parent_policy_tag_value", + child_policy_tags=["child_policy_tags_value"], + ) + + response = client.get_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.GetPolicyTagRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.PolicyTag) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.parent_policy_tag == "parent_policy_tag_value" + + assert response.child_policy_tags == ["child_policy_tags_value"] + + +def test_get_policy_tag_from_dict(): + test_get_policy_tag(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_policy_tag_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanager.GetPolicyTagRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.PolicyTag( + name="name_value", + display_name="display_name_value", + description="description_value", + parent_policy_tag="parent_policy_tag_value", + child_policy_tags=["child_policy_tags_value"], + ) + ) + + response = await client.get_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanager.PolicyTag) + + assert response.name == "name_value" + + assert response.display_name == "display_name_value" + + assert response.description == "description_value" + + assert response.parent_policy_tag == "parent_policy_tag_value" + + assert response.child_policy_tags == ["child_policy_tags_value"] + + +def test_get_policy_tag_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.GetPolicyTagRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_policy_tag), "__call__") as call: + call.return_value = policytagmanager.PolicyTag() + + client.get_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_policy_tag_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanager.GetPolicyTagRequest() + request.name = "name/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_policy_tag), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.PolicyTag() + ) + + await client.get_policy_tag(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "name=name/value",) in kw["metadata"] + + +def test_get_policy_tag_flattened(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_policy_tag), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.PolicyTag() + + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + client.get_policy_tag(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +def test_get_policy_tag_flattened_error(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + client.get_policy_tag( + policytagmanager.GetPolicyTagRequest(), name="name_value", + ) + + +@pytest.mark.asyncio +async def test_get_policy_tag_flattened_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_policy_tag), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanager.PolicyTag() + + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanager.PolicyTag() + ) + # Call the method with a truthy value for each flattened field, + # using the keyword arguments to the method. + response = await client.get_policy_tag(name="name_value",) + + # Establish that the underlying call was made with the expected + # request object values. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0].name == "name_value" + + +@pytest.mark.asyncio +async def test_get_policy_tag_flattened_error_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Attempting to call a method with both a request object and flattened + # fields is an error. + with pytest.raises(ValueError): + await client.get_policy_tag( + policytagmanager.GetPolicyTagRequest(), name="name_value", + ) + + +def test_get_iam_policy( + transport: str = "grpc", request_type=iam_policy.GetIamPolicyRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy(version=774, etag=b"etag_blob",) + + response = client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.GetIamPolicyRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_get_iam_policy_from_dict(): + test_get_iam_policy(request_type=dict) + + +@pytest.mark.asyncio +async def test_get_iam_policy_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = iam_policy.GetIamPolicyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policy.Policy(version=774, etag=b"etag_blob",) + ) + + response = await client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_get_iam_policy_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.GetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + call.return_value = policy.Policy() + + client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_get_iam_policy_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.GetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.get_iam_policy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + + await client.get_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +def test_get_iam_policy_from_dict(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.get_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + response = client.get_iam_policy( + request={ + "resource": "resource_value", + "options": options.GetPolicyOptions(requested_policy_version=2598), + } + ) + call.assert_called() + + +def test_set_iam_policy( + transport: str = "grpc", request_type=iam_policy.SetIamPolicyRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy(version=774, etag=b"etag_blob",) + + response = client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.SetIamPolicyRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_set_iam_policy_from_dict(): + test_set_iam_policy(request_type=dict) + + +@pytest.mark.asyncio +async def test_set_iam_policy_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = iam_policy.SetIamPolicyRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.set_iam_policy), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policy.Policy(version=774, etag=b"etag_blob",) + ) + + response = await client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policy.Policy) + + assert response.version == 774 + + assert response.etag == b"etag_blob" + + +def test_set_iam_policy_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.SetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + call.return_value = policy.Policy() + + client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_set_iam_policy_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.SetIamPolicyRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.set_iam_policy), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(policy.Policy()) + + await client.set_iam_policy(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +def test_set_iam_policy_from_dict(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client._transport.set_iam_policy), "__call__") as call: + # Designate an appropriate return value for the call. + call.return_value = policy.Policy() + + response = client.set_iam_policy( + request={ + "resource": "resource_value", + "policy": policy.Policy(version=774), + } + ) + call.assert_called() + + +def test_test_iam_permissions( + transport: str = "grpc", request_type=iam_policy.TestIamPermissionsRequest +): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.test_iam_permissions), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = iam_policy.TestIamPermissionsResponse( + permissions=["permissions_value"], + ) + + response = client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.TestIamPermissionsRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, iam_policy.TestIamPermissionsResponse) + + assert response.permissions == ["permissions_value"] + + +def test_test_iam_permissions_from_dict(): + test_test_iam_permissions(request_type=dict) + + +@pytest.mark.asyncio +async def test_test_iam_permissions_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = iam_policy.TestIamPermissionsRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.test_iam_permissions), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + iam_policy.TestIamPermissionsResponse(permissions=["permissions_value"],) + ) + + response = await client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, iam_policy.TestIamPermissionsResponse) + + assert response.permissions == ["permissions_value"] + + +def test_test_iam_permissions_field_headers(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.TestIamPermissionsRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.test_iam_permissions), "__call__" + ) as call: + call.return_value = iam_policy.TestIamPermissionsResponse() + + client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_test_iam_permissions_field_headers_async(): + client = PolicyTagManagerAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = iam_policy.TestIamPermissionsRequest() + request.resource = "resource/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.test_iam_permissions), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + iam_policy.TestIamPermissionsResponse() + ) + + await client.test_iam_permissions(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "resource=resource/value",) in kw["metadata"] + + +def test_test_iam_permissions_from_dict(): + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.test_iam_permissions), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = iam_policy.TestIamPermissionsResponse() + + response = client.test_iam_permissions( + request={ + "resource": "resource_value", + "permissions": ["permissions_value"], + } + ) + call.assert_called() + + +def test_credentials_transport_error(): + # It is an error to provide credentials and a transport instance. + transport = transports.PolicyTagManagerGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # It is an error to provide a credentials file and a transport instance. + transport = transports.PolicyTagManagerGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = PolicyTagManagerClient( + client_options={"credentials_file": "credentials.json"}, + transport=transport, + ) + + # It is an error to provide scopes and a transport instance. + transport = transports.PolicyTagManagerGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = PolicyTagManagerClient( + client_options={"scopes": ["1", "2"]}, transport=transport, + ) + + +def test_transport_instance(): + # A client may be instantiated with a custom transport instance. + transport = transports.PolicyTagManagerGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + client = PolicyTagManagerClient(transport=transport) + assert client._transport is transport + + +def test_transport_get_channel(): + # A client may be instantiated with a custom transport instance. + transport = transports.PolicyTagManagerGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + channel = transport.grpc_channel + assert channel + + transport = transports.PolicyTagManagerGrpcAsyncIOTransport( + credentials=credentials.AnonymousCredentials(), + ) + channel = transport.grpc_channel + assert channel + + +def test_transport_grpc_default(): + # A client should use the gRPC transport by default. + client = PolicyTagManagerClient(credentials=credentials.AnonymousCredentials(),) + assert isinstance(client._transport, transports.PolicyTagManagerGrpcTransport,) + + +def test_policy_tag_manager_base_transport_error(): + # Passing both a credentials object and credentials_file should raise an error + with pytest.raises(exceptions.DuplicateCredentialArgs): + transport = transports.PolicyTagManagerTransport( + credentials=credentials.AnonymousCredentials(), + credentials_file="credentials.json", + ) + + +def test_policy_tag_manager_base_transport(): + # Instantiate the base transport. + with mock.patch( + "google.cloud.datacatalog_v1beta1.services.policy_tag_manager.transports.PolicyTagManagerTransport.__init__" + ) as Transport: + Transport.return_value = None + transport = transports.PolicyTagManagerTransport( + credentials=credentials.AnonymousCredentials(), + ) + + # Every method on the transport should just blindly + # raise NotImplementedError. + methods = ( + "create_taxonomy", + "delete_taxonomy", + "update_taxonomy", + "list_taxonomies", + "get_taxonomy", + "create_policy_tag", + "delete_policy_tag", + "update_policy_tag", + "list_policy_tags", + "get_policy_tag", + "get_iam_policy", + "set_iam_policy", + "test_iam_permissions", + ) + for method in methods: + with pytest.raises(NotImplementedError): + getattr(transport, method)(request=object()) + + +def test_policy_tag_manager_base_transport_with_credentials_file(): + # Instantiate the base transport with a credentials file + with mock.patch.object( + auth, "load_credentials_from_file" + ) as load_creds, mock.patch( + "google.cloud.datacatalog_v1beta1.services.policy_tag_manager.transports.PolicyTagManagerTransport._prep_wrapped_messages" + ) as Transport: + Transport.return_value = None + load_creds.return_value = (credentials.AnonymousCredentials(), None) + transport = transports.PolicyTagManagerTransport( + credentials_file="credentials.json", quota_project_id="octopus", + ) + load_creds.assert_called_once_with( + "credentials.json", + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id="octopus", + ) + + +def test_policy_tag_manager_auth_adc(): + # If no credentials are provided, we should use ADC credentials. + with mock.patch.object(auth, "default") as adc: + adc.return_value = (credentials.AnonymousCredentials(), None) + PolicyTagManagerClient() + adc.assert_called_once_with( + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id=None, + ) + + +def test_policy_tag_manager_transport_auth_adc(): + # If credentials and host are not provided, the transport class should use + # ADC credentials. + with mock.patch.object(auth, "default") as adc: + adc.return_value = (credentials.AnonymousCredentials(), None) + transports.PolicyTagManagerGrpcTransport( + host="squid.clam.whelk", quota_project_id="octopus" + ) + adc.assert_called_once_with( + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id="octopus", + ) + + +def test_policy_tag_manager_host_no_port(): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), + client_options=client_options.ClientOptions( + api_endpoint="datacatalog.googleapis.com" + ), + ) + assert client._transport._host == "datacatalog.googleapis.com:443" + + +def test_policy_tag_manager_host_with_port(): + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), + client_options=client_options.ClientOptions( + api_endpoint="datacatalog.googleapis.com:8000" + ), + ) + assert client._transport._host == "datacatalog.googleapis.com:8000" + + +def test_policy_tag_manager_grpc_transport_channel(): + channel = grpc.insecure_channel("http://localhost/") + + # Check that if channel is provided, mtls endpoint and client_cert_source + # won't be used. + callback = mock.MagicMock() + transport = transports.PolicyTagManagerGrpcTransport( + host="squid.clam.whelk", + channel=channel, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=callback, + ) + assert transport.grpc_channel == channel + assert transport._host == "squid.clam.whelk:443" + assert not callback.called + + +def test_policy_tag_manager_grpc_asyncio_transport_channel(): + channel = aio.insecure_channel("http://localhost/") + + # Check that if channel is provided, mtls endpoint and client_cert_source + # won't be used. + callback = mock.MagicMock() + transport = transports.PolicyTagManagerGrpcAsyncIOTransport( + host="squid.clam.whelk", + channel=channel, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=callback, + ) + assert transport.grpc_channel == channel + assert transport._host == "squid.clam.whelk:443" + assert not callback.called + + +@mock.patch("grpc.ssl_channel_credentials", autospec=True) +@mock.patch("google.api_core.grpc_helpers.create_channel", autospec=True) +def test_policy_tag_manager_grpc_transport_channel_mtls_with_client_cert_source( + grpc_create_channel, grpc_ssl_channel_cred +): + # Check that if channel is None, but api_mtls_endpoint and client_cert_source + # are provided, then a mTLS channel will be created. + mock_cred = mock.Mock() + + mock_ssl_cred = mock.Mock() + grpc_ssl_channel_cred.return_value = mock_ssl_cred + + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + transport = transports.PolicyTagManagerGrpcTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=client_cert_source_callback, + ) + grpc_ssl_channel_cred.assert_called_once_with( + certificate_chain=b"cert bytes", private_key=b"key bytes" + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@mock.patch("grpc.ssl_channel_credentials", autospec=True) +@mock.patch("google.api_core.grpc_helpers_async.create_channel", autospec=True) +def test_policy_tag_manager_grpc_asyncio_transport_channel_mtls_with_client_cert_source( + grpc_create_channel, grpc_ssl_channel_cred +): + # Check that if channel is None, but api_mtls_endpoint and client_cert_source + # are provided, then a mTLS channel will be created. + mock_cred = mock.Mock() + + mock_ssl_cred = mock.Mock() + grpc_ssl_channel_cred.return_value = mock_ssl_cred + + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + transport = transports.PolicyTagManagerGrpcAsyncIOTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=client_cert_source_callback, + ) + grpc_ssl_channel_cred.assert_called_once_with( + certificate_chain=b"cert bytes", private_key=b"key bytes" + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@pytest.mark.parametrize( + "api_mtls_endpoint", ["mtls.squid.clam.whelk", "mtls.squid.clam.whelk:443"] +) +@mock.patch("google.api_core.grpc_helpers.create_channel", autospec=True) +def test_policy_tag_manager_grpc_transport_channel_mtls_with_adc( + grpc_create_channel, api_mtls_endpoint +): + # Check that if channel and client_cert_source are None, but api_mtls_endpoint + # is provided, then a mTLS channel will be created with SSL ADC. + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + # Mock google.auth.transport.grpc.SslCredentials class. + mock_ssl_cred = mock.Mock() + with mock.patch.multiple( + "google.auth.transport.grpc.SslCredentials", + __init__=mock.Mock(return_value=None), + ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), + ): + mock_cred = mock.Mock() + transport = transports.PolicyTagManagerGrpcTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint=api_mtls_endpoint, + client_cert_source=None, + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@pytest.mark.parametrize( + "api_mtls_endpoint", ["mtls.squid.clam.whelk", "mtls.squid.clam.whelk:443"] +) +@mock.patch("google.api_core.grpc_helpers_async.create_channel", autospec=True) +def test_policy_tag_manager_grpc_asyncio_transport_channel_mtls_with_adc( + grpc_create_channel, api_mtls_endpoint +): + # Check that if channel and client_cert_source are None, but api_mtls_endpoint + # is provided, then a mTLS channel will be created with SSL ADC. + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + # Mock google.auth.transport.grpc.SslCredentials class. + mock_ssl_cred = mock.Mock() + with mock.patch.multiple( + "google.auth.transport.grpc.SslCredentials", + __init__=mock.Mock(return_value=None), + ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), + ): + mock_cred = mock.Mock() + transport = transports.PolicyTagManagerGrpcAsyncIOTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint=api_mtls_endpoint, + client_cert_source=None, + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +def test_taxonomy_path(): + project = "squid" + location = "clam" + taxonomy = "whelk" + + expected = "projects/{project}/locations/{location}/taxonomies/{taxonomy}".format( + project=project, location=location, taxonomy=taxonomy, + ) + actual = PolicyTagManagerClient.taxonomy_path(project, location, taxonomy) + assert expected == actual + + +def test_parse_taxonomy_path(): + expected = { + "project": "octopus", + "location": "oyster", + "taxonomy": "nudibranch", + } + path = PolicyTagManagerClient.taxonomy_path(**expected) + + # Check that the path construction is reversible. + actual = PolicyTagManagerClient.parse_taxonomy_path(path) + assert expected == actual + + +def test_policy_tag_path(): + project = "squid" + location = "clam" + taxonomy = "whelk" + policy_tag = "octopus" + + expected = "projects/{project}/locations/{location}/taxonomies/{taxonomy}/policyTags/{policy_tag}".format( + project=project, location=location, taxonomy=taxonomy, policy_tag=policy_tag, + ) + actual = PolicyTagManagerClient.policy_tag_path( + project, location, taxonomy, policy_tag + ) + assert expected == actual + + +def test_parse_policy_tag_path(): + expected = { + "project": "oyster", + "location": "nudibranch", + "taxonomy": "cuttlefish", + "policy_tag": "mussel", + } + path = PolicyTagManagerClient.policy_tag_path(**expected) + + # Check that the path construction is reversible. + actual = PolicyTagManagerClient.parse_policy_tag_path(path) + assert expected == actual diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py new file mode 100644 index 00000000..9676d368 --- /dev/null +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py @@ -0,0 +1,967 @@ +# -*- coding: utf-8 -*- + +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import os +import mock + +import grpc +from grpc.experimental import aio +import math +import pytest +from proto.marshal.rules.dates import DurationRule, TimestampRule + +from google import auth +from google.api_core import client_options +from google.api_core import exceptions +from google.api_core import grpc_helpers +from google.api_core import grpc_helpers_async +from google.auth import credentials +from google.auth.exceptions import MutualTLSChannelError +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization import ( + PolicyTagManagerSerializationAsyncClient, +) +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization import ( + PolicyTagManagerSerializationClient, +) +from google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization import ( + transports, +) +from google.cloud.datacatalog_v1beta1.types import policytagmanager +from google.cloud.datacatalog_v1beta1.types import policytagmanagerserialization +from google.oauth2 import service_account + + +def client_cert_source_callback(): + return b"cert bytes", b"key bytes" + + +# If default endpoint is localhost, then default mtls endpoint will be the same. +# This method modifies the default endpoint so the client can produce a different +# mtls endpoint for endpoint testing purposes. +def modify_default_endpoint(client): + return ( + "foo.googleapis.com" + if ("localhost" in client.DEFAULT_ENDPOINT) + else client.DEFAULT_ENDPOINT + ) + + +def test__get_default_mtls_endpoint(): + api_endpoint = "example.googleapis.com" + api_mtls_endpoint = "example.mtls.googleapis.com" + sandbox_endpoint = "example.sandbox.googleapis.com" + sandbox_mtls_endpoint = "example.mtls.sandbox.googleapis.com" + non_googleapi = "api.example.com" + + assert PolicyTagManagerSerializationClient._get_default_mtls_endpoint(None) is None + assert ( + PolicyTagManagerSerializationClient._get_default_mtls_endpoint(api_endpoint) + == api_mtls_endpoint + ) + assert ( + PolicyTagManagerSerializationClient._get_default_mtls_endpoint( + api_mtls_endpoint + ) + == api_mtls_endpoint + ) + assert ( + PolicyTagManagerSerializationClient._get_default_mtls_endpoint(sandbox_endpoint) + == sandbox_mtls_endpoint + ) + assert ( + PolicyTagManagerSerializationClient._get_default_mtls_endpoint( + sandbox_mtls_endpoint + ) + == sandbox_mtls_endpoint + ) + assert ( + PolicyTagManagerSerializationClient._get_default_mtls_endpoint(non_googleapi) + == non_googleapi + ) + + +@pytest.mark.parametrize( + "client_class", + [PolicyTagManagerSerializationClient, PolicyTagManagerSerializationAsyncClient], +) +def test_policy_tag_manager_serialization_client_from_service_account_file( + client_class, +): + creds = credentials.AnonymousCredentials() + with mock.patch.object( + service_account.Credentials, "from_service_account_file" + ) as factory: + factory.return_value = creds + client = client_class.from_service_account_file("dummy/file/path.json") + assert client._transport._credentials == creds + + client = client_class.from_service_account_json("dummy/file/path.json") + assert client._transport._credentials == creds + + assert client._transport._host == "datacatalog.googleapis.com:443" + + +def test_policy_tag_manager_serialization_client_get_transport_class(): + transport = PolicyTagManagerSerializationClient.get_transport_class() + assert transport == transports.PolicyTagManagerSerializationGrpcTransport + + transport = PolicyTagManagerSerializationClient.get_transport_class("grpc") + assert transport == transports.PolicyTagManagerSerializationGrpcTransport + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + ( + PolicyTagManagerSerializationClient, + transports.PolicyTagManagerSerializationGrpcTransport, + "grpc", + ), + ( + PolicyTagManagerSerializationAsyncClient, + transports.PolicyTagManagerSerializationGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +@mock.patch.object( + PolicyTagManagerSerializationClient, + "DEFAULT_ENDPOINT", + modify_default_endpoint(PolicyTagManagerSerializationClient), +) +@mock.patch.object( + PolicyTagManagerSerializationAsyncClient, + "DEFAULT_ENDPOINT", + modify_default_endpoint(PolicyTagManagerSerializationAsyncClient), +) +def test_policy_tag_manager_serialization_client_client_options( + client_class, transport_class, transport_name +): + # Check that if channel is provided we won't create a new one. + with mock.patch.object( + PolicyTagManagerSerializationClient, "get_transport_class" + ) as gtc: + transport = transport_class(credentials=credentials.AnonymousCredentials()) + client = client_class(transport=transport) + gtc.assert_not_called() + + # Check that if channel is provided via str we will create a new one. + with mock.patch.object( + PolicyTagManagerSerializationClient, "get_transport_class" + ) as gtc: + client = client_class(transport=transport_name) + gtc.assert_called() + + # Check the case api_endpoint is provided. + options = client_options.ClientOptions(api_endpoint="squid.clam.whelk") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host="squid.clam.whelk", + scopes=None, + api_mtls_endpoint="squid.clam.whelk", + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS is + # "never". + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "never"}): + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS is + # "always". + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "always"}): + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", and client_cert_source is provided. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + options = client_options.ClientOptions( + client_cert_source=client_cert_source_callback + ) + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=client_cert_source_callback, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", and default_client_cert_source is provided. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=True, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_MTLS_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_MTLS_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided, GOOGLE_API_USE_MTLS is + # "auto", but client_cert_source and default_client_cert_source are None. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "auto"}): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=False, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS has + # unsupported value. + with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS": "Unsupported"}): + with pytest.raises(MutualTLSChannelError): + client = client_class() + + # Check the case quota_project_id is provided + options = client_options.ClientOptions(quota_project_id="octopus") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id="octopus", + ) + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + ( + PolicyTagManagerSerializationClient, + transports.PolicyTagManagerSerializationGrpcTransport, + "grpc", + ), + ( + PolicyTagManagerSerializationAsyncClient, + transports.PolicyTagManagerSerializationGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +def test_policy_tag_manager_serialization_client_client_options_scopes( + client_class, transport_class, transport_name +): + # Check the case scopes are provided. + options = client_options.ClientOptions(scopes=["1", "2"],) + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=["1", "2"], + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + +@pytest.mark.parametrize( + "client_class,transport_class,transport_name", + [ + ( + PolicyTagManagerSerializationClient, + transports.PolicyTagManagerSerializationGrpcTransport, + "grpc", + ), + ( + PolicyTagManagerSerializationAsyncClient, + transports.PolicyTagManagerSerializationGrpcAsyncIOTransport, + "grpc_asyncio", + ), + ], +) +def test_policy_tag_manager_serialization_client_client_options_credentials_file( + client_class, transport_class, transport_name +): + # Check the case credentials file is provided. + options = client_options.ClientOptions(credentials_file="credentials.json") + with mock.patch.object(transport_class, "__init__") as patched: + patched.return_value = None + client = client_class(client_options=options) + patched.assert_called_once_with( + credentials=None, + credentials_file="credentials.json", + host=client.DEFAULT_ENDPOINT, + scopes=None, + api_mtls_endpoint=client.DEFAULT_ENDPOINT, + client_cert_source=None, + quota_project_id=None, + ) + + +def test_policy_tag_manager_serialization_client_client_options_from_dict(): + with mock.patch( + "google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization.transports.PolicyTagManagerSerializationGrpcTransport.__init__" + ) as grpc_transport: + grpc_transport.return_value = None + client = PolicyTagManagerSerializationClient( + client_options={"api_endpoint": "squid.clam.whelk"} + ) + grpc_transport.assert_called_once_with( + credentials=None, + credentials_file=None, + host="squid.clam.whelk", + scopes=None, + api_mtls_endpoint="squid.clam.whelk", + client_cert_source=None, + quota_project_id=None, + ) + + +def test_import_taxonomies( + transport: str = "grpc", + request_type=policytagmanagerserialization.ImportTaxonomiesRequest, +): + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.import_taxonomies), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanagerserialization.ImportTaxonomiesResponse() + + response = client.import_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanagerserialization.ImportTaxonomiesRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanagerserialization.ImportTaxonomiesResponse) + + +def test_import_taxonomies_from_dict(): + test_import_taxonomies(request_type=dict) + + +@pytest.mark.asyncio +async def test_import_taxonomies_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerSerializationAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanagerserialization.ImportTaxonomiesRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.import_taxonomies), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanagerserialization.ImportTaxonomiesResponse() + ) + + response = await client.import_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanagerserialization.ImportTaxonomiesResponse) + + +def test_import_taxonomies_field_headers(): + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanagerserialization.ImportTaxonomiesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.import_taxonomies), "__call__" + ) as call: + call.return_value = policytagmanagerserialization.ImportTaxonomiesResponse() + + client.import_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_import_taxonomies_field_headers_async(): + client = PolicyTagManagerSerializationAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanagerserialization.ImportTaxonomiesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.import_taxonomies), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanagerserialization.ImportTaxonomiesResponse() + ) + + await client.import_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_export_taxonomies( + transport: str = "grpc", + request_type=policytagmanagerserialization.ExportTaxonomiesRequest, +): + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = request_type() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.export_taxonomies), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = policytagmanagerserialization.ExportTaxonomiesResponse() + + response = client.export_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanagerserialization.ExportTaxonomiesRequest() + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanagerserialization.ExportTaxonomiesResponse) + + +def test_export_taxonomies_from_dict(): + test_export_taxonomies(request_type=dict) + + +@pytest.mark.asyncio +async def test_export_taxonomies_async(transport: str = "grpc_asyncio"): + client = PolicyTagManagerSerializationAsyncClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # Everything is optional in proto3 as far as the runtime is concerned, + # and we are mocking out the actual API, so just send an empty request. + request = policytagmanagerserialization.ExportTaxonomiesRequest() + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.export_taxonomies), "__call__" + ) as call: + # Designate an appropriate return value for the call. + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanagerserialization.ExportTaxonomiesResponse() + ) + + response = await client.export_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + + assert args[0] == request + + # Establish that the response is the type that we expect. + assert isinstance(response, policytagmanagerserialization.ExportTaxonomiesResponse) + + +def test_export_taxonomies_field_headers(): + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanagerserialization.ExportTaxonomiesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._transport.export_taxonomies), "__call__" + ) as call: + call.return_value = policytagmanagerserialization.ExportTaxonomiesResponse() + + client.export_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) == 1 + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +@pytest.mark.asyncio +async def test_export_taxonomies_field_headers_async(): + client = PolicyTagManagerSerializationAsyncClient( + credentials=credentials.AnonymousCredentials(), + ) + + # Any value that is part of the HTTP/1.1 URI should be sent as + # a field header. Set these to a non-empty value. + request = policytagmanagerserialization.ExportTaxonomiesRequest() + request.parent = "parent/value" + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client._client._transport.export_taxonomies), "__call__" + ) as call: + call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( + policytagmanagerserialization.ExportTaxonomiesResponse() + ) + + await client.export_taxonomies(request) + + # Establish that the underlying gRPC stub method was called. + assert len(call.mock_calls) + _, args, _ = call.mock_calls[0] + assert args[0] == request + + # Establish that the field header was sent. + _, _, kw = call.mock_calls[0] + assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"] + + +def test_credentials_transport_error(): + # It is an error to provide credentials and a transport instance. + transport = transports.PolicyTagManagerSerializationGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), transport=transport, + ) + + # It is an error to provide a credentials file and a transport instance. + transport = transports.PolicyTagManagerSerializationGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = PolicyTagManagerSerializationClient( + client_options={"credentials_file": "credentials.json"}, + transport=transport, + ) + + # It is an error to provide scopes and a transport instance. + transport = transports.PolicyTagManagerSerializationGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + with pytest.raises(ValueError): + client = PolicyTagManagerSerializationClient( + client_options={"scopes": ["1", "2"]}, transport=transport, + ) + + +def test_transport_instance(): + # A client may be instantiated with a custom transport instance. + transport = transports.PolicyTagManagerSerializationGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + client = PolicyTagManagerSerializationClient(transport=transport) + assert client._transport is transport + + +def test_transport_get_channel(): + # A client may be instantiated with a custom transport instance. + transport = transports.PolicyTagManagerSerializationGrpcTransport( + credentials=credentials.AnonymousCredentials(), + ) + channel = transport.grpc_channel + assert channel + + transport = transports.PolicyTagManagerSerializationGrpcAsyncIOTransport( + credentials=credentials.AnonymousCredentials(), + ) + channel = transport.grpc_channel + assert channel + + +def test_transport_grpc_default(): + # A client should use the gRPC transport by default. + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), + ) + assert isinstance( + client._transport, transports.PolicyTagManagerSerializationGrpcTransport, + ) + + +def test_policy_tag_manager_serialization_base_transport_error(): + # Passing both a credentials object and credentials_file should raise an error + with pytest.raises(exceptions.DuplicateCredentialArgs): + transport = transports.PolicyTagManagerSerializationTransport( + credentials=credentials.AnonymousCredentials(), + credentials_file="credentials.json", + ) + + +def test_policy_tag_manager_serialization_base_transport(): + # Instantiate the base transport. + with mock.patch( + "google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization.transports.PolicyTagManagerSerializationTransport.__init__" + ) as Transport: + Transport.return_value = None + transport = transports.PolicyTagManagerSerializationTransport( + credentials=credentials.AnonymousCredentials(), + ) + + # Every method on the transport should just blindly + # raise NotImplementedError. + methods = ( + "import_taxonomies", + "export_taxonomies", + ) + for method in methods: + with pytest.raises(NotImplementedError): + getattr(transport, method)(request=object()) + + +def test_policy_tag_manager_serialization_base_transport_with_credentials_file(): + # Instantiate the base transport with a credentials file + with mock.patch.object( + auth, "load_credentials_from_file" + ) as load_creds, mock.patch( + "google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization.transports.PolicyTagManagerSerializationTransport._prep_wrapped_messages" + ) as Transport: + Transport.return_value = None + load_creds.return_value = (credentials.AnonymousCredentials(), None) + transport = transports.PolicyTagManagerSerializationTransport( + credentials_file="credentials.json", quota_project_id="octopus", + ) + load_creds.assert_called_once_with( + "credentials.json", + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id="octopus", + ) + + +def test_policy_tag_manager_serialization_auth_adc(): + # If no credentials are provided, we should use ADC credentials. + with mock.patch.object(auth, "default") as adc: + adc.return_value = (credentials.AnonymousCredentials(), None) + PolicyTagManagerSerializationClient() + adc.assert_called_once_with( + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id=None, + ) + + +def test_policy_tag_manager_serialization_transport_auth_adc(): + # If credentials and host are not provided, the transport class should use + # ADC credentials. + with mock.patch.object(auth, "default") as adc: + adc.return_value = (credentials.AnonymousCredentials(), None) + transports.PolicyTagManagerSerializationGrpcTransport( + host="squid.clam.whelk", quota_project_id="octopus" + ) + adc.assert_called_once_with( + scopes=("https://www.googleapis.com/auth/cloud-platform",), + quota_project_id="octopus", + ) + + +def test_policy_tag_manager_serialization_host_no_port(): + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), + client_options=client_options.ClientOptions( + api_endpoint="datacatalog.googleapis.com" + ), + ) + assert client._transport._host == "datacatalog.googleapis.com:443" + + +def test_policy_tag_manager_serialization_host_with_port(): + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), + client_options=client_options.ClientOptions( + api_endpoint="datacatalog.googleapis.com:8000" + ), + ) + assert client._transport._host == "datacatalog.googleapis.com:8000" + + +def test_policy_tag_manager_serialization_grpc_transport_channel(): + channel = grpc.insecure_channel("http://localhost/") + + # Check that if channel is provided, mtls endpoint and client_cert_source + # won't be used. + callback = mock.MagicMock() + transport = transports.PolicyTagManagerSerializationGrpcTransport( + host="squid.clam.whelk", + channel=channel, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=callback, + ) + assert transport.grpc_channel == channel + assert transport._host == "squid.clam.whelk:443" + assert not callback.called + + +def test_policy_tag_manager_serialization_grpc_asyncio_transport_channel(): + channel = aio.insecure_channel("http://localhost/") + + # Check that if channel is provided, mtls endpoint and client_cert_source + # won't be used. + callback = mock.MagicMock() + transport = transports.PolicyTagManagerSerializationGrpcAsyncIOTransport( + host="squid.clam.whelk", + channel=channel, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=callback, + ) + assert transport.grpc_channel == channel + assert transport._host == "squid.clam.whelk:443" + assert not callback.called + + +@mock.patch("grpc.ssl_channel_credentials", autospec=True) +@mock.patch("google.api_core.grpc_helpers.create_channel", autospec=True) +def test_policy_tag_manager_serialization_grpc_transport_channel_mtls_with_client_cert_source( + grpc_create_channel, grpc_ssl_channel_cred +): + # Check that if channel is None, but api_mtls_endpoint and client_cert_source + # are provided, then a mTLS channel will be created. + mock_cred = mock.Mock() + + mock_ssl_cred = mock.Mock() + grpc_ssl_channel_cred.return_value = mock_ssl_cred + + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + transport = transports.PolicyTagManagerSerializationGrpcTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=client_cert_source_callback, + ) + grpc_ssl_channel_cred.assert_called_once_with( + certificate_chain=b"cert bytes", private_key=b"key bytes" + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@mock.patch("grpc.ssl_channel_credentials", autospec=True) +@mock.patch("google.api_core.grpc_helpers_async.create_channel", autospec=True) +def test_policy_tag_manager_serialization_grpc_asyncio_transport_channel_mtls_with_client_cert_source( + grpc_create_channel, grpc_ssl_channel_cred +): + # Check that if channel is None, but api_mtls_endpoint and client_cert_source + # are provided, then a mTLS channel will be created. + mock_cred = mock.Mock() + + mock_ssl_cred = mock.Mock() + grpc_ssl_channel_cred.return_value = mock_ssl_cred + + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + transport = transports.PolicyTagManagerSerializationGrpcAsyncIOTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint="mtls.squid.clam.whelk", + client_cert_source=client_cert_source_callback, + ) + grpc_ssl_channel_cred.assert_called_once_with( + certificate_chain=b"cert bytes", private_key=b"key bytes" + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@pytest.mark.parametrize( + "api_mtls_endpoint", ["mtls.squid.clam.whelk", "mtls.squid.clam.whelk:443"] +) +@mock.patch("google.api_core.grpc_helpers.create_channel", autospec=True) +def test_policy_tag_manager_serialization_grpc_transport_channel_mtls_with_adc( + grpc_create_channel, api_mtls_endpoint +): + # Check that if channel and client_cert_source are None, but api_mtls_endpoint + # is provided, then a mTLS channel will be created with SSL ADC. + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + # Mock google.auth.transport.grpc.SslCredentials class. + mock_ssl_cred = mock.Mock() + with mock.patch.multiple( + "google.auth.transport.grpc.SslCredentials", + __init__=mock.Mock(return_value=None), + ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), + ): + mock_cred = mock.Mock() + transport = transports.PolicyTagManagerSerializationGrpcTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint=api_mtls_endpoint, + client_cert_source=None, + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel + + +@pytest.mark.parametrize( + "api_mtls_endpoint", ["mtls.squid.clam.whelk", "mtls.squid.clam.whelk:443"] +) +@mock.patch("google.api_core.grpc_helpers_async.create_channel", autospec=True) +def test_policy_tag_manager_serialization_grpc_asyncio_transport_channel_mtls_with_adc( + grpc_create_channel, api_mtls_endpoint +): + # Check that if channel and client_cert_source are None, but api_mtls_endpoint + # is provided, then a mTLS channel will be created with SSL ADC. + mock_grpc_channel = mock.Mock() + grpc_create_channel.return_value = mock_grpc_channel + + # Mock google.auth.transport.grpc.SslCredentials class. + mock_ssl_cred = mock.Mock() + with mock.patch.multiple( + "google.auth.transport.grpc.SslCredentials", + __init__=mock.Mock(return_value=None), + ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), + ): + mock_cred = mock.Mock() + transport = transports.PolicyTagManagerSerializationGrpcAsyncIOTransport( + host="squid.clam.whelk", + credentials=mock_cred, + api_mtls_endpoint=api_mtls_endpoint, + client_cert_source=None, + ) + grpc_create_channel.assert_called_once_with( + "mtls.squid.clam.whelk:443", + credentials=mock_cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_cred, + quota_project_id=None, + ) + assert transport.grpc_channel == mock_grpc_channel diff --git a/tests/unit/gapic/v1/test_data_catalog_client_v1.py b/tests/unit/gapic/v1/test_data_catalog_client_v1.py deleted file mode 100644 index 6d36cdcd..00000000 --- a/tests/unit/gapic/v1/test_data_catalog_client_v1.py +++ /dev/null @@ -1,1268 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Unit tests.""" - -import mock -import pytest - -from google.cloud import datacatalog_v1 -from google.cloud.datacatalog_v1.proto import datacatalog_pb2 -from google.cloud.datacatalog_v1.proto import search_pb2 -from google.cloud.datacatalog_v1.proto import tags_pb2 -from google.iam.v1 import iam_policy_pb2 -from google.iam.v1 import policy_pb2 -from google.protobuf import empty_pb2 - - -class MultiCallableStub(object): - """Stub for the grpc.UnaryUnaryMultiCallable interface.""" - - def __init__(self, method, channel_stub): - self.method = method - self.channel_stub = channel_stub - - def __call__(self, request, timeout=None, metadata=None, credentials=None): - self.channel_stub.requests.append((self.method, request)) - - response = None - if self.channel_stub.responses: - response = self.channel_stub.responses.pop() - - if isinstance(response, Exception): - raise response - - if response: - return response - - -class ChannelStub(object): - """Stub for the grpc.Channel interface.""" - - def __init__(self, responses=[]): - self.responses = responses - self.requests = [] - - def unary_unary(self, method, request_serializer=None, response_deserializer=None): - return MultiCallableStub(method, self) - - -class CustomException(Exception): - pass - - -class TestDataCatalogClient(object): - def test_search_catalog(self): - # Setup Expected Response - next_page_token = "" - results_element = {} - results = [results_element] - expected_response = {"next_page_token": next_page_token, "results": results} - expected_response = datacatalog_pb2.SearchCatalogResponse(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - scope = {} - query = "query107944136" - - paged_list_response = client.search_catalog(scope, query) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.results[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.SearchCatalogRequest( - scope=scope, query=query - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_search_catalog_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - scope = {} - query = "query107944136" - - paged_list_response = client.search_catalog(scope, query) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_create_entry_group(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.EntryGroup(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - parent = client.location_path("[PROJECT]", "[LOCATION]") - entry_group_id = "entryGroupId-43122680" - - response = client.create_entry_group(parent, entry_group_id) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateEntryGroupRequest( - parent=parent, entry_group_id=entry_group_id - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_entry_group_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - parent = client.location_path("[PROJECT]", "[LOCATION]") - entry_group_id = "entryGroupId-43122680" - - with pytest.raises(CustomException): - client.create_entry_group(parent, entry_group_id) - - def test_get_entry_group(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name_2, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.EntryGroup(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - response = client.get_entry_group(name) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.GetEntryGroupRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_entry_group_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - with pytest.raises(CustomException): - client.get_entry_group(name) - - def test_update_entry_group(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.EntryGroup(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - entry_group = {} - - response = client.update_entry_group(entry_group) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateEntryGroupRequest( - entry_group=entry_group - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_entry_group_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - entry_group = {} - - with pytest.raises(CustomException): - client.update_entry_group(entry_group) - - def test_delete_entry_group(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - client.delete_entry_group(name) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteEntryGroupRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_entry_group_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - with pytest.raises(CustomException): - client.delete_entry_group(name) - - def test_list_entry_groups(self): - # Setup Expected Response - next_page_token = "" - entry_groups_element = {} - entry_groups = [entry_groups_element] - expected_response = { - "next_page_token": next_page_token, - "entry_groups": entry_groups, - } - expected_response = datacatalog_pb2.ListEntryGroupsResponse(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - paged_list_response = client.list_entry_groups(parent) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.entry_groups[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.ListEntryGroupsRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_list_entry_groups_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - paged_list_response = client.list_entry_groups(parent) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_create_entry(self): - # Setup Expected Response - name = "name3373707" - linked_resource = "linkedResource1544625012" - user_specified_type = "userSpecifiedType-940364963" - user_specified_system = "userSpecifiedSystem-1776119406" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "linked_resource": linked_resource, - "user_specified_type": user_specified_type, - "user_specified_system": user_specified_system, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.Entry(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - entry_id = "entryId-2093663224" - entry = {} - - response = client.create_entry(parent, entry_id, entry) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateEntryRequest( - parent=parent, entry_id=entry_id, entry=entry - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - entry_id = "entryId-2093663224" - entry = {} - - with pytest.raises(CustomException): - client.create_entry(parent, entry_id, entry) - - def test_update_entry(self): - # Setup Expected Response - name = "name3373707" - linked_resource = "linkedResource1544625012" - user_specified_type = "userSpecifiedType-940364963" - user_specified_system = "userSpecifiedSystem-1776119406" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "linked_resource": linked_resource, - "user_specified_type": user_specified_type, - "user_specified_system": user_specified_system, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.Entry(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - entry = {} - - response = client.update_entry(entry) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateEntryRequest(entry=entry) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - entry = {} - - with pytest.raises(CustomException): - client.update_entry(entry) - - def test_delete_entry(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - client.delete_entry(name) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteEntryRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - with pytest.raises(CustomException): - client.delete_entry(name) - - def test_get_entry(self): - # Setup Expected Response - name_2 = "name2-1052831874" - linked_resource = "linkedResource1544625012" - user_specified_type = "userSpecifiedType-940364963" - user_specified_system = "userSpecifiedSystem-1776119406" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name_2, - "linked_resource": linked_resource, - "user_specified_type": user_specified_type, - "user_specified_system": user_specified_system, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.Entry(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - response = client.get_entry(name) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.GetEntryRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - with pytest.raises(CustomException): - client.get_entry(name) - - def test_lookup_entry(self): - # Setup Expected Response - name = "name3373707" - linked_resource = "linkedResource1544625012" - user_specified_type = "userSpecifiedType-940364963" - user_specified_system = "userSpecifiedSystem-1776119406" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "linked_resource": linked_resource, - "user_specified_type": user_specified_type, - "user_specified_system": user_specified_system, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.Entry(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - response = client.lookup_entry() - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.LookupEntryRequest() - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_lookup_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - with pytest.raises(CustomException): - client.lookup_entry() - - def test_list_entries(self): - # Setup Expected Response - next_page_token = "" - entries_element = {} - entries = [entries_element] - expected_response = {"next_page_token": next_page_token, "entries": entries} - expected_response = datacatalog_pb2.ListEntriesResponse(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - paged_list_response = client.list_entries(parent) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.entries[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.ListEntriesRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_list_entries_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - paged_list_response = client.list_entries(parent) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_create_tag_template(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - expected_response = {"name": name, "display_name": display_name} - expected_response = tags_pb2.TagTemplate(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - parent = client.location_path("[PROJECT]", "[LOCATION]") - tag_template_id = "tagTemplateId-2020335141" - tag_template = {} - - response = client.create_tag_template(parent, tag_template_id, tag_template) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateTagTemplateRequest( - parent=parent, tag_template_id=tag_template_id, tag_template=tag_template - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_tag_template_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - parent = client.location_path("[PROJECT]", "[LOCATION]") - tag_template_id = "tagTemplateId-2020335141" - tag_template = {} - - with pytest.raises(CustomException): - client.create_tag_template(parent, tag_template_id, tag_template) - - def test_get_tag_template(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - expected_response = {"name": name_2, "display_name": display_name} - expected_response = tags_pb2.TagTemplate(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - - response = client.get_tag_template(name) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.GetTagTemplateRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_tag_template_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - - with pytest.raises(CustomException): - client.get_tag_template(name) - - def test_update_tag_template(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - expected_response = {"name": name, "display_name": display_name} - expected_response = tags_pb2.TagTemplate(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - tag_template = {} - - response = client.update_tag_template(tag_template) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateTagTemplateRequest( - tag_template=tag_template - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_tag_template_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - tag_template = {} - - with pytest.raises(CustomException): - client.update_tag_template(tag_template) - - def test_delete_tag_template(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - force = False - - client.delete_tag_template(name, force) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteTagTemplateRequest( - name=name, force=force - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_tag_template_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - force = False - - with pytest.raises(CustomException): - client.delete_tag_template(name, force) - - def test_create_tag_template_field(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - is_required = True - order = 106006350 - expected_response = { - "name": name, - "display_name": display_name, - "is_required": is_required, - "order": order, - } - expected_response = tags_pb2.TagTemplateField(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - parent = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - tag_template_field_id = "tagTemplateFieldId-92144832" - tag_template_field = {} - - response = client.create_tag_template_field( - parent, tag_template_field_id, tag_template_field - ) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateTagTemplateFieldRequest( - parent=parent, - tag_template_field_id=tag_template_field_id, - tag_template_field=tag_template_field, - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_tag_template_field_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - parent = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - tag_template_field_id = "tagTemplateFieldId-92144832" - tag_template_field = {} - - with pytest.raises(CustomException): - client.create_tag_template_field( - parent, tag_template_field_id, tag_template_field - ) - - def test_update_tag_template_field(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - is_required = True - order = 106006350 - expected_response = { - "name": name_2, - "display_name": display_name, - "is_required": is_required, - "order": order, - } - expected_response = tags_pb2.TagTemplateField(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - tag_template_field = {} - - response = client.update_tag_template_field(name, tag_template_field) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateTagTemplateFieldRequest( - name=name, tag_template_field=tag_template_field - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_tag_template_field_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - tag_template_field = {} - - with pytest.raises(CustomException): - client.update_tag_template_field(name, tag_template_field) - - def test_rename_tag_template_field(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - is_required = True - order = 106006350 - expected_response = { - "name": name_2, - "display_name": display_name, - "is_required": is_required, - "order": order, - } - expected_response = tags_pb2.TagTemplateField(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - new_tag_template_field_id = "newTagTemplateFieldId-1668354591" - - response = client.rename_tag_template_field(name, new_tag_template_field_id) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.RenameTagTemplateFieldRequest( - name=name, new_tag_template_field_id=new_tag_template_field_id - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_rename_tag_template_field_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - new_tag_template_field_id = "newTagTemplateFieldId-1668354591" - - with pytest.raises(CustomException): - client.rename_tag_template_field(name, new_tag_template_field_id) - - def test_delete_tag_template_field(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - force = False - - client.delete_tag_template_field(name, force) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteTagTemplateFieldRequest( - name=name, force=force - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_tag_template_field_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - force = False - - with pytest.raises(CustomException): - client.delete_tag_template_field(name, force) - - def test_create_tag(self): - # Setup Expected Response - name = "name3373707" - template = "template-1321546630" - template_display_name = "templateDisplayName-532252787" - column = "column-1354837162" - expected_response = { - "name": name, - "template": template, - "template_display_name": template_display_name, - "column": column, - } - expected_response = tags_pb2.Tag(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - parent = client.tag_path( - "[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]", "[TAG]" - ) - tag = {} - - response = client.create_tag(parent, tag) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateTagRequest(parent=parent, tag=tag) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - parent = client.tag_path( - "[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]", "[TAG]" - ) - tag = {} - - with pytest.raises(CustomException): - client.create_tag(parent, tag) - - def test_update_tag(self): - # Setup Expected Response - name = "name3373707" - template = "template-1321546630" - template_display_name = "templateDisplayName-532252787" - column = "column-1354837162" - expected_response = { - "name": name, - "template": template, - "template_display_name": template_display_name, - "column": column, - } - expected_response = tags_pb2.Tag(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - tag = {} - - response = client.update_tag(tag) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateTagRequest(tag=tag) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - tag = {} - - with pytest.raises(CustomException): - client.update_tag(tag) - - def test_delete_tag(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - client.delete_tag(name) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteTagRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - with pytest.raises(CustomException): - client.delete_tag(name) - - def test_list_tags(self): - # Setup Expected Response - next_page_token = "" - tags_element = {} - tags = [tags_element] - expected_response = {"next_page_token": next_page_token, "tags": tags} - expected_response = datacatalog_pb2.ListTagsResponse(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - parent = client.entry_path( - "[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]" - ) - - paged_list_response = client.list_tags(parent) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.tags[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.ListTagsRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_list_tags_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - parent = client.entry_path( - "[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]" - ) - - paged_list_response = client.list_tags(parent) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_set_iam_policy(self): - # Setup Expected Response - version = 351608024 - etag = b"21" - expected_response = {"version": version, "etag": etag} - expected_response = policy_pb2.Policy(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - resource = "resource-341064690" - policy = {} - - response = client.set_iam_policy(resource, policy) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = iam_policy_pb2.SetIamPolicyRequest( - resource=resource, policy=policy - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_set_iam_policy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - resource = "resource-341064690" - policy = {} - - with pytest.raises(CustomException): - client.set_iam_policy(resource, policy) - - def test_get_iam_policy(self): - # Setup Expected Response - version = 351608024 - etag = b"21" - expected_response = {"version": version, "etag": etag} - expected_response = policy_pb2.Policy(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - resource = "resource-341064690" - - response = client.get_iam_policy(resource) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = iam_policy_pb2.GetIamPolicyRequest(resource=resource) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_iam_policy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - resource = "resource-341064690" - - with pytest.raises(CustomException): - client.get_iam_policy(resource) - - def test_test_iam_permissions(self): - # Setup Expected Response - expected_response = {} - expected_response = iam_policy_pb2.TestIamPermissionsResponse( - **expected_response - ) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup Request - resource = "resource-341064690" - permissions = [] - - response = client.test_iam_permissions(resource, permissions) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = iam_policy_pb2.TestIamPermissionsRequest( - resource=resource, permissions=permissions - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_test_iam_permissions_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1.DataCatalogClient() - - # Setup request - resource = "resource-341064690" - permissions = [] - - with pytest.raises(CustomException): - client.test_iam_permissions(resource, permissions) diff --git a/tests/unit/gapic/v1beta1/test_data_catalog_client_v1beta1.py b/tests/unit/gapic/v1beta1/test_data_catalog_client_v1beta1.py deleted file mode 100644 index 9c04c196..00000000 --- a/tests/unit/gapic/v1beta1/test_data_catalog_client_v1beta1.py +++ /dev/null @@ -1,1268 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Unit tests.""" - -import mock -import pytest - -from google.cloud import datacatalog_v1beta1 -from google.cloud.datacatalog_v1beta1.proto import datacatalog_pb2 -from google.cloud.datacatalog_v1beta1.proto import search_pb2 -from google.cloud.datacatalog_v1beta1.proto import tags_pb2 -from google.iam.v1 import iam_policy_pb2 -from google.iam.v1 import policy_pb2 -from google.protobuf import empty_pb2 - - -class MultiCallableStub(object): - """Stub for the grpc.UnaryUnaryMultiCallable interface.""" - - def __init__(self, method, channel_stub): - self.method = method - self.channel_stub = channel_stub - - def __call__(self, request, timeout=None, metadata=None, credentials=None): - self.channel_stub.requests.append((self.method, request)) - - response = None - if self.channel_stub.responses: - response = self.channel_stub.responses.pop() - - if isinstance(response, Exception): - raise response - - if response: - return response - - -class ChannelStub(object): - """Stub for the grpc.Channel interface.""" - - def __init__(self, responses=[]): - self.responses = responses - self.requests = [] - - def unary_unary(self, method, request_serializer=None, response_deserializer=None): - return MultiCallableStub(method, self) - - -class CustomException(Exception): - pass - - -class TestDataCatalogClient(object): - def test_search_catalog(self): - # Setup Expected Response - next_page_token = "" - results_element = {} - results = [results_element] - expected_response = {"next_page_token": next_page_token, "results": results} - expected_response = datacatalog_pb2.SearchCatalogResponse(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - scope = {} - query = "query107944136" - - paged_list_response = client.search_catalog(scope, query) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.results[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.SearchCatalogRequest( - scope=scope, query=query - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_search_catalog_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - scope = {} - query = "query107944136" - - paged_list_response = client.search_catalog(scope, query) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_get_entry(self): - # Setup Expected Response - name_2 = "name2-1052831874" - linked_resource = "linkedResource1544625012" - user_specified_type = "userSpecifiedType-940364963" - user_specified_system = "userSpecifiedSystem-1776119406" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name_2, - "linked_resource": linked_resource, - "user_specified_type": user_specified_type, - "user_specified_system": user_specified_system, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.Entry(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - response = client.get_entry(name) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.GetEntryRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - with pytest.raises(CustomException): - client.get_entry(name) - - def test_lookup_entry(self): - # Setup Expected Response - name = "name3373707" - linked_resource = "linkedResource1544625012" - user_specified_type = "userSpecifiedType-940364963" - user_specified_system = "userSpecifiedSystem-1776119406" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "linked_resource": linked_resource, - "user_specified_type": user_specified_type, - "user_specified_system": user_specified_system, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.Entry(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - response = client.lookup_entry() - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.LookupEntryRequest() - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_lookup_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - with pytest.raises(CustomException): - client.lookup_entry() - - def test_create_entry_group(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.EntryGroup(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - parent = client.location_path("[PROJECT]", "[LOCATION]") - entry_group_id = "entryGroupId-43122680" - - response = client.create_entry_group(parent, entry_group_id) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateEntryGroupRequest( - parent=parent, entry_group_id=entry_group_id - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_entry_group_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - parent = client.location_path("[PROJECT]", "[LOCATION]") - entry_group_id = "entryGroupId-43122680" - - with pytest.raises(CustomException): - client.create_entry_group(parent, entry_group_id) - - def test_update_entry_group(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.EntryGroup(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - entry_group = {} - - response = client.update_entry_group(entry_group) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateEntryGroupRequest( - entry_group=entry_group - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_entry_group_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - entry_group = {} - - with pytest.raises(CustomException): - client.update_entry_group(entry_group) - - def test_get_entry_group(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name_2, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.EntryGroup(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - response = client.get_entry_group(name) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.GetEntryGroupRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_entry_group_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - with pytest.raises(CustomException): - client.get_entry_group(name) - - def test_delete_entry_group(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - client.delete_entry_group(name) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteEntryGroupRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_entry_group_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - with pytest.raises(CustomException): - client.delete_entry_group(name) - - def test_list_entry_groups(self): - # Setup Expected Response - next_page_token = "" - entry_groups_element = {} - entry_groups = [entry_groups_element] - expected_response = { - "next_page_token": next_page_token, - "entry_groups": entry_groups, - } - expected_response = datacatalog_pb2.ListEntryGroupsResponse(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - paged_list_response = client.list_entry_groups(parent) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.entry_groups[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.ListEntryGroupsRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_list_entry_groups_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - paged_list_response = client.list_entry_groups(parent) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_create_entry(self): - # Setup Expected Response - name = "name3373707" - linked_resource = "linkedResource1544625012" - user_specified_type = "userSpecifiedType-940364963" - user_specified_system = "userSpecifiedSystem-1776119406" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "linked_resource": linked_resource, - "user_specified_type": user_specified_type, - "user_specified_system": user_specified_system, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.Entry(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - entry_id = "entryId-2093663224" - entry = {} - - response = client.create_entry(parent, entry_id, entry) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateEntryRequest( - parent=parent, entry_id=entry_id, entry=entry - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - entry_id = "entryId-2093663224" - entry = {} - - with pytest.raises(CustomException): - client.create_entry(parent, entry_id, entry) - - def test_update_entry(self): - # Setup Expected Response - name = "name3373707" - linked_resource = "linkedResource1544625012" - user_specified_type = "userSpecifiedType-940364963" - user_specified_system = "userSpecifiedSystem-1776119406" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "linked_resource": linked_resource, - "user_specified_type": user_specified_type, - "user_specified_system": user_specified_system, - "display_name": display_name, - "description": description, - } - expected_response = datacatalog_pb2.Entry(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - entry = {} - - response = client.update_entry(entry) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateEntryRequest(entry=entry) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - entry = {} - - with pytest.raises(CustomException): - client.update_entry(entry) - - def test_delete_entry(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - client.delete_entry(name) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteEntryRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_entry_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - with pytest.raises(CustomException): - client.delete_entry(name) - - def test_list_entries(self): - # Setup Expected Response - next_page_token = "" - entries_element = {} - entries = [entries_element] - expected_response = {"next_page_token": next_page_token, "entries": entries} - expected_response = datacatalog_pb2.ListEntriesResponse(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - paged_list_response = client.list_entries(parent) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.entries[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.ListEntriesRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_list_entries_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - parent = client.entry_group_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]") - - paged_list_response = client.list_entries(parent) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_create_tag_template(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - expected_response = {"name": name, "display_name": display_name} - expected_response = tags_pb2.TagTemplate(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - parent = client.location_path("[PROJECT]", "[LOCATION]") - tag_template_id = "tagTemplateId-2020335141" - tag_template = {} - - response = client.create_tag_template(parent, tag_template_id, tag_template) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateTagTemplateRequest( - parent=parent, tag_template_id=tag_template_id, tag_template=tag_template - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_tag_template_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - parent = client.location_path("[PROJECT]", "[LOCATION]") - tag_template_id = "tagTemplateId-2020335141" - tag_template = {} - - with pytest.raises(CustomException): - client.create_tag_template(parent, tag_template_id, tag_template) - - def test_get_tag_template(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - expected_response = {"name": name_2, "display_name": display_name} - expected_response = tags_pb2.TagTemplate(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - - response = client.get_tag_template(name) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.GetTagTemplateRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_tag_template_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - - with pytest.raises(CustomException): - client.get_tag_template(name) - - def test_update_tag_template(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - expected_response = {"name": name, "display_name": display_name} - expected_response = tags_pb2.TagTemplate(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - tag_template = {} - - response = client.update_tag_template(tag_template) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateTagTemplateRequest( - tag_template=tag_template - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_tag_template_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - tag_template = {} - - with pytest.raises(CustomException): - client.update_tag_template(tag_template) - - def test_delete_tag_template(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - force = False - - client.delete_tag_template(name, force) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteTagTemplateRequest( - name=name, force=force - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_tag_template_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - force = False - - with pytest.raises(CustomException): - client.delete_tag_template(name, force) - - def test_create_tag_template_field(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - is_required = True - order = 106006350 - expected_response = { - "name": name, - "display_name": display_name, - "is_required": is_required, - "order": order, - } - expected_response = tags_pb2.TagTemplateField(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - parent = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - tag_template_field_id = "tagTemplateFieldId-92144832" - tag_template_field = {} - - response = client.create_tag_template_field( - parent, tag_template_field_id, tag_template_field - ) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateTagTemplateFieldRequest( - parent=parent, - tag_template_field_id=tag_template_field_id, - tag_template_field=tag_template_field, - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_tag_template_field_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - parent = client.tag_template_path("[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]") - tag_template_field_id = "tagTemplateFieldId-92144832" - tag_template_field = {} - - with pytest.raises(CustomException): - client.create_tag_template_field( - parent, tag_template_field_id, tag_template_field - ) - - def test_update_tag_template_field(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - is_required = True - order = 106006350 - expected_response = { - "name": name_2, - "display_name": display_name, - "is_required": is_required, - "order": order, - } - expected_response = tags_pb2.TagTemplateField(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - tag_template_field = {} - - response = client.update_tag_template_field(name, tag_template_field) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateTagTemplateFieldRequest( - name=name, tag_template_field=tag_template_field - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_tag_template_field_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - tag_template_field = {} - - with pytest.raises(CustomException): - client.update_tag_template_field(name, tag_template_field) - - def test_rename_tag_template_field(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - is_required = True - order = 106006350 - expected_response = { - "name": name_2, - "display_name": display_name, - "is_required": is_required, - "order": order, - } - expected_response = tags_pb2.TagTemplateField(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - new_tag_template_field_id = "newTagTemplateFieldId-1668354591" - - response = client.rename_tag_template_field(name, new_tag_template_field_id) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.RenameTagTemplateFieldRequest( - name=name, new_tag_template_field_id=new_tag_template_field_id - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_rename_tag_template_field_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - new_tag_template_field_id = "newTagTemplateFieldId-1668354591" - - with pytest.raises(CustomException): - client.rename_tag_template_field(name, new_tag_template_field_id) - - def test_delete_tag_template_field(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - force = False - - client.delete_tag_template_field(name, force) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteTagTemplateFieldRequest( - name=name, force=force - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_tag_template_field_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.tag_template_field_path( - "[PROJECT]", "[LOCATION]", "[TAG_TEMPLATE]", "[FIELD]" - ) - force = False - - with pytest.raises(CustomException): - client.delete_tag_template_field(name, force) - - def test_create_tag(self): - # Setup Expected Response - name = "name3373707" - template = "template-1321546630" - template_display_name = "templateDisplayName-532252787" - column = "column-1354837162" - expected_response = { - "name": name, - "template": template, - "template_display_name": template_display_name, - "column": column, - } - expected_response = tags_pb2.Tag(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - parent = client.tag_path( - "[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]", "[TAG]" - ) - tag = {} - - response = client.create_tag(parent, tag) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.CreateTagRequest(parent=parent, tag=tag) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - parent = client.tag_path( - "[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]", "[TAG]" - ) - tag = {} - - with pytest.raises(CustomException): - client.create_tag(parent, tag) - - def test_update_tag(self): - # Setup Expected Response - name = "name3373707" - template = "template-1321546630" - template_display_name = "templateDisplayName-532252787" - column = "column-1354837162" - expected_response = { - "name": name, - "template": template, - "template_display_name": template_display_name, - "column": column, - } - expected_response = tags_pb2.Tag(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - tag = {} - - response = client.update_tag(tag) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.UpdateTagRequest(tag=tag) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - tag = {} - - with pytest.raises(CustomException): - client.update_tag(tag) - - def test_delete_tag(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - client.delete_tag(name) - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.DeleteTagRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - name = client.entry_path("[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]") - - with pytest.raises(CustomException): - client.delete_tag(name) - - def test_list_tags(self): - # Setup Expected Response - next_page_token = "" - tags_element = {} - tags = [tags_element] - expected_response = {"next_page_token": next_page_token, "tags": tags} - expected_response = datacatalog_pb2.ListTagsResponse(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - parent = client.entry_path( - "[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]" - ) - - paged_list_response = client.list_tags(parent) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.tags[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = datacatalog_pb2.ListTagsRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_list_tags_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - parent = client.entry_path( - "[PROJECT]", "[LOCATION]", "[ENTRY_GROUP]", "[ENTRY]" - ) - - paged_list_response = client.list_tags(parent) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_set_iam_policy(self): - # Setup Expected Response - version = 351608024 - etag = b"21" - expected_response = {"version": version, "etag": etag} - expected_response = policy_pb2.Policy(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - resource = "resource-341064690" - policy = {} - - response = client.set_iam_policy(resource, policy) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = iam_policy_pb2.SetIamPolicyRequest( - resource=resource, policy=policy - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_set_iam_policy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - resource = "resource-341064690" - policy = {} - - with pytest.raises(CustomException): - client.set_iam_policy(resource, policy) - - def test_get_iam_policy(self): - # Setup Expected Response - version = 351608024 - etag = b"21" - expected_response = {"version": version, "etag": etag} - expected_response = policy_pb2.Policy(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - resource = "resource-341064690" - - response = client.get_iam_policy(resource) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = iam_policy_pb2.GetIamPolicyRequest(resource=resource) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_iam_policy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - resource = "resource-341064690" - - with pytest.raises(CustomException): - client.get_iam_policy(resource) - - def test_test_iam_permissions(self): - # Setup Expected Response - expected_response = {} - expected_response = iam_policy_pb2.TestIamPermissionsResponse( - **expected_response - ) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup Request - resource = "resource-341064690" - permissions = [] - - response = client.test_iam_permissions(resource, permissions) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = iam_policy_pb2.TestIamPermissionsRequest( - resource=resource, permissions=permissions - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_test_iam_permissions_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.DataCatalogClient() - - # Setup request - resource = "resource-341064690" - permissions = [] - - with pytest.raises(CustomException): - client.test_iam_permissions(resource, permissions) diff --git a/tests/unit/gapic/v1beta1/test_policy_tag_manager_client_v1beta1.py b/tests/unit/gapic/v1beta1/test_policy_tag_manager_client_v1beta1.py deleted file mode 100644 index 21a3e5b4..00000000 --- a/tests/unit/gapic/v1beta1/test_policy_tag_manager_client_v1beta1.py +++ /dev/null @@ -1,613 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Unit tests.""" - -import mock -import pytest - -from google.cloud import datacatalog_v1beta1 -from google.cloud.datacatalog_v1beta1.proto import policytagmanager_pb2 -from google.iam.v1 import iam_policy_pb2 -from google.iam.v1 import policy_pb2 -from google.protobuf import empty_pb2 - - -class MultiCallableStub(object): - """Stub for the grpc.UnaryUnaryMultiCallable interface.""" - - def __init__(self, method, channel_stub): - self.method = method - self.channel_stub = channel_stub - - def __call__(self, request, timeout=None, metadata=None, credentials=None): - self.channel_stub.requests.append((self.method, request)) - - response = None - if self.channel_stub.responses: - response = self.channel_stub.responses.pop() - - if isinstance(response, Exception): - raise response - - if response: - return response - - -class ChannelStub(object): - """Stub for the grpc.Channel interface.""" - - def __init__(self, responses=[]): - self.responses = responses - self.requests = [] - - def unary_unary(self, method, request_serializer=None, response_deserializer=None): - return MultiCallableStub(method, self) - - -class CustomException(Exception): - pass - - -class TestPolicyTagManagerClient(object): - def test_create_taxonomy(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "display_name": display_name, - "description": description, - } - expected_response = policytagmanager_pb2.Taxonomy(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - parent = client.location_path("[PROJECT]", "[LOCATION]") - - response = client.create_taxonomy(parent) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.CreateTaxonomyRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_taxonomy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - parent = client.location_path("[PROJECT]", "[LOCATION]") - - with pytest.raises(CustomException): - client.create_taxonomy(parent) - - def test_delete_taxonomy(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - name = client.taxonomy_path("[PROJECT]", "[LOCATION]", "[TAXONOMY]") - - client.delete_taxonomy(name) - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.DeleteTaxonomyRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_taxonomy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - name = client.taxonomy_path("[PROJECT]", "[LOCATION]", "[TAXONOMY]") - - with pytest.raises(CustomException): - client.delete_taxonomy(name) - - def test_update_taxonomy(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name, - "display_name": display_name, - "description": description, - } - expected_response = policytagmanager_pb2.Taxonomy(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - response = client.update_taxonomy() - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.UpdateTaxonomyRequest() - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_taxonomy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - with pytest.raises(CustomException): - client.update_taxonomy() - - def test_list_taxonomies(self): - # Setup Expected Response - next_page_token = "" - taxonomies_element = {} - taxonomies = [taxonomies_element] - expected_response = { - "next_page_token": next_page_token, - "taxonomies": taxonomies, - } - expected_response = policytagmanager_pb2.ListTaxonomiesResponse( - **expected_response - ) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - parent = client.location_path("[PROJECT]", "[LOCATION]") - - paged_list_response = client.list_taxonomies(parent) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.taxonomies[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.ListTaxonomiesRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_list_taxonomies_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - parent = client.location_path("[PROJECT]", "[LOCATION]") - - paged_list_response = client.list_taxonomies(parent) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_get_taxonomy(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - description = "description-1724546052" - expected_response = { - "name": name_2, - "display_name": display_name, - "description": description, - } - expected_response = policytagmanager_pb2.Taxonomy(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - name = client.taxonomy_path("[PROJECT]", "[LOCATION]", "[TAXONOMY]") - - response = client.get_taxonomy(name) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.GetTaxonomyRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_taxonomy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - name = client.taxonomy_path("[PROJECT]", "[LOCATION]", "[TAXONOMY]") - - with pytest.raises(CustomException): - client.get_taxonomy(name) - - def test_create_policy_tag(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - description = "description-1724546052" - parent_policy_tag = "parentPolicyTag2071382466" - expected_response = { - "name": name, - "display_name": display_name, - "description": description, - "parent_policy_tag": parent_policy_tag, - } - expected_response = policytagmanager_pb2.PolicyTag(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - parent = client.taxonomy_path("[PROJECT]", "[LOCATION]", "[TAXONOMY]") - - response = client.create_policy_tag(parent) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.CreatePolicyTagRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_create_policy_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - parent = client.taxonomy_path("[PROJECT]", "[LOCATION]", "[TAXONOMY]") - - with pytest.raises(CustomException): - client.create_policy_tag(parent) - - def test_delete_policy_tag(self): - channel = ChannelStub() - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - name = client.policy_tag_path( - "[PROJECT]", "[LOCATION]", "[TAXONOMY]", "[POLICY_TAG]" - ) - - client.delete_policy_tag(name) - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.DeletePolicyTagRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_delete_policy_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - name = client.policy_tag_path( - "[PROJECT]", "[LOCATION]", "[TAXONOMY]", "[POLICY_TAG]" - ) - - with pytest.raises(CustomException): - client.delete_policy_tag(name) - - def test_update_policy_tag(self): - # Setup Expected Response - name = "name3373707" - display_name = "displayName1615086568" - description = "description-1724546052" - parent_policy_tag = "parentPolicyTag2071382466" - expected_response = { - "name": name, - "display_name": display_name, - "description": description, - "parent_policy_tag": parent_policy_tag, - } - expected_response = policytagmanager_pb2.PolicyTag(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - response = client.update_policy_tag() - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.UpdatePolicyTagRequest() - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_update_policy_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - with pytest.raises(CustomException): - client.update_policy_tag() - - def test_list_policy_tags(self): - # Setup Expected Response - next_page_token = "" - policy_tags_element = {} - policy_tags = [policy_tags_element] - expected_response = { - "next_page_token": next_page_token, - "policy_tags": policy_tags, - } - expected_response = policytagmanager_pb2.ListPolicyTagsResponse( - **expected_response - ) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - parent = client.taxonomy_path("[PROJECT]", "[LOCATION]", "[TAXONOMY]") - - paged_list_response = client.list_policy_tags(parent) - resources = list(paged_list_response) - assert len(resources) == 1 - - assert expected_response.policy_tags[0] == resources[0] - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.ListPolicyTagsRequest(parent=parent) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_list_policy_tags_exception(self): - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - parent = client.taxonomy_path("[PROJECT]", "[LOCATION]", "[TAXONOMY]") - - paged_list_response = client.list_policy_tags(parent) - with pytest.raises(CustomException): - list(paged_list_response) - - def test_get_policy_tag(self): - # Setup Expected Response - name_2 = "name2-1052831874" - display_name = "displayName1615086568" - description = "description-1724546052" - parent_policy_tag = "parentPolicyTag2071382466" - expected_response = { - "name": name_2, - "display_name": display_name, - "description": description, - "parent_policy_tag": parent_policy_tag, - } - expected_response = policytagmanager_pb2.PolicyTag(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - name = client.policy_tag_path( - "[PROJECT]", "[LOCATION]", "[TAXONOMY]", "[POLICY_TAG]" - ) - - response = client.get_policy_tag(name) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = policytagmanager_pb2.GetPolicyTagRequest(name=name) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_policy_tag_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - name = client.policy_tag_path( - "[PROJECT]", "[LOCATION]", "[TAXONOMY]", "[POLICY_TAG]" - ) - - with pytest.raises(CustomException): - client.get_policy_tag(name) - - def test_get_iam_policy(self): - # Setup Expected Response - version = 351608024 - etag = b"21" - expected_response = {"version": version, "etag": etag} - expected_response = policy_pb2.Policy(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - resource = "resource-341064690" - - response = client.get_iam_policy(resource) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = iam_policy_pb2.GetIamPolicyRequest(resource=resource) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_get_iam_policy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - resource = "resource-341064690" - - with pytest.raises(CustomException): - client.get_iam_policy(resource) - - def test_set_iam_policy(self): - # Setup Expected Response - version = 351608024 - etag = b"21" - expected_response = {"version": version, "etag": etag} - expected_response = policy_pb2.Policy(**expected_response) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - resource = "resource-341064690" - policy = {} - - response = client.set_iam_policy(resource, policy) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = iam_policy_pb2.SetIamPolicyRequest( - resource=resource, policy=policy - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_set_iam_policy_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - resource = "resource-341064690" - policy = {} - - with pytest.raises(CustomException): - client.set_iam_policy(resource, policy) - - def test_test_iam_permissions(self): - # Setup Expected Response - expected_response = {} - expected_response = iam_policy_pb2.TestIamPermissionsResponse( - **expected_response - ) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup Request - resource = "resource-341064690" - permissions = [] - - response = client.test_iam_permissions(resource, permissions) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = iam_policy_pb2.TestIamPermissionsRequest( - resource=resource, permissions=permissions - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_test_iam_permissions_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerClient() - - # Setup request - resource = "resource-341064690" - permissions = [] - - with pytest.raises(CustomException): - client.test_iam_permissions(resource, permissions) diff --git a/tests/unit/gapic/v1beta1/test_policy_tag_manager_serialization_client_v1beta1.py b/tests/unit/gapic/v1beta1/test_policy_tag_manager_serialization_client_v1beta1.py deleted file mode 100644 index fc1f1206..00000000 --- a/tests/unit/gapic/v1beta1/test_policy_tag_manager_serialization_client_v1beta1.py +++ /dev/null @@ -1,145 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Unit tests.""" - -import mock -import pytest - -from google.cloud import datacatalog_v1beta1 -from google.cloud.datacatalog_v1beta1.proto import policytagmanagerserialization_pb2 - - -class MultiCallableStub(object): - """Stub for the grpc.UnaryUnaryMultiCallable interface.""" - - def __init__(self, method, channel_stub): - self.method = method - self.channel_stub = channel_stub - - def __call__(self, request, timeout=None, metadata=None, credentials=None): - self.channel_stub.requests.append((self.method, request)) - - response = None - if self.channel_stub.responses: - response = self.channel_stub.responses.pop() - - if isinstance(response, Exception): - raise response - - if response: - return response - - -class ChannelStub(object): - """Stub for the grpc.Channel interface.""" - - def __init__(self, responses=[]): - self.responses = responses - self.requests = [] - - def unary_unary(self, method, request_serializer=None, response_deserializer=None): - return MultiCallableStub(method, self) - - -class CustomException(Exception): - pass - - -class TestPolicyTagManagerSerializationClient(object): - def test_import_taxonomies(self): - # Setup Expected Response - expected_response = {} - expected_response = policytagmanagerserialization_pb2.ImportTaxonomiesResponse( - **expected_response - ) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerSerializationClient() - - # Setup Request - parent = client.location_path("[PROJECT]", "[LOCATION]") - - response = client.import_taxonomies(parent) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = policytagmanagerserialization_pb2.ImportTaxonomiesRequest( - parent=parent - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_import_taxonomies_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerSerializationClient() - - # Setup request - parent = client.location_path("[PROJECT]", "[LOCATION]") - - with pytest.raises(CustomException): - client.import_taxonomies(parent) - - def test_export_taxonomies(self): - # Setup Expected Response - expected_response = {} - expected_response = policytagmanagerserialization_pb2.ExportTaxonomiesResponse( - **expected_response - ) - - # Mock the API response - channel = ChannelStub(responses=[expected_response]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerSerializationClient() - - # Setup Request - parent = client.location_path("[PROJECT]", "[LOCATION]") - taxonomies = [] - - response = client.export_taxonomies(parent, taxonomies) - assert expected_response == response - - assert len(channel.requests) == 1 - expected_request = policytagmanagerserialization_pb2.ExportTaxonomiesRequest( - parent=parent, taxonomies=taxonomies - ) - actual_request = channel.requests[0][1] - assert expected_request == actual_request - - def test_export_taxonomies_exception(self): - # Mock the API response - channel = ChannelStub(responses=[CustomException()]) - patch = mock.patch("google.api_core.grpc_helpers.create_channel") - with patch as create_channel: - create_channel.return_value = channel - client = datacatalog_v1beta1.PolicyTagManagerSerializationClient() - - # Setup request - parent = client.location_path("[PROJECT]", "[LOCATION]") - taxonomies = [] - - with pytest.raises(CustomException): - client.export_taxonomies(parent, taxonomies)