You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
error "google.api_core.exceptions.InvalidArgument: 400 Cannot parse as CloudRegion." while try to create BigLake table with google-cloud-bigquery-biglake
#12631
Open
tiltgod opened this issue
Apr 26, 2024
· 2 comments
I already have Biglake table named "config" in the same region that created with stored procedure + big spark via Bigquery "projects/dudledood-sql-project-1/locations/us/catalogs/iceberg_catalog/database/iceberg_warehouse/tables/config"
but I can't create another with
create_table_request = bigquery_biglake_v1.CreateTableRequest(
parent="projects/{dudledood-sql-project-1}/locations/{us}/catalogs/{iceberg_catalog}/database/{iceberg_warehouse}", # projects/dudledood-sql-project-1/locations/us}/catalogs/iceberg_catalog/database/iceberg_warehouse
table_id=table_id,
)
# Make the request
create_table_response = self.biglake_client.create_table(request=create_table_request)
# Handle the response
print(create_table_response)
full error
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - Traceback (most recent call last):
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/google/api_core/grpc_helpers.py", line 76, in error_remapped_callable
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - return callable_(*args, **kwargs)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/grpc/_channel.py", line 1176, in __call__
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - return _end_unary_response_blocking(state, call, False, None)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/grpc/_channel.py", line 1005, in _end_unary_response_blocking
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - raise _InactiveRpcError(state) # pytype: disable=not-instantiable
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - status = StatusCode.INVALID_ARGUMENT
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - details = "Cannot parse as CloudRegion."
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - debug_error_string = "UNKNOWN:Error received from peer ipv4:142.250.199.10:443 {created_time:"2024-04-26T11:31:49.70895308+00:00", grpc_status:3, grpc_message:"Cannot parse as CloudRegion."}"
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - >
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO -
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - The above exception was the direct cause of the following exception:
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO -
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - Traceback (most recent call last):
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/opt/***/dags/script/iceberg_insertion_sparkappp.py", line 28, in <module>
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - biglake_client.create_biglake_table(table_id=table_id)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/tmp/localPyFiles-2899bd6c-42be-48e5-b653-e6df2dad4414/client.py", line 164, in create_biglake_table
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - create_table_response = self.biglake_client.create_table(request=create_table_request)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/google/cloud/bigquery_biglake_v1/services/metastore_service/client.py", line 1882, in create_table
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - response = rpc(
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/google/api_core/gapic_v1/method.py", line 131, in __call__
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - return wrapped_func(*args, **kwargs)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/google/api_core/grpc_helpers.py", line 78, in error_remapped_callable
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - raise exceptions.from_grpc_error(exc) from exc
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - google.api_core.exceptions.InvalidArgument: 400 Cannot parse as CloudRegion.
The text was updated successfully, but these errors were encountered:
I already have Biglake table named "config" in the same region that created with stored procedure + big spark via Bigquery
"projects/dudledood-sql-project-1/locations/us/catalogs/iceberg_catalog/database/iceberg_warehouse/tables/config"
but I can't create another with
full error
The text was updated successfully, but these errors were encountered: