Tables exist within datasets. See BigQuery documentation for more information on Tables.
List the tables belonging to a dataset with the ~google.cloud.bigquery.client.Client.list_tables
method:
../samples/list_tables.py
Get a table resource with the ~google.cloud.bigquery.client.Client.get_table
method:
../samples/get_table.py
Determine if a table exists with the ~google.cloud.bigquery.client.Client.get_table
method:
../samples/table_exists.py
Browse data rows in a table with the ~google.cloud.bigquery.client.Client.list_rows
method:
../samples/browse_table_data.py
Create an empty table with the ~google.cloud.bigquery.client.Client.create_table
method:
../samples/create_table.py
Create a clustered table with the ~google.cloud.bigquery.client.Client.create_table
method:
../samples/create_table_clustered.py
Create an integer range partitioned table with the ~google.cloud.bigquery.client.Client.create_table
method:
../samples/create_table_range_partitioned.py
Load table data from a file with the ~google.cloud.bigquery.client.Client.load_table_from_file
method:
../samples/load_table_file.py
Creating a clustered table from a query result:
../samples/client_query_destination_table_clustered.py
Creating a clustered table when you load data ~google.cloud.bigquery.client.Client.load_table_from_file
method:
../samples/load_table_clustered.py
Load a CSV file from Cloud Storage with the ~google.cloud.bigquery.client.Client.load_table_from_uri
method:
../samples/load_table_uri_csv.py
See also: Loading CSV data from Cloud Storage.
Load a JSON file from Cloud Storage:
../samples/load_table_uri_json.py
See also: Loading JSON data from Cloud Storage.
Load a Parquet file from Cloud Storage:
../samples/load_table_uri_parquet.py
See also: Loading Parquet data from Cloud Storage.
Load an Avro file from Cloud Storage:
../samples/load_table_uri_avro.py
See also: Loading Avro data from Cloud Storage.
Load an ORC file from Cloud Storage:
../samples/load_table_uri_orc.py
See also: Loading ORC data from Cloud Storage.
Load a CSV file from Cloud Storage and auto-detect schema:
../samples/load_table_uri_autodetect_csv.py
Load a JSON file from Cloud Storage and auto-detect schema:
../samples/load_table_uri_autodetect_json.py
Update a property in a table's metadata with the ~google.cloud.bigquery.client.Client.update_table
method:
../snippets.py
Insert rows into a table's data with the ~google.cloud.bigquery.client.Client.insert_rows
method:
../samples/table_insert_rows.py
Insert rows into a table's data with the ~google.cloud.bigquery.client.Client.insert_rows
method, achieving higher write limit:
../samples/table_insert_rows_explicit_none_insert_ids.py
Mind that inserting data with None
row insert IDs can come at the expense of more duplicate inserts. See also: Streaming inserts.
Add an empty column to the existing table with the ~google.cloud.bigquery.update_table
method:
../samples/add_empty_column.py
Copy a table with the ~google.cloud.bigquery.client.Client.copy_table
method:
../samples/copy_table.py
Copy table data to Google Cloud Storage with the ~google.cloud.bigquery.client.Client.extract_table
method:
../snippets.py
Delete a table with the ~google.cloud.bigquery.client.Client.delete_table
method:
../samples/delete_table.py
Restore a deleted table from a snapshot by using the ~google.cloud.bigquery.client.Client.copy_table
method:
../samples/undelete_table.py
Replace the table data with an Avro file from Cloud Storage:
../samples/load_table_uri_truncate_avro.py
Replace the table data with a CSV file from Cloud Storage:
../samples/load_table_uri_truncate_csv.py
Replace the table data with a JSON file from Cloud Storage:
../samples/load_table_uri_truncate_json.py
Replace the table data with an ORC file from Cloud Storage:
../samples/load_table_uri_truncate_orc.py
Replace the table data with a Parquet file from Cloud Storage:
../samples/load_table_uri_truncate_parquet.py