Skip to content

Commit

Permalink
feat: Add log export GCS bucket object versioning (#317)
Browse files Browse the repository at this point in the history
* Add log export GCS bucket object versioning

* Improve wording

Co-authored-by: Bharath KKB <bharathkrishnakb@gmail.com>

Co-authored-by: Bharath KKB <bharathkrishnakb@gmail.com>
  • Loading branch information
vovinacci and bharathkkb committed Mar 31, 2021
1 parent d9468db commit cb0e622
Show file tree
Hide file tree
Showing 4 changed files with 19 additions and 8 deletions.
1 change: 1 addition & 0 deletions 1-org/envs/shared/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
| log\_export\_storage\_force\_destroy | (Optional) If set to true, delete all contents when destroying the resource; otherwise, destroying the resource will fail if contents are present. | `bool` | `false` | no |
| log\_export\_storage\_location | The location of the storage bucket used to export logs. | `string` | `"US"` | no |
| log\_export\_storage\_retention\_policy | Configuration of the bucket's data retention policy for how long objects in the bucket should be retained. | <pre>object({<br> is_locked = bool<br> retention_period_days = number<br> })</pre> | `null` | no |
| log\_export\_storage\_versioning | (Optional) Toggles bucket versioning, ability to retain a non-current object version when the live object version gets replaced or deleted. | `bool` | `false` | no |
| org\_audit\_logs\_project\_alert\_pubsub\_topic | The name of the Cloud Pub/Sub topic where budget related messages will be published, in the form of `projects/{project_id}/topics/{topic_id}` for the org audit logs project. | `string` | `null` | no |
| org\_audit\_logs\_project\_alert\_spent\_percents | A list of percentages of the budget to alert on when threshold is exceeded for the org audit logs project. | `list(number)` | <pre>[<br> 0.5,<br> 0.75,<br> 0.9,<br> 0.95<br>]</pre> | no |
| org\_audit\_logs\_project\_budget\_amount | The amount to use as the budget for the org audit logs project. | `number` | `1000` | no |
Expand Down
13 changes: 7 additions & 6 deletions 1-org/envs/shared/log_sinks.tf
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ resource "random_string" "suffix" {

module "log_export_to_biqquery" {
source = "terraform-google-modules/log-export/google"
version = "~> 5.0"
version = "~> 5.1.0"
destination_uri = module.bigquery_destination.destination_uri
filter = local.main_logs_filter
log_sink_name = "sk-c-logging-bq"
Expand All @@ -55,7 +55,7 @@ module "log_export_to_biqquery" {

module "bigquery_destination" {
source = "terraform-google-modules/log-export/google//modules/bigquery"
version = "~> 5.0"
version = "~> 5.1.0"
project_id = module.org_audit_logs.project_id
dataset_name = "audit_logs"
log_sink_writer_identity = module.log_export_to_biqquery.writer_identity
Expand All @@ -69,7 +69,7 @@ module "bigquery_destination" {

module "log_export_to_storage" {
source = "terraform-google-modules/log-export/google"
version = "~> 5.0"
version = "~> 5.1.0"
destination_uri = module.storage_destination.destination_uri
filter = local.all_logs_filter
log_sink_name = "sk-c-logging-bkt"
Expand All @@ -81,14 +81,15 @@ module "log_export_to_storage" {

module "storage_destination" {
source = "terraform-google-modules/log-export/google//modules/storage"
version = "~> 5.0"
version = "~> 5.1.0"
project_id = module.org_audit_logs.project_id
storage_bucket_name = "bkt-${module.org_audit_logs.project_id}-org-logs-${random_string.suffix.result}"
log_sink_writer_identity = module.log_export_to_storage.writer_identity
uniform_bucket_level_access = true
location = var.log_export_storage_location
retention_policy = var.log_export_storage_retention_policy
force_destroy = var.log_export_storage_force_destroy
versioning = var.log_export_storage_versioning
}

/******************************************
Expand All @@ -97,7 +98,7 @@ module "storage_destination" {

module "log_export_to_pubsub" {
source = "terraform-google-modules/log-export/google"
version = "~> 5.0"
version = "~> 5.1.0"
destination_uri = module.pubsub_destination.destination_uri
filter = local.main_logs_filter
log_sink_name = "sk-c-logging-pub"
Expand All @@ -109,7 +110,7 @@ module "log_export_to_pubsub" {

module "pubsub_destination" {
source = "terraform-google-modules/log-export/google//modules/pubsub"
version = "~> 5.0"
version = "~> 5.1.0"
project_id = module.org_audit_logs.project_id
topic_name = "tp-org-logs-${random_string.suffix.result}"
log_sink_writer_identity = module.log_export_to_pubsub.writer_identity
Expand Down
6 changes: 6 additions & 0 deletions 1-org/envs/shared/variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -114,6 +114,12 @@ variable "log_export_storage_force_destroy" {
default = false
}

variable "log_export_storage_versioning" {
description = "(Optional) Toggles bucket versioning, ability to retain a non-current object version when the live object version gets replaced or deleted."
type = bool
default = false
}

variable "audit_logs_table_delete_contents_on_destroy" {
description = "(Optional) If set to true, delete all the tables in the dataset when destroying the resource; otherwise, destroying the resource will fail if tables are present."
type = bool
Expand Down
7 changes: 5 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,9 +68,12 @@ example-organization
Among the six projects created under the common folder, two projects (`prj-c-logging`, `prj-c-billing-logs`) are used for logging.
The first one for organization wide audit logs and the latter for billing logs.
In both cases the logs are collected into BigQuery datasets which can then be used general querying, dashboarding & reporting. Logs are also exported to Pub/Sub and GCS bucket.
_The various audit log types being captured in BigQuery are retained for 30 days._

For billing data, a BigQuery dataset is created with permissions attached, however you will need to configure a billing export [manually](https://cloud.google.com/billing/docs/how-to/export-data-bigquery), as there is no easy way to automate this at the moment.
**Notes**:

- Log export to GCS bucket has optional object versioning support via `log_export_storage_versioning`.
- The various audit log types being captured in BigQuery are retained for 30 days.
- For billing data, a BigQuery dataset is created with permissions attached, however you will need to configure a billing export [manually](https://cloud.google.com/billing/docs/how-to/export-data-bigquery), as there is no easy way to automate this at the moment.

#### DNS Hub

Expand Down

0 comments on commit cb0e622

Please sign in to comment.