Skip to content

mgrzybek/observability-datahub-k8s-terraform

Repository files navigation

observability-datahub-k8s-terraform

Read logs and metrics from Kafka topics and send them to S3, split GPDR logs and tech logs

How it works

Workflow diagram

The logs topic contains json-based messages. Two key fields are used:

  • log_type from the main message
  • _log_type in the nested json-based data within message

If their value is application, then the nested json is extracted to the main message and the field is tested.

{
    "log_type":"application",
    "message":"{\"_log_type\":\"audit\",\"message\":\"Hello World from app !\"}"
}

If their value is audit, then the message is routed to the auditlogs topic.

{
    "log_type":"audit",
    "message":"Hello World from audit !"
}

How to use it

Each resource is a terraform-based module:

The provided Makefile can be used. You must create a Terraform variables file called values.tfvars.

For example:

$ cat values.tfvars
namespace          = "obs-datahub"
kafka_cluster_name = "datahub"
zk_replicas        = 3

source_topics = ["logs"]

storage_class = "openshift-storage.noobaa.io"

isOpenshift     = true
operatorSource  = "community-operator"
sourceNamespace = "openshift-marketplace"
startingCSV     = "strimzi-cluster-operator.v0.31.1"

$ make help
apply                Apply the plan
destroy              Destroy the deployment
help                 This help message
init                 Initialize the environment
operator             Creates the required operator
plan.out             Create the plan
show-modules         Prints the modules
show                 Prints the resources
$ make operator && make apply
[…]

Requirements

No requirements.

Providers

Name Version
kubernetes 2.14.0

Modules

Name Source Version
auditlogs_bucket git::https://github.com/mgrzybek/terraform-module-k8s-bucket-claim n/a
auditlogs_topic git::https://github.com/mgrzybek/terraform-module-strimzi-topic n/a
cluster ./strimzi-cluster n/a
operator ./strimzi-operator n/a
splitter git::https://github.com/mgrzybek/terraform-module-k8s-logstash-logs-splitter n/a
techlogs_bucket git::https://github.com/mgrzybek/terraform-module-k8s-bucket-claim n/a
techlogs_topic git::https://github.com/mgrzybek/terraform-module-strimzi-topic n/a

Resources

Name Type
kubernetes_namespace.logs resource

Inputs

Name Description Type Default Required
auditlogs_bucket Name of the bucket to create to store the audit logs string "auditlogs" no
auditlogs_topic Target Kafka topic to push audit logs string "audit" no
channel Channel used to download the operator string "stable" no
isOpenshift Is it deployed on Openshift? bool false no
kafka_cluster_name Name of the cluster created string "kafka-logs" no
kafka_data_size Size of the PV claimed to store Kafka’s data string "1Gi" no
kafka_replicas Number of data nodes deployed number 1 no
namespace The namespace used to deploy the module string n/a yes
operatorSource n/a string "operatorhubio-catalog" no
sourceNamespace Marketplace used to download the operator string "olm" no
source_topics Names of the topics to listen to list(string)
[
"logs"
]
no
splitter_replicas Number of replicas to deploy number 1 no
startingCSV Version to install string "strimzi-cluster-operator.v0.31.1" no
storage_class Storage class to use in the ObjectBucketClaim string n/a yes
techlogs_bucket Name oh the bucket to create to store the technical logs string "techlogs" no
techlogs_topic Target Kafka topic to push technical logs string "techlogs" no
zk_data_size Size of the PV claimed to store Zookeeper’s data string "1Gi" no
zk_replicas Number of pods deployed for Zookeeper number 1 no

Outputs

No outputs.

Releases

No releases published

Packages

No packages published