Skip to content

Latest commit

 

History

History
132 lines (93 loc) · 4.4 KB

CONTRIBUTING.md

File metadata and controls

132 lines (93 loc) · 4.4 KB

Contributing

This document provides guidelines for contributing to the module.

Dependencies

The following dependencies must be installed on the development system:

Generating Documentation for Inputs and Outputs

The Inputs and Outputs tables in the READMEs of the root module, submodules, and example modules are automatically generated based on the variables and outputs of the respective modules. These tables must be refreshed if the module interfaces are changed.

Templating

To more cleanly handle cases where desired functionality would require complex duplication of Terraform resources (i.e. PR 51), this repository is largely generated from the autogen directory.

The root module is generated by running make build. Changes to this repository should be made in the autogen directory where appropriate.

Note: The correct sequence to update the repo using autogen functionality is to run make build. This will create the various Terraform files, and then generate the Terraform documentation using terraform-docs.

Autogeneration of documentation from .tf files

To generate new Inputs and Outputs tables run

make docker_generate_docs

Integration Testing

Integration tests are used to verify the behaviour of the root module, submodules, and example modules. Additions, changes, and fixes should be accompanied with tests.

The integration tests are run using Kitchen, Kitchen-Terraform, and InSpec. These tools are packaged within a Docker image for convenience.

The general strategy for these tests is to verify the behaviour of the example modules, thus ensuring that the root module, submodules, and example modules are all functionally correct.

Six test-kitchen instances are defined:

  • deploy-service
  • node-pool
  • shared-vpc
  • simple-regional
  • simple-zonal
  • stub-domains

The test-kitchen instances in test/fixtures/ wrap identically-named examples in the examples/ directory.`

Test Environment

The easiest way to test the module is in an isolated test project. The setup for such a project is defined in test/setup directory.

To use this setup, you need a service account with Project Creator access on a folder; the Billing Account User role is also required. Export the Service Account credentials to your environment like so:

export SERVICE_ACCOUNT_JSON=$(< credentials.json)

Note that SERVICE_ACCOUNT_JSON holds the contents of the credentials file; if you see errors pertaining to credential type, ensure this variable contains valid JSON, and not, for example, a path.

You will also need to set a few environment variables:

export TF_VAR_org_id="your_org_id"
export TF_VAR_folder_id="your_folder_id"
export TF_VAR_billing_account="your_billing_account_id"

With these settings in place, you can prepare a test project using Docker:

make docker_test_prepare

Noninteractive Execution

Run make docker_test_integration to test all of the example modules noninteractively, using the prepared test project.

Interactive Execution

  1. Run make docker_run to start the testing Docker container in interactive mode.

  2. Run kitchen_do create <EXAMPLE_NAME> to initialize the working directory for an example module.

  3. Run kitchen_do converge <EXAMPLE_NAME> to apply the example module.

  4. Run kitchen_do verify <EXAMPLE_NAME> to test the example module.

  5. Run kitchen_do destroy <EXAMPLE_NAME> to destroy the example module state.

Linting and Formatting

Many of the files in the repository can be linted or formatted to maintain a standard of quality.

Execution

Run make docker_test_lint.