Getting organised

The Terraform documentation doesn’t explicitly provide a recommended project structure, so we came up with our own.

To improve the development experience and make the project clearer for everyone, we decided to split our project into three distinct groups (modules, blueprints and environments), along with a master go script.

├── modules

│ ├── resource_group

│ │ ├── main.tf

│ ├── storage_account

│ │ ├── main.tf

├── blueprints

│ ├── commerce

│ │ ├── main.tf

│ ├── finance

│ │ ├── orders.tf

│ │ ├── payments .tf

│ │ ├── variables.tf

│ ├── plan.tf

├── environments

│ ├── test

│ │ ├── main.tf

│ ├── production

│ │ ├── main.tf

├── go.ps1

Modules

Terraform modules allow users to group logical sets of resources. Any .tf file within a folder is bundled together to create a module, which can be referenced from anywhere else in the project using its folder path.

We have built upon this to create a reusable set of components, which are declared using common standards to ensure that infrastructure of the same type conforms to the requirements of the system. Anything that needs to be specifically configured is set using variables.

The max topic size is explicitly set in the modules definition

Terraform provides the ability to declare variables and outputs in separate files and this is what you’ll find in most online examples. I prefer to keep them in the same file along with the resources that use them as much as possible. This helps maintain clean and concise modules whilst keeping the project size down.

Even if a module consists of just one .tf file, it must be included in a folder to be referenced as a module from other locations in the project. Having a single file in a folder is a code smell but unfortunately there isn’t any way around it.

Blueprints

We use blueprints to define infrastructure, which needs to be built using a combination of modules. This allows us to logically group resources, which reflect the requirements for specific parts of our system and domain. Slightly more specific variables can be defined at this level, which may vary from one resource to the other but are of the same type.

For example, if we are required to provide a topic to allow subscribers to listen for messages that will be published, we can create a blueprint. This defines all resources and variables needed, such as a resource group, service bus namespace and a topic.

Blueprint referencing multiple modules with domain specific variables

The number of resources required for a particular part of the system can often be large, so keeping a reference to each module becomes messy and difficult to maintain quite quickly. This is a good time to split up these files but keep them grouped within the same blueprint folder, like the finance example above. Variables and outputs should be extracted into separate files at this point for clarity, as they will be used across multiple different resource files.

The plan.tf file at the root of the folder references each blueprint to create a complete plan of the system. It’s essentially the master plan and is referenced by each environment to generate its infrastructure.

Environments

An environment represents the top level of the system, such as test or production. The plan is referenced from here and allows for environment specific variables to be set. This ensures that all environments will be consistently created using the same resources.

Example environment definition

The Terraform state and provider are also declared here so that environment specific backend storage and client secrets can be set. It’s important to note that secrets are not set in these files or checked into source control. Instead we replace these at runtime with the go script.

Go script

The go.ps1 script is the entry point into our Terraform project. The script provides a wrapper around Terraform that provides us with extra control in our workflow and build pipelines. The same script is used both for development on local machines and to apply the infrastructure across all environments in the build pipeline. This is done to enforce a consistent development and deployment process.

This PowerShell script allows us to control the commands used to plan, apply and monitor the system during the deployment process. The script also replaces any secrets in the .tf files and reverts them afterwards, so that no sensitive data is saved and compromised.