Building an Azure DevOps Pipeline for Terraform with Azure Storage Backend

In this article, I’ll walk you through creating an Azure DevOps pipeline for deploying infrastructure using Terraform. The pipeline utilizes an Azure Storage backend for state management. I’ll also discuss the issues I faced during implementation and how I resolved them.


Pipeline Overview

This Azure DevOps pipeline is designed to:

  1. Initialize Terraform with an Azure Storage backend.

  2. Validate the Terraform configuration.

  3. Generate a Terraform execution plan.

  4. Archive the generated plan for later use.


Pipeline YAML Configuration

Here’s the YAML file used to configure the pipeline:

trigger:
- main/azureinfra/*

pool:
  name: 'tfagent' 

stages:
- stage: Build
  jobs:
  - job: Build
    pool:
      name: 'tfagent'
    steps:
    - task: TerraformInstaller@1
      inputs:
        terraformVersion: 'latest'

    # Debugging Step to Confirm .tf Files Location
    - script: |
        echo "Listing files in the azureinfra folder:"
        echo "$(System.DefaultWorkingDirectory)/azureinfra"
        ls -al $(System.DefaultWorkingDirectory)/azureinfra
      displayName: 'Debug azureinfra Folder Contents'

    - task: TerraformTaskV4@4
      inputs:
        provider: 'azurerm'
        command: 'init'
        commandOptions: '-migrate-state'
        workingDirectory: '$(System.DefaultWorkingDirectory)/azureinfra'
        backendServiceArm: 'Free Trial(f7d7399c-da85-4d03-b12c-d6608ce5f808)'
        backendAzureRmResourceGroupName: 'manualtest'
        backendAzureRmStorageAccountName: 'tfstatetest1'
        backendAzureRmContainerName: 'tfstatetest'
        backendAzureRmKey: 'test.terraform.tfstate'

    - task: TerraformTaskV4@4
      inputs:
        provider: 'azurerm'
        command: 'validate'
        workingDirectory: '$(System.DefaultWorkingDirectory)/azureinfra'

    - task: TerraformTaskV4@4
      inputs:
        provider: 'azurerm'
        command: 'plan'
        commandOptions: '-out $(Build.SourcesDirectory)/tfplanfile'
        workingDirectory: '$(System.DefaultWorkingDirectory)/azureinfra'
        environmentServiceNameAzureRM: 'Free Trial(f7d7399c-da85-4d03-b12c-d6608ce5f808)'

    - task: ArchiveFiles@2
      inputs:
        rootFolderOrFile: '$(Build.SourcesDirectory)/'
        includeRootFolder: false
        archiveType: 'zip'
        archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
        replaceExistingArchive: true

    - task: PublishBuildArtifacts@1
      inputs:
        PathtoPublish: '$(Build.ArtifactStagingDirectory)'
        ArtifactName: '$(Build.BuildId)-build'
        publishLocation: 'Container'

Step-by-Step Breakdown

1. Terraform Installation

We use the TerraformInstaller task to ensure the latest version of Terraform is installed. This eliminates compatibility issues and ensures the pipeline uses the desired version.

task: TerraformInstaller@1
  inputs:
    terraformVersion: 'latest'

2. Debugging Directory and Files

Before initializing Terraform, a script step checks the working directory and ensures that all .tf files are present. This is especially helpful when the pipeline fails to find configuration files.

script: |
    echo "Listing files in the azureinfra folder:"
    ls -al $(System.DefaultWorkingDirectory)/azureinfra
  displayName: 'Debug azureinfra Folder Contents'

3. Terraform Initialization

The pipeline initializes Terraform with an Azure Storage backend. The backend configuration ensures that Terraform state is stored securely in Azure Storage, enabling team collaboration and preventing local file corruption.

Key settings include:

  • Resource group: The Azure resource group containing the storage account.

  • Storage account: Stores the Terraform state file.

  • Container name: The blob storage container where the state file resides.

task: TerraformTaskV4@4
  inputs:
    provider: 'azurerm'
    command: 'init'
    commandOptions: '-migrate-state'
    workingDirectory: '$(System.DefaultWorkingDirectory)/azureinfra'
    backendServiceArm: 'Free Trial(f7d7399c-da85-4d03-b12c-d6608ce5f808)'
    backendAzureRmResourceGroupName: 'manualtest'
    backendAzureRmStorageAccountName: 'tfstatetest1'
    backendAzureRmContainerName: 'tfstatetest'
    backendAzureRmKey: 'test.terraform.tfstate'

4. Validation and Planning

Terraform configuration is validated to catch syntax errors or misconfigurations. Once validated, a plan file is generated to show the changes that Terraform will apply.

task: TerraformTaskV4@4
  inputs:
    provider: 'azurerm'
    command: 'validate'
    workingDirectory: '$(System.DefaultWorkingDirectory)/azureinfra'

5. Archiving and Publishing

The generated plan file is archived and published as a build artifact for future use or reference.

task: ArchiveFiles@2
  inputs:
    rootFolderOrFile: '$(Build.SourcesDirectory)/'
    includeRootFolder: false
    archiveType: 'zip'
    archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
    replaceExistingArchive: true

Challenges and Resolutions

1. Missing Terraform Configuration Files

Issue: Terraform couldn’t find .tf files in the working directory.

Resolution: Added a debugging step to list directory contents and updated the workingDirectory to point to the correct folder: $(System.DefaultWorkingDirectory)/azureinfra.

You can use this to debug the directory issue if you get stuck at any point in this regard.

There are the following directories you need to understand:

Read more about variables here:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=azure-devops-cli%2Cbatch


2. Backend Defaulted to Local

Issue: Despite configuring Azure Storage as the backend, Terraform defaulted to using the local backend.

Resolution: Added the -migrate-state option to the init command to ensure proper backend initialization.


3. Insufficient Permissions

Issue: The Azure DevOps pipeline failed to access the storage account due to missing permissions.

Resolution: Granted the service principal the following roles on the storage account:

  • Storage Blob Data Contributor

  • Storage Blob Data Reader


4. State Migration Issues

Issue: Migrating state from local to Azure Storage caused initialization errors.

Resolution:

  1. Used the -reconfigure flag with terraform init.

  2. Verified the backend configuration and re-initialized the state.


Best Practices

  1. Use Remote State Management:

    • Always use a remote backend for production environments to avoid state conflicts.
  2. Debug Early:

    • Add steps to verify the presence of necessary files and directories.
  3. Secure Service Connections:

    • Use Azure DevOps service connections to securely manage authentication.
  4. Automate Artifact Archival:

    • Archive Terraform plans to maintain a record of infrastructure changes.

Conclusion

This Azure DevOps pipeline simplifies Terraform-based infrastructure deployment while adhering to best practices like remote state management and validation. By addressing common challenges, we ensure a seamless experience for both developers and operators.

Would you like to see further optimizations or integrations in such pipelines? Let me know in the comments!