I have a simple terraform code that creates a storage account. When I plan and deploy it from my personal VM, it is fine and remains in sync.
When I then use GitHub actions and automate that process, the plan says in the first run that it will destroy the storage container and all child modules. The GitHub actions flow runs on a runner which is different from my own VM.
I have a thought that because it's a different machine, maybe that is why it wants to destroy the resource first but I am not sure. I don't want it to destroy. Is there a way to use terraform import to first import the resource there so that it doesn't destroy?
Can I somehow avoid this scenario? I can provide the files as well.
The solution works after adding the backend configuration as suggested by Marko in the comments.
main.tf :
terraform {
required_providers {
azurerm = {
source = "hashicorp/azurerm"
version = "~>3.0"
}
}
backend "azurerm" {
resource_group_name = "resource-group"
storage_account_name = "xxxxxxx"
container_name = "terraform-state"
key = "terraform.tfstate"
}
}
provider "azurerm" {
features {}
skip_provider_registration = true
}
# ############################ Storage Account ###########################
locals {
storageAccount = fileexists("${var.storageAccountfilepath}/storageAccount.json") ? jsondecode(file("${var.storageAccountfilepath}/storageAccount.json")) : null
}
module "storageAccount" {
for_each = { for element in coalesce(local.storageAccount, []) : element.name => element }
source = "./storageAccount"
storageAccountName = each.value.name
resource_group_name = var.resource_group_name
filepath = "${var.storageAccountfilepath}${each.value.name}"
}
storageAccount.tf
############### Storage Account ##########################
resource "azurerm_storage_account" "storageAccount" {
name = var.storageAccountName
resource_group_name = var.resource_group_name
location = var.location
account_tier = "Standard"
account_replication_type = "LRS"
table_encryption_key_type = "Account"
queue_encryption_key_type = "Account"
infrastructure_encryption_enabled = "false"
tags = {
env = "sandbox-test"
owner = "som"
CreatedBy = "soms"
}
blob_properties {
change_feed_enabled = false
container_delete_retention_policy {
days = 7
}
delete_retention_policy {
days = 7
}
}
lifecycle {
prevent_destroy = true
}
}
My github actions workflow
name: Terraform Workflow
on:
push:
branches:
- develop
# pull_request:
# branches:
# - main
# Special permissions required for OIDC authentication
permissions:
id-token: write
contents: read
pull-requests: write
# These environment variables are used by the Terraform Azure provider to set up OIDC authentication.
env:
TERRAFORM_PATH: './terraform'
jobs:
terraform-plan:
name: 'Terraform Plan'
runs-on: [Windows,NonSOX]
env:
# This is needed since we are running Terraform with read-only permissions.
ARM_SKIP_PROVIDER_REGISTRATION: true
outputs:
tfplanExitCode: ${{ steps.tf-plan.outputs.exitcode }}
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Setup Terraform
uses: hashicorp/setup-terraform@v2
with:
terraform_wrapper: false
- name: Get secrets from vault
uses: hashicorp/[email protected]
id: get_secrets
with:
exportToken: true
method: jwt
path: github-actions-oidc
url: xxxxxxx
role: actions-runner-finance
tlsSkipVerify: true
secrets: |
/finance/data/terraform/AZURE_CLIENT_ID AZURE_CLIENT_ID | AZURE_CLIENT_ID ;
/finance/data/terraform/AZURE_SUBSCRIPTION_ID AZURE_SUBSCRIPTION_ID | AZURE_SUBSCRIPTION_ID ;
/finance/data/terraform/AZURE_TENANT_ID AZURE_TENANT_ID | AZURE_TENANT_ID ;
/finance/data/terraform/ARM_CLIENT_SECRET ARM_CLIENT_SECRET | ARM_CLIENT_SECRET ;
- name: Terraform Init
run: |
pushd './${{ env.TERRAFORM_PATH }}'
terraform init
- name: Terraform Format
run: |
terraform fmt -check
- name: Terraform Plan
id: tf-plan
env:
ARM_CLIENT_ID: ${{ steps.get_secrets.outputs.AZURE_CLIENT_ID }}
ARM_SUBSCRIPTION_ID: ${{ steps.get_secrets.outputs.AZURE_SUBSCRIPTION_ID }}
ARM_TENANT_ID: ${{ steps.get_secrets.outputs.AZURE_TENANT_ID }}
ARM_CLIENT_SECRET: ${{ steps.get_secrets.outputs.ARM_CLIENT_SECRET }}
run: |
pushd './${{ env.TERRAFORM_PATH }}'
terraform plan -detailed-exitcode -no-color -out tfplan
echo "exitcode=$LASTEXITCODE" >> $GITHUB_ENV
if ($LASTEXITCODE -eq 1) {
echo "Terraform Plan Failed!"
exit 1
} else {
exit 0
}
The solution as suggested by Marko was to add the terraform backend configuration. Being a newbie, I missed that. I have added it to the original post as well in main.tf.