Saturday, October 22, 2022

Managing SECRETS with TERRAFORM - Azure, GCP, AWS

 1.  Environment Variables

 

Keep PLAIN TEXT secrets out of your code using Env Variables.

 

A.      Declare Variables in .tf configuration files.

variable “password”{

        sensitive = true

        type = string

        description = “The User Name for DB Master”

}

B.      Now pass this variable to TF Resource Blocks that need those secerets.

resource “xyz” “abc”{

        attr1 = val1

        attrn = valn

        password = var.password

}

C.      Now use [ export TF_VAR_ ] to set Environment Variables in any type of Linux CMD Prompt.

export TF_VAR_password = rootroot

D.     Now, run $ terraform apply

E.      NOTE: We can use tools like  pass, iPassword etc with this technique.

F.       -: Everyone using our code needs to declare these variables first

G.     +: Easy, Keeps Plain text secrets out of code and VCS, Test Friendly

 

 

2.  ENCRYPTED Files (e.g., KMS, PGP, SOPS)

The secrets are Encrypted and cipher text is stored in a file. We need to check that file in VCS. Now, the problem is storing the KEY FILE. We store KEY FILE in our Cloud Provider (AWS, GCP Bucket, Azure Blob Storage, etc) and the cloud automatically encrypts our file. We can also control user access to this file.

A.      +: secrets are out of our VCS and code, Secrets are encrypted, versioned, packaged in cloud, no extra scripts required.

B.      -: Each Cloud has different way of doing it, Running of lots of commands in Cloud Terminal, Logs can tell key was used but can’t tell used for decrypting what?, Encurrs cost, Not Test Friendly

C.      TF with Different Cloud Provider’s Storage(Encrypted).

                                           

                                           I.            TF with AZURE

 

First, Create these resources in Azure using Azure CLI

# Create resource group

az group create --name $RESOURCE_GROUP_NAME --location eastus

 

# Create storage account

az storage account create --resource-group $RESOURCE_GROUP_NAME --name $STORAGE_ACCOUNT_NAME --sku Standard_LRS --encryption-services blob

 

# Create blob container (ALLOW PUBLIC ACCESS &  Globally Unique Name)

az storage container create --name $CONTAINER_NAME --account-name $STORAGE_ACCOUNT_NAME

 

Get the STORAGE ACCESS KEY in a Variable

ACCOUNT_KEY=$(az storage account keys list --resource-group $RESOURCE_GROUP_NAME --account-name $STORAGE_ACCOUNT_NAME --query '[0].value' -o tsv)

 

Declare Environment Variable of that key

export ARM_ACCESS_KEY=$ACCOUNT_KEY

 

TF CODE :

terraform {

  required_providers {

    azurerm = {

      source  = "hashicorp/azurerm"

      version = "=2.46.0"

    }

  }

    backend "azurerm" {

        resource_group_name  = "-----------"

        storage_account_name = “-----------"

        container_name       = "----------"

        key                  = "terraform.tfstate"

    }

 

}

 

provider "azurerm" {

  features {}

}

 

resource "azurerm_resource_group" "state-demo-secure" {

  name     = "state-demo"

  location = "eastus"

}

Now, run $ terraform init & $ terraform apply

 

                                        II.            TF with GCP

 

 

Make sure you have the necessary Cloud Storage permissions on your user account:

·         storage.buckets.create

·         storage.buckets.list

·         storage.objects.get

·         storage.objects.create

·         storage.objects.delete

·         storage.objects.update

in Cloud SHELL type : gcloud services enable [service name] 
            eg: gcloud services enable storage.googleapis.com    

In Terraform config file, add this GCD Bucket you want to store your file into:

resource "random_id" "bucket_prefix" {
  byte_length = 8
}
 
resource "google_storage_bucket" "default" {
  name          = "${random_id.bucket_prefix.hex}-bucket-tfstate"
  force_destroy = false
  location      = "US"
  storage_class = "STANDARD"
  versioning {
    enabled = true
  }
}
               

Now, add this block to confirm your GCS Bucket as backend:

terraform {
 backend "gcs" {
   bucket  = "BUCKET_NAME" 
# google_storage_bucket.default.name
   prefix  = "terraform/state"
 }
}

Run $terraform init & $terraform apply

 

 

                                      III.            TF with AWS

 

In your Terraform Config File, input this code:

terraform{

        backend “s3”{

                        bucket = “unique-bucket-name”

                        key = “state-file-name”

                        region = “ap-southeast-2”

                        encrypt = true

                        dynamodb_table = “terraform-lock”

}

}

 

3.             3.  Secret Stores (Vault, AWS Secrets Manager, GCP Secrets Manager)

A.      +: Keeps Plain Text secrets out of VCS & Code, Secrets are stored in dedicated secrets Store with encryption and strict access Control, Everything is defined in code with no extra scripts, Web UI Experience feels good, Secrets Rotation is enabled and it means they change after a particular defined time interval, We get detailed audit logs.

B.      -: Secrets are not versioned, packaged and tested with code causing errors, Expensive for large teams, Not test Friendly

 



2 comments:

  1. This comment has been removed by the author.

    ReplyDelete
  2. Awesome, Thanks for this info, works perfectly fine on Azure

    ReplyDelete

Everything about [.tfvars] file

1. A .tfvars file is a Terraform configuration file that is used to set variable values that are required by a Terraform module or configur...