Published:
Github Actions - Workflow Patterns
Pipelines for Cross Compiling Go programs. Deploying AWS Infrastructure with Terraform and more...
Table of Contents
↑Introduction
I’m late using github actions, but I’m a total convert.
During the last seven days, I’ve worked on two projects that have benefited from using them. I am documenting the general patterns here for personal use, but if you find them useful, please use them.
Cross Compiling Go to OSX & Windows
I’m working on a project which adds extra functionality to an SQLite database. Go is good at cross-compiling your code to different computer architectures if you avoid using CGO-enabled packages. My SQLite extensions project uses a CGO-enabled package, and I’ve tried various ways to cross-compile the code without success.
The GitHub actions below use a macos-11 runner to build the OSX binary. I have another action which uses the windows-latest runner, which builds the binary for Windows.
My local development machine is Ubuntu, so I can build the Linux version locally. When I push the code up to GitHub, the two actions are invoked, and within a minute, I have a Windows and OSX version of my project.
name: Build OSX Version
on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
jobs:
build:
runs-on: macos-11
steps:
- uses: actions/checkout@v3
- name: 'Set up Go'
uses: actions/setup-go@v3
with:
go-version: 1.19
- name: 'Install Go Packages'
run: |
go get -u go.riyazali.net/sqlite
go get -u github.com/leekchan/accounting
- name: 'Build'
run: go build -v -buildmode=c-shared -o s0-sqlite-extensions-osx-amd64.so ./...
- name: 'Upload Artifact'
uses: actions/upload-artifact@v3
with:
name: osx-binary
path: s0-sqlite-extensions-osx-amd64.so
retention-days: 2
↑Deploying AWS Infrastructure with Terraform
The second project involved me working on some joint Terraform code with a third party. As we were both contributing to the same code repository, we set up Github actions to deploy the infrastructure using the Terraform actions.
Before we started, I migrated the state file from my local machine to an S3 bucket. I introduced the backend block to my main terraform file and then re-initialised it.
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 4.42.0"
}
}
required_version = ">= 1.3.4"
backend "s3" {
bucket = "projectx-terraform-state-files"
key = "prod/dns/terraform.tfstate"
region = "eu-west-2"
}
}
I introduced the workflow below with the state file safely stored in an S3 bucket. It runs when the code is pushed to the main or development branch.
The workflow sets up the AWS credentials by pulling them in from Github’s repository secrets store. It then checks the code, initialises Terraform and runs the Plan command to see what needs to change.
The final step of the workflow runs if the checked-out code is from the main branch. The last step is the application of the changes to the AWS infrastructure.
name: Terraform Infrastructure Change Management Pipeline
on:
push:
branches: [ "main", "development" ]
permissions:
id-token: write
contents: read
jobs:
build:
runs-on: ubuntu-latest
defaults:
run:
shell: bash
working-directory: ./production
steps:
- name: 'Configure AWS Credentials'
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_PRODUCTION_ACCESS_KEY }}
aws-secret-access-key: ${{ secrets.AWS_PRODUCTION_ACCESS_SECRET }}
aws-region: eu-west-2
- name: 'Checkout'
uses: actions/checkout@v3
- name: 'Setup Terraform'
uses: hashicorp/setup-terraform@v2
with:
terraform_version: 1.4.6
- name: 'Terraform Init'
run: terraform init
- name: 'Terraform Plan'
run: terraform plan -input=false
continue-on-error: true
- name: 'Terraform Apply'
if: github.ref == 'refs/heads/main' && github.event_name == 'push'
run: terraform apply -auto-approve -input=false
↑Compiling and Deploying a Go Lambda Function
Most of my web applications use AWS serverless technology known as Lambda.
The Github workflow below shows the actions I take to compile my Go Lambda function and deploy it to AWS. The workflow also compresses the Go binary using UPX to make the package as small as possible to reduce cold start times. After compressing the binary it then ZIPs the file up before finally pushing it AWS.
name: Compile & Deploy Production
on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: 'Set up Go'
uses: actions/setup-go@v3
with:
go-version: 1.19
- name: 'Install Go Packages'
run: go mod tidy
- name: 'Compile Go Binary'
run: GOARCH=arm64 GOOS=linux go build -ldflags="-w -s" -o bootstrap main.go state.go templates.go
- name: 'Compress Go Binary with UPX'
uses: svenstaro/upx-action@v2
with:
files: |
bootstrap
args: -9
strip: false
- name: 'ZIP Binary'
uses: montudor/action-zip@v0.1.0
with:
args: zip -qq -r ./main.zip . -i bootstrap
- name: 'Deploy to AWS Production Account'
uses: appleboy/lambda-action@master
with:
aws_access_key_id: ${{ secrets.AWS_PRODUCTION_ACCESS_KEY }}
aws_secret_access_key: ${{ secrets.AWS_PRODUCTION_ACCESS_SECRET }}
aws_region: eu-west-2
function_name: arn:aws:lambda:eu-west-2:561675481801:function:prod-central-lpres-viewer-default
zip_file: main.zip
publish: true