In my previous post I gave a guide on how I set up my static website using Hugo and AWS. In this much (thankfully) briefer post, I’ll explain how I set up an automatic build and deployment of my new website content by hooking up CircleCI, a container based continuous integration provider, to my Github repo. This post will focus less on the AWS side of things (I’m so tired of taking screenshots), and more on the automation side of things. It assumes that you’re using GitHub for version control on your static content. If you’re not currently using version control, I highly recommend it, and I’d further recommend using GitHub because of the great integrations (and well, git is awesome).

First steps

Create a CircleCI account

Create an account with CirleCI (free) using your Github account, and then add the repository holding you Hugo content via-the Add Repo menu option on the left side.

Create a AWS IAM user for your CircleCI build

For security reasons, it’s best to restrict access to users with highly targeted credentials. For example, a micro-service that needs to read from some files in an S3 bucket should have its own dedicated user, with Read-Only permissions. A lot has been written about best practices for permissions, so I won’t get into many of the details. I will say however that since an AWS IAM user may belong to multiple groups, it can be helpful to create groups based on different levels of access for different resources, and assign a user to the groups accordingly. A useful tool, regardless of the AWS resource you’re trying to lock down, is the AWS Policy Generator.

In our use case, we should create a CircleCI IAM user (which could have a more specific name, like CircleCI-StaticSite), who will need to be able to run an aws sync command and a CloudFront cache invalidation as part of our automation.

S3 bucket permissions

For your bucket access, I’d recommend restricting use to both the specific bucket, and the specific actions required. The bucket policy I used looks like the one below, with List, Delete, Put, and PutAcl permissions, to allow the AWS cli tool to run the sync command.

{ "Version" : "2012-10-17" , "Id" : "Policy1470665480320" , "Statement" : [ { "Sid" : "Stmt1470663567916" , "Effect" : "Allow" , "Principal" : { "AWS" : "arn:aws:iam::<your-user-arn-id>" }, "Action" : [ "s3:DeleteObject" , "s3:PutObjectAcl" , "s3:PutObject" ], "Resource" : "arn:aws:s3:::<your-bucket>/*" }, { "Sid" : "Stmt1470665474830" , "Effect" : "Allow" , "Principal" : { "AWS" : "arn:aws:iam::<your-user-arn-id>" }, "Action" : "s3:ListBucket" , "Resource" : "arn:aws:s3:::<your-bucket>" } ] }

If your particular use case localizes all files to a specific folder, I would make the permission even more restrictive.

CloudFront permissions

Compared to S3, CloudFront currently does not offer a resource level IAM restriction, so it will be more open in comparison. We can however limit a users actions, so we can at least limit it to cache invalidations only.

{ "Version" : "2012-10-17" , "Statement" : [ { "Sid" : "Stmt1470758342165" , "Action" : [ "cloudfront:CreateInvalidation" , "cloudfront:GetInvalidation" , "cloudfront:ListDistributions" ], "Effect" : "Allow" , "Resource" : "*" } ] }

In the policy above we allow the listing of distributions, and invalidation creations, which will be used by the automation’s AWS CLI command.

Add your automation user’s keys to the Circle project build

If you navigate to your repo’s settings in Circle, you’ll see a field for AWS Permissions . This is where you should add the Access and Secret keys for your automation user, for both security and function within Circle.

The Automation

The Makefile

This Makefile is brief and consists of a few distinct portions

all : Specifies which of the subcommands should be run on make and make all

: Specifies which of the subcommands should be run on and clean : This command simply removes the generated “public” folder if you’re building your Hugo files locally

: This command simply removes the generated “public” folder if you’re building your Hugo files locally build : This command runs the Hugo build that outputs the necessary static files for your website

: This command runs the Hugo build that outputs the necessary static files for your website sync-s3 : As its name suggests, this simply syncs the public folder to our S3 bucket, setting the permissions as to allow people to view the generated content once its synced

: As its name suggests, this simply syncs the public folder to our S3 bucket, setting the permissions as to allow people to view the generated content once its synced invalidate-cache : The first thing this command does is enable the CloudFront preview, since the CLI support for it isn’t yet considered to be part of the default toolset within the CLI. It also uses a wild card for everything under “/page/*” since CloudFront charges you by invalidations (but don’t worry, only after the first 1000 a month), and wildcard invalidations count the same as a single, full-path invalidation

The Circle.yml

This file is fairly simple too, but I’ll explain it anyways in case you are not familiar with Circle’s configuration settings

machine/environment : This section specifies which environment variables we want to make available to our container

: This section specifies which environment variables we want to make available to our container dependencies cache_directories : Here we specify which locations we want to cache for future builds. In this case we’re caching our hugo install location override : Circle tries to infer what tools and commands may be needed to run your build, and in override is where you specify which dependencies you’ll need (that it wouldn’t be able to infer). In this case we check our cached directory to see if hugo is present, and if not, we download the Hugo version specified by the environment variable, extract it, and move it to the install directory

test/override : Circle is of the opinion (which I share) that the base case of no tests is a failing case. In order to have successful build, I overrode the tests with a simple echo, given that there’s not much code to test in a generated static website. That being said, you could do something clever here like looking for all non-draft pages, and ensuring the count in the “public” folder matches your expected number of files

: Circle is of the opinion (which I share) that the base case of no tests is a failing case. In order to have successful build, I overrode the tests with a simple echo, given that there’s not much code to test in a generated static website. That being said, you could do something clever here like looking for all non-draft pages, and ensuring the count in the “public” folder matches your expected number of files deployment/aws branch : Here you can specify which branches you’d like to run the deployment on for new code pushes. Personally, I only want to run updates when the “master” branch is changed commands : This is where you tell Circle which commands it should run on its deployments. Since we defined make all to be build sync-s3 invalidate-cache , that’s all we’ll need.



Conclusion