I recently found myself creating a CI/CD pipeline for a 3 page static html website, and to be honest, despite how long it took to setup, I think it was worth it! In this post I write about my journey and reflection.

Breaking point

Last week I started working on our website as we relaunch to offering only a single service, Software Development and CTO-OnDemand ( moodio if you’re looking!) and do some rebranding. During a period of particularly regular updates as I fixed spelling, copy, and small CSS errors, I started getting agitated with having to repeat a serious of command line commands; (↑↑↑

) x 3, followed by another series of command line inputs in my Linux shell; (↑↑

) x 2.

Now 5 commands might not seem like a lot, but done repeatedly, with different steps sometimes taking a couple of minutes, it gave me a lot of time to reflect. I took a long hard look at myself and I was disappointed. I felt like a philistine, manually calling my gulp tasks to package everything minified, calling my Docker build commands and not even bothering to change the version tag anymore (Know how many left arrows it even takes to get to the tag?) and pushing the image. Following this I’d open my Kubernetes shell to delete and recreate the deployment. Since I didn’t properly tag the Docker images, couldn’t just change the image name to the new tag unfortunately.

I decided enough was enough! I knew I had to make a change and so I decided to practice what I preach. I would automate the whole thing, I would version control it, and as soon as I checked in any changes I would expect to see it online (in production. it’s still a static website, I’m obviously not going to have a dev environment for it).

And so, began my journey to do what any software developer with no plans for a Sunday evening would do. I decide to spend the next couple of hours putting a system in place that would help save me minutes! And so, using Docker, git, visual studio online and a Kubernetes cluster I already had up, I would create a CI/CD pipeline for my static html 3-page website.

The Pipeline

For HTML development, I have a pretty simple template built on Gulp and SCSS, that includes most things I need and a lot of CSS prewritten. It includes a lot of SCSS mixins and functions to help with cross-browser compatibility, and a list of vars that I can change to set most of the styling in a document such as sizes, fonts, spacing, etc. For development, it’s great, it’s but gives all the basics I need to quickly start a project and gave me something to build up from.

Step 1: .git

The first step was to set up a git repo. I use visual studio online and so I logged in, set up a new project, did the initial commit.

Step 2: \dist

As a CI/CD can’t involve any manual work once something is committed, the next step was to set up a gulp task for all the work I was previously doing manually. I set up a group of gulp tasks for this, the group would compress all the images, minify and concat all the JavaScript scripts into a single file, replace all the relative script tags with the new concatted version, minify and clean the html and CSS and push everything into a new folder named dist. Once this was done I had a folder with just the files I needed and everything minified and concatted or compressed.

For anyone interested, I’ve included the full gulpfile in the appendix, however you’ll need to visit their individual pages to see how they work.

Step 3: docker build, docker push

Now that I had a production ready website that PageSpeed Insights would be happy with, the next step is putting it all in a Docker file. To do this I set up a simple dockerfile which copies the dist folder over.

FROM nginx:alpine

WORKDIR /usr/share/nginx/html

COPY dist .

And I’d run a build command on it, and tag my image appropriately, then push it to my private repo.

The dockerfile didn’t require any changes to what I previously have.

Step 4: Kubectl apply

For the deployment, I am hosting it on a Kubernetes cluster i already have running and so I quickly through together a service and deployment config yaml file.

I created a loadbalanced service that connects to a deployment consisting of 2 replicas of the Docker image I had created previously. Did I need 2 pods? no, but it was load balanced, and load balancing to a single pod felt stupid.

Now once I had updated and pushed the Docker image, I would simply change the yaml file and update the tag to the latest version, or use a kubectl set image command to do it from the command line. While that’s true in theory, that would require me to correctly tag, and as I mentioned early I wasn’t bothered, as far as I was concerned when doing it manually, everything was tagged :latest, and only :latest. So instead I deleted the deployment, then reapplied the yaml file. Importantly I only deleted the deployment and not the service, so I would keep my public loadbalanced IP and avoid having it released back.

Step 5: Putting it all together in Visual Studio Online

Now the final piece was to put it all together! To do this I used Visual Studio Online, as I previously mentioned I set up the git repository there as well.

First I set up a new build definition. It was relatively simply.

The overall definition should look like this, and visual studio online includes a template for npm and gulp to add those two.

Luckily this was relatively straight forward, and as Visual Studio would take care of iterating the version number, I could now finally begin to directly modify the image using the correct image number! To do this, in the final task, I connect to the kubectl cluster simple run a kubectl set image command. To connect visual studio online to your Kubernetes cluster, you simply copy the config file from your cluster (~/.kube/config) to visual studio online and give it the cluster url.

And once it was all correctly configured and I worked out all the kinks, I finally had my CI/CD pipeline for my static html website! No longer do I have to repeatedly ↑↑↑

to recall my previous commands. I now simply check in my changes and about a minute later I get an email telling me it completed successful and the updated website will be online.

Reflection

While spending a couple of hours to configure something that would only save me a few minutes a day might seem like overkill, I honestly don’t care. I enjoyed putting it together, and smile a little every time I get an email saying that the build succeeded. Even my fiancée was impressed after I spent an hour over the phone explaining to her what I did exactly and a brief background on CI/CD, DevOps, Jenkins, Agile, Lean Start-ups, and some other high level concepts. She assured me she listened to everything I said and was very impressed, she only left the phone on speaker so that she could better take in everything I was saying.

As for if it will pay off, I think in the long run it will. I don’t do a lot of websites but atleast I now have a CI/CD template I can quickly deploy for any new html project I start, and it can always be extended for other non-static frameworks.

Pitfalls

A few pitfalls I came across, because to be honest I had many failed attempts before it finally working all together.

The hosted VSTS build agents DO NOT have Docker engines installed! make sure to use either your own build agent or set up a connection to an engine. The link for the Docker based VSTS agent is https://hub.docker.com/r/microsoft/vsts-agent/.

Repository name should be all lowercase! The Docker task by default uses the repository name as the image name, and the engine will throw an error if you use capital letters.

Scroll all the way down when copying the config file! I spent hours trying to troubleshoot why it wouldn’t connect to Kubernetes only to then realise I forgot to copy the config file completely and missed the certificate key data field.

When copying over the config file, make sure to remove all line breaks from the certificates. Otherwise this will cause a base64 decoding error in the build.

Appendix

Gulp Tasks

For the gulp tasks, create the gulp.js file, and add the dependacies in the package.json section into your package.json devDependencies section of your package.json (if you don’t already have a package.json file, type npm init. If you dont have node package manager install (npm) then download it….).

To use it, open a new command prompt/shell and navigate to the folder the gulp.js file is in and type.

gulp sass

gulp

Gulp.js

/*

* Place <!-- build:css --> <!-- endbuild -->

* around CSS link tags, to replace with cleaned/minified css

* and <!-- build:js --> <!-- endbuild --> around js tags.

* e.g.:

* <!-- build:js -->

* <script src="scripts/slider.js" async></script>

* <script src="scripts/hammer.min.js" async></script>

* <!-- endbuild -->

* Will replace both script tags with

* <script src="scripts/site.min.js" async></script>

*/

'use strict'; var gulp = require('gulp');

var sass = require('gulp-sass');

var stripCssComments = require('gulp-strip-css-comments');

var removeEmptyLines = require('gulp-remove-empty-lines');

var minify = require('gulp-minify');

var cleanCSS = require('gulp-clean-css');

var concat = require('gulp-concat');

var imagemin = require('gulp-imagemin');

var htmlreplace = require('gulp-html-replace');

var htmlmin = require('gulp-htmlmin'); var browserSync = require('browser-sync').create(); gulp.task('default', function()

{

browserSync.init({

server: {

baseDir: ""

},

cors:true

});



gulp.watch('sass/**/*.scss',['sass']);

gulp.watch('**/*.html').on('change', browserSync.reload);

//gulp.watch('Scripts/*.js',['minify']);

}); gulp.task('sass', function()

{

return gulp.src('sass/main.scss')

.pipe(sass().on('error', sass.logError))

.pipe(stripCssComments())

.pipe(removeEmptyLines())

.pipe(cleanCSS())

.pipe(gulp.dest('CSS'))

.pipe(browserSync.stream());

}); gulp.task('browser-sync', function() {

browserSync.init({

server: {

baseDir: ""

}

});

}); gulp.task('minify',function()

{

return gulp.src('scripts/*.js')

.pipe(minify({

ext:{

src:'.js',

min:'.min.js'

},

ignoreFiles:['.min.js'],

noSource: true

}))

.pipe(concat('site.js'))

.pipe(gulp.dest('Scripts/dist'));

}); gulp.task('imagemin',function()

{

return gulp.src('media/**/*')

.pipe(imagemin())

.pipe(gulp.dest('media/dist'));

}); /*dockerbuild tasks*/

gulp.task('dockerbuild',['docker-minify','docker-sass','docker-imagemin','docker-copy']); gulp.task('docker-minify',function()

{

return gulp.src('scripts/*.js')

.pipe(minify({

ext:{

src:'.js',

min:'.min.js'

},

ignoreFiles:['.min.js'],

noSource: true

}))

.pipe(concat('site.min.js'))

.pipe(gulp.dest('dist/scripts'));

}) gulp.task('docker-sass', function()

{

return gulp.src('sass/main.scss')

.pipe(sass().on('error', sass.logError))

.pipe(stripCssComments())

.pipe(removeEmptyLines())

.pipe(cleanCSS())

.pipe(gulp.dest('dist/css'))

.pipe(browserSync.stream());

}); gulp.task('docker-imagemin',function()

{

return gulp.src('media/**/*')

.pipe(imagemin())

.pipe(gulp.dest('dist/media'));

}); gulp.task('docker-copy',function()

{

gulp.src('*.html')

.pipe(htmlreplace({

'css': 'css/main.css',

'js': {src:'scripts/site.min.js', tpl: '<script src="%s" async></script>'}

}))

.pipe(htmlmin({collapseWhitespace: true}))

.pipe(gulp.dest('dist/')); });

package.json devDependencies

"browser-sync": "^2.17.5",

"gulp": "^3.9.1",

"gulp-clean-css": "^2.3.2",

"gulp-concat": "^2.6.1",

"gulp-html-replace": "^1.6.2",

"gulp-htmlmin": "^3.0.0",

"gulp-imagemin": "^3.1.1",

"gulp-minify": "0.0.14",

"gulp-remove-empty-lines": "0.0.8",

"gulp-sass": "^2.3.2",

"gulp-strip-css-comments": "^1.2.0"

Helpful links