UrlDiff - Simple visual regression testing

A few days ago, a small CSS change in the Smartphone Section caused the 3D-Printers Section to go haywire in Chrome. I did not notice before pushing it to the production server. For one because I use Firefox when I am coding, and second because I was focusing on the Smartphone Section.



To prevent this in the future, I decided it's time for automated visual regression testing!

I took a look at diff.io but that would cost $200/month and still be a bit limited with 258 daily page comparisons. If you have 50 pages in your test and already ran 5 tests today - then what? I also took a look at Ghost Inspector. While it has a lot of nice functionality, for some reason it failed on the Product Chart pages.

Existing self-hosted solutions like this one based on wraith come with a complex set of dependencies. And run in a 2-pass way. First they render two sets of screenshots, write them to disk, then they compare them and report on the number of differences.

Thinking about it, I decided that my favorite solution would be a simple shellscript, that visually compares all pages of my development server with the corresponding pages on the production server. One by one, without hitting the disk at all. And halting as soon as a difference is detected.

Not long ago, I read about cutycapt , a command line tool that renders websites via WebKit. Could it be used to compare two versions without much overhead?



Installation is easy:

apt-get install cutycapt

$ cmp -s

$ /dev/fd/63 /dev/fd/62 differ: byte 3, line 1

Now let's pass the output of two cutycapt calls to cmp:That outputs:

Wow, that's nice! Visual comparison from the shell with just one dependency and no temp files.



Time to think up a little config file for the tests. I immediately knew I wanted something simple like this:

urldiff.conf www.server1.com www.server2.com / /about /blog /animals/dogs /animals/cats ...

urldiff.sh c() { cutycapt --out-format=bmp --out=/dev/stdout "$@"; } { read prefix1; read prefix2; while read -r url do echo $url a="$prefix1$url" b="$prefix2$url" while ! cmp -s

Turns out the script to process it only needs 16 lines of bash:

That's it. Just put all your urls in urldiff.conf and then run urldiff.sh anytime to assert that no visual regression took place.



When urldiff hits a page that is different, it will kindly ask you to fix your stuff and then checks the page again.



Even though not explicitely coded into the script, some convinient additional functionality is automatically available:



If one of your servers needs authentification, you can simply put it into the config file:

urldiff.conf http://name:password@server1.com http://server2.com / /blog /about ...

You can pass all cutycapt parameters by appending them to the url. Let's say your blog has some funky javascript effect that needs 1.5 seconds to complete. Then add a 2s delay to that url so the final state get's rendered:

urldiff.conf http://name:password@server1.com http://server2.com / /blog --delay=2000 /about ...

That's it. Happy urldiffing!