To get the speed data into Loggly it needs (1) reformatting into JSON (as this is a format Loggly interprets nicely) and (2) sending to Loggly. To do these things, save the following small script as /usr/local/bin/speedtest-loggly :

Don’t forget to do chmod a+x /usr/local/bin/speedtest-loggly to make it executable. As always, the script also must have Unix line endings, not DOS ones (so use dos2unix on it if you’ve copied and pasted from a Windows machine). Test it out by typing speedtest-loggly : it will take a while to run but should report nothing to the console. If you then do tail /var/log/syslog you should see a new log message on the end of the syslog looking something like:

You can then get it to run every hour by putting a symbolic link to the script from the /etc/cron.hourly directory by doing:

That should all then be working. If you want to understand what’s going on there read on, otherwise jump ahead to Viewing the data in Loggly.

The script explained

We need to reformat the data produced by speedtest-cli --simple into JSON format so that Loggly can display it nicely for us. In long-hand, the data needs to end up looking like:

We are actually going to transform it to look like this but without the spaces and new lines so it will end up as:

This is describing a dataset we are labelling as “speed” and which has 3 parts each with a label and value. As the required format is not so different to the --simple format from speedtest-cli it can be transformed by a perl regular expression and a cut command by the first line of the script:

MSG = ` speedtest-cli --simple | perl -pe 's/^(.*): (.*) (.*?)(\/s)?

/"$1_$3": $2, /m' | cut -d ',' -f 1 -3 `

There are plenty of references to be found regarding regular expressions (including man perlrequick and man perlretut ): I’ll just explain what this one is doing. We take the output from the speedtest-cli --simple command and pipe it into (as input) the perl command. The -pe arguments to perl tell it to automatically p -rint what it processes (with an assumed loop over the incoming contents) and to e -xecute the one-line script that follows inline between the single-quotes (see perl --help for a little more detail).

The regular expression (or “regexp”) is a substitution sort (starts with s ) and is doing multi-line matching (ends with m ) so we can look for the new-line character in the expression. The ^ means it looks for the start of a line, it stores away (indicated by the parentheses) everything up to the : (which is the label) then finds and stores the number, another space and then finds and stores the units with a possible (but ignored) /s which is matched using (\/s)?

. The \/ is an escaped slash character: without the back-slash first it would be treated specially as ending that part of the regexp. We ignore the /s because that makes the JSON invalid. It then goes on to replace all that it’s found with "$1_$3" , a : , space and $2, which is the label an underscore and the units (without any /s ) surrounded by double quotes, then a colon, space and the number, then a comma and a space.

Processing the output through perl in this way gives us something like:

This is nearly what we need but we have to get rid of the last comma as it is not valid JSON to include it. To do this we use the cut command: piping the output of perl into cut and getting it to split the input at every comma (the -d',' argument says to use comma as the field delimiter rather than the default tab) and then output just the first 3 fields (using the -f 1-3 argument) which is all of the fields, but as cut knows that’s the end it leaves the comma off!

The first line starts with MSG=` and ends with another so-called back-tick (top left of most keyboards). This is assigning a value to the MSG variable in bash (the shell we are using). You must make sure not to put any spaces either side of the equals sign (even if it does make it human-readable). The back-ticks surrounding the combination of speedtest-cli , perl and cut mean that the pipeline is evaluated and the result set as the value of MSG .

Let’s now look at the second line:

logger "{\"speed\": { $MSG }}"

Here we massage the data a little further with the "{\"speed\": {$MSG}}" part. This is adding in {"speed": { to the start and a }} to the end. The double-quotes that we want to add in are escaped with back-slashes so that the shell doesn’t treat them specially and we use double-quotes on the outside to hold it all together and so that $MSG is evaluated (often called “interpolated”). In bash when you assign to a variable you don’t use a dollar sign but when you want to use the variable’s value you put a dollar sign on the front.

Finally, now the message is in JSON format (albeit with no new lines, but that’s fine) we can send it to the syslog with the simple logger command.

The script has used all three types of quote marks: