Monitoring python cron jobs with Babis

Over at MozMeao we are using APScheduler to schedule the execution of periodic tasks, like Django management tasks to clear sessions or to fetch new job listings for Mozilla Careers website.

A couple of services provide monitoring of cron job execution including HealthChecks.io and DeadManSnitch. The idea is that you ping a URL after the successful run of the cron job. If the service does not receive a ping within a predefined time window then it triggers notifications to let you know.

With shell scripts this is as simple as running curl after your command:

$ ./manage.py clearsessions && curl https://hchk.io/XXXX

For python based scripts like APScheduler's I created a tool to help with that:

Babis provides a function decorator that will ping monitor URLs for you. It will ping a URL before the start or after the end of the execution of your function. With both before and after options combined, the time required to complete the run can be calculated.

You can also rate limit your pings. So if you're running a cron job every minute but your check window is every 15 minutes can you play nicely and avoid DOSing your monitor by defining a rate of at most one request per 15 minutes with 1/15m .

In some cases network hiccups or monitor service maintenance can make Babis to fail. With the silent_failures flag you can ignore any failures to ping the defined URLs.

The most common use of Babis is to ping a URL after the function has returned without throwing an exception.

@babis.decorator ( ping_after = 'https://hchk.io/XXXX' ) def cron_job (): pass