I needed to do some work with splunk the log and monitoring analytics toolkit. Specifically I needed to combine disparate logs from different systems to solve a problem that had been intefering with our systems for a long time. I had a look around for stuff on the CPAN, but it was either embedded into other bigger things that I didn't want to have to deal with, or did not work for me for maintenance reasons. So I decided to write my own. The library I wrote is available here in draft form.

Now because this was written by me for debugging purposes, I think it shouldn't be a CPAN module - not without a lot of hardening it up - but I wanted to share it anyway, as a handy way of doing API integration.

I created a new dist and wrote a test:

#!/usr/bin/env perl use warnings; use strict; use Test::More; use FindBin qw/$Bin/; use lib "$Bin/../lib"; use SplunkPipe; my $pipe = SplunkPipe->new; isa_ok($pipe, 'SplunkPipe'); isa_ok($pipe->client, 'SplunkPipe::Client'); done_testing;

This was just enough for me to start sketching out an API for my client.

SplunkPipe turned out to be pretty simple in the end:

package SplunkPipe; use Moo; use SplunkPipe::Client; has 'client' => ( is => 'lazy', ); sub _build_client { return SplunkPipe::Client->new(); } 1;

SplunkPipe::Client is a bit more fun. There's some commentary inline.

package SplunkPipe::Client; use Moo;

LWP::UserAgent::JSON is a handy little module that will give you back $response->json_content for responses which have a JSON content-type.

extends 'LWP::UserAgent::JSON';

Splunk ssl certificates seem to always be self signed. This was one of the pain points with a CPAN module I looked at.

use IO::Socket::SSL; use Path::Tiny;

Here's how you glue non-Moo classes into Moo:

sub FOREIGNBUILDARGS { my ($class, %args) = @_; $args{ssl_opts} = { verify_hostname => 0, SSL_verify_mode => IO::Socket::SSL::SSL_VERIFY_NONE, }; return %args; }

I did have an authenticate method, but I figured it wasn't worth it, just authenticate on construction:

sub BUILD { my ($self) = @_; # authenticate and on construction my $res = $self->post('/services/auth/login', { username => $self->splunk_credentials->{username}, password => $self->splunk_credentials->{password}, }); die "Unable to authentication with content " . $res->content . "

" unless $res->status_line eq '200 OK';

Oh yes, this is me parsing XML with a regular expression:

my ($key) = $res->content =~ m{<sessionKey>(.+)</sessionKey>}ms; $self->default_header( Authorization => "Splunk $key" ); }

I happen to have my credentials kicking around elsehwere on my box:

has 'credential_file' => ( is => 'ro', default => sub { return path("$ENV{HOME}/.creds"); }, );

It's an lhs=rhs style config file:

has 'splunk_credentials' => ( is => 'ro', default => sub { my ($self) = @_; my @data = $self->credential_file->lines; my $creds = { map { /(\S.*?)=(\S+)/} @data }; return $creds; }, ); has server => ( is => 'ro', default => 'https://my.splunk.instance:8089' );

This is the interesting bit. For all API operations we just want to send a path, the hostname is constant. Therefore I can just prepend the uri/host/port fragment to every request:

sub _mangle_request { my ($orig, $self, @args) = @_; $args[0] = '/' . $args[0] unless $args[0] =~ m{^/}; $args[0] = $self->server . $args[0]; # . '$mode';

The other thing is splunk responses return XML by default. In most cases this is overridable with addition of ?output_mode=[whatever] to each request as a query parmeter. However some responses, especially authentication will only return XML. This little snippet ensures that when possible we get JSON back.

if ( $args[0] !~ m{auth/login}) { $args[0] = "$args[0]?output_mode=json"; } return $self->$orig(@args); }

And finally we hook into the LWP guts to make that happen on every request. Splunk only uses GET, POST and DELETE.

around get => \&_mangle_request; around post => \&_mangle_request; around delete => \&_mangle_request; 1;

So the final thing to do was to create some search methods in the SplunkPipe class. First the worker that kicks up the search. You get a search ID out of it for polling and retrieving the search:

sub do_search { my ($self, $search) = @_; $search = "search $search"; my $res = $self->client->post('/services/search/jobs', { search => $search}); return $res->json_content->{sid}; }

Second splunk searches can take a while so we need to get the status. There's an options hash here in case I need the full response. Otherwise mostly we just want to know if the search is finished or not:

sub get_search_status { my ($self, $sid, %options) = @_; my $res = $self->client->get("/services/search/jobs/$sid"); return $res->json_content if $options{full}; return $res->json_content->{entry}->[0]->{content}->{isDone}; }

Finally a little worker to get the search results:

sub get_search_results { my ($self, $sid) = @_; my $res = $self->client->get("/services/search/jobs/$sid/results"); return $res->json_content; }

In reality what I did was extend my test to use SplunkPipe::Client requests and responses to work up the test output, and once that worked I refactored it into methods on SplunkPipe . Clearly if the functionality of this module grows I probably should refactor the methods of SplunkPipe into a SplunkPipe::Search class. Here's what my final parts of my test looked like. I used a subset of the problem I was working on for test data.

my $date_range = 'earliest="06/06/2016:18:47:00" latest = "06/06/2016:18:58:00"'; my $search = '[some search string here] ' . $date_range; my $sid = $pipe->do_search($search); my $status = $pipe->get_search_status($sid); is $status, '0'; diag "Snoozing for 5 secs while job completes"; sleep 5; $status = $pipe->get_search_status($sid); is $status, '1' or die "Job is not done :(

"; my $results = $pipe->get_search_results($sid); is ref ($results), 'HASH', "got a hashref from the search results"; done_testing;

It took me most of the day to work this up. That was coming from zero knowledge of the splunk API, the usual interruptions that a busy workplace will give you and a spot of semi-related yak shaving. However once I was done, it was another 30 minutes work to write a script that actually solved my problem. And I now have a reusable tool, as I'm pretty sure that I'm going to need to do similar things again. It also opens up the possibility of some low key monitoring of our logs. Yes, you can do most or all of this from the splunk front end. But I wanted tools I was comfortable with, and some scaffolding to help me learn how to make better use of the tool.