Installing logstash is easy. Problems arrive only once you have to configure it. This post will reveal some of the tricks the ELK team at Comperio has found helpful.

Write configuration on the command line using the -e flag

If you want to test simple filter configurations, you can enter it straight on the command line using the -e flag.

bin\logstash.bat agent -e 'filter{mutate{add_field => {"fish" => “salmon”}}}' 1 bin \ logstash . bat agent - e 'filter{mutate{add_field => {"fish" => “salmon”}}}'

After starting logstash with the -e flag, simply type your test input into the console. (The defaults for input and output are stdin and stdout, so you don’t have to specify it. )

Test syntax with –configtest

After modifying the configuration, you can make logstash check correct syntax of the file, by using the –configtest (or -t) flag on the command line.

Use stdin and stdout in the config file

If your filter configurations are more involved, you can use input stdin and output stdout. If you need to pass a json object into logstash, you can specify codec json on the input.

input { stdin { codec => json } } filter { if ![clicked] { mutate { add_field => ["clicked", false] } } } output { stdout { codec => json }} 1 2 3 4 5 6 7 8 9 10 11 input { stdin { codec = > json } } filter { if ! [ clicked ] { mutate { add_field = > [ "clicked" , false ] } } } output { stdout { codec = > json } }

Use output stdout with codec => rubydebug

Using codec rubydebug prints out a pretty object on the console

Use verbose or –debug command line flags

If you want to see more details regarding what logstash is really doing, start it up using the –verbose or –debug flags. Be aware that this slows down processing speed greatly!

Send logstash output to a log file.

Using the -l “logfile.log” command line flag to logstash will store output to a file. Just watch your diskspace, in particular in combination with the –verbose flags these files can be humongous.

When using file input: delete .sincedb files. in your $HOME directory

The file input plugin stores information about how far logstash has come into processing the files in .sincedb files in the users $HOME directory. If you want to re-process your logs, you have to delete these files.

Use the input generate stage

You can add text lines you want to run through filters and output stages directly in the config file by using the generate input filter.

input { generator{ lines => [ '{"@message":"fisk"}', '{"@message": {"fisk":true}}', '{"notMessage": {"fisk":true}}', '{"@message": {"clicked":true}}' ] codec => "json" count => 5 } } 1 2 3 4 5 6 7 8 9 10 11 12 input { generator { lines = > [ '{"@message":"fisk"}' , '{"@message": {"fisk":true}}' , '{"notMessage": {"fisk":true}}' , '{"@message": {"clicked":true}}' ] codec = > "json" count = > 5 } }

Use mutate add_tag after each successful stage.

If you are developing configuration on a live system, adding tags after each stage makes it easy to search up the log events in Kibana/Elasticsearch.

filter { mutate { add_tag => "before conditional" } if [@message][clicked] { mutate { add_tag => "already had it clicked here" } } else { mutate { add_field => [ "[@message][clicked]", false] } } mutate { add_tag => "after conditional" } } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 filter { mutate { add_tag = > "before conditional" } if [ @message ] [ clicked ] { mutate { add_tag = > "already had it clicked here" } } else { mutate { add_field = > [ "[@message][clicked]" , false ] } } mutate { add_tag = > "after conditional" } }

Developing grok filters with the grok debugger app

The grok filter comes with a range of prebuilt patterns, but you will find the need to develop your own pretty soon. That’s when you open your browser to https://grokdebug.herokuapp.com/ Paste in a representative line for your log, and you can start testing out matching patterns. There is also a discover mode that will try to figure out some fields for you.

The grok constructor, http://grokconstructor.appspot.com/do/construction offers an incremental mode, which I have found quite helpful to work with. You can paste in a selection of log lines, and it will offer a range of possibilities you can choose from, trying to match one field at a time.

SISO

If possible, pre-format logs so Logstash has less work to do. If you have the option to output logs as valid json, you don’t need grok filters since all the fields are already there.

This has been a short runthrough of the tips and tricks we remember to have used. If you know any other nice ways to develop Logstash configurations, please comment below.