Taking posts from Lobste.rs and putting them on Mastodon

If you post a programming article to Hacker News, Reddit or Lobsters; you will notice that soon after it gets to the front page, it gets posted to Twitter automatically.

But why settle for Twitter when you can have this on Mastodon? In this article we will write Mastodon bot that regularly checks the Lobste.rs front page and posts new links to Mastodon.

Since this is a Mastodon bot, let’s start by sending a post to our followers.

Sending a Mastodon post

To interact with Mastodon, we are going to use a library called Tooter. To get the API keys you need; just log in to Mastodon, go to Settings > Development > New Application. Once you create an application, the page will show all the API keys you need.

( defun get-mastodon-client () ( make-instance 'tooter:client :base "https://botsin.space" :name "lobsterbot" :key "Your client key" :secret "Your client secret" :access-token "Your access token" ) )

This function will create a Mastodon client whenever you call it. Now, let’s send our first message.

( tooter:make-status ( get-mastodon-client ) "Hello world!" )

Now that we can send messages, the next step in our project is to fetch the RSS feed.

Fetching the RSS feed

Fetching resources over HTTP is really straightforward with Common Lisp, the drakma library provides an easy-to-use function called http-request. In order to get the contents of my blog, all you need to do is

( drakma:http-request "https://gkbrk.com" )

So let’s write a function that takes a feed URL and returns the RSS items.

There is one case we need to handle with this. When you are fetching text/html , drakma handles the decoding for you; but it doesn’t do this when we fetch application/rss . Instead, it returns a byte array.

( defvar feed-path "https://lobste.rs/rss" ) ( defun get-rss-feed () "Gets rss feed of Lobste.rs" ( let* (( xml-text ( babel:octets-to-string ( drakma:http-request feed-path ))) ( xml-tree ( plump:parse xml-text ))) ( plump:get-elements-by-tag-name xml-tree "item" ) ))

This function fetches an RSS feed, parses the XML and returns the <item> tags in it. In our case, these tags contain each post Lobste.rs.

A struct in Common Lisp is similar to a struct in C and other languages. It is one object that stores multiple fields.

( defstruct lobsters-post title url guid )

Getting and setting fields of a struct can be done like this.

; Pretend that we have a post called p ( setf ( lobsters-post-title p ) "An interesting article" ) ; Set the title ( print ( lobsters-post-title p )) ; Print the title

Let’s map the RSS tags to our struct fields.

( defun find-first-element ( tag node ) "Search the XML node for the given tag name and return the text of the first one" ( plump:render-text ( car ( plump:get-elements-by-tag-name node tag ))) ) ( defun parse-rss-item ( item ) "Parse an RSS item into a lobsters-post" ( let (( post ( make-lobsters-post ))) ( setf ( lobsters-post-title post ) ( find-first-element "title" item )) ( setf ( lobsters-post-url post ) ( find-first-element "link" item )) ( setf ( lobsters-post-guid post ) ( find-first-element "guid" item )) post ))

Now, we can make the previous get-rss-feed function return lobsters-post’s instead of raw XML nodes.

( defun get-rss-feed () "Gets rss feed of Lobste.rs" ( let* (( xml-text ( babel:octets-to-string ( drakma:http-request *feed-url* ))) ; Tell the parser that we want XML tags instead of HTML ; This is needed because <link> is a self-closing tag in HTML ( plump:*tag-dispatchers* plump:*xml-tags* ) ( xml-tree ( plump:parse xml-text )) ( items ( plump:get-elements-by-tag-name xml-tree "item" )) ) ( reverse ( map 'list #' parse-rss-item items )) ))

Posting the first link to Mastodon

( defun share-post ( item ) "Takes a lobsters-post and posts it on Mastodon" ( tooter:make-status ( get-mastodon-client ) ( format nil "~a - ~a ~a" ( lobsters-post-title item ) ( lobsters-post-guid item ) ( lobsters-post-url item ))) ) ( share-post ( car ( get-rss-feed )))

Keeping track of shared posts

We don’t want our bot to keep posting the same links. One solution to this is to keep all the links we already posted in a file called links.txt.

Every time we come accross a link, we will record it to our “database”. This basically appends the link follewed by a newline to the file. Not very fancy, but certainly enough for our purposes.

( defun record-link-seen ( item ) "Writes a link to the links file to keep track of it" ( with-open-file ( stream "links.txt" :direction :output :if-exists :append :if-does-not-exist :create ) ( format stream "~a~%" ( lobsters-post-guid item ))) )

In order to filter our links before posting, we will go through each line in that file and check if our link is in there.

( defun is-link-seen ( item ) "Returns if we have processed a link before" ( with-open-file ( stream "links.txt" :if-does-not-exist :create ) ( loop for line = ( read-line stream nil ) while line when ( string= line ( lobsters-post-guid item )) return t )) )

Now let’s wrap this all up by creating a task that

Fetches the RSS feed

Gets the top 10 posts

Filters out the links that we shared before

Posts them to Mastodon

( defun run-mastodon-bot () ( let* (( first-ten ( subseq ( get-rss-feed ) 0 10 )) ( new-links ( remove-if #' is-link-seen first-ten )) ) ( loop for item in new-links do ( share-post item ) ( record-link-seen item )) ))

How you schedule this to run regularly is up to you. Set up a cron job, make a timer or just run it manually all the time.

You can find the full code here.