This post was inspired by a story that showed up on my Facebook feed recently, An A.I. Wrote a Christmas Song and It’s Really, Really Creepy. The system discussed in the article was a neural network, a topic I'd like to dig deeper into in a future article; but for this one I wanted to explore the use of a Markov chain to produce something that sounded "Christmasy."

My goal is to learn more about Markov chains, find a library or two to help me generate one, learn enough about midi files to read existing Christmas carols and generate a new one, and finally to generate some Christmas lyrics by feeding the Markov chain library lyrics to real Christmas carols.

Before going any further, I want to remind the reader that most of the posts on this blog are really about the author just exploring something they know very little about, and hopefully coming out a little smarter at the end of this process. In my case, I know of Markov chains (as in I know they exist), and I know more or less what they do, but I've never played around with them. Beyond that, I know nothing of midi files. Or music. In other words....

Markov Chains

So.... what the heck is a Markov chain? I know... let's start with Wikipedia! (Opens Wikipedia, scrolls down a bit...)

Joking aside, I was looking for a more gentle introduction, so I hit Google and landed on Markov Chains Explained Visually, a fantastic short explanation on the topic with interactive animations. I strongly recommend checking it out, it's a very quick read.

Markov chains, named after Andrey Markov, are mathematical systems that hop from one "state" (a situation or set of values) to another. For example, if you made a Markov chain model of a baby's behavior, you might include "playing," "eating", "sleeping," and "crying" as states, which together with other behaviors could form a 'state space': a list of all possible states. In addition, on top of the state space, a Markov chain tells you the probability of hopping, or "transitioning," from one state to any other state---e.g., the chance that a baby currently playing will fall asleep in the next five minutes without crying first.

Ok, so given a set of states, the Markov chain helps us predict what the next state will be, given our current state.

Markov help us make better models. For instance, say we wanted a simple model of weather, where the state is either sunny or rainy. A naive approach would to flip a coin (or rand.Next(0,2)==1) to decide whether a day is rainy. The problem is that your model would tend to flip flip between rainy and non-rainy. In real life (at least where I live), rainy days tend to clump together; so in our model we'd want to say, "if today is rainy, tomorrow is 70% likely to also be rainy."

In a transition matrix representation of a Markov chain, we'd describe that like so:

Rainy Sunny Rainy P(R|R):0.7 P(R|S):0.3 Sunny P(S|R):0.3 P(S|R):0.7

The website noted above has a neat animation tool that shows you the results of playing around with the transition matrix on a state diagram, it's pretty neat you should check it out. Here's a screen capture of the above transition diagram inputted into their tool.

Alright, so now we know what a Markov chain is, and what they're good for. Specifically, we want to use Markov chains for the following:

Given a note, produce the next (Christmasy) note

Given a word, produce the next (Christmasy) word for a lyric

MarkovSharp

So now armed with knowledge, I set out to look for a library to help us make Markov chains. Now I have to admit, given the date that I started (December 22), I didn't think I'd finish before Christmas, given that I had to learn about Markov chains, learn about MIDI, and code. I assumed that my Christmas post would be late, or that I'd have to re-title it to A Very Markov New Years. But then I landed on MarkovSharp, "an easy to use C# implementation of an N-state Markov model. MarkovSharp exposes the notion of a model strategy, which allows you to use pre-defined model strategies, or create your own." I scrolled down to check out the examples and found that the two strategies provided out of the box were to produce words.... and MIDIs!

MarkovSharp is pretty easy to use out of the box. You simply provide it with source material (e.g. a list of strings) and tell it to learn. From there it can provide you with state transitions, given the current state. For example:



Pretty simple!





Markov Christmas





The project (code available on GitHub) is pretty simple, Chris Core's library did most of the hard work.

To make music, it finds songs in the Song directory, flattens the tracks, then gets inputted into the Markov model. For the output, I use a different model to output what division to use (which controls the song's tempo, learned via playing around).

Snippet

After the model is trained, I simply get the song by getting the first output from the model.



Generating lyrics is only slightly more complicated than the demo code provided by the library. I put in lyrics to a small handful of Christmas songs and pull out 18 lines from the output.

The Result

The melody can be downloaded here: MarkovChristmas.mp3 (6.70 mb). The lyrics are below.

A Very Markov Christmas (2016)

Here comes Santa Claus comes tonight!

I'm dreaming of

Marshmallows for toasting

And may all

There'll be much mistletoeing

Please have snow and brightTis the Christmas card I write

Sing we joyous all

I'm dreaming of a White Christmas tree, O Christmas card I used to call

and Dancer and

Brings to have snow

And everyone telling you just the harp and mistletoeIt's the new, ye lads and if you recall

And caroling out in the most wonderful time of the other reindeer games

For boys and Vixen

Hail the most famous reindeer

The most wonderful time of all together! Fa la

Oh what a peep

Oh what a peep indeed.

To all of our readers, have a happy and happy holiday!