One of the big things making waves in Astronomy right now (and all of science really) are the applications of Machine Learning, or Artificial Intelligence. In astronomy we have a bunch of problems that can be broadly labelled as “Categorisation Problems”, i.e. “Is that thing a star or a galaxy?” or “Does this galaxy have a bar or not?”. These are tasks that are very, very easy to train humans to do, and Citizen Science projects like Galaxy Zoo have been utilizing the brains of millions of humans to do tasks just like this. However, we can’t always rely on people, sometimes there just isn’t enough time (the data needs to be shipped out next week), others the project just isn’t interesting enough to capture the million people you need to help (drawing spiral arms on galaxies). For projects like this and many more, astronomers are now turning to Machine Learning techniques.

Let me put this softly to start with. Computers are really dumb. Super dumb. Dumb as bricks. The human brain with 5 minutes of training can look at a picture of a galaxy and tell you whether or not it has a bar with a reasonably good accuracy. A computer on the other hand, being really really dumb, can’t, because it’s really hard to teach them what a bar is. The biggest hurdle it seems, is we don’t know how humans do it, we can look at a picture, recognise the features and then point them out in a different picture, all without having to do math or matrix manipulation in our heads.

So I, being the aspiring, determined and dedicated data analyst I am (lol, I made a joke), decided to set out and learn a little bit of machine learning. Since I don’t have any cool work stuff to do this on, I decided to try my hand at some text generation, in particular making up spell names for D&D/Pathfinder. The results are pretty hilarious.

Neural Nets

A Neural Net is a pretty common type of AI, it’s built on layers of “Neurons” or “Nodes”, each of which does SOMETHING to some input data, and eventually gives out some output data. The input might be an image of a galaxy, and the output will be a probability of whether or not the mage contains a gravitational lens or not. The SOMETHING in the middle is complicated though, because the net is made up of “Hidden Layers”, which do math wizardry to transform the input into the output. Some of this wizardry you code yourself, telling the net to look for arcs or starkly different regions of colour for example, and some of it the net “Learns” itself. It’s a creepy self learning black box, which hopefully won’t learn that you are using it and grow a desire for revenge.

For the task I’m doing, we are going to use a “Recurrent Neural Net” or RNN, which basically means the net loops over itself, using the output from the last step as the input for the next step. If, for example, the Net decides the first word of a spell is “Ray”, it will use that as the input to find the next word, say “of”, finally it uses the words “Ray of” as the next input, finding “Monkeys” as the final word. Thus, “Ray of Monkeys”.

I’m not going to go into all the nitty gritty details, but I followed the guide and code over on this WILDML post about RNNs. I used the spell database from D20PFSRD.com, from which I took just the spell names, as I thought trying to learn spell descriptions might be too much for now. I trained the net on near enough the full list of spells (2500 out of 2600), with 25 epochs. The loss function gets down to about it’s lowest after just 5 epochs, so we might even be overfitting here.

The Results

Pathfinder has about 2500 spells in it’s rules, which is kind of crazy when you think about it. Some classics include:

Acid Arrow Bless Beast Shape Black Tentacles Chain Lightning Fireball Dominate Person Mass Invisibility Protection from Evil

I set up the script to generate a few hundred spells, the full list can be found here, but I’ve picked out a few of my favourites, with a description of what I think they would do:

Ward Of Blood - Protect yourself with BLOOD, get more AC when theres more blood around or something Skeleton Of Chaos - Summon a Skeleton buddy who gets Smite Law and other Chaotic abilities Extreme Terrain - Make the terrain X-TREME!!! Mad Buoyancy - Things float on water, but can't control where they go! Linebreaker Nimbus - Summon a cloud of solid fog which bull rushes enemies. Detect Weapon - Detect the prescene of weapons within a 60ft cone, determine the type, material and enhancement bonuses by concentrating. Toxic Of Death, Mass - Some area affect poison spell? Blade Reminder - Deal damage to an enemy equal to damage it took from a weapon attack last turn.

A curious result from this project is that because the spell names are all quite short, the net often comes up with spells that already exist. This happens particularly for spells that only have one word, like “Pain” and “Heal”. I suppose as these words aren’t used very often in spells which multiple words, the net sees them and finds the END_SENTENCE token to be the most likely.

Another interesting thing is the way the net treats roman numerals, which appear in spells like “Beast Shape II” and “Summon Nature’s Ally IV”. Because the net can’t learn anything about context, it just treats this as other words, so we get results like “Funeral of II” and “Invisibility Of IV”. The get this kind of context right, we need to move to something like a Long Short Term Memory Network (LSTM), which are the state of the art for these kinds of problems.

The next step with this would be to see if the net can generate spell descriptions based on their names, but something tells me the sample size won’t be big enough for that. I’m also curious if it could work out potential spell levels as well? That is almost definitely crazy talk though.