Generating RNN Text on Spell

Or: the mystical art of turning a list of D&D spells into a much longer list of much weirder spells

with Janelle Shane

Level: hapless apprentice. Able to use terminal to install packages (ideally already have python installed) and able to change folders and move files, that sort of stuff. No coding required for this tutorial.

So! You want to become a wizard!

Using Spell is a fine way to amplify the awesomeness of your mystic spells — a fine way to run them on mighty GPUs rather than on your lowly laptop. And the GPUs are located a safe distance away, so you are not responsible for their care and feeding, or for evading their ravenous nighttime hunting flights.

Using Spell is a lot like running commands on your own computer, except:

1. It enforces the use of git for file-tracking (so that ye do not cast a spell for which there is no counterspell).

2. The command runs remotely, harnessing the power of Spell’s GPUs (yes, you tap into the very mana of the universe, available at a reasonable rate)

3. It doesn’t overwrite files in your directory unless you run a spell cp runs/## command

Yet using Spell is also different from running commands on remote machines. The nice thing is that there’s no virtual machine to manually start and stop. You invoke the mighty GPU just for the commands that require it, and it flaps back to its lofty perch when the command is done, picking its teeth with its talons. No more forgetting to stop the expensive virtual machine during your year-long quest for the legendary harmonic orb, returning at last, haggard and battleweary, to discover that you have performed the financial equivalent of leaving your sink running for a year, flooding your dungeons beyond repair.

And the commands for running a spell remotely are, conveniently, about the same as those for running locally.

And so! Let us begin.

First, register for an account.

Pick your wizard name (your username) and your super secret word of power (your password)

Note that an enemy wizard who knows your name and word of power will have great power over you, so only reveal these to wizards you trust completely. Never reveal them to Tretcher Twestybeard, who is a wizard of great cunning and duplicity.

When you mutter your name and word of power, you will see the following vision. Do not be alarmed.

You may do this entire tutorial with the CPU plan if you wish, but the free GPU credit that comes with a GPU account is many times more than you’ll need for completing this tutorial. I trained dozens of models on a K80 GPU (you know, the kind with the pretty golden wings) for this tutorial and ended up using just over $1.00 of credit. So, you might as well sign up for the GPU plan.

Now, take a deep breath, center your power within yourself, and check out the handy Get Started section.

The 1st instruction says to type something into the terminal. (If you’re on Mac, your terminal is under programs/utilities)

pip install spell

This command uses the python package installer, pip, to install spell and all of its dependencies. (You’ll need python already installed)

My own installation was not uneventful. Several times the skies flashed dark and voices boomed out alarming errors in red font. It seems my familiar miniconda, who had installed python for me, had taken it upon itself to defend python from all changes. I rolled up my robe sleeves and began to wrestle with it, cursing all the while. The miniconda is a lot stronger than it looks, and we finally arrived at a compromise: I would be allowed to install spell only if I was willing to let the miniconda wall it off for my protection.

By running the following command, I created a safe environment in which to install and run Spell.

$ conda create -m -n spellenv

And, to invoke this environment, I need to type the following in any new terminal window:

$ source activate spellenv

Then finally, I was able to install Spell via pip install spell .

Admission: the first time I tried this I accidentally typed pip install shell which also worked but installed something entirely different. Wizards must be careful not to mispronounce their incantations or the results may be quite different from intended.)

Also not recommended: pip install smell (your computer will take on the odor of your choice of cheese), pip install swell (your computer will double in size at odd intervals).

(okay not really.)

To see if the install worked, simply intone this simple word:

$ spell

And LO:

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$ spell

Usage: spell [OPTIONS] COMMAND [ARGS]… Options:

-h,--help Show this message and exit.

--version Show the version and exit. Commands: cp Retrieve a file or directory

feedback Provide feedback to the Spell team

help Display help information

hyper Create hyperparameter searches

info Describes a run. Displays info such as start and end

time as well as run parameters such as apts, pips,

and mounts.

jupyter Start a Spell Jupyter session

keys Manage public SSH keys registered with Spell

kill Kill a current run

login Log in with your username or email

logout Log out of current session

logs Retrieve logs for a run

ls List resource files

model-servers Manage model servers

passwd Set a new password for your Spell account

ps Display run statuses

rm Specify one or more resources to delete. These can be

[run_id] or uploads/[directory]

run Execute a new run

stats Display performance statistics for a run

status Display account and billing information

stop Stop a run with status 'Running'

upload Upload a file or directory

whoami Display current user information

workflow Execute a new workflow

workspaces List workspaces

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$

The universe is opened to you. Your mind dazzles with possibilities. Spells beyond your comprehension! The very power to kill, or to find out who you truly are.

Next command is to log in! These would be the credentials you used when you created your account.

YES. Your wizard name and your word of power. If you have managed to guard them safe from the wizard Twestybeard.

Type:

$ spell login

Enter yon credentials.

And, once again, LO!



Enter your spell username or email:

Enter your spell password:

Hello, Azrara Stoutfrogg!

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$ (spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$ spell loginEnter your spell username or email: azrara@stoutfrogg.com Enter your spell password:Hello, Azrara Stoutfrogg!(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$

Now you are ready to cast your first spell, a simple cantrip that every apprentice wizard learns first.

If I run this command on my own without invoking the power of spell (if I run it on my own computer rather than remotely tapping into Spell’s machines) I merely say “echo” (that is, “repeat after me”) and then the phrase that I want echoed. I hear my voice repeated back to me.

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$ echo hello-world

hello-world

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$

If I run this command using Spell, then a lot more happens (including fun emoji). A remote computer leaps to life to run this command for me, with Spell handling the request, and transferring data/programs to and from the remote computer. It takes, um, 30 seconds, and involves double confetti at the end.

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$ spell run echo hello-world Counting objects: 13, done.

Delta compression using up to 8 threads.

Compressing objects: 100% (13/13), done.

Writing objects: 100% (13/13), 5.06 MiB | 102.00 KiB/s, done.

Total 13 (delta 6), reused 0 (delta 0)

To git.spell.run:aiweirdness/d8f3f7b5925131ebeeba9e32c7e75b6b7c1b52d7.git

* [new branch] HEAD -> br_e8118d3b24aa8a2823900b518fc0e219d1dd5dcc

💫 Casting spell #75…

✨ Stop viewing logs with ^C

✨ Machine_Requested… done

✨ Building… done

✨ Run is running

hello-world

Retrieving modified or new files from the run

✨ Saving… done

✨ Pushing… done

🎉 Total run time: 31.07015s

🎉 Run 75 complete

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$

But wait! We will accelerate the speed of this simple command by harnessing the mighty power (and pretty golden wings) of the K80 GPU!

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$ spell run

--machine-type K80 echo hello-world Everything up-to-date

💫 Casting spell #76…

✨ Stop viewing logs with ^C

✨ Machine_Requested… done

✨ Building… done

✨ Run is running

hello-world

Retrieving modified or new files from the run

✨ Saving… done

✨ Pushing… done

🎉 Total run time: 36.483634s

🎉 Run 76 complete

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$

It takes, um, 36 seconds. We have just performed the computational equivalent of summoning a mighty dragon to open a package of cookies for us.

Never fear. Much like pack-llamas, who are a royal pain beyond imagining for a 1-mile hike and a light picnic lunch, the Spell approach proves its worthiness on the longer journeys.

So, let us try something more complex! Let us use the power of Spell to generate powerful new spells for Dungeons & Dragons! Starting from a list of existing spells, I’ll train a machine learning program called a recurrent neural network (RNN) to generate more. The program I’ll use is textgenrnn, an open-source project by Max Woolf.

First, to keep this endeavor separate from our other projects, we’ll create a folder just for spell and do our work in there.

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$ mkdir spell

(spellenv) Azraras-Book-of-Power:~ azrarastoutfrogg$ cd spell

(spellenv) Azraras-Book-of-Power:spell azrarastoutfrogg$

Now, we get textgenrnn:

We’ll also need a python wrapper file that I wrote for textgenrnn, which lets us build and train new textgenrnn models with a single command, and also handles saving and loading models for us. (This wrapper’s not specific to Spell — you can use it to run textgenrnn locally as well)

And then copy our dataset of D&D spell names into its own folder in textgenrnn/data . Let’s call it dd_spells.txt and put it in textgenrnn/data/dd_spells (we can get it from github, but don’t do it via git clone, because Spell will be confused if the file belongs to two nested git repositories at once).

Things to download: textgenrnn python save wrapper, ice cream dataset, d&d spells dataset, d&d character bios dataset.

Use the download ZIP button; don’t git clone inside textgenrnn!

And a handy dandy diagram of where the files we downloaded should be (and how to rename the folders if you want to copy/paste my code).

textgenrnn

|-- data

| |-- dd_spells (don't git clone me!)

| | |-- Combined_DnD_spells.txt

| |-- ice_cream (don't git clone me either!)

| | |-- IceCream_sorted.txt

| |-- dd_bios (no cloning for me!)

| | |-- dd_bios.csv

|-- textgenrnn_save_wrapper.py

Now I’ve got all the downloaded datasets in my data folder, ready to go. And I’ve got the wrapper python file we’ll use for running textgenrnn at the root (top) folder.

I need to tell my local project that these exist, though. Spell enforces the religious use of git, so it’s worth looking up basic git commands, but here are the ones we’ll mostly use:

$ git add -A -n

Adds all new/changed files to the git repository, handles deleted files and folders as well. The -n makes it a dry run. You only need to run this command if you want to see what’s changed.

$ git add -A

Adds all new/changed files to the git repository, handles deleted files and folders as well. Does it for real this time.

$ git commit -m “Explain what I was trying to do"

Commits these changes to the git repository.

If you forget to do these, Spell will complain when you try to run the spell remotely.

Now, the magic happens! (Pun intended. Puns are a powerful source of mana.)

To train a new model from the text file in data/dd_spells/Combined_DnD_spells.txt and save in weights/dd_spells :

$ spell run --machine-type K80 --pip textgenrnn \

“python textgenrnn_save_wrapper.py --model_name=dd_spells \

--new_model --num_epochs=1 \

--data_file=dd_spells/Combined_DnD_spells.txt”

(If you want to use a CPU instead, leave out --machine-type K80 .)

You’ll see a lot of output, but you may sit back, sip your dragonflower tea, and wait for sweet spells to come. If you’re using the K80, training will take about a minute.

(spellenv) Azraras-Book-of-Power:textgenrnn azrarastoutfrogg$ spell run --machine-type K80 --pip textgenrnn \

python “textgenrnn_save_wrapper.py \

--model_name=dd_spells \

--new_model --num_epochs=1 \

--data_file=dd_spells/Combined_DnD_spells.txt” Everything up-to-date

💫 Casting spell #77…

✨ Stop viewing logs with ^C

✨ Machine_Requested… done

✨ Building… done

✨ Run is running

Using TensorFlow backend.

2018–10–30 03:30:28.448246: I

...

Training new model w/ 2-layer, 128-cell LSTMs

Training on 253,682 character sequences.

Epoch 1/1 1/1981 […………………………] — ETA: 1:43:24 — loss: 3.9558

3/1981 […………………………] — ETA: 35:08 — loss: 4.4925

5/1981 […………………………] — ETA: 21:28 — loss: 4.2348

7/1981 […………………………] — ETA: 15:36 — loss: 3.9939

9/1981 […………………………] — ETA: 12:20 — loss: 3.7671

11/1981 […………………………] — ETA: 10:15 — loss: 3.6544

13/1981 […………………………] — ETA: 8:48 — loss: 3.5629

15/1981 […………………………] — ETA: 7:44 — loss: 3.4864

17/1981 […………………………] — ETA: 6:55 — loss: 3.4237

19/1981 […………………………] — ETA: 6:16 — loss: 3.3784

22/1981 […………………………] — ETA: 5:30 — loss: 3.3378

25/1981 […………………………] — ETA: 4:56 — loss: 3.2933

28/1981 […………………………] — ETA: 4:28 — loss: 3.2542

31/1981 […………………………] — ETA: 4:06 — loss: 3.2299

34/1981 […………………………] — ETA: 3:48 — loss: 3.2108

37/1981 […………………………] — ETA: 3:32 — loss: 3.1897

… (trimmed output from these other lines)

1975/1981 [============================>.] — ETA: 0s — loss: 1.8494

1978/1981 [============================>.] — ETA: 0s — loss: 1.8488

1981/1981 [==============================] — 47s 24ms/step — loss: 1.8480

####################

Temperature: 0.2

####################

Stone of the Sharm Spell Share Sharow of the Share ####################

Temperature: 0.5

####################

Alarce of the Stone Summon Conding Peath Chee Freath ####################

Temperature: 1.0

####################

Shiff of Canst Wall of Turous Enival Chental Redicting ✨ Saving… done

✨ Pushing… done

🎉 Total run time: 1m20.255094s

🎉 Run 77 complete

(spellenv) Azrara-Book-of-Power:textgenrnn azrarastoutfrogg$

The above command had textgenrnn look at the entire dataset just once (that is, one epoch). Loss is textgenrnn’s measure of its own progress — the better it thinks it is at matching the input dataset, the lower the loss. Then, at the end of the epoch, it output some example spells. The temperature is like a creativity level, governing how likely textgenrnn is to go with its top choice when adding new letters.

Our progress is automatically saved on Spell, and we can use our saved models to keep training this model or to generate more spells. We can use this command to see all the model files that textgenrnn_save_wrapper.py saved for us.

# replace 77 with your run id

$ spell ls runs/77

- - weights

188 Nov 05 10:35 textgenrnn_config.json

461 Nov 05 10:35 textgenrnn_vocab.json

1117528 Nov 05 10:35 textgenrnn_weights.hdf5

$ spell ls runs/77/weights

- - dd_spells

Behold! We have our weights/dd_spells directory as expected.

Let’s generate more spells!

The python wrapper automatically saves our model in weights/dd_spells , which we saw was saved to the output of runs/77 .

To sample from the model saved in weights/dd_spells , we will need to add (or in Spell’s parlance, mount) the weights directory to our new run. We can do this using the --mount or -m flag.

The information to the left of the colon tells Spell what we’re mounting (in this case the directory runs/77/weights ) and the information to the right of the colon tells Spell what to call the mounted directory in our new run (in this case, we’re keeping the name of the directory the same, weights since that’s what our wrapper is expecting).

$ spell run --pip textgenrnn --mount runs/77/weights:weights \

"python textgenrnn_save_wrapper.py \

--model_name=dd_spells --n_gen=10 --temperature=0.2"

Note that we ran this using CPU, because we don’t need GPU acceleration for this one. This command will give us 10 spells at a very low temperature. That is, spells that the model deems “safe”.

Sharm of the Shape Sharing Shape Sharm of the Strike Shadow Sharing Thants Sharm of the Sharity Sharper of the Sharing Chants Chant of the Share Sharm of the Sharm Chanter Share Shadow Stone

Suppose we wanted something more exotic. Use a higher temperature!

$ spell run --pip textgenrnn --mount runs/77/weights:weights \

"python textgenrnn_save_wrapper.py \

--model_name=dd_spells --n_gen=10 --temperature=1.0"

Then we get the following:

Weallin Shoct of mutQilefry Lightted Undick Drickes Dimp of the Aumthaus Touch of the Four Warves, Uningal Peats Shael of the Braveos Count of the Nsites, Greater Repengue Anlumonrat/isclaunt Seleak's Cosu Constal Sheep ob the Enroud

They are certainly more exotic, that’s for sure. Cast them with great caution, for these are the spells created by a very incompletely trained model.

If we seek spells of just one type, we can specify a beginning that each spell should have. Let us try fire, perhaps?

$ spell run --pip textgenrnn --mount runs/77/weights:weights "python textgenrnn_save_wrapper.py --model_name=dd_spells --n_gen=10 --temperature=1.0 --prefix='flaming'" flaming tofy flaming Summonin flaming Bersoous flaming Licks flaming Entral flaminggshosmity flaming Weith flaming Breath of tag Tatce flaming Presictralat flaming Fire, Greath

Note: do NOT cast “flaming tofy”, it is NOT candy. Don’t ask how I know.

Let us train our model to a higher level of sophistication, so our spells are not quite so volatile.

To get our textgenrnn_save_wrapper.py and Spell to play nicely, we’ll first need to download our weights from our earlier run, 77, and, to save our results and add them to git, we do like before:

$ spell cp runs/77

$ git add -A && git commit -m "results from first D&D spell training"

Now let’s load a model from dd_spells and train for another epoch on the text file in data/dd_spells/Combined_DnD_spells.txt (notice I’m using the shortcuts -t for --machine-type and -m for --mount :

$ spell run -t K80 --pip textgenrnn -m runs/77/weights/dd_spells:weights/dd_spells "python textgenrnn_save_wrapper.py --model_name=dd_spells --num_epochs=1 --data_file=dd_spells/Combined_DnD_spells.txt"

Note that we have removed --new_model= from our commands. This means instead of starting the model from scratch, we’re starting from our saved model.

As before, the model goes and trains without us, and after a minute or so (about 10x longer if trained on CPU), it returns with answers from the new, improved model.

1977/1981 [============================>.] - ETA: 0s - loss: 1.6396

1980/1981 [============================>.] - ETA: 0s - loss: 1.6394

1981/1981 [==============================] - 45s 23ms/step - loss: 1.6394

####################

Temperature: 0.2

####################

Summon Shape IIII Shadow Shape Spell Shape ####################

Temperature: 0.5

####################

Sonic Star Trap Spirit Hail of Death

####################

Temperature: 1.0

####################

Obo Starg Preserve Trump Jightning Hadance ✨ Saving… done

✨ Pushing… done

🎉 Total run time: 1m10.063085s

🎉 Run 79 complete

However, as indicated by one of the spells the model generated at temperature 1.0, this model appears to be cursed. It happens. This is a sobering reminder that the outputs of text-generating neural nets are not to be used unless first screened by a human of good taste. A neural net may generate delight one moment, and Badthnig Slurr in the next. We may be comforted to know that this neural net has no idea what words mean.

Nevertheless, sensing a great disturbance in the balance of the universe, we may decide to train our neural network to generate things less likely to create/summon/preserve chaotic evil abominations. We may decide to train it to generate ice cream flavors instead.

We can keep training on the same model, but just specify a different data_file . Since each spell we cast is saved in its own run on Spell, we don’t need to worry about overwriting our previous model. We just need to note the run id that resulted in each different model.

Here, we load a model from dd_spells , train on the text file in ice_cream/IceCream_sorted.txt and save the new model as dd_ice_cream .

$ spell run -t K80 --pip textgenrnn -m runs/77/weights/dd_spells:weights/dd_spells "python textgenrnn_save_wrapper.py --model_name=dd_spells --save_name=dd_ice_cream --num_epochs=1 --data_file=ice_cream/IceCream_sorted.txt"

The result is something like this:

####################

Temperature: 0.2

####################

Caramel Chocolate Chocolate Chocolate Banana Caramel Chip ####################

Temperature: 0.5

####################

Batterberry Cream Coffee Heanted Madge Caramel Cheesecake Black Cerry Cheese

####################

Temperature: 1.0

####################

Blicking Dark Caramel Salted Wanana Maded Peanut Burt Cheesecacach Beanbutter Appleder ✨ Saving… done

✨ Pushing… done

🎉 Total run time: 32.842697s

🎉 Run 80 complete

To sample from this trained model, we just mount the outputted model from our ice cream run ( runs/80/weights/dd_ice_cream ) and change the name of the model we’re sampling from.

$ spell run -t K80 --pip textgenrnn -m runs/80/weights/dd_ice_cream:weights/dd_ice_cream "python textgenrnn_save_wrapper.py --model_name=dd_ice_cream --n_gen=10 --temperature=0.6"

Result:

Boney Poconut Fresh Licked Chocolate Beap Oane Coffee Almond Pream S'n HOrdes Backberry Mandy Almond Cream Milk Chocolate Pecan Honey Frownie Dourted Pover Malted And Meppermint Ored Chocolate Chip Chrockely Pie

Delightfully, the model has not completely forgotten about its past as a D&D spell-generating model.

So, by using the mighty power of Spell’s GPUs, we can train models much faster than we could on a CPU. It’s nice to have to wait seconds rather than many minutes for glorious (or cursed) new D&D spells to appear. But where this comes in really handy is training on larger, more complex datasets, text that spans entire paragraphs rather than a couple of words.

For my next demonstration, I will attempt to generate the very fabric of our universe, the mysterious forces that make us who we are, that tragically orphan needless numbers of us, that make us wandering misfits with cool swords, that hide strange numbers of lost princes, princesses, and princex among us. Yes, I am speaking of D&D character bios.

To train for sentences and paragraphs, where each line has something to do with the previous line, we need only add another command to our call.

$ spell run -t K80 --pip textgenrnn \

"python textgenrnn_save_wrapper.py --model_name=dd_bios \

--large_text --new_model --num_epochs=6 \

--data_file=dd_bios/dd_bios.csv" ✨ Run is saving

Nov 08 14:12:32: saving: Saving 'weights/dd_bios'

Nov 08 14:12:32: saving: Saving 'textgenrnn_config.json'

Nov 08 14:12:32: saving: Saving 'textgenrnn_weights.hdf5'

Nov 08 14:12:33: saving: Saving 'textgenrnn_vocab.json'

Nov 08 14:12:33: saving: Compressing saved files

✨ Run is pushing

Nov 08 14:12:35: pushing: Saving build environment for future runs

✨ Total run time: 1h0m9.898649s

✨ Run 81 complete

Note that this HAS to be a new model. We can’t train a largetext model starting from a regular model.

Sampling is a little different, too, since we want one large chunk of text. We can specify the length of our text with --max_gen_length .

$ spell run --pip textgenrnn --mount runs/81/weights/dd_bios:weights/dd_bios "python textgenrnn_save_wrapper.py --model_name=dd_bios --large_text --n_gen=1 --max_gen_length=6000 --temperature=0.4"

And when we sample, we see the life story of a fellow adventurer laid bare.

Cras the Griffstorgoun

Half-elf

Ranger

“Raria was raised in a simple and crazy with the ship who were the surface and a small village of the wards of the trees of the collective and started a path of the world with his demon and his head on the community of a group of her parents were her around and wanted to be a strength of respect. He has been actually a strange noble town of the Summer of an adventure in the destruction of the temple of the Gracks of the Cales of the Older charming the way of the halfling with the princess of the forest of his mother who would be her first child, and the only thing along the cave in the group of course, and he was the trade of an adventurer that he was the stone of the town of the world, and the religions of the slavers were a few months of the manor he could be the creatures of the streets of his mother and the strength of an underground to his tendency at the age of 15 and bear the new world with a power of the family of the Surrowan. Thording her family was a great crowd and the right that he was a storm of the Neerly start of the Aris, and a bit of her father and so he was born in the world. He can take the same archiness in the dribble of the wilds. Traditionally, he was provided with a protective and leave her to do what they were the only child of the library. The former family was a strength and as the Hunter was only to be a way of his learning and she had spent in the heart of the streets needed to the River and the elves of the Grando.

The life story makes strangely little sense. I have not heard of the temple of the Gracks of the Cales of the Older, or the elves of the Grando, nor do I know what is meant by Thording. Sampling at a higher temperature setting only makes the words weirder.

Kirshalisan NhaKay Aestraus

Human

Warlock

“Sister is a life in a sking of pushed that the village couldnt show strong with a despite of the Empire of More:pission. One mustak’s father came more than a resparent who scraps for questions for the manor he gets at the lords. When he met Kalia had died to recondect to some of the company. At 32 years as the nest of the truth, what they harden gorguttel! In tinges by a facing for the boy maxest. Ive what he liked the Sense of adventures to the fellow and shiping with apprenticed by his way in the age of his head.

There is one more strategy we can bring to bear — insisting that the neural network use only words that it has seen before, and only the most common words.

This method builds a model that builds text word by word instead of letter by letter. The default vocabulary size is 20,000 words, so it can still generate a LOT of weird stuff. But it can make better use of its memory, since the phrase, say, “I cast lemonsflight” now is only 3 characters long instead of 18.

Here’s the command to create a new model named dd_bios_word and train it on our file dd_bios.csv .

$ spell run --machine-type K80 --pip textgenrnn \

"python textgenrnn_save_wrapper.py --model_name=dd_bios_word \

--new_model --large_text --word_level --num_epochs=5 \

--data_file=dd_bios/dd_bios.csv" #################### Temperature: 1.0 #################### in that situations , my anger was mischief by my heritage . after being expected , officially spending more time , zaela realized that they had just seen having things to gain arrogant , much anyway . thus wasn ' t toward her in names . however her parents were now treated her some joy to her new mission , and gave her at the retreat . she proved strangely pretty hard , but was being quite enough for a loud attempt to lie and unlikely on her own finger . her brother married , long lived for the natanian as he learned of magic . however , he loved and studied for his upbringing . having left to live in the forest with his adoptive mother , some well , and some early fewer waking upon his fallen and mother traveled 6 since , tending the emergency visit of the by families . this news of the prince was always clear the feathers to faded , a breland king , a dwarven into the heart of a great city . unfortunately , when the man died a child and his birth , left his father in the deepest caves , having been studying the middle unexpectedly ' s apprentice , not kindly to goblins , becoming a people of hire . " ✨ Saving… done

✨ Pushing… done

🎉 Total run time: 19m47.463368s

🎉 Run 83 complete

And to sample the generated text:

$ spell run --pip textgenrnn -m runs/83/weights/dd_bios_word:weights/dd_bios_word "python textgenrnn_save_wrapper.py --model_name=dd_bios_word --large_text --word_level --n_gen=20 --max_gen_length=6000 --temperature=1.0" half — elf “ here is the understanding of the tavern . monk above — orcs , micah found his first custom from the community of both selling the fighting sparkle of the world . it now exploring the recent lily of the back of the city of neverwinter , nearly whims and takes a few more finnton expensive style books on jewelry . “ the bad people have seen straight to her for music , though she knows nothing in her mind — arcane or thats bone . for wonder , she heard from good memory that her got talking and noya got on itself . warlock born only child warlock on strict long few years ago , not this need . if she discover the past my story came upon , the fire decided that i had an day ! and luck , i still woke up in there . much about said ( you would be forward to me ? investigation a pirate voice appeared and i could have enough control moving me ? i sleep ! , maybe others write , i know it had an air danger . i was no longer welcome into the world . eventually , we wandered the incident , left their suspicions about the shadow , lands on the price . it was here im encouraged to certainly avoid my hands . we would them believe the cook we were the next thing .

If I’ve got this right, we have a half-elf monk named “Here is the Understanding of the Tavern”, and a nameless, raceless warlock who had an day and possibly an air danger. Truly, artificial intelligence is astounding.

Now that we’re done casting spells, don’t forget to close your virtual machine and clean up your — oh, that’s right. There’s no cleanup to do. The GPUs are already back on their perches, singing their famous choral music. You’re only charged for the time they’re actually running your GPU commands, and not for storage, or for the time you spend mucking about changing directories and forgetting where you put your spells and looking up those pesky incantations. Scoop yourself a dish of chrockely pie ice cream, settle down into a comfy chair, and stream some serials to your crystal ball. You’ve earned it.