Truly generative musical scores in games have been few and far between, and “music games” has traditionally meant arcade-style rhythm games in which you repeat phrases or whole songs as accurately as possible. Pugs Luv Beats breaks those molds. Part of a vanguard of new gaming creations that generate dynamic music on the fly, it marries grid-based sequencing and resource-gathering gaming, as music making and gameplay blur together. The interactively-produced music could itself become a new way of delivering a musical signature with sound packs.

And beneath it all lurks a free and open source library, libpd – the embeddable version of tried-and-true free graphical music environment Pure Data. (That library is now on GitHub, and vastly updated, by the way, and we’re expecting a book soon from the library’s principle author Peter Brinkmann.)

Oh, yeah, and don’t forget about some seriously addictive gameplay and adorable pugs. I’m suddenly not concerned about the 15 hours Europe-to-North-America travel I’m doing tomorrow.

Here’s what the gameplay looks like, since it’s much easier to see:

Pugs Luv Beats was just approved on the iTunes App Store for iPhone and iPad.

Co-creator Yann Seznec (The Amazing Rolo) is a terrific musician; I just caught up with him in Edinburgh and Berlin and watched him play a homebrewed pig gut instrument with Matthew Herbert for the performance piece “One Pig,” on tour at Berghain. Working with Pd allowed Yann to focus on those musical impulses and not just engineering, and to let him try things he otherwise would never have imagined on a mobile title. So I asked Yann to walk us through how the project was built. He responded with an exhaustively-detailed examination of the evolution of this title, right down to the Pd patches. (Click through for high-res versions.) If your New Year’s Resolution is doing something with patching, you might want to hang onto these answers. Here’s Yann:

The origins of Pugs Luv Beats date back about two years. After making [musical iPhone game] Mujik, Jon (Jonathan Brodsky, aka jonbro) and I were trying to think of other approaches to music mobile app design, and we started thinking more and more about games. Music games, as a whole, are an oddly passive and traditionalist experience – you play along with a premade track, and you are judged on your accuracy and flair (which is strangely reminiscent of music conservatory mindset…). Obviously there are exceptions (RjDj’s Dimensions, Elektroplankton, etc.), but there you go.

Particularly interesting to me was the idea that game mechanics are often very similar to compositional techniques. So for example, when Sonic runs at a normal speed he collects rings at one rate. However when he powers up and goes super fast, he collects rings at a much higher rate. This could be compared to introducing a melody and then speeding it up – and when there are two players, doing this with two melodies. Instant fugue!

We started looking at how we could make a music game where the music and the game elements were fully intertwined and augmented by each other. So Jon prototyped a space shooter drum machine. It was awesome.

To make a (very very very long and boring) story short, our idea and prototype landed us some funding from Channel 4 and Creative Scotland to work on games that focus on musical creativity and composition.

For various reasons, we decided to put aside the space shooter drum machine for a while, and start from scratch. After going through several full prototyping iterations we eventually settled on a core game mechanic that turned out to be in many ways similar to a Tenori-on [Yamaha grid instrument]/Boiingg-style [monome hardware patch] music generation system – in our final prototype, you controlled a series of little dots that moved around the screen, creating loops. This is super fun from a musical perspective because it’s easy and rewarding within a few seconds, and when you have several loops going it can gain some pretty serious rhythmic and melodic depth.

The key from there for us was turning this into a game. We had been using free Internet graphics packs up until then (we hadn’t hired our artist Sean yet) which featured a ladybug, so we had been referring to the main characters as ‘bugs’. During some discussion one of us accidentally said ‘pugs’, and the game idea was born. We constructed a story about pugs and their love for beets (like the vegetables) which create beats (ha!), and how their love turned into greed and got out of control, destroying their world. The game, therefore, is about helping the pugs rebuild their lost civilization by guiding them to create beats. You grow your galaxy by collecting beats, which you do most efficiently when you dress your pugs up in costumes. What’s not to like?

To get to the part that I imagine CDM readers are most interested in, the app development was done by Jon using openFrameworks, [lightweight language] Lua, our own game engine called Blud, and the audio is all done in Pure Data using libpd (through ofxPd). In hindsight we started using libpd really late in the game, just at the very end of the prototyping stage, which was rather silly. Our adoption of libpd basically made our dev cycle about a million times more efficient. My background is as a musician and sound designer, and I have very little coding knowledge. I do, however, have lots of knowledge of Max/MSP, so picking up Pure Data was pretty easy. This allowed Jon to completely pass off all the audio processing (not to mention aesthetic sound design choices) to me, saving him loads of time, giving me direct control over the sound, and letting me test and prototype different approaches to audio within an environment that I knew would be recreated in the game. Also, as Jon mentioned to me recently, by using PD we are able to take advantage of 20 years of audio DSP research and development. Pretty amazing.

How it all works:

The entire audio engine is contained within this patch. Pardon the messiness.

The simplest part of the patch is the “sounds” section, which is used to playback simple sound effects, for the most part linked with interface actions in the game. I did this by creating a very simple patch which plays a sound when it receives a bang. Which sound it plays is dictated by the argument (in this case, the sound of discovering a new capsule). The process for adding a new sound, then, is as simple as adding the sound file to the /assets/sounds/ folder, and making a new instance of “sounds.pd” and naming it the same as the new sound. Jon, in the project code, created a list called “sounds” which is sent into Pure Data. When that list contains “capsule”, a bang is sent into that subpatch, and the sound is played.

A more complex version of what could be done with this type of data is seen in the voice of Mr Puggles, who helps you learn how to play the game. Mr Puggles pops on and off the screen to guide you through the first few worlds, and when he does he send Pure Data a “puggleShow” and “puggleHide” signal. I wanted to give Puggles a funny synthesizer voice that was different every time – dead simple in PD. To do that, I take the puggleShow bang and use it to trigger five more bangs, spaced out over a second. Each of these bangs triggers a random number which is translated into a MIDI note. This note controls the pitch of two oscillators (a sine and a sawtooth), one of which is slightly modified to make them slightly different pitches. These are played through a short volume envelope and a filter which is also controlled by a random number generator. Result? Hilarious beeping boopy Mr Puggles voice, all coming from one bang.

Every time a player buys or selects a planet, a short list is sent to Pure Data comprised of the planet BPM and a random number seed. The BPM is used to calculate delay times and such, and the random number seed is used to create a sort of musical identity for the planet. This is done by choosing a “beat library” and a musical mode.

The mode is created by building a lookup table that chooses the notes from a chromatic scale that would be used in a particular mode. For example, a major scale (ionian mode) uses notes 1, 3, 5, 6, 8, 10, and 12. Each melodic sound library I used is comprised of a full chromatic octave, and the notes that are played on any given planet are controlled by this table. This ensures not only that all of the different sound libraries being played on a planet will be in the same key, but also that a planet will have a strong melodic identity.

The sound libraries in the game are all controlled by the pugs on the planets. As they run around, each time they land they will trigger a sound. The type of sound is dependent on what terrain they are on – thus, if they run through the snow they play a toy piano, if they run through lava a distorted guitar, etc. There are two states of playing the sound, one if the player deliberately tells the pug to go to that tile, and the second if the pug is traveling over that tile to get somewhere else. It’s super easy to do that kind of thing in Pd; just set up two different ‘play sound’ envelopes, maybe a little extra delay or reverb, and you’re done!

The final piece of the puzzle for making the pugs running around into music is to make each tile be a different note. The terrain of each planet is created by making a sort of height map, where different heights correspond with different terrain types (grass, water, snow, etc). This also means that each tile has a unique number between 0 and 1. When the player buys or selects a planet, a giant random number table is generated in Pure Data which creates a number between 1 and 13 for each possible value between 0 and 1. That value is what is used to pick the note of the mode. This somewhat convoluted approach again lets us make sure that each planet will have a unique, but fully reproducible, musical character.

The actual playing of the sounds is probably the messiest part of the patch structure. Purists look away now.

I wanted to make sure this part of the patch was as flexible as possible, so I ended up using the soundfiler and tabread~ objects, rather than tabplay~, which is great in practice though does look rather uncouth. Additionally, I had some limitations imposed upon the structure of the patch – namely, I had to keep the number of tables down as much as possible, to save on memory. So each sound bank has two voice polyphony – there are many sound banks, and the beats and sound effects aren’t counted in this, so that limitation is not really heard in the final product at all. It did mean I had to work out a decent voice allocation system though!

I think my memory issues were probably my only problem with using PD in this project – though only indirectly. As I mentioned, they were hardly a problem artistically, however it took me a while to get used to the idea that not everything I patched on a computer would work on an iPhone. Similarly, I had to be very careful about things like relative volumes. In a generative music game like Pugs Luv Beats, the player could quite easily send 15 pugs running around making sound, which mounts up pretty quickly. It means that all of the patches and sound need to be designed to withstand lots of triggering without distorting. None of these things are problems, really, all they require is regular testing on devices and simulators – something that every mobile developer is already used to.

That’s the Pure Data audio engine in a nutshell. The end result is a flexible and powerful audio engine that sounds really great and is fully integrated into Pugs Luv Beats. The game is a great combination of music, silliness, and strategy – there’s a bit of something in there for everyone. You can definitely just play with the game to make beats, or you can try and collect all of the costumes, or you can try and make the most efficient planet ever. You can also explore the galaxies being made by your Game Center friends, to hear what they’re up to.