Spellcasters Beat Detection Adventure

My first experience with rhythm based games was the addictive Pappa the Rappa and later the hugely popular Guitar Hero. It wasn’t until we started discussing the ideas for core mechanics in Spellcasters that I really start to think about how to implement beat detection. This devlog is an insight into my thought process for implementing a system and it’s by no means a definitive guide. I am going to keep this fairly high level rather than getting bogged down in the nitty gritty.

So let’s get to it. First I considered how I could go about detecting a beat and I had a few initial ideas:

Fast Fourier Transform

I have an engineering background so one of my first thoughts was to transform the music to the frequency domain and trigger the beat from a selected frequency range. The image below attempts to visualise what the transformation does but if your not familiar it’s worth having a quick read about FFT and Frequency Domain.

Now thankfully Unity has a function (AudioSource.GetSpectrumData) that lest me do this so I don’t have completely create this from scratch myself. Previously I had used this for a music visualiser whereby I move objects in the scene based on the amplitudes of the frequency bands kinda like those jumping bars you saw on HiFi systems. Well now you know what they are.

The problem with this approach is knowing what frequency bands to check and how this will vary from track to track or even within the track itself. It’s one thing to react to the bands visually it’s another to determine if a button press was to the beat. This approach is based on the idea that I build a generic solution that analyses the song and determines the correct button timings. This might be possible but I dismissed this approach for two reasons. It would require a lot of work to understand how to build a reliable system and more importantly given the scope of the project we didn’t need anything this sophisticated.

Beats Per Minute (BPM)

My next approach was to map the beats with delta time and check if the player hit the button at the correct time. This was decoupled from the music playing. So as long as the timer started at the same time as the music all was good. The other thing though is that I needed to know the BPM of the song. I have drawn a diagram below how this might work.

I actually built a rough prototype of this to see if it was feasible. At first glance it appears to work but actually playing through the game you could get de-syncs to the music. This was due to errors with the timing and it was something that resulted in the beat feeling off.

So I did some digging and learned about how audio is played in Unity. The AudioSource.timeSamples could be used to sync with the music rather than relying on a timer. This was more accurate and wasn’t prone to de-syncing, keeping my detection coupled correctly to the music. Using this in combination with AudioClip.samples I could work out my beats in terms of samples with the information available.

I should point out that to determine if a button was pressed on the beat I would check if the sample when the button was pressed was close enough to the actual beat (between the OnBeat and OffBeat in the diagram above). This approach was how I was gonna apply the detection for all methods. We could return an accuracy range -1 for too early to 1 for too late if required.

Now at this point for the initial prototype was enough to get us into the Dare 2019 Competition. It demonstrated our core idea. We were also able to fire events from the OnBeat and OffBeat to make the world come alive by dancing to the beat. It rocked!!

You can see a super early prototype video here. This was way before our pitch but I think it’s nice to see where it all started.

Leveraging what's out there - Koreographer

So making great systems takes time and it’s better to let someone else do the hard work. I understood the concept but I we wanted to be able to do more with it. I knew about some really cool assets on the Unity store that could do what we needed. One such asset was Koreographer. Getting into Dare gave us some budget to offset the cost so we went for it.

What I like about this was we can map information to our songs and all we needed to do was listen for the events in code wherever we needed them. I won’t go on about all the features as I am not trying to push the asset but it’s worth checking out. This provided a tool to make our bpm detection and enabled us to encode any data we wanted to bring our world alive. Making a system like this would have taken me so long that the game would never have happened.

Spellcasters Update

All this was launched in our first iteration of our game which we took to the Dare play party. We are working really hard to iterate on the game and look forward to sharing our new content really soon but here is our current trailer to wet your appetite.

We will be taking our next iteration to EGX. We look forward to seeing you all there.

Good luck implementing beat detection and thanks for reading if you have any questions comment below.

You can follow Spellcasters as it develops over on our Twitter and Reddit pages.