Update: This video might explain it better

An idea popped out of Radio 1 Interactive a while ago. This would be a device that measures 'rock' - how much the band and the crowd are rocking at a gig - called The Rockterscale. It would display the amount of rock at the venue and on the web in real-time, maybe even showing it at other gigs and encouraging bands and crowds to out-rock each other. But, until now, no-one has really tried building it. But we were due another hardware hacking session so we decided to build the Rockterscale. Two intensive days later we had these...

First, we have the Hat of Rock which measures the amount of head thrashing. Suitable for both fans and the band.

Next, the dance floor measures movement and a force sensor hooked up to an improvised crash barrier at the front measures the crowd pushing up to it.

A webcam mounted on the ceiling measures the overall crowd movement and then there's the music itself. Audio processing code measures the loudness of the song and the spread of the frequencies in it - a high value would be a "wall of sound" like effect. It also does some beat detection. So we have 6 measurements and an equation...

Hat + Floor + Crush + Crowd + Loud + Phat = ROCK

Each sensor generates a measure of 'rock' between 0 and 9 (what unit of measurement would that be?) and then sends the data to the displays. A separate team were working on the output side and they built a big screen display and a physical scale.

The big screen shows six scales representing the six sensors, with a combined scale at the top - if that reaches 11 then fireworks go off, or something like that. The Rockterscale logo pulses in time with the beat.

And the physical Rockterscale is built from a guitar-shaped pointer which also uses the combined reading. When it hits 11?

The LED turns on.

\m/

That was it. But what happens next? We had to dismantle it on the day because we were using up a meeting room but it does have real potential for deployment at a gig. Certainly the video and audio processing could be used, though I'm not sure how the video code would cope with the lighting conditions. The rest of the sensors would probably need a bit more work; to make them robust, reliable and a bit more standalone and I'm not sure many singers would agree to wear the accelerometer-fitted hat with a USB cable going down their back.

Some technical details for those who are interested...

First, the sensors. The hat has a 3-axis accelerometer mounted in the top which gives an orientation reading in each of the 3 axes, we differentiate these readings to give a movement value. The cardboard floor mat also has an accelerometer attached and this worked in the same way. A force sensor is mounted between an improvised barrier and a table and that needs a reasonable amount of pushing to register. A webcam mounted above the audience measures the amount of movement in the whole image using Processing. The image is divided up into an 8x6 grid and then the difference between the pixels in each grid square for each frame gives 48 movement values every second (represented as the blobs in the photo above). The audio processing is based on the aubio C library and measures loudness and spectral spread as well as doing beat detection.

All the sensors were powered through Arduino boards that were hooked up to MacBooks - we didn't have time to make them standalone. Each sensor produced a stream of readings (0-9 on the Rockterscale) up to 10 times per second. These were sent asynchronously over the local network using Open Sound Control (OSC) to a single Processing application doing the presentation work. The screen display and the guitar-pointer was (powered by a stepper motor via an Arduino) were both built with Processing.

Why we do this

As always, we hold these workshops to get people thinking differently, to provide inspiration, as team-building, to get people away from their day job for a bit and to build something which might even be useful. Previous hacks have included the DABagotchi and Dog Vader.