For the Grimes version at Moogfest, the space is divided into four "zones," with each one corresponding to a particular piece of the song. As you move through the exhibit, you control a variety of options for the track's instrumentation by pressing into a fine mesh screen. The only two parts of "Realiti" that are constant is Grimes' vocal track and the driving bass beat. All of the other sounds are literally in the hands of the 20 users moving through the space at any given time.

"Each time we approach this with a new artist, it's a completely new challenge," Listen's founding partner Steve Milton says. "In this case, Grimes vocal is really commanding in this song, so we wanted that to envelope the space. We also felt there's a danciness to the track, so we wanted to maintain that. The rest is up to the audience to fill in."

In one quadrant, hand movements control the volume of a certain piece of the song while another adds effects like delay or reverb. "As people interact with the mesh, they're bringing to life different parts of the music," Milton continued. "They're not only affecting volume and pitch, they're affecting a number of different parameters: different types of EQ or adding delays, reverbs and all sorts of effects to those tracks."

So how does this whole thing work? There's one Kinect camera for each of the four sections of the installation, which map the surface of the mesh netting and track hand movements into and out of the space. The motion is then translated to control the stems, or different parts of the song, with a combination of MaxMSP and Ableton Live. Think of it like a mini-nightclub that lets you control the music as you move to the beat.

The remix that's created when you're inside the installation is one of a kind, but that's the whole idea. Sure, we're seeing what Microsoft's Kinect is capable of, but those of us experiencing the setup in person are also creating a one-of-a-kind remix that will likely never be repeated.

"Every time anybody goes through the space, their movements and interactions change the song just a little bit," Serokas said. "I'd guess that it would probably be really hard to have you go through and have the same thing happen."

While the first version of the project was being produced with Matthew Dear, Microsoft was careful to document the process so that other creatives could tap into the resources afterward. You can grab the code that's used to drive the Kinect sensors on GitHub and play with it at home. "If you want to build a 20-by-40-foot structure and put the Kinects up, you can re-create this or try some things for yourself on a smaller scale," Serokas explained.

The "Inside the Music" project will continue to evolve; both Microsoft and Listen say they learn new things each time they team up with another artist. And thankfully they're sharing that knowledge with all the hackers and tinkerers who helped make the Kinect famous in the first place. Microsoft is taking notice of the ways musicians are using its tech as well, so this won't be the last time we'll see the company collaborate with an artist on an interactive installation.