Almost 2 years ago I was approached by artist and producer Adriano Clemente about creating a Max for Live device that would allow him to generate sound responsive visuals during his live sets. We had worked together on a number of interactive projects for other artists and had discussed wanting to collaborate on something together. A Max for Live device seemed natural as Adriano teaches Ableton Live at Dubspot in NYC, and I had been teaching and using max for years to develop interactive sound and video pieces for myself and others.

As a DJ, Adriano knows what tools are helpful to have during a live set. For him, the ability to automate parameters was key. So, one of the first things I set out to do in my programming was develop an easy way to assign LFO's and other control data to the knobs and sliders in the device. We devised the drop down menu that is located above each of the ui elements. The menu system allows you to assign a visual effect to the control of one of three LFO's - or high, mid, or low frequencies in the audio track. The LFO's can be switched from several BPM synchronized note values, or to free oscillating rates specified in Hz.

When we had this skeleton of the device figured out, we moved on to deciding which visual effects would be included. We knew we wanted there to be the possibility of having two videos crossfade because it was something that we hadn't seen in a Max for Live device yet. I also wanted to expose newcomers to all of this stuff that is almost hidden in the examples folders of Max, so I added in some of those compositing ideas lurking in the program's subfolders. We wanted the device to be very functional, so we made sure that all of the fundamental effects you would expect in a VJ tool were there. These are things like hue, saturation, and color balancing including more boutique things like color inversion and kaleidoscopic effects.