Max for Live Jitter Device

For my last post , I wrote a tutorial on creating and provided a free download to a fairly basic Max for Live Jitter device (can be read here).

For this post , I am going to talk through the Max for Live device which I developed with help from Robin Price for my final year project in college , the link to download this is at the bottom of the page .

The device is similar to the last blog post as the jit.catch~ object is used for audio analysis and both jit.gl.gridshape & jit.gl.mesh objects are used.

The gridshape is added to the matrix by giving the object the @matrixoutput attribute. The gridshape sends out X, Y and Z co-ordinates and in the case of this patch, audio matrices are added to the Z plane for animation. This is achieved by forming a mathematical operation using the jit.op object, “jit.op@ op pass pass + “ means the signal from jit.catch~ , passes the X plane, passes the Y plane and is added to the Z plane

Creating a Texture

An initial texture is taken from jit.catch~ ,the matrices are taken from the object and sent to jit.op where a mathematical operation is done, in this case it is to increase the amplitude going into the jit.catch~ which makes the quieter sounds more visible.

From here the signal is sent to the matrix and then to jit.gl.texture where a texture is created.

The texture is name @texture 1, by adding the @texture attribute to the jit.gl.mesh object; the name of the newly created texture can be added directly to the shape. The attribute used here is @texture texture 1.

A further texture is added which was obtained in the examples folder in Max 7. This JavaScript file creates the main texture, which will be seen in screen behind the electronic music performer.

jit.gl.shader is used and the file name is referred to. In Jitter there are three different types of texture mappings.

Object Linear

Applies texture in a fixed manner relative to objects co-ordinate system, this means as an object is rotated and positioned , the texture will stay the same.

Eye Linear

As the object rotates , the texture changes

Sphere map

Environment mapping, rendered as though it is reflecting the surrounding environment. The texture changes as the model moves.

By using the @tex_map1 attribute in the jit.gl.mesh object will set the texture mapping to object linear. The poly_mode attribute is set to 0 1, this means that the front of the rendered shape will be solid while the back will be wireframe.

To allow the function to switch between full screen, the key and sel objects are used. By using ascii where each key on the keyboard is given a number, any key can be used to trigger a message in Max MSP. In the case of this project, the escape key is set to switch between full screen, being ascii number 27, once this key is pressed, it turns the toggle one which activates the full screen message which is being sent to the jit.window object

With the shape stationary on the screen it was decided to animate the shape and allow for the definition of a viewpoint change. To change the viewpoint, the jit.gl.camera object is used.

With the ability to change camera position , lens angle or zoom and camera rotation , adds variation to the video output.

To animate the OpenGL shape, the jit.anim.drive is used. With this object OpenGl shapes can be rotated, moved to a specified location and scaled to a specified size.

By using the turn 111 message enables the audio analysis output to rotate 360 degrees with each number representing the X, Y and Z axis respectively.

A message and a dial to adjust the speed of rotation is added and this will be the first of the projects live ui objects.

jit.gl.camera

jit.anim.drive (figure 1)

The next stage of the patch is to implement MIDI mappable parameters, meaning that the user can map live UI objects to their hardware MIDI controller and to have the ability to change how the video is displayed.

A live UI object is one ,which is recognised by Ableton Live and available to map to a MIDI controller.

In Max for live , any attribute which has an integer or flonum can be controlled using an Ableton Live specific dial, fader, toggle or button. Creating a message with the attribute name followed by $1 allows for the value to be changed by a live object.This can be seen in figure 1 above.

Preset Object.

Due to the customisable parameters in the patch, it was decided next to implement a preset system. So if the user found an interesting output he/she could save this as a preset, which can be recalled during their live performance.

Three objects are required to have the ability to store and recall preset`s, these are pattrstorage, autopattr and preset. To open the client window with all live UI objects and their current relevant values, you double click on the pattrstorage object .To avoid confusion, each live UI object is given a unique scripting name, this will be viewable in the client window. To give an object a scripting name, the live UI object in question in selected and by pressing command and I ,the scripting name can be changed to a unique name. See figure 2.

Preset Figure 2

The preset graphical user interface is used to save and recall presets. By holding shift and clicking on an empty slot you will store all current values. Once a value is stored the empty clip changes to yellow. Although this colour can be changed in the inspector menu.The pattrstorage object takes a snapshot of all values and stores it to the empty slot selected.

The Max for Live patch can be downloaded here ,if you have any questions feel free to leave a comment / follow, thank you for reading.