AV16-4 – A Pure Data audio-visual experiment – Using The AV16-4

Share This:

The videos shown above show the finished, or latest incarnation of the AV16-4 audio-visual performance instrument. In creating an instrument that is performance ready assignments were changed based on experiences gained during the testing processes to create something that becomes more fluid to use within a performance situation.

Pure Data is especially suitable for prototyping applications in this way due to the ease in which one is able to make relatively global changes to suit the needs of the performer. Applications such as Resolume Avenue allow complex routing of control devices to almost all parameters, for both audio and video. Parameters can be linked to allow complex movements linking elements together. Pure Data though offers a wider scope in allowing data from one element of an instrument to be directly fed to any other part of the patch, or indeed any other patch.

Here is a discussion of how the prototype interface was adapted from the previous experiments, to create the version above:


An issue with the sequencer during experimentation was the inability to make direct changes to the sequence during performance. The controls chosen, the joystick control and Leap Motion controller did not lend themselves to changing sequence elements during performance.

The sequencer element was enhanced through the creation of a selection of pre-composed patterns, constructed in Pure Data as a list. Upon triggering by a joystick button, these sequences were unpacked and sent to the sample selector vsliders to be selected by the counter. This enabled a greater variety of sonic and rhythmic textures to be created during performance. This process was replicated with the video sequencer to allow a range of video textures to be triggered via the sequencer.

A ‘glitch’ style control was featured as part of the sequencer. This control was implemented using the Z-Rotate control on the joystick. The control utilised the maxlib/scale control and was assigned to the metro function, sweeping the counter from 180 to 1 milliseconds, creating a rapid, cyclical effect which added to the timbres that could be created. When combined with Freeverb and delay effects this adds an interesting element to the instrument.

From the original experiments the tempo changes assigned to the controls on the base of the joystick unit were dropped in favour of sequence options. These tempo changes are still accessible from the TouchOSC interface should they be required. Below is a screenshot of the sequencer:

avj16-4 mk1 seq


In initial experimentation the focus on assignments for video manipulation were focussed on the rotateXYZ and translateXYZ controls. Manipulating solely these movement based parameters created a sense of interactivity but was not effective as a VJ style tool.

In order to create effects that were more effective from a VJ perspective the colour gain parameters were manipulated in two places, red in the X-axis and green and blue in the Y-axis. This had the effect of extreme saturation created by sweeping joystick movements and can be seen to effective usage in the demonstration video being used rhythmically to transform the video content.

Utilising the joystick controller to manipulate movement was implemented across the X-axis to provide an effect whilst transitioning across colour changes. The rotateXYZ function was used with a 360-degree movement to provide an image spinning movement in use. Again this is shown within the demonstration video.

avj16-4 mk1 vid


Reverb (utilising Freeverb) and delay effects were utilised within the sample player and were mapped to the Leap Motion controller. This assignment used the GecoMIDIs‘ ability to ‘see’ both open and closed hand movements. The closed hand movement has mapped to the Freeverb Pure Data external. X-axis and Y-axis movements were used to control dry and wet signals. The data received for the dry signal was reversed to allow the signal level to decay as the wet signal increased, thus creating a balanced sound when moving the hand away from the Leap Motion. Room Size was controlled through moving the hand backwards and forwards towards the controller in the Z-axis. This allowed the all three elements of the Freeverb external.

Delay functionality was implemented using techniques inspired by the series of tutorials created by Dr Rafael Hernandez, in which the delwrite function is used to feedback a copy of the original signal. In the av16-drum.pd patch the delay lines are linked to the tempo dictated in the sequencer patch in order to create a tempo-synced multi-delay effect.

The Leap Motion controller was assigned using open hand movements. The X-axis movement was assigned to one delay line, the Y-axis to the second delay line. The enabled the delay signals to be operated individually or together.

avj16-4 mk1 drum



Share This: