junXion – OSX Data Routing Application

junXion v4 Screengrab

junXion calls itself an ‘input device to MIDI translator, and is an application available for the OSX operating system that processed input information from a wide variety of devices and converts this information into MIDI or OSC. Like GlovePIE, junXion can take data from devices not normally used for music or AV performance such as touchscreens, joysticks, mice and allow this data to be converted and used within a MIDI or OSC capable application.

junXion v4 Screengrab
junXion v4 Screengrab

Developed by Steim, junXion offers for Mac / OSX users the ability to utilise data through non-music HID (Human Interface Device) within a MIDI or OSC application.

Alongside more common HID such as joysticks, mice and trackpads, junXion is also able to work with video information and can work with contrast, movement or colour. junXion is able to track objects in space through colour recognition, converting this information with up to 6 ‘event’ parameters.

As discussed within the junXion v4 manual, video tracking is dependent on light changes, which in a daytime, uncontrolled environment is subject to natural fluctuations, rendering in somewhat difficult to utilise. However, within a controlled performance environment, where the lighting can be precisely controlled.

Video tracking within junXion allows the user to track video objects and generate data focussing on 6 events – Detected, Coverage, X-pos center, Y-pos center, Width and Height.

As with many video applications, latency is often a consideration, and this is still the case with junXion. Although not significant in use, this must be taken into account where precision timing is required. junXion samples video at between 24-30 frames per second.

basic premise of the Video Input tracking is to interpret a video image and to convert it into Video Input event (=Sensor) information according to specifications set up by the user. Interpretation is done on a frame by frame basis, the program retains knowledge of previous frames, so the system is able to interpret movement. The user has to specify which objects in the incoming video image are to be looked at and which parts of the image are to be ignored. This is done on the basis of the brightness and/or color attributes of elements in the image. As you can see in the right part of the Video Object Edit window, there are four types of video filters. The central idea behind these image filters is to single out those objects in a scene which are important, i.e. filter out the parts of an image that you don’t want junXion to interpret. In the end, when junXion attempts to track objects, the only thing that counts is which pixels in the image are ’on’ and which are ‘off ’. Off pixels being pixels that are absolutely black and ‘on’ pixels holding any value other than absolutely black. Color or brightness information is no longer relevant at that stage.



junXion @ Steim