TOC

Download

Links

Rhizomatic!

Heterogenous network of interacting musical objects.

Motivation

Music is most fun if you don’t have to play it yourself. So we wanted to create an installation, which—based on some kind of initial configuration—would produce some kind of “self-composing” music on its own. While playing, user interaction should also be possible by changing the configuration

Overview

The idea was to have a set of objects generate sounds (either autonomously or when triggered) and have the objects’ output stimulate the other objects around. In practice, this will be realized by placing various things (with different shapes and colors) on a white table, moving them around an removing them again. On the one hand, the objects emit pulses (or waves) according to their rhythmical pattern, on the other hand, they may react to the pulses received from other objects.

Ideally, each object would have it’s own little “behavioural pattern” built in, but to keep the project realizalbe, we decided to limit ourselves to basically two classes of objects:

Active objects produce continuous sound (loops, in our case) and accordingly also emit continuous rhythmic pulses which trigger the reactive objects. Reactive objects produce short sounds when triggered and, again, emit pulses to which other objects will react to. The pulses spread with a certain speed from the originating object and also lose energy on their way, so by varying the distance between objects, you both change the relative rhythmical placement in which they sound and the volume of the reacting object’s output.

concept

Realization

Flow of information

Objects are recognized by a camera. According to color a bounding box is deterined by the object jit.findbounds. Position and size of these boxes are used to calculate distance. If active objects are recognized, a sample automatically starts to play, meanwhile signals are also broadcaasting from this object as well. These signals arrived With a delay to other reactive objects. The amount of delay depends on the distance bewteen each pair of objects.

Accomplished Tasks

Problems

Due to our limited knowledge about MAX/MSP and the limited time we have the system has been configured in the way that signals broadcasting is depend on the length of the sample audio file. Our initial idea was to do a beat detection, so that sample with a fast beat send more signals than sample with a slow beat. The workaround we have applied is to divide the length of the sample into 4, signals are sent to other objects according to this amount of time. This is clearly a point for improvement, by using a more advanced beat detection mechanism the melody created by the setting might sound more harmonic and not like noise.

Our intention was to recognize an unlimited amount of objects as well. But beacuse we did not come to know how to create dynamic object in the MAX/MSP environment, we have limited ourselves to a fix number of objects, namely two active object and 3 reactive objects. Our guess is that scripting is necessary to create objects while running the patch.

Future work

Usage

Why "Rhizomatic!"?

Some pointers:

Sample Sources

Due to popular demand we’re revealing the sources of the sounds used in our demo: