Sounding Colors

Members

Ray Bohnenberger, Björn Kuhlmann, Christoph Röttgen

Introduction

In our project we want to produce music by using an iSight-Cam and the latest Powerbooks Apple Motion Sensor. The project is intended for two simultaneous players creating music collaboratively.

The idea was to explore the colors of everyday-objects in a new way. For example for children it could be quite interesting to "play" with different colors by making music with them.

The input of the camera determines the type of instruments, the melody and the harmonies. The RGB values are mapped to three different instruments, one for the melody, one for the bass and one for supporting percussion. A rhythm is provided by the system to keep the music more structured.

The input from the Apple Motion Sensor influences the speed and volume.

Apple Motion Sensor iSight

Tasks

- discussions about a new project (iDrums was not "expressive" enough)
- getting familiar with Max/MSP and midi
- mapping camera inputs to sounds
- including background rhythm
- embedding of the Apple Motion Sensor

Brief description of any specific problems/challenges you encountered

Different problems occured while implementing our system. A traditional instrument in most cases is quite complex and hard to learn. In our system we wanted to produce music without any knowledge about notes and rhythm. So we had to simplify some things. The RGB values of the camera were mapped to different pitches and harmonies. The grey value determines how fast a melody will be played. Dark colors produce a slow melody while bright colors tend to faster melodies.

A big problem is the quality of the pictures. Printed color tables appear more grey on the screen than they are in reality, especially under artificial light. Another problem of the iSight is the auto focus (we weren't able to disble it): even when the cam is kept still, the colors permanently change a little bit.

Possibilities for future work

In future version of the system there will be maybe a pressure sensor mounted on the iSight, which detects the pressure of a finger to play a tune of the melody instrument everytime the sensor is pressed with a volume dependent on the pressure.

List of references

Link to ams-tracker (for Apple Motion Sensor): http://www.kernelthread.com/software/ams/

Links to your Max/MSP patches

Max/MSP patch: midi-final









iDrums (maybe not the final name) (!cancelled!)



In our project we want to realise drums, that can be played only with gestures. For these purposes we will use either a video-camera to recognize gestures of hands or maybe even feet and/or infrared-batons.

We haven't decided yet, whether the user plays on the drums continuously, or if he can record some samples or loops, that will be mixed together.

If we find a good and efficient way to recognize gestures and map them to sounds, we maybe can add a second instrument for a second user, e.g. a guitar, to provide collaborative music making.

Our goal is not to implement a perfect mapping to simulate real drums, but rather a simplified creation of sounds, that can be learned easily by even rhythmically untrained users.