Project: iTheremin

Thorsten Karrer Daniel Beck
thorsten(dot)karrer(at)rwth-aachen(dot)de danbee(at)gmx(dot)de

IDEA

The idea of our project is to convert the well known "drumming-on-the-edge-of-a-table" into something more musical. One should be able to play simple tunes by knocking on the table. The system should output MIDI-data so that you can play any instrument you like.

The name of the project was inspired by the Theremin, an early synthesizer constructed by Lev Sergeivitch Termen in 1917. The Theremin could be played without being touched by the artist in which it resembles our first prototype.


CONCEPT

A microphone listens for the knock-sounds and a video camera watches the edge of the table. The edge is virtually divided into several regions and a note is assigned to each region. When the system detects movement in a certain region it outputs the corresponding note (see picture below; red region is activated). The amplitude of the note depends on how hard you knock on the table and thus on the amplitude of the captured knock.

A possible extension might be to use two or more cameras either to enalbe multiple users to play simultaniously or to add new functionally (e.g. pitch bend).

What it should look like...

IMPLEMENTATION

STEP 1: Video-Recognition
We thought the video recognition part would be the hardest part in the project but thanks to the Cyclops Object it turned out to be pretty simple. In our first prototype (Patch) we defined five zones which fired whenever the luminance of the block containing the zone exceeds a certain threshold. The output was used to trigger a note-on event whenever a bright object (e.g. your hand) enters the zone and a note-off event on leaving. Each zone was assigned a different pitch on a pentatonic scale to avoid non-harmonic music. The mapping between the zone numbers and the pitch values was stored in an instrument definition table. So new mappings for different instruments could be added easily (e.g. drums).
Later we combined several adjacent zones into one single pitch-zone to make targeting easier. In the end we provided the ability to store the number of pitch-zones and their width in the instrument definition table. The user can select an instrument and according to its definition the pitch-zones are arranged automatically.

Cyclops Windows
Cyclops Window with five zones


STEP 2: Detecting the KNOCK
Since the camera signal cannot provide all necessary information our plan was to extract the following information from the microphone signal (Patch):
  • the velocity of the note to be played being derived from the volume of the knock
  • the exact time to trigger the note
In order to get that data we applied the following transformations:
  • low-pass filtering by taking averages over 150ms
  • Schmitt triggering the resulting signal to get clean rising and falling edges
  • beginning with the rising edge we analyzed subsequent samples of the low-pass filtered signal determining the maximum amplitude and output it on the falling edge
  • the resulting value is used to calculate the velocity and to trigger the note-on event
  • STEP 3: Putting both together...
    ...resulted in several hours of tweaking, solving timing issues, dirty hacking and removing bug after bug. This took all the time we planned to spend on cleaning up the patch ;-))

    What a mess!


    STEP 4: Nice visual feedback
    In order to make the usage more intuitive we provide the user with a fullscreen video feedback. It shows the playing area with the current arrangement of the pitch-zones drawn as red rectangles. Whenever a pitch-zone is activated by placing the hand inside the corresponding rectangle is filled semi-transparently. To create the illusion that the video is recorded from the user's position (although the camera is mounted on top of the display) we flipped the video accordingly.
    Video output


    SEE THE iTheremin IN ACTION

    Video 1 (6.1 MB)
    Video 2 (12.9 MB)

    DOWNLOAD

    Patch and Instrument Definition



    Modified: 22-Jul-2004