eMail adresses are firstname.lastname@example.org.
The idea behind the whole project was some kind of interaction with a computer without using standard input devices such as keyboard or mouse.
We were looking for devices that can be taken wherever you go and that need no special training.
So the concept of "InstruFace" was born.
Because the eyes and the mouth are two parts of a human's face that nearly everyone can move in the same way, we chose to control digital audio with them.
After starting the program the user gets a brief introduction in form a short movie which tells him how to use the patcher. By opening and widening the mouth he can change pitch and velocity
of the included samples. Those samples can be switched by closing the left eye and time-stretched by using the right eye.
Closing the eyes for five seconds stops the recognition and turns to a slideshow of screenshots taken during the session.
Ten seconds after playback the patcher goes back to its initial state and waits for the next user to come by (the included movement detection will restart the session).
Task that had to be accomplished
- creating a program that once started, does not need any maintenance (constant loop)
- tracking eyes and mouth properly without any preparation on the face (except InstruCap)
- creation and management of user-defined MIDI-samples
- modular design to keep the patcher easy upgradeable
Brief description of problems encountered
The light in the environment was always a problem concerning the subpatcher that is responsible for tracking the colors. Even small changes lead to a high rate of misrecognitions.
For that reason we had to spend a lot of time on finding an efficient way to analyze the input signals.
In the beginning we intented to implement a real instrument so that you can create sounds on your own. Because of the tracking problems we decided to use some fixed samples so
that everyone - even those who have never used InstruFace before - is able to create more or less pleasing sounds.
Possibilities for future work
- tracking without InstruCap
- implementing some system to optimize the color tracking (independent from light)
- giving user the possibility to create own sounds like on a real instrument
- taking screenshots whenever a funny face is detected
- building some kind of visualization (e.g. frame around the face)
- including more parts of the face
Project files including sound samples, Max/MSP patch etc.
InstruFace startup video
Who is InstruFace?