PO3: Sources that might be interesting

Sources that might be interesting and were mentioned in the papers of Teresa Marrin and Wolfgang Samminger:
  • Allen, Paul E. and Roger B. Dannenberg, "Tracking Musical Beats in Real Time." Proceedings of the International Computer Music Conference, 1990. San Francisco: Computer Music Association, 1990, pp. 140-143.
  • Anderson, D. and R. Kuivila. (1991). Formula: A Programming Language for Expressive Computer Music. IEEE Computer: 12-21.
  • Arcos, J. L., Dolores Canamero, and Ramon Lopez de Mantaras. (1998). Affect-Driven Generation of Expressive Musical Performances. "Emotional and Intelligent: the Tangled Knot of Cognition," AAAI Fall Symposium.
  • Armstrong, D. F., William C. Stokoe, and Sherman E. Wilcox (1995). Gesture and the Nature of Language. Cambridge, England, Cambridge University Press.
  • Berlioz, Hector (1843). Grand Traité d'instrumentation et d'orchestration moderne. Paris, Schönenberger (réédition: Paris-Bruxelles, Lemoine, 1844).
  • Berlioz, H. / Strauss, R. (1948). Treatise on Orchestration. New York, Edwin Kalmus.
  • Bernsee, Stephan M.: Time and Pitch Scaling of Audio Signals, 1995-2002
  • Bilmes, Jeffrey Adam. "Timing is of the Essence: Perceptual and Computational Techniques for Representing, Learning, and Reproducing Expressive Timing in Percussive Rhythm." Masters' Thesis, Media Lab, M.I.T., 1993.
  • Bobick, A. and A. Wilson. (1995). Using Configuration States for the Representation and Recognition of Gesture. Cambridge, MA, MIT Media Laboratory Perceptual Computing Section.
  • Bobick, A. F. and Y. A. Ivanov. (1998). Action Recognition using Probabilistic Parsing. Computer Vision and Pattern Recognition, IEEE Computer Society.
  • Bonada, Jordi: Automatic technique in frequency domain for near-lossless time-scale modification of audio; Proc. ICMC 2000, ICMA, Staatsbibliothek zu Berlin, Berlin, Deutschland, 2000 (pdf on media (49.50 Kb))
  • Bordegoni, M. (1994). "Parallel use of hand gestures and force-input device for interacting with 3D and virtual reality environments." International Journal of Human-Computer Interaction 6(4): 391-413.
  • Boult, Sir Adrian. (1968). A Handbook on the Technique of Conducting. London, Paterson's Publications, Ltd.
  • Brecht, B., and Garnett, G. (1995). Conductor Follower. Proceedings of the International Computer Music Conference, Banff Centre for the Arts, Banff, Canada.
  • Bremmer, J. and H. Roodenburg. (1991). A Cultural History of Gesture. Ithaca, New York, Cornell University Press.
  • Bruml, B. J. and N. F. H. (1975). A Dictionary of Gestures. Metuchen, NJ.
  • Burt, W. (1994). "Thoughts on Physicality and Interaction in Current Electronic Music and Art." Continuum 8(1).
  • Campbell, L. W., David A. Becker, Ali Azarbayejani, Aaron F. Bobick, and Alex Pentland. (1996). Invariant features for 3-D gesture recognition. Second International Workshop on Face and Gesture Recognition.
  • Camurri, A., Matteo Ricchetti, Massimiliano Di Stefano, and Alessandro Stroscio. (1998). EyesWeb - toward gesture and affect recognition in dance/music interactive systems. XII Colloquium on Musical Informatics.
  • Choi, I. (1998). From Motion to Emotion: Synthesis of Interactivity with Gestural Primitives. "Emotional and Intelligent: the Tangled Knot of Cognition." AAAI Fall Symposium.
  • Darrell, T. J., Irfan A. Essa, and Alex P. Pentland. (1995). Task-specific Gesture Analysis in Real-Time using Interpolated Views. Cambridge, MA, M.I.T. Media Laboratory Perceptual Computing Section.
  • Darrell, T. J. and A. P. Pentland. (1993). Space Time Gestures. IEEE Computer Society Conference on Computer Vision and Pattern Recognition.
  • Darrell, T. J. and A. P. Pentland. (1995). Attention-driven Expression and Gesture Analysis in an Interactive Environment. Cambridge, MA, M.I.T. Media Laboratory Vision and Modeling Group, Technical Report #364.
  • DIVA - Digital Interactive Virtual Acoustics
  • Doscher, J. (1996). Using Micro-Machined Accelerometers in Joysticks, 3DOF and 6DOF Systems: A New Paradigm for the Human Computer Interface. Norwood, MA, Analog Devices, Inc. Transportation and Industrial Products Division.
  • Fels, S. and G. Hinton. (1997). "Glove-TalkII: A neural network interface which maps gestures to parallel formant speech synthesizer controls." IEEE Transactions on Neural Networks.
  • Gibet, S., A. Braffort, C. Collet, F. Forest, R. Gherbi and T. Lebourq. (1996). Gesture in Human-Machine Communication: capture, analysis-synthesis, recognition, semantics. Gesture Workshop, London, Springer-Verlag.
  • Haflich, Stephen / Burns, Mark: Following a conductor: the engineering of an input device; Proc. ICMC 1983, ICMA, Eastman School of Music, Rochester/NY, 1983
  • Harling, P. (1997). Gesture-Based Interaction. http://dcpul.cs.york.ac.uk/~philiph/gestures.html
  • Harling, P. A. and A. D. N. Edwards. (1997). Hand Tension as a Gesture Segmentation Cue. Gesture Workshop ‘96, London, Springer-Verlag.
  • Ilmonen, Tommi: Tracking Conductor of an Orchestra Using Artificial Neural Networks; STeP’98, Human and Artificial Information Processing Finnish Conference on Artificial Intelligence, Jyväskylä, Finland, 1998 (pdf on media (1.58 Mb))
  • Ilmonen, Tommi / Takala, Tapio: Conductor Following With Artificial Neural Networks; Proc. ICMC 1999, ICMA, Beijing, China, 1999 (pdf on media (1.04 Mb))
  • Kjeldsen, R. and J. Kender. (1995). Visual Hand Gesture Recognition for Window System Control. International Workshop on Automatic Face- and Gesture-Recognition, Zurich, Switzerland.
  • Krom, M. W. (1996). Machine Perception of Natural Musical Conducting Gestures. M.S. Thesis, Media Laboratory. Cambridge, MA, M.I.T.
  • Kurtenbach, G. and E.A. Hulteen. (1990). Gestures in Human-Computer Communication. The Art of Computer-Human Interface Design. B. Laurel, ed. Reading, MA, Addison-Wesley Publishing Company: 309-317.
  • Lee, M., Freed, A., Wessel, D. (1991). Real-Time Neural Network Processing of Gestural and Acoustic Signals. International Computer Music Conference, Montreal, International Computer Music Association.
  • Lee, M., Garnett, Guy E. and Wessel, D. (1992). An Adaptive Conductor Follower. International Computer Music Conference, San Jose State University, International Computer Music Association. http://www.cnmat.berkeley.edu/~adrian/AES95/.
  • Logemann, G. W. (1989). Experiments with a Gestural Controller. International Computer Music Conference, International Computer Music Association, pp. 184-185.
  • Machover, T. (1996). The Brain Opera. http://brainop.media.mit.edu.
  • Marrin Nakra, T. (1999). Searching for Meaning in Gestural Data: Interpretive Feature Extraction and Signal Processing for Affective and Expressive Content. Trends in Gestural Control of Music. M. Wanderley, ed. Paris, IRCAM.
  • Marrin, T. (1996). Toward an Understanding of Musical Gesture: Mapping Expressive Intention with the Digital Baton. M.S. Thesis, Media Laboratory. Cambridge, MA, Massachusetts Institute of Technology.
  • Marrin, T. (1997). Possibilities for the Digital Baton as a General-Purpose Gestural Interface. CHI '97 Conference on Human Factors in Computing Systems, pages 311-312.
  • Marrin, T. and J. Paradiso. (1997). The Digital Baton: a Versatile Performance Instrument. International Computer Music Conference, Thessaloniki, Greece, pages 313-316.
  • Marrin, T. and R. Picard. (1998). Analysis of Affective Musical Expression with the Conductor's Jacket. XII Colloquium for Musical Informatics, Gorizia, Italy, pages 61-64.
  • Marrin, T. and R. Picard. (1998). The Conductor's Jacket: a Device for Recording Expressive Musical Gestures. International Computer Music Conference, Ann Arbor, MI, pages 215-219.
  • Mathews, M. V. (1991). The Conductor Program and Mechanical Baton. Current Directions in Computer Music Research. M.V. Mathews and J. R. Pierce, eds. Cambridge, MIT Press.
  • Metois, E. (1996). Musical Sound Information: Musical Gesture and Embedding Synthesis. Ph.D. Thesis, Media Laboratory. Cambridge, MA, Massachusetts Institute of Technology.
  • Modler, P. (1998). Expressing Emotions: Using Symbolic and Parametric Gestures in Interactive Systems. "Emotional and Intelligent: the Tangled Knot of Cognition." AAAI Fall Symposium.
  • Morita, H., Shuji Hashimoto, and Sadamu Ohteru. (1991). A Computer Music System that Follows a Human Conductor. Computer Magazine. 24: 44-53.
  • Mulder, A. (1996). "Getting a Grip on Alternate Controllers: Addressing the Variability of Gestural Expression in Musical Instrument Design." Leonardo Music Journal 6: 33-40.
  • Mulder, A., S. Fels and K. Mase. (1997). Empty-handed Gesture Analysis in Max/FTS. Kansei -- The Technology of Emotion, AIMI International Workshop, Genova, Italy.
  • Quek, F. (1993). Hand Gesture Interface for Human-Machine Interaction. Virtual Reality Systems.
  • Rovan, B. and M. Wanderley. (1998). Gestural Controllers: Strategies for Expressive Application. SEAMUS.
  • Rovan, J. B., Marcelo M. Wanderley, Shlomo Dubnov and Philippe Depalle, (1997). Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance. KANSEI - The Technology of Emotion.
  • Rudolf, M. (1994). The Grammar of Conducting. New York, Schirmer Books.
  • Sawada, H., Shin'ya Ohkura, and Shuji Hashimoto. (1995). Gesture Analysis Using 3D Acceleration Sensor for Music Control. International Computer Music Conference.
  • Segen, Jakub / Majumder, Aditi / Gluckman, Joshua: Virtual Dance and Music Conducted by a Human Conductor; Bell Laboratories, Holmdel/NJ, 2000
  • Starner, T. and A. Pentland. (1995). Real-Time American Sign Language Recognition from Video Using Hidden Markov Models. Cambridge, MA, MIT Media Laboratory Perceptual Computing Section, Technical Report #375.
  • Usa, S. and Y. Mochida. (1998). "A conducting recognition system on the model of musicians' process." Journal of the Acoustical Society of Japan 19(4).
  • Usa, Satoshi / Mochida, Yasunori: A Multi-Modal Conducting Simulator; Proc. ICMC 1998, ICMA, Univ. of Michigan, Ann Arbor/MI, 1998
  • Wanderley, M. (1996). 3D Position Signals Acquisition System with application in real-time processing. ICSPAT.
  • Wanderley, M. (1997). Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance. Kansei - the Technology of Emotion Workshop.
  • Wanderley, M. M. (1999). Non-obvious Performer Gestures in Instrumental Music. III Gesture Workshop, Gif-sur-Yvette - France.
  • Watanabe, T. and M. Yachida. (1998). Real Time Gesture Recognition Using Eigenspace from Multi Input Image Sequences. IEEE Conference on Face and Gesture, Nara, Japan.
  • Wexelblat, Alan Daniel. "A Feature-Based Approach to Continuous-Gesture Analysis." Masters' Thesis, M.I.T. Media Laboratory, 1994.
  • Wilson, A., Aaron Bobick, and Justine Cassell. (1996). Recovering the Temporal Structure of Natural Gesture. Second International Conference on Automatic Face and Gesture Recognition.
  • Wilson, A. and A. Bobick. (1995). Learning Visual Behavior for Gesture Analysis. IEEE Symposium on Computer Vision.
  • Wilson, A. and A. F. Bobick. (1998). Nonlinear PHMMs for the Interpretation of Parameterized Gesture. IEEE Computer Society Conference on Computer Vision and Pattern Recognition.
  • Wilson, A. and A. F. Bobick. (1998). Recognition and Interpretation of Parametric Gesture. International Conference on Computer Vision.
  • Wilson, A. and A. F. Bobick. (1999). Parametric Hidden Markov Models for Gesture Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence.
  • Wilson, A. D. and A. F. Bobick. (1999). Realtime Online Adaptive Gesture Recognition. International Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real Time Systems.
  • Wundt, W. M. (1973). The Language of Gestures. The Hague, Mouton & Company. Reprinted from 1921 edition.

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.