Hand Postures for Sonification Control

The paper is available by Springer.

Authors: Thomas Hermann, Claudia Nölker and Helge Ritter
Gesture and Sign Language in Human-Computer Interaction, Proc. of International Gesture Workshop, Ipke Wachsmuth, Timo Sowa (Eds.), Springer, 2001

On this page, further results on the interactive control of sonifications using a computer vision based hand posture recognition system will be reported. Sound examples for Section 3 of the paper, demonstrating the real-time control of soundscapes as reported in the first part of the paper are available at GREFIT: Visual Recognition of Hand Postures. Sound examples for Section 4 are shown here.

Section 4.1 Using Hand Postures to Control Parameter Mapping Sonifications

The first sound example presents different sonifications of the Iris dataset, obtained by interactively modifying the parameter ranges while cycling the sonification rendering.

Section 4.2 Using Hand Postures to Amplify Contrast in Sonifications

The sound examples will be supplied at a later time (sorry).