Real-Time Control of Sonification Models with a Haptic Interface
Authors: Thomas Hermann, Jan Krause and Helge Ritter
ICAD 2002, July 2002, Japan
Sound Demonstrations for the Audio-Haptic Ball Interface
- Sounds for atomic collisions:
Each model mass is assigned a material according to the data within
the neurons voronoi cell. In our situation, binary classification data
is used and the object type is either A (or B) if data from class A (or
B) dominates the cell.
- Collision between a A object with A object: sound
- Collision between a B object and a B object: sound
- Collision between a A object and a B object: sound
- Table 1: Sound examples for synthetic datasets from binary classification
A) two separated classes -- sound
B) two classes showing little overlap along one axis -- sound
C) two classes that overlap - inseparable classes -- sound
||The sonification model was excited by shaking the interface device.
The shaking activation is given by the acceleration a_x(t), a_y(t) . During
the first half of the sound examples, the interface is shaken along the
x axis, during the second half along the y axis.
||about 5 sec.
Table 2: Sound examples for Data solid sonifications using a GNG for the
dataset (B) (see above) for different network complexities.
Sound Example for shaking while GNG adaptation proceeds, and thus while
the data-solid structure changes over time: sound
example . It can be heard that with ongoing GNG growth more and more
neurons exist (more collisions). From the pitch it can be perceived that
the number of data points that have a neuron as its nearest neighbor decreases
- new neurons are added between neurons that are frequently activated.