• Keine Ergebnisse gefunden

Aiding the Development of Piano Gesture Recognition with a Gesture Enriched Instrument

N/A
N/A
Protected

Academic year: 2022

Aktie "Aiding the Development of Piano Gesture Recognition with a Gesture Enriched Instrument"

Copied!
2
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Aiding the Development of Piano Gesture Recognition with a Gesture Enriched Instrument

Aristotelis Hadjakos, Erwin Aitenbichler, and Max M¨uhlh¨auser

Telecooperation Group, TU Darmstadt, Hochschulstr. 10, 64289 Darmstadt, Germany {telis, erwin, max}@tk.informatik.tu-darmstadt.de

1 Introduction

Recognition of instrumental playing movements has recently drawn attention, especially in the domain of bowed string instruments. Peiper et al. have used decision trees to classify bowing movements [1]. Rasamimanana et al. have used k-NN (k-Nearerst-Neighbours) in a feature space spanned by minimal and max- imal acceleration of a bowing gesture [2]. Young has used a combination of PCA (principal component analysis) and k-NN to recognize bowing gestures [3]. Although the system of Peiper et al. provides real-time visual feedback, the mentioned studies have not reported the aid of a gesture-enriched instrument to assist the development of a recognition method as it is proposed in this paper.

In this paper we discuss the development of our method that allows to dis- tinguish three piano playing gestures: The method distinguishes a touch with pronation, which is a touch with counterclockwise (clockwise) rotation of the right (left) arm, a touch with supination, which is the clockwise (counterclock- wise) rotation of the right (left) arm, and a touch with no rotation. We have built a gesture-enriched piano that assisted us with the development of our recogni- tion method. The instrument plays unique sounds for each recognized movement by generating MIDI messages. The instrument allowed us to test our recognition method under realistic conditions. This lead to the identification of problems, which we did not anticipate and which were initially not reflected in our training set. After identifying these problems, we extended our corpus and used the new data to improve our method.

2 Recognition Method

The recognition is based on data from inertial sensors, which we have devel- oped and custom-built. Prior to storage or recognition, the values obtained by the accelerometers and gyroscopes are transformed to physical quantities. Our recognition method uses the mean rate of turn in two successive time slices of 8 samples measured at the wrist denoted as ts1 and ts2 as main features (see Fig. 1 top).

Because of the many variations when executing a touch, it is difficult to obtain a truely representative corpus of examples. Consider, e.g. our initial corpus (see Fig. 1 top left): It seems possible to recognize the movements based on ts2

(2)

only. This is however not the case because of passive forearm rotation generated by the muscular forces on the fingers and the interaction between the playing apparatus and the key. The gesture-enriched instrument helped us to develop the recognition method as it helped us to spot this and similar issues.

The final version of our method uses adaptive thresholding onts1based on the MIDI velocity of the played note (see Fig. 1 bottom). Additionally, if ts2 exceeds a certain value the threshold on ts1 is modified. The method obtains high recognition rates on the examples of the final corpus.

! "! #! $! %! &!! &"!

!$!!

!'!!

!#!!

!(!!

!"!!

!&!!

!

&!!

"!!

(!!

#!!

)*+*,-./01234

526.,7/21.,"8,6.9:,;93.,0<,3=;:,>?@7A

!!"" !#$" !#"" !%$" !%"" !$" " $" %"" %$" #""

!!""

!#""

!%""

"

%""

#""

!""

&""

'()*+,-(.*+%/+)*01+203*+45+3621+789,:

'()*+,-(.*+#/+)*01+203*+45+3621+789,:

!!"" !#"" !$"" " $"" #"" !""

!%""

!&""

!'""

!!""

!#""

!$""

"

$""

#""

!""

'""

()*+,-.)/+,#0,*+12,314+,56,4732,89:-;

()*+,-.)/+,$0,*+12,314+,56,4732,89:-;

No Rotation

Pronation No Rotation

No Rotation

Pronation Pronation

Supination

Supination

Supination

Fig. 1. Initial corpus with 776 examples (top left), full corpus with 11615 examples (top right). The noise onts2depends on the loudness of the played note (bottom)

References

1. Peiper, C., Warden, D., Garnett, G.: An interface for real-time classification of articulations produced by violin bowing. In: Proc. of NIME 2003. (2003)

2. Rasamimanana, N.H., Fl´ety, E., Bevilacqua, F.: Gesture analysis of violin bow strokes. In: GW 2005. LNCS. Springer (2006)

3. Young, D.: Classification of common violin bowing techniques using gesture data from a playable measurement system. In: Proc. of NIME 2008. (2008)

Referenzen

ÄHNLICHE DOKUMENTE

Responses were coded with respect to the posture (uncrossed and crossed) and type (hand and foot) of the correct limb and in refer- ence to response accuracy: (a) correct response

All of Flusser’s gestures — writing, speaking, making, destroying, painting, photographing, film- ing, turning a mask around, planting, shaving, listening to mu- sic, smoking a

Wolf danken für die ausführliche und informative Prä- sentation von General Motors Austria GmbH.. WINGNET-WIEN bedankt sich bei Lukas Nitsche und Karl Seiberl für die Organisation,

If the motion in a joint reaches similar preparation level and hit level heights in a series of touches, the equality measure of that joint’s movement will have a value

The visual tracking of the Elbow Piano could be improved using this approach and the user would not need to configure the position of the keyboard. Movements

For not only are they able to give free reign to their imagination and creativity to create the kind of algorithm visualization they want, their construction

Although the reads are mapped and counted on individual restriction fragment ends, Hi-C data are usually not analyzed at single-fragment level.. Instead, the read counts are

In many applications today user interaction is moving away from mouse and pens and is becoming pervasive and much more physical and tangible New emerging interaction WHFK nologies