• Keine Ergebnisse gefunden

Psychoacoustical and psychophysiological correlates of the emotional impact and the perception of music

6.6 Knowledge and Expertise

The hypothesis that implicit knowledge matches psychoacoustical features is con-firmed by the high correlation of mean chill-potential with number of experienced chills withrs=.80 (p < .01). However, many parameters that could play a role in the emotional perception of music and contribute to implicit knowledge, such as harmonicity, subjective experiences and structural properties of the musical pieces, were not investigated in this study.

All the investigated parameters, including the psychoacoustical, structural and performance parameters, enable participants to rate the chill-potential of musi-cal excerpts as short as 20 seconds. The hypothesis that participants are able to rate the emotional expression and the potential of the music to elicit chills independently of their own emotional affection and that participants are aware of the musical parameters used by composers and performers to evoke emotions and to elicit chills in the listeners are thereby also confirmed.

7 Outlook

7.1 Psychophysiology

In order to confirm the findings of the first experiment, a replication should be done with a more homogeneous group of listeners, i.e. matched for age and gender. This initial exploratory approach, however, has generated several new insights into the origins of chills and their psychophysiological correlates.

As there was a discrepancy in the frequency of chills in comparison to Panksepp (1995), it would be good to repeat the experiments with groups of students. Fur-thermore, psychophysiological parameters could be measured simultaneously in a group of participants to investigate the hypothesis that the feeling of community affects the emotional experience of listening.

Schubert (2004) showed a time delay between musical structural events and self-reported emotions. One can now forge a link between such musical events and psychophysiology with the methodology presented in this thesis. The accu-racy of the calculation of the delay between chill onset and increase in HR and SCR should, however, be improved in the future, for instance by using more participants and within participant repetitions.

7.2 EMuJoy

The methods for measuring continuous self-reporting were further developed with the EMuJoy software, illustrating the need for a standard technique of investigat-ing emotional self-reports generated from multi-modal stimuli. The open source status of the software furthers this cause by ensuring that there are no limits on further developments and adaptations of the program. This will hopefully encour-age additions from researchers adapting the software to their needs. Furthermore, the application is not restricted to be used in non-profit organizations. Numerous possible fields of application are particularly likely in the areas of advertisement or music industry.

7.3. PSYCHOACOUSTICS

7.3 Psychoacoustics

The parameters which the participants have identified as predictors for the emo-tional impact of the music should be systematically varied in subsequent exper-iments. Increasing and decreasing the loudness between 8 to 18 Barks in chill pieces could influence the number of chills experienced by participants; based on these findings, one could predict that increasing loudness will result in an increase of perceived arousal. Also, the parameters which have been shown to play a role in the emotional perception of the music should be analyzed in de-tail. Such studies are necessary for a more thorough understanding of the role of psychoacoustical parameters in emotions.

7.4 Postface

Musical and psychoacoustical factors, particularly loudness and density of events, play a role in the experience of strong emotions of music. Though a general one-to-one relation of structural events in music to encoding and decoding emotions in music was not found, several factors were identified. It was also shown that listeners are aware of these factors.

In summary, no evidence that people experience music independently of culture or personality was found. This caused me to reject the hypothesis that music works universally. Though universally-functioning music might exist, the exam-ples used in the experiments did not seem to be representative of such pieces.

Given the large set of pieces used in the studies, one can conclude that such universal music is a rare exception rather than a standard.

I believe that music encodes rather than induces emotions, despite cases in which an apparent “emotion” is caused by a startle, uncomfortably loud music, etc. This belief is supported by the finding that familiar music is experienced as more emotional than unfamiliar pieces. Furthermore, no evidence was found that music acts in stimulus-response manner (in contrast to simple sounds, for example, those that startle). Lastly, several personal factors were found to influence the experience of chills. These included gender and length of choir membership.

Furthermore, psychological factors not presented here were identified and can be found in Grewe et al. (2006b). These include the findings that sensation seeking, high reward dependence, musical preferences and listening situations also play an important role in the emotional experience of music.

The thesis started with a quotation from Victor Hugo: “Music expresses that which cannot be said and on which it is impossible to be silent.” Through this exploratory study, many new insights into the wonderful and mysterious experi-ence of music listening were discovered. However, much still remains unsaid, or, 92

according to Hugo, is perhaps impossible to articulate. Yet we should not stop trying to explain these phenomena; rather, we should continue to investigate the experience of music and emotions. The increasing number of recent publications germane to music and its cognitive and developmental effects support Hugo’s quotation. This is especially true of music and neuroimaging studies investigat-ing emotional and other effects of music (see for example Peretz & Zatorre, 2005 or Limb, 2006), as such research brings together many methods to attempt to express the ineffable nature of music.

A Tables