• Keine Ergebnisse gefunden

3. The Use of Visual Context for Language Processing

3.7 Emotions and Emotional Facial Expressions

3.7.1 Identification and Processing of Emotions

3.7.1.3 Children

!

positive emotions easier to process than negative ones. This might be one of the reasons why they show, similar to older adults, a preference towards positive emotions.

!

since these three are tightly intertwined and moreover crucial for successful human communication (Denham et al., 2011).

Children’s understanding of emotions emerges quickly but has been shown to improve gradually over time. Happy facial expressions are produced and identified the earliest and most accurate followed by sadness, and then by other negative emotions such as fear and anger (Bullock & Russell, 1984; Widen & Russell, 2003).

This development takes place between a few months of age until early adolescence (see Herba & Phillips, 2004 for a review). Children begin to use emotion descriptive language and talk about their emotions with around 16-20 months of age (Fivush, Brotman, Buckner, & Goodman, 2000; Ridgeway, Waters, & Kuczaj, 1985). By the age of 2, basic emotional expressions such as happiness, sadness, anger and fear are then already commonly used to describe related states such as crying and hurting (Wellman, Harris, Banerjee, & Sinclair, 1995). By the age of 4-5, children can reliably label the most basic emotional facial expressions such as happiness, sadness and anger. Moreover, at that age they also understand causal reasons for these emotions. However, the ability to comprehend the influence of false beliefs and conflicting desires on emotions, as well as the possibility to hide emotions only seems to emerge between 5 and 6 years of age (Pons & Harris, 2005).

The younger the children are, the more they seem to rely on facial expressions as opposed to situational information when emotional cues are conflicting. Gnepp (1983) showed 3-, 6- and 11- year old children a series of pictures portraying children with emotional expressions in situational setting. The emotional facial expressions of the children in the pictures could either match or mismatch with the situational setting, e.g., it could show a smiling boy with his broken bicycle or a crying boy with his broken bicycle. The children were then asked to indicate how the child in the picture felt and how they could tell that the child in the picture felt that way. The results indicated that the youngest children made judgments on the basis of the character’s facial expression alone, whereas the older children made significantly more affective judgments consistent with the situation. In addition, the older the children, the better they were at creating a story that reconciled the conflicting facial and situational cues (Gnepp, 1983).

However, Morton and Trehub (2001) suggested that when emotional linguistic content and emotional prosody match or mismatch, children younger than 9 years of

!

age relied mostly on content whereas adults primarily relied on paralanguage in judging the happiness or sadness of a speaker. It was only at the age of 10 that half of the children evaluated the speaker’s feelings on the basis of the paralinguistic cues.

Nevertheless, when the content could not be judged because it was presented in a foreign language even the youngest children of 4 years of age could accurately judge the emotional feeling of the speaker based on the emotional prosody alone (Morton &

Trehub, 2001).

Gnepp (1983) and Morton and Trehub (2001) thus demonstrate that the ability to correctly use, process and identify emotional information does not only develop with age. Crucially, it also varies as a function of how the emotional information is presented and with which other information it co-occurs. Hence, the modality and way in which emotional content is presented effects the way children (and adults) process this information and the ease with which they can do so.

Apart from the fact that the processing of emotional material seems to depend on the way in which it is presented, children also already react differently to opposing emotionally valenced information. In addition to understanding and producing positive emotional expressions earlier than negative ones, the former also seem to be categorized more accurately than the latter (Durand, Gallay, Seigneuric, Robichon, &

Baudouin, 2007).

Durand et al. (2007) asked 5-6-, 7-8-, 9-10-, 11-12-year old children and a group of adults to recognize happy, sad, disgusted, angry, fearful and neutral facial expressions and found that the youngest group could only accurately recognize happy and sad facial expressions (see also Gross & Ballif, 1991). However, the recognition of fear, anger and disgust needed an additional 2-4 years to reach adult-level competence. Interestingly, neutral facial expressions also seem to be difficult for 5-6-year old children to recognize, since children under the age of 9 showed a tendency to assign an emotional valence even to the neutral facial expression. Hence, the ability to correctly process, identify and categorize emotional information does not seem to be an easy task and needs years of development and learning until adult competence is reached (Durand et al., 2007).

That emotional information is special compared to non-emotional information becomes even clearer when looking at children’s ERP responses. In contrast to neutral stimuli, emotionally positive and negative stimuli elicited an increased positivity

!

between 500 and 1500 ms after stimulus presentation. This late positive potential (LPP) in response to emotional compared with neutral material has also been observed in adults. Thus, 5-8-year old children already show adult-like brain responses regarding the on-line processing of emotional pictures (Hajcak & Dennis, 2009) even though they might not yet be able to fully identify all emotional categories. Although children know implicitly and explicitly from a very early age onwards that there is a difference between positive and negative emotional material, to come to a full adult-like understanding of the different specific emotions takes time to develop.

However, research regarding a bias towards positive or negative emotional material – as it is the case for older and younger adults – has not come to a clear consensus yet. Thomas et al. (2001) for example investigated amygdala activation of children and adults in response to fearful and neutral facial expressions and found an increased activity to neutral expressions compared to fearful expressions in children and the opposite effect in adults. However, they did not test amygdala activation in response to positive emotional facial expressions. Moreover, children often assign an emotional valence to neutral expressions instead of categorizing them as non-emotional (Durand et al., 2007; Gross & Ballif, 1991; Hortaçsu & Ekinci, 1992;

Tottenham, Phuong, Flannery, Gabard-Durnam, & Goff, 2013). Pagliaccio et al.

(2013) did use positive emotional facial expressions in addition to negative and neutral expressions. They found no differences in the amygdala activation of 7-12-year olds in response to emotional faces of different valences, but an increased activation in response to emotional compared to neutral facial expressions.

In contrast to that, Vaish, Grossmann, & Woodward (2008) argue that the negativity bias in younger adults is already present in children. However, they only review previous studies which did not directly address the question of an emotional bias but claim that the negativity bias for children has been shown as a byproduct in these studies. Moreover, most studies in their review suggest that extreme negative emotions / events are avoided more than positive emotions / events are pursued. That negative events are avoided more than positive events are pursued is for example the case when infants are asked if they want to cross a visual cliff based on their mother’s fearful, angry or happy facial expressions who stands on the other side of the visual cliff. Children crossed the cliff fewer times when the facial expression of the mother

!

was fearful compared to when it was happy. This effect was driven by the fearful expression, i.e., none of the infants crossed in the fearful condition but only 14 of the 19 infants crossed in the happy condition (Sorce, Emde, Campos, & Klinnert, 1985).

For this reason among others, Vaish et al. (2008) argue for a negativity bias.

However, the task of crossing a cliff is a negative (and potentially dangerous) event and should be avoided regardless of the facial expression of the person standing on the other side. Hence, the claim that children show a negativity bias like adults based on studies such as Sorce et al. (1985) seems questionable. Vaish et al. (2008) also mention themselves that most of their reviewed studies do not provide intensity ratings. Moreover, the fact that children often interpret neutral emotions as negative does not generally favor the assumption of a negativity bias in children, but rather suggests that neutral emotional facial expressions as well as negative expressions are overall more complex and diverse than positive facial expressions.

This is further supported by Berman, Chambers, and Graham (2016) who measured 3- and 5-year olds’ fixations towards happy, sad and neutral facial expressions while listening to happy, sad or neutral sounding but semantically neutral sentences. They found that children could not match neutral expressions with the neutral vocal emotion. However, although they did not find a behavioral difference between positive and negative face-prosody matching, children needed more time to fixate the positive facial expression when hearing a positive voice than fixating a negative facial expression when hearing a negative voice. They already fixated the matching negative face 200 ms after the onset of the sad-sounding vocal affect, but needed 600 ms longer to fixate the happy face in the positive valence condition.

This finding could be interpreted as indicating a negativity bias in children similar to that found in younger adults. Despite this, there is, however, more evidence speaking in favor of a positivity bias in children similar to that observed in older adults. First of all, children understand and use positive emotional information earlier and more accurately than negatively valenced emotions (for a review see Herba and Phillips, 2004). Moreover, when asked to generate as many words as possible for an imagined positive or negative emotional state, children produced fewer words related to negative feelings than to positive ones. Additionally, the older the children were (primary vs. secondary school) the more negative feelings they described (Doost, Moradi, Taghavi, Yule, & Dalgleish, 1999). When 7-9-year old children and adults

!

rated pictures for valence, arousal and complexity, children rated positive and neutral pictures as more positive than adults. There was however no significant difference in ratings of aversive images between age groups (Cordon, Melinder, Goodman, &

Edelstein, 2013). Furthermore, children are not only more accurate, but also faster in identifying positive emotional facial expressions than negative emotions (De Sonneville et al., 2002; Richards, French, Nash, Hadwin, & Donnelly, 2007).

Moreover, in a gaze-cueing task, Niedźwiecka and Tomalski (2015) suggested that 12-months old infants oriented their gaze more rapidly to targets cued by happy compared to angry and fearful faces.

In addition, research reported by Todd, Evans, Morris, Lewis, and Taylor (2010) further underlines that children, just like older adults, portray a positivity rather than a negativity bias (for a review see also Todd et al., 2012). In a functional magnetic resonance imaging (fMRI) study they analyzed 3-8-year olds’ and young adults’ brain responses to emotional facial expressions and found that children but not adults showed greater amygdala responses for happy than angry facial expressions. Adults were between 18 and 33 years old and showed – in line with their negativity bias – greater amygdala activation for angry compared to happy expressions. Children’s activation towards angry facial expressions increased with age (Todd et al., 2010).

In conclusion, even though emotion processing, just like face processing is present from a very early age onwards, becoming an expert in a specific domain takes time and practice (for a review on expertise and face processing see Tarr & Cheng, 2003).

It seems that the processing, identification and interpretation of emotions is biased with regard to its valence depending on the age of the perceiver. Children presumably start out with a preference towards positive emotional material, maybe because there is less variation and complexity within the range of positively valenced information.

The older they get, there more this preference shifts towards the arguably more complex negative emotional information. However, this negativity bias for younger adults seems to change again into a positivity bias and the avoidance of negative emotions in older age.

Regardless of the perceiver’s age, emotions are social cues to the understanding of our interlocutor and are hence a means for establishing successful communication.

Still, not many studies have addressed the question of how emotions influence the way we understand language in real time and even fewer have taken the emotional

!

biases of different ages into account. More so, to our knowledge, there are to date no studies focusing on the real-time effect of visual emotional cues on language processing in children.

However, in two off-line studies Ruffman, Slade, Rowlandson, Rumsey, and Garnham (2003) investigated in how far language in 3-5-year old children relates to emotion understanding and specifically focused on the correlation between emotion understanding and syntax and semantics. They tested children’s understanding of syntax and semantics using the comprehension of word order and embedded clauses.

The sentence stimuli were carefully designed to permit a clear distinction between syntax and semantics. Moreover, Ruffman et al. (2003) asked the same children to identify emotions (happiness, fear, sadness, anger and surprise) from facial